Fives

This is in response to Collin Brooke, who asked for some lists of 5.

5 Books On My Desk

  1. The Bourgeois, Franco Moretti.
  2. In the Footsteps of Genghis Khan, John DeFrancis.
  3. Warriors of the Cloisters: The Central Asian Origins of Science in the Medieval World, Christopher Beckwith.
  4. Medieval Rhetoric: A Select Bibliography, James J. Murphy
  5. The Invaders, Pat Shipman

5 Most Played Songs in my iTunes

  1. All Night Long, Lionel Richie
  2. We Are All We Need, Above and Beyond
  3. In My Memory, Tiesto
  4. Two Tickets to Paradise, Eddie Money
  5. We Built This City, Starship

5 Toppings That I Just Put On My Frozen Yogurt

  1. Peanut butter cups
  2. Chocolate chips
  3. M&Ms
  4. cookie dough
  5. whipped cream

5 Alcoholic Beverages In The Kitchen

  1. Bud Lite Lime
  2. Jose Cuervo Silver
  3. Triple-sec
  4. E&J Brandy
  5. Fireball

5 TV Shows on Netflix Instant Que

  1. Mad Men
  2. Human Planet
  3. Don’t Trust the B—- In Apartment 23
  4. Wild India
  5. Blue Planet

Graphing Early Modern Memory Treatises

Rhetoric’s fourth canon—memory—is synonymous with the art of memory. Today, the constructions of the ars memorativa are known as “memory palaces,” and they have been popularized by characters like Hannibal Lecter and Sherlock Holmes. The technique goes back to the Greek sophists, for it is mentioned in the Dissoi Logoi (circa 420 BCE) and is said to have been formalized by the lyric poet Simonides of Ceos (d. 468 BCE).

The art was elaborated upon by the Romans and went through various transformations during the middle ages (in fact, it has been argued by Frances Yates, Lina Bolzoni, and others that the Divine Comedy, with its carefully ordered visualizations of virtue and vice, presents nothing less than a Christianized memory palace). The art flourished during the early modern era despite the rise of print technologies and did not disappear from the intellectual scene until the late eighteenth century, demonstrating that until recently in history neither writing nor print were seen as substitutes for a strong memory, as they are today.

As part of my dissertation on the art of memory, I searched several bibliographies of rhetoric, eventually compiling 423 memory treatises. 261 were published between 1430 – 1800; 162 were published in the nineteenth century. Bibliographies consulted were Lawrence D. Green and James J. Murphy’s Renaissance Rhetoric Short-Title Catalogue, 1460-1700; Heinrich F. Plett’s English Renaissance Rhetoric and Poetics: A Systematic Bibliography of Primary and Secondary Sources; George S. Fellows’ “Bibliography of Mnemonics,” published in A.E. Middleton, Memory Systems New and Old; and the British Library’s Incunabula Short Title Catalogue. At the bottom of this post, there is a download link for an Excel file listing all treatises published between 1430 – 1800, with Date, Title, Author, Author’s Nationality, and Place of Publication. (I’ll upload the nineteenth century treatises at a later date.)

Of course, that number 423 does not represent 423 unique treatises; many are second or third editions of an original work. But it was important to count multiple editions because I was interested primarily in the rise and fall of this genre’s (the memory treatise’s) popularity over time.

Here are some graphs that have ended up in my dissertation. The first graph displays all memory treatises published, by country, between 1551 – 1600. The second displays all memory treatises published, by country, between 1601 – 1650. The third displays memory treatises published in all countries, by half-century.

Memory treatises published, by country, 1551-1600

Memory treatises published, by country, 1551-1600

Memory treatises published, by country, 1601-1650

Memory treatises published, by country, 1601-1650

Memory treatises published in Europe, per half-century

Memory treatises published in Europe, per half-century

There are several prominent trends in these graphs that I analyze in the dissertation, but the one I want to call attention to here is the drop in English memory treatises between the end of the sixteenth century and the beginning of the seventeenth. Between the years 1551 – 1600, England keeps pace with the publication of memory treatises on the continent. Between the years 1601 – 1650, however, with memory treatises flourishing in France and Germany, the publication of treatises in London slows down greatly. This is made especially curious by the fact that 1601 – 1650 marks the entrance of London as a major publication center. Before 1600, England was producing fewer than 160,000 books per year, compared to 700,000 per year in France and Germany and 800,000 in Italy. After 1601, however, England caught up to the continental powerhouses, publishing ~700,000 books annually from 1601 – 1650 (Buringh and Van Zanden, “Rise” 418). Yet the genre of the memory treatise saw no corresponding rise in publication numbers. Indeed, looking at the total number of memory treatises published in Europe by half-century, the lack of London contributions between 1601 – 1650 appears even more remarkable, for those decades correspond to artificial memory’s most popular period, judging by raw publication counts.

So what could account for the sudden lack of interest in the art of memory in England at the turn of the sixteenth and seventeenth centuries?

One reason, I argue, is that the entire point of the art of memory is to furnish one’s mind with elaborate and emotionally charged mental imagery. The art operates upon the notion that remediating knowledge into the semiotics of sight and dimension makes them easier to remember and to access. However, this was not a precept destined to be popular in an England that had come to be dominated by iconoclastic Anglo Protestants, because imagery, in Protestant England, was synonymous with Catholic religious practice, idolatry, and all manner of irrational superstition—holy relics, indulgences, the veneration of saints. Perhaps the crucial event for understanding this rejection of imagery is Henry VIII’s dissolution of the monasteries, an episode that saw a literal smashing of the idols—“an orgy of iconoclasm,” in Leanda de Lisle’s words (Tudor 244)—with books burned, altar-pieces ripped out, stained glass windows shattered, and art destroyed. The destruction wrought during this period was driven as much by opportunistic plundering and political intimidation as by abstract theological conflict, but the result was the same. Writing not long after those events, the antiquarian John Stow would write that the Protestants “judged every image to be an idol” (qtd. de Lisle 244), resulting in the loss of some ninety percent of medieval art housed in England (de Lisle, “A sad reminder”). In such an environment, precepts for affective imagery are dead on arrival. As one Protestant minister wrote in condemnation of the art of memory: “A thing faigned in the mind by imagination is an idol.”

In Art of Memory, Yates analyzes a few early modern texts suggesting that many well-educated English Protestants had a distaste for the art of memory on iconoclastic grounds. I am glad that her argument remains intact (and is in fact strengthened) when taking a more “distant” bibliographic view.

If Anglo iconoclasm did indeed expedite the abandonment of the art of memory in England, then this history provides a good example of the influence of changing social environments on rhetorical practice. All rhetorical precepts are “creatures of historical circumstance,” Nan Johnson has written. Studying the historical transformation of those precepts is as valuable to understanding cultural evolution as is studying the transformation of literary form.

Memory treatises, 1430 – 1800 (.xlsx)

Some Quick Text Mining of the 2015 CCCC Program

During CCCC last week, Freddie deBoer made a couple comments about the conference: first, that there weren’t as many panels on the actual work of teaching writing compared to panels on sexier topics, like [insert stereotypical humanities topic here]; and second, that not much empirical research was being presented at the conference.

Testing these claims isn’t easy, but as a first stab, here’s a list of the most frequent unigrams and bigrams in the conference’s full list of presentation titles, as found in the official program. Make of these lists what you will. It’s pretty obvious to me that the conference wasn’t bursting at the seams with quantitative data. Sure, research appears at the head of the distribution, but I’ll leave it to you to concordance the word and figure out how often it denotes empirical research into writers while writing.

Then again, big data was a relatively popular term this year. It was used in titles more often than case studies, though case studies was used more often than digital humanities.

To Freddie’s point, the word empirical only appears 11 times in the CCCC program; the word essay appears only 16 times. Is it therefore fair to say there weren’t many empirical studies on essay writing presented this year? Maybe. Maybe not.

CCCCUnigramsCCCCBigrams

One way to get a flavor for the contexts and connotations of individual words and bigrams is of course to create a text network. I’ve begun to think of text networks as visual concordances.

Here is a text network of the tokens writing, write, writer, writers, writing_courses, classroom, and classrooms in the CCCC program. One thing to notice here is that each of these words is semantically related, but in the panel and presentation titles, they exist in clusters of relatively unrelated words. I had expected to discover a messy, overlapping network with these terms, but they’re rather distinct, as judged by the company they keep in the CCCC program. Even the singular and plural forms of the same noun  (e.g., from classroom to classrooms, writer to writers) form distinct clusters.

CCCCProgramNetwork

In relation to Freddie’s point, this network demonstrates that words or bigrams that are prima facie good proxies for “teaching writing” often do lead us to presentations that are pedagogical in nature. However, just as often, they lead us to presentations that are only tangentially or not at all related to the teaching of writing and to the empirical study of writers while writing.

Thus, writer forms a cluster with FYC, student, and reader but also with identity, ownership, and virtual. The same thing occurs with the other terms, though writing by far occurs alongside the most diverse range of lexical items.

CCCCWriter

CCCCWriters

CCCCClassroom

CCCCWriting

This is about as much work as I’m interested in doing on the CCCC program for now. In my last post, I put a download link for a .doc version of the program, for anyone interested in doing a more thorough analysis, whether to test Freddie’s claims or to test your own ideas about the field’s zeitgeist.

However, it’s always important to keep in mind that a conference program might tell us more about the influence of conference themes than about the field itself.

ADDED: Here is a list of all names listed at the end of the CCCC program (CCCCProgramNames). Problem is, it’s a list of the FIRST and LAST names, with each given its own entry. If someone is inclined, they can go through this list and delete the last names, which will leave you with a file that can be run through a Gender Recognition algorithm, to see what the gender split of CCCC presenters was.

University representation at CCCC

Here’s a list of the universities and colleges best represented at the 2015 CCCC conference. I used NLTK to locate named entities in the CCCC program, so the graph simply represents a raw count of each time a university’s name appears in the program. Some counts might be inflated, but in general, each time a school is named = a panel with a representative from that school.

The graph shows only those schools that were named at least 10 times in the program (i.e., the schools that had at least 10 individual panels). Even in this truncated list, Michigan State dominates. Explanations for this gross inequality in representation are welcome in the comments.

CCCCColleges

Program (in .docx form because WordPress doesn’t allow .txt files)

All Your Data Are Belong To Us

In the blink of an eye, sci-fi dystopia becomes reality becomes the reality we take for granted becomes the legally enshrined status quo:

“One of our top priorities in Congress must be to promote the sharing of cyber threat data among the private sector and the federal government to defend against cyberattacks and encourage better coordination,” said Carper, ranking member of the Senate Homeland Security and Governmental Affairs Committee.

Of course, the pols are promising that data analyzed by the state will remain nameless:

The measure — known as the Cyber Threat Intelligence Sharing Act — would give companies legal liability protections when sharing cyber threat data with the DHS’s cyber info hub, known as the National Cybersecurity and Communications Integration Center (NCCIC). Companies would have to make “reasonable efforts” to remove personally identifiable information before sharing any data.

The bill also lays out a rubric for how the NCCIC can share that data with other federal agencies, requiring it to minimize identifying information and limiting government uses for the data. Transparency reports and a five-year sunset clause would attempt to ensure the program maintains its civil liberties protections and effectiveness.

Obama seems to suggest that third-party “cyber-info hubs”—some strange vivisection of private and public power—will be in charge of de-personalizing data in between Facebook and the NSA or DHS:

These industry organizations, known as Information Sharing and Analysis Organizations (ISAOs), don’t yet exist, and the White House’s legislative proposal was short on details. It left some wondering what exactly the administration was suggesting.

In the executive order coming Friday, the White House will clarify that it envisions ISAOs as membership organizations or single companies “that share information across a region or in response to a specific emerging cyber threat,” the administration said.

Already existing industry-specific cyber info hubs can qualify as ISAOs, but will be encouraged to adopt a set of voluntary security and privacy protocols that would apply to all such information-sharing centers. The executive order will direct DHS to create those protocols for all ISAOs.

These protocols will let companies “look at [an ISAO] and make judgments about whether those are good organizations and will be beneficial to them and also protect their information properly,” Daniel said.

In theory, separating powers or multiplying agencies accords with the vision of the men who wrote the Federalist Papers, the idea being to make power so diffuse that no individual, branch, or agency can do much harm on its own. However, as Yogi Berra said, “In theory there is no difference between theory and practice, but in practice there is.” Mark Zuckerberg and a few other CEOs know the difference, too. They decided not to attend Obama’s “cyber defense” summit in Silicon Valley last week.

The attacks on Target, Sony, and Home Depot (the attacks invoked by the state to prove the need for more state oversight) are criminal matters, to be sure, and since private companies can’t arrest people, the state will need to get involved somehow. But theft in the private sector is not a new thing. When a Target store is robbed, someone calls the police. No one suggests that every Target in the nation should have its own dedicated police officer monitoring the store 24/7. So why does the state need a massive data sharing program with the private sector? It’s the digital equivalent of putting police officers in every aisle of every Target store in the nation—which is likely the whole point.

Target, of course, does monitor every aisle in each of its stores 24/7. But this is a private, internal decision, and the information captured by closed circuit cameras is shared with the state only after a crime been committed. There is no room of men watching these tapes, no IT army paid to track Target movements on a massive scale, to determine who is a possible threat, to mark and file away even the smallest infraction on the chance that it is needed to make a case against someone at a later date.

What Obama and the DHS are suggesting is that the state should do exactly that: to enter every private digital space and erect its own closed circuit cameras, so that men in suits can monitor movement in these spaces whether a crime has been committed or not. (State agencies are already doing it, of course, but now the Obama Administration is attempting to increase the state’s reach and to enshrine the practice in law.)

“As long as you aren’t doing anything wrong, what do you care?”

In the short term, that’s a practical answer. In the future, however, a state-run system of closed circuit cameras watching digital space 24/7 may not always be used for justified criminal prosecution.

The next great technological revolution, in my view, will be the creation of an entirely new internet protocol suite that enables some semblance of truly “invisible” networking, or perhaps the widespread adoption of personal cloud computing. The idea will be to exit the glare of the watchers.

Hindi 101

I’m taking Hindi 101 this semester. The Devangari script feels mildly ornate in my hand compared to the angularity of alphabets descended from the Phoenician script (including the English alphabet), but it is quite lovely and not as challenging as I had imagined. It is still an alphabet, after all, with a much closer sound-grapheme correspondence than one finds in English, where each letter—particularly vowels—can correspond to multiple phonemes. (English grammar is absurdly simple compared to all other major languages, but our spelling system must be a nightmare for foreign learners. There’s something to be said for language academies that control the drift between pronunciation and spelling.) Devanagari does, however, omit some vowel sounds and uses secondary or “dependent” vowel forms in most contexts, so it has something of the syllabary about it. In fact, the biggest mistake I make in class is to confuse two dependent vowels,  ी and  ो. The former is long “ee”, the latter is “o”, but in certain fonts (including my own handwriting), they look nearly identical.

The script’s biconsonantal conjuncts are mostly intuitive, though a few bizarre ones need to be memorized as separate graphemes. We have conjuncts in English, but I believe they are a relatively new innovation with limited usage. One example is the city logo of Huntington Beach, California. Hindi has a lot of these, and they are quite common.

clip_image002_0001.201144028_std

An English biconsonantal conjunct.

Apart from learning a new script, the most enjoyable part of Hindi class has been coming across Romance or Germanic cognates. At an intellectual level, I know and have long known that Hindi and English, both Indo-European languages, share a genetic ancestry, which means that at some point in the distant past all Indo-European speakers spoke the same language. It’s easy to get a handle on the concept when talking about Romance languages: Spanish, Italian, and French all used to be Latin. There, we have a well documented history, stretching back through the Renaissance and middle ages to the familiar  world of Rome. However, when it comes to Proto Indo-European, we are faced with a deeper and wider canyon of time and an ancient world that is mostly unknown to us. The PIE speakers were probably living in the Pontic-Caspian steppe lands, but some evidence suggests that they may have been living in the greater Anatolian region; perhaps the most direct descendants of Proto Indo-Europeans are today’s Armenians, Turks, and Persians. They apparently kicked ass and took names because Indo European now stretches from the Pacific to the Indian Oceans.

But whoever they were, the PIE speakers are remote in a way that the Romans or Germanic tribes are not. Yet while doing my Hindi homework, every now and again I come across a word that clearly indicates the ancient linguistic (and genetic) connectedness between the Romans, the Germans, and the Hindi speakers. Kamiz for shirt; mez for table; kamra for room; mata for mother; pita for father; nam for name; darvaza for door . . . In Hindi class, when I say a word out loud that is clearly related to a European word, I am intoning sounds close to the ones that came from the lips of those ancient Indo-Europeans before they split eastward and westward to conquer Eurasia. To language nerds like me, it’s a chilling sensation.

Distorting time to deny inevitability

The latest issue of Rhetoric Society Quarterly has its authors engaging with “untimely historiography,” which, as near as I can tell, is an attempt to complicate the notion of time as a one-way river of cause and effect. Most of the essays (I’ve read two and skimmed the others) seem to share a common distrust of grand narratives and a distaste for histories that look beyond the contingency of particular events. Cause and effect, linear time—these are human constructs that make sense of distort an otherwise irreducibly complex mess of events.

The chronological anxiety in these essays is of the sort recently addressed by Ted Underwood in Why Literary Periods Mattered. There is of course good reason to be skeptical about grand narratives and historical theories, so I’m sympathetic to much of what is said in these new essays, and I find value in taking a critical look at constructions of linearity in history. However, as genetics blogger Razib Khan notes, acknowledging the dangers of over-generalization presents us with “problems to be grappled with, not a ‘get out of jail’ card to be thrown at any attempts to construct a formal system of interpretation.” Khan’s post is aptly entitled “Human History is Both Contingent and Inevitable,” and I think this both/and worldview is intellectually useful. It makes room for the radical contingency argued for by Michelle Ballif and others without foreclosing on legitimate linear interpretations of history. Thinking about history as both contingent and inevitable leads us to ask where it’s one or the other, to disentangle where it’s more one than the other.

Not everyone would agree with my sentiment, to put it mildly. As an example, I’ll quote from Hans Kellner’s essay “Is History Ever Timely?”*, in which he recounts a talk given by Hayden White:

In 1967, Hayden White . . . journeyed to Colorado to deliver a talk at a conference on biology. At this conference he spoke on the topic “What is a Historical System?” in which he contrasted a historical system with a biological system. In effect, he said that biological—that is, genetic—systems are timely. By this he meant that one’s biological state had been determined in the past by genetic ancestral code. Today we would speak of DNA. But is this true of historical, cultural ancestry? Are we historically determined in the matter of who we are? Is our historical identity as fixed by the timeliness of time and genetic logic as our biological identity is? At that conference, White said, “no.”

A resounding answer, one that, I believe, many scholars in the humanities would echo. It also rejects my olive branch to both sides of the question. It implicitly denies the possibility that culture and history might exhibit large-scale patterns or processes due to the influence of biology, geography, demographics, economics, and so on.

Kellner continues with an example that White used to prove his point: the Christianization of Europe as a culturally created event that needn’t have occurred:

Cultural communities are constituted on the basis of a shared agreement about the choice of historical ancestors. There are times, however, when people lose faith in their chosen identities . . . The example White cited at the time was the crisis of the seventh and eighth centuries in Northern Europe, when a Romanized world saw that the source of their identity had been changed beyond recognition, and a new candidate for that identity had emerged in the teachings of Christian missionaries. As White put it, when the Germanic peoples of northern Europe decided that they were no longer the cultural descendants of ancient Romans or of pagan barbarians, and that their cultural ancestors were Palestinian Jews with whom they had no biological connection at all, a new culture was formed. Backwards. This did not need to happen. Just as the pin on which one sat might have never been noticed if the pain had not caused it to exist for us, so the “Christianization” might have never happened . . .

But is it true that Northern Europe switched identities and cultures as effortlessly as Kellner’s gloss implies? It seems to me a highly contested statement. The Holy Roman Empire was a hegemon among Europe’s warring monarchs and tribes for a time, and, as White describes, the Church Fathers went to great lengths to adopt for themselves and for Europe a foreign Jewish culture and history, but to suggest that the Scots, the Anglos, the Franks, and the Iberians stopped being Scots, Anglos, Franks, and Iberians just because they became Christian is a gross overstatement belied by the constant warfare and power-plays that constitute European history (you’d think White and Kellner would be more careful about hasty generalizations!). It’s like saying the Persians stopped being Persian when they were conquered by the Muslims. Culture runs deep, precisely, I think, because it is tied to and influenced by processes much more intransigent than individual human whim. I don’t believe culture is a costume ready to be changed in a generation or two, and any attempts to do so often result in backlashes or corrections. One might even argue that during the middle ages Europe was just waiting for its monarchs to re-assert their power over Rome so they could all go back to fighting one another again. And indeed they did.

Now, I’m sympathetic to the political sensibility from which I think all this emerges—the idea that if history is not inevitable then the future is, to some extent, in our hands, ready to be constructed in a more just and moral way. On the other hand, if the movement of history is inevitable, then humans can have no agency over their (often unjust) cultures and behaviors, no more agency than they have over their genetics. Such is the “Cormac McCarthy” view of the world, McCarthy having famously said that wishing the species could be “improved in some way . . . will make your life vacuous.” It is an antipathy to this view that brings out the poststructuralist and postmodern tendencies in these RSQ essays, whose authors deny inevitability to history by denying the linear shape of time altogether. Get rid of linear time and any notion of inevitability disappears with it.

I grew up watching wildlife documentaries, so I was inured from a young age to the McCarthy view. It probably didn’t help that I read Blood Meridian in tenth grade. Nevertheless, I try not to err in extremes, so although my default position on culture is determinism of all types—genetic, geographic, demographic, historical—I enjoy challenging and often replacing my default assumptions. I think those who err on the other side—no determinism of any type, history is always contingent—should likewise challenge their default assumption. Hopefully we can meet in the middle.

Hayden White asked:  Are we historically determined in the matter of who we are? Is our historical identity as fixed by the timeliness of time and genetic logic as our biological identity is? He answered no, but I think we should answer, Sometimes yes and sometimes no. It depends on what you’re talking about. The intellectual challenge is to figure out what is (or was) contingent and what is (or was) inevitable. Does history exhibit patterns and cycles? What are the large-scale processes which stand outside of but influence cultural expressions? Do certain cultural expressions change according to broadly identifiable patterns, while others exhibit no patterned changes whatsoever? How do irreducibly contingent moments interact with larger historical processes? Interesting questions, in my opinion, ones that the cliodynamicists are trying to answer mathematically. Will they be successful? Maybe, maybe not. But before the fact, I don’t think we should, to quote Khan again, “throw our hands up in the air and assume that all of history is a contingent darkness from which we can’t infer general patterns.”

 

*Kellner’s essay is a sensible discussion of the ways that texts, films, and images create connections across great gaps of time to re-figure the past in terms of the present. It’s an excellent piece, and I’m simply using these carefully extracted quotes as a foil.