Lecture: Reflecting on the Course

As we wrap up the semester, I thought it might be useful to reflect back on where we started this semester by returning to the course introduction, found at the beginning of the syllabus. If you choose to take advantage of the Participation Post make-up work or the Reading Response make-up work or extra credit opportunity (see the addendum to the Week 14 announcement), think of this introduction as a prompt or frame through which to reflect upon the semester.

Course Introduction

“Any shift in the traffic of information can create not only new thoughts, but new ways of thinking.” – Paul Miller, aka DJ Spooky that Subliminal Kid, Rhythm Science

“It is impossible to understand social and cultural changes without a knowledge of the workings of media.” – Marshall McLuhan, The Medium Is the Massage

“It is the first step in sociological wisdom, to recognize that the major advances in civilization are processes which all but wreck the societies in which they occur […]. The art of free society consists first in the maintenance of the symbolic code; and secondly in fearlessness of revision, to secure that the code serves those purposes which satisfy an enlightened reason. Those societies which cannot combine reverence to their symbols with freedom of revision, must ultimately decay either from anarchy, or from the slow atrophy of a life stifled by useless shadows.” – A.N. Whitehead, Symbolisms: Its Meaning and Effect

“As the era of print is passing, it is possible once again to see print in a comparative context with other textual media, including the scroll, the manuscript codex, the early printed codex, the variations of book forms produced by changes from letterpress to offset to digital publishing, and born-digital forms such as electronic literature and computer games.” – N. Katherine Hayles and Jessica Pressman, Comparative Textual Media: Transforming the Humanities in the Postprint Era

“It is the business of the future to be dangerous.” – A.N. Whitehead, Science and the Modern World

In the months of May and June, readers of the New Republic were treated to articles about the end of English Departments, soon to be killed off by technology in the guise of the digital humanities. In his article, New Republic Senior Editor Adam Kirsch decries the doom he believes technology is wreaking. Less alarmist, James Pulizzi also sees the end of the traditional literature department as all but inevitable, not because they must die but because they must shift and adapt to the new digital environment.

It is true, as Pulizzi suggests, that literature departments, especially English departments, are changing, even need to change. But that’s nothing new. English departments have always been changing. We might point to the 1800s where at schools like Harvard one of the most prestigious professorships was the Boylston Professor of Rhetoric and Oratory, or to the late 1800s when American English departments did not teach American literature – the first American professor of American literature had to jump ship from his literature department for a department of history. Or we might point to the 1940s and the rise of the then New Criticism, or to the 1960s as the start of a series of waves of post-structuralist and post-modernist theories and perspectives including but not limited to feminism, gender studies, new historicism, postcolonialism, multiculturalism and ethnic studies, ecocriticism, trauma theory, memory studies, new materialism, object-oriented criticism, and speculative realism. Or we might look again to the 1960s and the revival of classical rhetoric and the beginnings of contemporary composition studies, followed later by the growth of professional writing and technical communication.

Kirsch, however, is right in sensing that something is different. This is not just a change in the practice of theory or the object of study, but a change in the very way we are structuring our culture. We are no longer a culture of print. We are, instead, a transitional culture moving from the print to digital age. In arguing against the study and use of digital technology, Kirsch asks, “Was it necessary in the past 500 years for a humanist to know how to set type and publish a book?”

Kirsch believes that the answer is no, and therein lies the problem with his attempt to defend the humanities from technology. Renaissance Humanism was born within the newly established printing houses of Europe. The first Humanists did not just learn how to set type and publish books, they embraced the printing press; got their hands on as many hand-written manuscripts of Greek and Roman literature, philosophy, mathematics, and science as they could; set them to type; and published, published, published.

Kirsch is unaware of these facts because he is trapped within a catch-22. To be aware of how the printing press gave rise to Renaissance Humanism, Kirsch would have had to have studied the history of media technologies, something which he seems loathe to do because he believes it to be antithetical to humanistic concerns.

As Hayles and Pressman argue, that we are transitioning from a print to a digital culture allows us to more readily recognize that print and its modes of thought, patterns of behavior, and organizational structures were a temporary condition fostered and encouraged by a technology around which we shaped our culture. That era, the Age of Print, is ending, just as the manuscript culture of medieval scholasticism ended with the rise of print.

And that is what this course is about: In recognizing, as DJ Spooky reminds us, that shifts in the traffic of communication will alter modes of thought; in seeking to understand the workings of electronic and digital media, as Marshall McLuhan suggests we need to do, so that we might understand the social and cultural changes around us; in revising the ways we practice English studies even as we maintain our symbolic codes so that we might, as A.N. Whitehead argues, stave off cultural stagnation.

If “the business of the future is to be dangerous,” then the answer is not to hide from it but, as McLuhan suggests, “to contemplate what is happening.” Or, as Michel de Montaigne, the Renaissance writer and inventor of the essay – a genre thoroughly entwined with the rise and logics of print – once wrote, “The thing of it is, we must live with the living.” That is what this course is about: To understand how English studies might live within a digital world.

Lecture: On Hayles’ “The Future of Literature”

Books will not disappear, but neither will they escape the effects of the digital technologies that interpenetrate them. More than a mode of material production (although it is that), digitality has become the textual condition of twenty-first-century literature. — N. Katherine Hayles, Electronic Literature, 186

Having presented the general argument of Electronic Literature: New Horizons for the Literary in chapter 4, N. Katherine Hayles concludes her book on electronic literature by examining three print novels: Jonathan Safran Foer’s Extremely Loud and Incredibly Close, Salvador Plascencia’s The People of Paper, and Mark Danielewski’s House of Leaves. While this might seem at first glance an odd way to finish off a book on electronic literature, it’s entirely consistent with the argument Hayles has made throughout the book. If, as she has claims, intermediation interpenetrates computer code and human language, digital and analogue processing, and print and electronic media; and if this occurs through a process of coevolution through recursive feedback loops in which the interaction of A and B means that A and B mutually shape the development of each other, and the change to one invokes new changes to the other, then we should find that not only is electronic literature influenced and shaped by print, but that print literature should be influenced and shaped by digitality. And that is why Hayles ends her book with an examination of three recent print novels: to demonstrate how the digital is exerting its influence upon print.

Hayles begins chapter five, “The Future of Literature: Print Novels and the Mark of the Digital,” by claiming that electronic literature “will be a significant component of the twenty-first century canon,” and she reminds us that this shouldn’t be too surprising when we remember that today nearly all print literature exists as a digital file at some stage in its production (159). However, this is not what Hayles means when she claims that electronic literature will have its impact on twenty-first century literature. Instead, she argues, we find print textuality imitating electronic textuality while also intensifying features of print “in effect, declaring allegiance to print” (162).

Having made this claim, Hayles then defines a number of characteristics of electronic textuality “around which the dance between imitation and intensification takes place” (163). They are:

  • Computer-mediated text is layered. That is, there’s the text we read/perform/experience and the code that renders that text (163-64).
  • Computer-mediated text tends to be multimodal. Unlike print books which can display verbal text and images, computer text can include sound and video as well as verbal text and images (164).
  • In computer-mediated text, storage is separate from performance. In print, the text is both stored and performed on the same page, whereas a digital text can be stored anywhere a local computer can access it (whether that’s the local hard drive or a web server on the other side of the world), and we can’t access computer code while it performs (164).
  • Computer-mediated text manifests fractured temporality. Electronic text can be programmed to display at its own rate regardless of how fast or how slow we read. Additionally, it can be programmed so that we have to stop reading and perform other tasks such as clicking a mouse or entering text in order to keep reading (164-65).

Having defined these major characteristics of digital textuality, Hayles devotes the rest of her chapter to discussing how these characteristics play out in that “dance between imitation and intensification” in the three novels mentioned above.

Lecture: On Hayles’ “Revealing and Transforming”

Through such intermediations, computation evolves into something more than a technical practice, though of course it is also that. It becomes a powerful way to reveal to us the implications of our contemporary situation, creating revelations that work both within and beneath conscious thought. Joining technical practice with artistic creation, computation is revalued into a performance that addresses us with the full complexity our human natures require, including the rationality of the conscious mind, the embodied response that joins cognition and emotion, and the technological non conscious that operates through sedimented routines of habitual actions, gestures, and postures. Thus understood, computation ceases to be a technical practice best left to software engineers and computer scientists and instead becomes a partner in the coevolving dynamics through which artists and programmers, users and players, continue to explore and experience the intermediating dynamics that let us understand who we have been, who we are, and who we might become. — N. Katherine Hayles, Electronic Literature, 157

 While it is commonplace to suggest that literature is an exploration of the human condition and, therefore, gives us insight into who we are and who we have been, N. Katherine Hayles ends the fourth chapter of Electronic Literature: New Horizons for the Literary with a much bolder claim. Electronic literature, she argues, has the potential to give us insight into what it means for us to coexist – coevolve – with computers, and that these insights might give us a glimpse of what we and our computers might be coevolving into. Explaining how electronic literature allows us to explore and examine this coevolution of human and computational machine is the focus of this chapter, “Revealing and Transforming: How Electronic Literature Revalues Computational Practice.”

Hayles begins this chapter by giving us a summary of chapters 2 and 3 which looked at the concepts of intermediation and at the dynamic coevolution of human and human technologies. Intermediation, she reminds us, is the dynamic, recursive interactions amongst elements within a system. In the case of electronic literature, these intermediations are between computer code and human language, digital and analogue processing, and print and electronic media. As for coevolution, she reminds us of how the development of human language coincided with the creation of complex, multipart tools, and that while humans are not independent of the tools they create and use (the argument represented by Mark B. N. Hansen), they are neither entirely dependent upon or dominated by those tools (the argument by Friedrich A. Kittler). Hayles’ middle ground – her third option argued in chapter 3 – is something akin to Marshall McLuhan’s own dictum that first we shape our tools and then our tools shape us.

Having reminded us of the previous chapters upon which her argument is based, Hayles offers what the “general framework for her argument.” She suggests that electronic literature builds upon the functions of print literature (“extends”) in “creating recursive feedback loops between explicit articulation, conscious thought, and embodied sensorimotor knowledge” (135). Explicit articulation is our ability to accurately describe a concept or a process through the use of technical vocabulary ( a phrase she uses on page 131). Conscious thought is our ability to recognize, understand, and make sense of experience. And embodied sensorimotor knowledge includes both habit and muscle memory and our ability to intuit function from form. For instance, a computer mouse is designed in such a way that if we know it is a device to control a computer, we can guess how to place our hand around the mouse and begin to use it.

What she means by all this might be more clear if we think about the act of typing at a keyboard, which Hayles herself uses as an example a few pages earlier. We can, she notes, read all about how to type in a book, how to place our hands over the keyboard, which fingers we should use to strike which keys, how to memorize the layout, etc.; however, she explains, even with all this explicit articulation (the ability to describe the process of typing using a technical vocabulary) and conscious understanding of how typing works, almost no one can simply read typing, memorize a keyboard layout, and begin typing with ease (133). Likewise, she explains, experienced typists who can touch type with ease are often at a loss if they are asked to draw a keyboard from memory. A touch typists’s knowledge of the keyboard’s layout exists not in explicit articulation or conscious thought but in embodied sensorimotor knowledge (133).

Having suggested that electronic literature extends the functions of print literature by creating recursive feedback loops between explicit articulation, conscious thought, and embodied sensorimotor knowledge, Hayles then argues that electronic literature differs from print literature in that electronic literature not only does all this but it also “performs the additional function of entwining human ways of knowing with machine cognitions” (135). This is important, she argues, because intermediation, by “re-presenting material in different mediums” changes the the sense rations by which we perceive material, and these changes in the modes of sensory input means changes in the “kinds of knowledge represented” (135). The way in which the recursive feedback loops work is that when we interact with a tool, we are actually transforming that tool, and the transformation of that tool means that we have to change how we interact with that tool, which further changes that tool, which in turn further changes us, and so on (135-36).

If all this sounds a bit McLuhanesque, that is if it reminds you of The Medium Is the Massage, it is, and it should. This is not to say that Hayles and McLuhan are in complete agreement, nor that they come to their conclusions through the same arguments, but nonetheless, they are in general agreement: essentially, both argue that we humans live in a dynamic coevolutionary relationship with our tools. We develop a tool which in turn changes the way we act and think, and that change in the way we think and act leads us to change the tool, which in turn means that we again change the way we think and act, and so on. Moreover, the ways through which we perceive and engage media shape the way we perceive and interact with the world.

Having presented this general framework for her argument, Hayles then offers two propositions which she explores in detail in her discussions of the specific works of electronic literature later in the chapter. The two propositions are:

  1. “verbal narratives are simultaneously conveyed and disrupted by code” (136)
  2. “distributed cognition implies distributed agency” (136)

The first proposition is based on the idea that as we become more and more entwined with computers and computational media, our daily lives are permeated by interactions that increasingly rely upon computer code. This code, however, sometimes breaks down. We might think back to Kenneth Goldsmith’s “Revenge of the Text” and his discussion of the occasional ruptures of source code spilling over into what should be seamless interactions with computers. While we rely upon computer code to go about our daily activities – our alarms clocks; our cars; our communications with friends, colleagues, and family; our shopping for food; our school work; etc. – we also experience moments when code, through a breakdown of some sort, ruptures into and disrupts our daily life.

The second proposition is based on the idea that we interact with complex technological systems through multiple means. Again, she uses the computer mouse as an example:

The mouse by my computer, for example, is shaped so that the tapered end fits neatly in my palm, and the buttons are positioned at just the right distance so that my fingers naturally rest on them. For habitual users, such devices rapidly become integrated by way of proprioceptive, haptic [touch], and kinesthetic feedback loops into the mind boy, so that agency seems to flow out of the band and into the virtual arena in which intelligent machines and humans cooperate to accomplish tasks and achieve goals. At the same time, however, unpredictable breaks occur that disrupt the smooth functioning of thought, action, and result, making us abruptly aware that our agency is increasingly enmeshed within complex networks extending beyond our ken and operating through codes that are, for the most part, invisible and inaccessible. (137)

By distributing our cognition – the ways we know and know how to do (explicit articulation, conscious thought, sensorimotor knowledge) and embedding some of that cognition into the design of our tools – we distribute our agency into multiple arenas and even place some of it within our tools. In this way distributing our cognition also distributes our agency.

Ultimately, Hayles argues that electronic literature allows us not only to experience all of this through reading and performing it, electronic literature, as literature that is based within computational media itself, allows us to explore and examine this interplay between the human and our computational machines. Having presented her argument, she turns to William Poundstone’s Project for Tachistoscope, Niss and Martha Deed’s Sundays in the Park, and John Cayle’s Translation to illustrate her point.

Announcement: Reading/Community Response Posts and Participation Posts

[Dec. 8 addendum: Please see the end of this post for an addendum regarding Week 16.]

As we begin to wind up the semester, a few comments on Reading and Community Response posts and Participation posts:

Reading and Community Response Posts: As per the revised assignment guidelines, during Weeks 2-15, you are required to write one (1) weekly Reading Response and one (1) weekly Community Response posts for a total of 11 Reading Response posts and 10 Community Response posts. If you have blogged regularly as required, you will have now met this goal. (21 posts x 4 points = 84 points.)

With three weeks left in the course, there are six (6) additional blog posts you can make to help catch up if you are  behind on your Reading and Community Response posts.

Participation Posts: As per the assignment guidelines, during Weeks 2-13 and Week 15 you are expected to make three (3) weekly posts for a total of 39 posts. (No participation posts are required during Week 14.) If you have been posting diligently, you will  have 33 of the 39 required posts. (The last 6 posts are to be done during Weeks 13 and 15.)

While no participation posts are required during Week 14, you are more than welcome, even encouraged, to make up to three participation posts during that week to help make up for any participation posts you may have missed.

Extra Credit

As I’ve reduced the number of required Reading and Community Response posts to help allow some people to catch up, I want to offer a chance to earn some extra credit for those of you who have kept up with your blog posts. For each post beyond the required 21 points, you can earn .5 points to your final grade. For those of you who have not missed a Reading or Community Response post, that’s a possible 3 points to your final grade (6 posts x .5 points = 3 points).

Week 16 Addendum

Participation Posts: I’ve decided to open a Week 16 Participation Post forum in Blackboard for those of you who would like to make up missing discussion posts. You may post up to three times this week.

Reading Response Post: I’ve also decided to offer the opportunity to make a Reading Response post this week on your blogs. As per the guidelines above, you may use this post to make up a missing post or, if you’ve reached the required 21 blog posts, you may use this opportunity to earn extra credit as described above.

While participation posts and reading response posts need to be different, I encourage you to use these opportunities to reflect on the course, and offer this prompt as a suggestion:

For this bonus week of discussion, it seems appropriate to reflect upon what we’ve covered this semester while also looking to the future of English studies in light of digital technoloigies. A good way to start thinking about this might be to reread the Course Introduction I included on the course syllabus, which would have been one of the first things you read for this class. While you should read all eight paragraphs, pay particular attention to the final one:

“If ‘the business of the future is to be dangerous,’ then the answer is not to hide from it but, as McLuhan suggests, ‘to contemplate what is happening.’ Or, as Michel de Montaigne, the Renaissance writer and inventor of the essay — a genre thoroughly entwined with the rise and logics of print — once wrote, ‘The thing of it is, we must live with the living'” That is what this course is about: To understand how English studies might live within a digital world.”

Now, here at the end of the semester, what is your understanding of how English studies might live within a digital world? What are your hopes, concerns, suggestions, and predictions?

Lecture: The Oulipo

While we’ve encountered the Ouvroir de littérature potentielle, or Oulipo, a number of times already, we take a close look at them this week. Founded in 1960, the Oulipo, which roughly translates as Workshop for Potential Literature, was co-founded by novelist and poet Raymond Queneau and mathematician François Le Lionnais, inspired by their collaboration on Queneau’s Cent Mille Milliards de Poèmes (Hundred Thousand Billion Poems). If you haven’t done so already, I strongly recommend playing with the interactive version linked in the optional texts for this week.1

As mentioned in the introductory lecture for this week, as a group dedicated to potential literature, the goal of the Oulipo “is to invent (or reinvent) restrictions of a formal nature (contraintes)2 and propose them to enthusiasts interested in composing literature” (“Introduction Jaques Roubaud The Oulipo and Combinatory Art (1991),” 38).3 As Jean Lescure explains – quoting Oulipo co-founder François Le Lionnais in the “Brief History of the Oulipo,” the Ouliop sought “new forms and structures that may be used by writers any way they see fit” (176). The contraintes, as Lescure explains, stand in apposition to inspiration. Quoting Queneau, Lescure says that they exist to help writers “escape from that which is called inspiration” (176).

Although said somewhat tongue-in-cheek4, the point here is that as a group dedicated to “potential literature,” the Oulipians are more concerned with all the possible forms of contraintes that might be applied to the creation of literature than the creation of literature using all the possible contraintes. While many, if not most, Oulpians do produce literature, not all do. As a mathematician Oulipo co-founder Le Lionnais never produced a literary work but focused his energies on developing contraintes on his own or along with Oulipian writers. This is why we find the essays by Claude Berge, Paul Fournel, and Italo Calvino alongside the poems and short story by Queneau. These essays are as Oulipian as the Oulipian literary works we’ve read this week and in Week 6. It may be worth recalling as well that the N+7 procedure is an Oulipian contraintes created by Jean Lescure.

As I suggested in the introduction to this week, we might think of Oulipian contraintes as algorithms (in the broadest sense of the term) used to create literature. While this might actually involve a computer, especially when it comes to writing combinatory and algorithmic works such as Queneau’s Cent Mille Milliards de Poèmes and “A Story as You Like It” and Fournel’s play “The Theater Tree: A Combinatory Play” (see Fournel’s “Computer and Writer” and Calvino’s “Prose and Anticombinatorics”), these are just a few of the multitude of contraintes described and used by Oulipians. Other examples include the lipogram (Georges Perec wrote the 300-page novel La Disparition using only words without the letter “e” – with a few exceptions such as the “le,” the masculine form of “the” ), the irrational sonnet, Canada Dry, the snowball, and x mistakes y for z. For a detailed but not comprehensive list of Oulipian contraintes see the Oulipo Compendium.5

One final note on the subject of the Oulipians and computers. While computers can help manage the creation of certain kinds of Oulipian texts, computers, as Fournel notes, also make reading certain kinds of texts possible. Using Queneau’s Cent Mille Milliards de Poèmes as the example Fournel explains that when confronted with so many possible readings, a computer can winnow down all the possible options into a readable text. That is, whether we are trying to read selections from Queneau’s possible hundred thousand billion poems, trying to decide how to stage tonight’s performance of Fournel’s “The Theater Tree: A Combinatory Play,” or wanting to explore the versions of Queneau’s “A Story as You Like It,” we can use a computer to give us a solitary, readable text from all the possible versions available to us.

  1. I use playing here rather than reading in a nod to last week’s reading of Hayles “Intermediation” chapter. With one hundred thousand billion possible poems, can we say that anyone really reads Cent Mille Milliards de Poèmes rather than play with them?
  2. From the Oulipo Compendium: “The usual French word for the basic element in Oulipian practice has been variously translated in this volume as constraintrestrictionrestrictive form, and other comparable terms. All these expressions denote the strict and clearly definable rule, method, procedure, or structure that generates every work that can be properly called Oulipian.”
  3.  Oulipo Compendium. Compiled by Harry Mathews and Alastair Brotchie. London: Atlas Press, 1998.
  4. If you get anything out of Lescure’s “Brief History of the Oulipo,” I hope it is a sense of playfulness, which is not to say that Oulipo isn’t a serious endeavor and that the Oulipians aren’t serious artists. It is and they are.
  5. Of course, no complete list of Oulipian contraintes is possible because every conceivable contraintes, even those not yet identified, would need to be included.

On Ramsay’s “‘Patacomputing” and “Postconditions”

With the chapter “‘Patacomputing,” Stephan Ramsay returns us the concept of pataphysics (the science of imaginary solutions), which he introduced us to in the second chapter of Reading Machines, “Potential Literature.” If you recall that reading and its corresponding lecture, you’ll remember that Ramsay drew specifically upon C. P. Snow’s 1959 lecture “The Two Cultures and the Scientific Revolution” and Snow’s call for a third culture, one that combined scientific thought and analysis with the “imaginative experience” of the humanities.1 It is here, in this third culture, Ramsay told us in chapter two, that we would find algorithmic criticism.

As we saw in the first chapter of Reading Machines, computers are really good at certain kinds of textual analysis such as creating lists of words used by different characters.2 While we can ask computers to engage in fairly sophisticated textual analysis such as the example Ramasy uses with Virginia Woolf’s The Waves in which he asks a computer program to:

  1. Identify the words spoken by one of the six main characters,
  2. Create a list of the most commonly used words for each of those characters, and
  3. Drop from each list words shared by other characters.

The product was a list of words for each of the six characters that were unique to each of those characters.  Using the character of Louis as an example, Ramsay notes that we see unique to Louis issues that critics have often noted to be among Louis’ preoccupations (12). While we can write sophisticated textual analysis programs, their result, Ramsay notes in the “‘Patacomputing” chapter we’re reading this week, is almost invariably a list of some sort. To sum up his discussion of various text analysis programs – WordHord, David Hoover’s study of “vocabulary richness,” TAPoR, HyperPo, and MONK – what Ramsay has to say about MONK could be said about any of these examples: “The ‘result’ of a system like MONK is the same as that for virtually any text-analytical procedure: a textual artifact that, even if recapitulated in the form of an elaborate interactive visualization, remains essentially a list” (80).

At this point, it might be all too easy to say “A list? That’s all we’re going to get? Okay, then, I’m done. Back to reading texts the old-fashioned way.” And one could do that, but one would be making the same mistake as the hypothetical user of WordHord who, upon learning that the three most common words in Homer are “man, ship, and god,” and thinks “Home is about ‘man, ship, and god’ in that order. Stop reading right there” (Martin Mueller, “Digital Shakespeare” 123; qtd. in Ramsay 70). The problem with both approaches is that while both are entirely accurate as a description of the thing (textual analysis (computer-based or otherwise) and the corpus of Homer), neither is a complete description of the thing being described any more than “an animal that lives in water” is a complete description of a seal.

The purpose of textual analysis, as you might recall, is to describe quantifiable features of a text such as word frequency, rhyme scheme, syntactical structure, words per sentence, adjectival and adverbial density, etc: Homer is mostly about “man, ship, and god” (69-70); Rudyard Kipling’s children’s book Kim has more “vocabulary richness” than either William Faulkner’s Light in August or Henry James’s The Ambassadors (71-73); Christina Rossetti’s Goblin Market has 3107 words of which 1130 are unique, and averages 40.4 words per sentence (73-74); and in Goblin Market some of the collocations (that is, words which appear together at rates far above mere chance) are “Laura” and “should”and “Lizzie” and “not” (75-76). While we might look at these lists and think “so what,” we might also look at these lists and think “huh, I would have said Homer was about…” or we might ask ask “why?” or “what does it mean?” It’s these later responses that lead us somewhere. As Ramsay suggests, “Algorithmic criticism is born at that moment” (71). To quote from the “An Algorithmic Criticism” lecture:

In short, Ramsay is making the argument that while literary computing has focused on using computers to do what computers do best, that is engage in textual analysis (issues related to understanding the features of a text), we can ask computers to analyze texts in ways that produce data that we can then use to identify fruitful lines of interpretive inquiry in order to engage in literary criticism. In other words, while we can’t yet ask computers to interpret literary texts for us, we can use them for more than straight-forward analysis to describe the features of a text. While that description (analysis) can be meaningful and important for some kinds of inquiry, we can use that analytical data as a starting point for our own acts of interpretation. This computer-assisted interpretation is what Ramsay names an “algorithmic criticism.”

And this brings us back both to pataphysics and to Ramsay’s invoking of C. P. Snow. Textual analysis gives us data about texts, and sometimes that data might surprise us (that Homer’s works are primarily about ‘man, ship, and god’ or that Shakespeare’s works are primarily about “lord, man, and sir”; that Kipling’s Kim is richer in vocabulary than William Faulkner’s Light in August or Henry James’s The Ambassadors; that Lizzie collocates with not; etc); however, textual analysis doesn’t tell us what these things mean. That is the purview of literary criticism, the narrative explanation we create to account for this data the computer has given us. In other words, the ‘Patacomputing of the chapter’s title is the imaginative explanations (literary criticism) given to the result of one or more sets of scientifically derived observations (computer-assisted textual analysis). In short, ‘patacomputing is algorithmic criticism.

If we think of computers as antithetical to the Humanities, it might be well worth reflecting on the fact that, as Ramsay himself notes, we live “[i]n an age when the computer itself has gone from being a cold arbiter of numerical facts to being a platform for social networking and self-expression” (81). Computers are already deeply apart of the human experience.4 And that has been one of the main points of Ramsay’s book: that computers, like all tools, are human tools to be put to work for human ends. While textual analysis existed long before computers, computers make the work of textual analysis much, much easier. Keeping that in mind, it might be worthwhile to let Ramsay have the last word here. He concludes the “‘Patacomputing” chapter with note about his hopes for algorithmic criticism:

algorithmic criticism looks forward not to the widespread acknowledgement of its utility but to the day when ‘algorithmic criticism’ seems as odd a term as ‘library-based criticism.’ For by then we will have understood computer-based criticism to be what it has always been: human-based criticism with computers (81).

  1. Snow, you might recall, was both a scientist and a novelist.
  2. Remember, here, Ramsay’s distinction between textual analysis and literary criticism. This may be a good time to reread the lecture on “An Algorithmic Criticism.”
  3. Mueller, Martin. “Digital Shakespeare or Toward a Literary Informatics.” Shakespeare 4.3 (2008): 284-301.
  4. Or, as Walter Ong suggests in “Digitalization Ancient and Modern: Beginnings of Writing and Today’s Computers,”: “there is nothing more natural for a human being than the artificial.”

Lecture: Introduction to Week 12

“In an age when the computer itself has gone from being a cold arbiter of numerical facts to being a platform for social networking and self-expression, we may well wonder whether those new kinds of critical acts are in fact already implicit in the many interfaces that seek only to facilitate thought, self-expression, and community. As with such recent inventions, the transforming effect will come through ‘the change of scale, or pace, or pattern that it introduces into human affairs’ (McLuhan 81). Once those changes are acknowledged, the bare facts of the tools themselves will seem, like the technical details of automobiles or telephone, not to be the main thing at all. In this sense, algorithmic criticism looks forward not to the widespread acknowledgement of its utility but to the day when ‘algorithmic criticism’ seems an odd a term as library-based criticism.’ For by then we will have understood computer-based criticism to be what it has always been: human-based criticism with computers.” Stephen Ramsay, Reading Machines: Toward an Algorithmic Criticism, 81

Contrainte. The usual French word for the basic element in Oulipian practice has been variously translated in this volume as constraint, restriction, restrictive form, and other comparable terms. All these expressions denote the strict and clearly definable rule, method, procedure, or structure that generates every work that can be properly called Oulipian.” –Oulipo Compendium, 131

The OED definition of algorithm.
The OED definition of algorithm.

Our focus this week might best be summed up by the question, “What does it mean to read and write algorithmically?”

Algorithm, as we can see from the Oxford English Dictionary, comes from the domain of mathematics, with an emphasis on using a procedure or set of rules to solve problems, and has been applied to computing and to other formalized step-by-step protocols.

So, for one to read and/or write algorithmically, we might say that one would need to use a formalized set of rules with the intent of solving a problem or obtaining a specific result. And that brings us to the three sets of readings for this week: The final two chapters of Stephan Ramsay’s Reading Machines: Toward an Algorithmic Criticism, the Oulipo readings, and the Twitter bot readings.

Ramsay, in his exploration of how we might use computers for interpretive purposes, that is, to aid us in literacy criticism, we find him this week returning to the concept of pataphysics, which was the subject of the chapter “Potential Literature.” In short, the chapter “Patacomputing” is about using the algorithmic processes of computers to see texts differently, which, in turn, can lead to new ways of reading them.

By now, the Oulipo and their interest in potential literature should familiar to us. We’ve read about them in Ramsay‘s and Hayles‘s books, we’ve encountered them in previous lectures, and we’ve even read a few of their works. As a group dedicated to potential literature, the goal of the Oulipo “is to invent (or reinvent) restrictions of a formal nature (contraintes)2 and propose them to enthusiasts interested in composing literature” (“Introduction Jaques Roubaud The Oulipo and Combinatory Art (1991),” 38).3 As Jean Lescure explains – quoting Oulipo co-founder François Le Lionnais in the “Brief History of the Oulipo,” the Ouliop sought “new forms and structures that may be used by writers any way they see fit” (176). These new forms and new structures are themselves a kind of algorithm, a procedure or set of rules used to solve a problem, the problem being, according to Raymond Queneau, Oulipo’s other co-founder, the ability to “escape from that which is called inspiration” (Lescure 176).

If Ramsay’s Reading Machines is a search for an algorithmic form of reading and the Oulipo’s various contraintes are forms of algorithmic writing, then we might say that the creation and deployment of Twitter bots, including any number of literary Twitter bots, might be seen as a form of algorithmic reading and writing. Consider, for example, Ingrid Lunden’s description of how Ranjit Bhatnagar created the Twitter bot “Pentametron”: 

After that, he wrote a program to search for tweets that were written in iambic pentameter (10 beats to the line in a heartbeat rhythm), by referencing every word in each tweet received against the CMU Pronouncing Dictionary, an online resource produced by Carnegie Mellon’s School of Computer Science that identifies the stresses in words. […].

At that point he refined Pentametron to search for tweets that fit the meter and feet but also rhymed with each other, and when two were found, it would retweet them in a couplet. “To get a rhyme you throw away 100 lines that are in pentameter that don’t rhyme,” he notes.

We see here Bhatnagar writing a computer program to read and analyze existing tweets in order to select those which meet a specific criteria, and then to combine them those tweets and send them out on Twitter as a rhyming couplet of iambic pentameter. In short, the Pentametron Twitter bot engages in a form of algorithmic reading in order to find the source material to be combined as per a specific set of contraintes to perform an act of algorithmic writing.

  1. McLuhan, Marshall. Understanding Media: The Extensions of Man. 1964. Cambridge: MIT Press, 1994
  2. From the Oulipo Compendium: “The usual French word for the basic element in Oulipian practice has been variously translated in this volume as constraint, restriction, restrictive form, and other comparable terms. All these expressions denote the strict and clearly definable rule, method, procedure, or structure that generates every work that can be properly called Oulipian.”
  3. Oulipo Compendium. Compiled by Harry Mathews and Alastair Brotchie. London: Atlas Press, 1998.

Lecture: On Hayles’ “Intermediation: From Page to Screen”

A Hypertext Editing System Console at Brown Univeristy, 1969.
A Hypertext Editing System Console at Brown Univeristy, 1969. Photograph by Greg Lloyd. Some rights reserved.
In the “Intermediation: From Page to Screen” chapter of  Electronic Literature: New Horizons for the Literary, N. Katherine Hayles introduces the concept of intermediation as a way of understanding the complex interactions taking place between a human reading a work of electronic literature and the text themselves as they are performed by “an intelligent machine” (44). Intermediation, she explains, is derived from Nicholas Gessler’s work on dynamic hierarchies, which Hayles renames dynamic heterarchies.

A dynamic hierarchy or heterarchy, Hayles explains, is “a multitiered system of feedback and feedforward loops” in which the feedback and feedforward loops continually circulate throughout a system (45). She uses the example of a fetus in the womb as an example: “The mother’s body is forming the fetus, but the fetus is also re-forming the mother’s body; both are bound together in a dynamic heterarchy, the culmination of which is the emergent complexity of an infant” (45).

This concept of dynamic heterarchies is important for our understanding of human-computer interactions because, as Hayles notes, we have long known that humans both shape their tools and are in turn shaped by their tools, and we have evidence that his has been the case since Paleolithic times (47). This is, I’ll note, one of the driving assumptions behind the work of Marshall McLuhan and Walter Ong: We shape our tools and our tools shape us. It’s worth recalling the OHM Thesis at this point, which states that:

“technologies of representation, communication, and mediation, when adopted widely in any cultural setting, and maintained over at least a generation or two of use, begin to alter fundamentally the cultural epistemologies and discursive practices of that culture” (Casaregola, 211).

Compare the OHM Thesis with Hayles’ description of Gessler’s dynamic hierarchies:

“a first-level emergent pattern is captured in another medium, which leads to an emergent result captured in turn by yet another medium, and so forth. The result is what researchers in artificial life call a ‘dynamic hierarchy,’ a multi-tiered system in which feedback and feedforward loops tie the system together through continuing interactions circulating throughout the hierarchy […]. Distinguished by their degree of complexity, different levels continuously inform and mutually determine each other” (45).

While we have long existed as part of a dynamic hierarchy/heterarchy1 with our tools, Hayles argues that computers are significantly different than tools such as “a hammer or a stone ax” because “a computer has much more flexibility, interactivity, and cognitive power” (48). For example, she notes, unlike other kinds of technologies, computers “are able to handle both natural language and programming code,” a feature that allows for the creation of “complex human-computer networks” (48). In short, because humans and computers can communicate with each other, we exist in a much more complex dynamic system with computers than with other tools.

Intermediation, then, is the movement of feedback and feedforward within a dynamic heterarchy that takes place across mediums such as between humans and computers: “Humans engineer computers and computers reengineer humans in systems bound together by recursive feedback and feedforward loops, with emergent complexities catalyzed by leaps between different media substrates and levels of complexity [that is, between humans and computers]” (48).

So, how does this relate to reading electronic literature such as Michael Joyce’s “Twelve Blue,” Maria Mencia’s “Birds Singing Other Bird’s Songs,” and Judd Morrisey’s “The Jew’s Daughter”? As Hayles explains in her discussions of each – and foregrounds in her discussion of “Twelve Blue,” reading these texts isn’t a straightforward process, especially if we approach them from the perspective of print. In fact, the Oulipo’s combinatory literature (such as Queneau’s “A Story As You LIke It” and Fournel’s “The Theater Tree: A Combinatory Play”) and interactive fiction, both of which we looked at during Week 6, far more conform to a print-based reading sensibility than the three examples of electronic literature that we’ve looked at this week. 

As Hayles explains, we don’t read texts like Joyce’s “Twelve Blue,” Mencia’s “Birds Singing Other Bird’s Songs,” and Morrisey’s “The Jew’s Daughter” so much as we explore or play/perform them. We find our way in by interacting with them in some way, and our interactions change the texts, which in turn gives us new meaning. Each new interaction brings about new changes to the text which in turn provides us newer patterns that we then decipher and draw upon as we decide our next form of interaction, and so on and so forth. In short, to read a work of electronic literature such as the three texts we’ve looked at this week, we enter into a dynamic heterarchy with that text, and this all happens through the act(s) of intermediation.

  1. Hayles explains that she prefers the term heterarchy to foreground the fact that all levels of the system interact and influence each other in a dynamic fashion in ways that we’re not used to seeing in hierarchical systems (45).

Lecture: On Price’s “Edition, Project, Database, Archive, Thematic Research Collection”

While Kenneth Price’s purpose for writing “Edition, Project, Database, Archive, Thematic Research Collection: What’s in a Name?” is to propose a new term for describing the kinds of “large-scale, text-based electronic scholarship” (par. 1) now taking place, the article serves a useful purpose in bringing together and exploring various terms we’ve encountered this term. As he notes, while the various terms – edition, project, database, archive, thematic research collection – have all been applied to these forms of electronic scholarship, they all come with different connotations while at the same time not having clear boundaries.

For example, while we’re using the terms project and edition for the McLuhan Project and the Electronic Edition project, one could easily argue that the McLuhan Project involves creating an edition, and that both projects could just as easily be called thematic research collections. Even as we find the boundaries between these terms permeable, it’s useful to understand how they differ as well.

And while I’m not particularly fond of Price’s suggestion – arsenal – his larger point, that digital environments are allowing us to reimagine what it means to do text-based scholarship and the ways in which that scholarship can take form.

Lecture: Introduction to Week 11

“The danger in applying critical models developed for print is that the new possibilities opened for literary creation and interpretation will simply not be seen. Whatever limitations intermediation as a theory may have, its virtue as a critical framework is that it introduces computation into the picture at a fundamental level, making it not an optional add-on but a foundational premise from which to launch further interrogation.” – H. Katherine Hayles, “Intermediation: From Page to Screen,” Electronic Literature: New Horizons for the Literary, 83

The common theme that unified our readings this week, both Hayles’ “Intermediation: From Screen to Page” and Price’s “Edition, Project, Database, Archive: Thematic Research Collection: What’s in a Name?” is the understanding that digital technologies and environments open up new possibilities for how we produce and engage with texts of all sorts. If we fail to recognize this, we risk approaching our new media environment (the digital) with the assumptions and practices of the old (print). To invoke McLuhan, we run the risk of “look[ing] at the present through a rear-view mirror,” and in doing so we risk simply not seeing the new: the new ways of making meaning, the new ways texts can work, the new ways we interact with texts, the new ways we can explore and analyze texts, and the new ways we are called on to interact with them.