Sunday, September 14, 2008

Response to Hayles Writing Machines

Hayles distinguishes her concept of technotext from hypertext and cybertext. These terms focus too narrowly on links, in the case of hypertext, and the interactivity required by users to traverse ergodic texts (Espen Aarseth's neologism) in the case of cybertext. What is more important to her for developing Media-Specific Analysis is when

iterary work interrogates the inscription technology that produces it, it mobilizes reflexive loops between its imaginative world and the material apparatus embodying that creation as a physical presence. .. Technotexts play a special role in transforming literary criticism into a material practice, for they make vividly clear that the issue at stake is nothing less than a full-bodied understanding of literature. (pages 25-26)

The examples she gives of second-generation electronic literature such as the Califa project and Lexia to Perplexia bear out this distinction, as do print examples like A Humument and House of Leaves that foreground the materiality of the book form. Technotexts are the target of her Media-Specific Analysis, a method of literary analysis that "moves from the language of text to a more precise vocabulary of screen and page, digital program and analogue interface, code and ink, mutable image and durable mark, computer and book" and whose power "comes from holding one term constant across media (in this case, technotexts) and varying the media to explore how medium-specific possibilities and constraints shape texts." (pages 30-31).

To see how MSA works, consider the challenge of finding analogues to our contemporary studies of state-of-the-art digital media in antiquity, when the state-of-the-art was alphabetic writing itself. For example, parts of Plato's Phaedrus explicitly address (as artistic strategies) the materiality of the text as something that is carried around as a physical object (the recorded speech of Gorgias about love), from which new interpretations can be derived merely by negating the putatively recorded text, and whose composition displays interesting combinatorial properties such as the Midas epitaph (264C-E). There are even a number of places in the Phaedrus where the Socratic dialectic is broken by the skeptical Phaedrus putting into doubt the sensibility of Socrates' entire line of questioning when he broaches the boundaries of the implicit conventions of writing.

Shifting back to the present, a salient feature of some technotexts is creole discourse that combines ordinary spoken English and computer code, as found in Lexia to Perplexia. Hayles states its purpose, besides whatever artistic or entertainment value, is to form "the medium through which the origin of subjectivity can be re-described as coextensive with technology" (page 53). I, too, have found myself mixing technical acronyms, strings of signifiers cast in Backus-Naur Form (BNF), pseudocode as well as actual program source code in my notes to illustrate some point, convey an idea, or delineate an incomplete position requiring later emendation by my future selves or perhaps autonomous helpers via cyberspace. I think it is important to consider that the function of such creole can go far beyond presenting material metaphors when they reveal the very fabric of cyberspace upon which technotexts are rhapsodized. Put another way, in our interrogation of the alterity (a Feenberg term?) of the non-human, technologically-mediated components of the writing machines we encounter, we can become better suited to participate in the iterative re-creation of subsequent systems. The flip side of this potential is to drift toward the emptiness of Baudrillard's precession of simulacra when we ignorantly and gratuitously employ the creole language of human and machine codes, throwing in a little C++ here, a little command line lingo there, but only as an inauthentic spectacle of the bustling activity in the circuits and networks behind the browser screen.

I see this as precisely the place to expand on our philosophical conception of cyberspace, since it plays such an important role in MSA. Hayles takes an ambivalent stance towards Claude Shannon's classic communication model, in which the human participants are cleanly separated and outside of the channel as "sender" and "receiver," preferring the view that "artifacts such as this book serve as noisy channels of communication in which messages are transformed and enfolded together as they are encoded and decoded, mediated and remediated" (page 130). Despite shortcomings of the model, she argues

House of Leaves suggests that the appropriate model for subjectivity is a communication circuit rather than discrete individualism, for narration remediation rather than representation, and for reading and writing inscription technology fused with consciousness rather than a mind conveying its thoughts directly to the reader. (page 130)

This rejection of "discrete individualism" helps avoid the postmodern pitfalls while strengthening the requirement of a sophisticated understanding of cyberspace. For when MSA shifts into the realm of machine communication, a new territory opens advancing into the idioms of specific technology systems that can be explored by the careful humanist scholar. Therefore, I think it is very important to present technically correct examples when writing creoles of text amendable to natural automata (humans) and what normally amounts to fantasies of being amenable for artificial automata (electronic computing machinery, what Michael Heim calls 'general intelligence'), if we are actually intending to feed the creole text to both. My response to Halyes is help this along by expanding the view of cyberspace I previously made in “The Free Open Source Option as Ethic.”


This undertaking is greatly aided by free, open source software running in TCP/IPv4 networks of hosts and routers whose openness offer the promise of access to the human-readable source code underlying the machine mediated generation of texts delivered to the browser via HTTP from webserver hosts. This availability I call “affordance of epistemological transparency” and it is a tool for cutting through the useless information about a given technological product to reach the truly unique and unknown (but worth knowing) parts. By defining cyberspace in this way, the potential for radical epistemological transparency arises where it never could before for communication systems involving human computers, listeners, readers, and writers in addition to external media because of the inability to completely understand the workings of the human mind. Technotexts that exist for the most part in cyberspace, delivered via browsers for instance, that are composed of a combination of human readable and machine readable text - not merely as underlying control programs but also to be 'read' in the sense of narrative by artificial intelligence, perhaps in a virtual world co-inhabited by humans, cyborgs, and machines - demand a high degree of technical understanding to exist. This is no less than collaborative machine and human readable narrative.

No comments: