Wednesday, December 8, 2010

Dissertation Prospectus

Dissertation Prospectus
Influence of Early Personal Computers and Texts on Programming Style
John Bork
University of Central Florida

Summary

Interest in learning computer programming outside of computer science peaked in the mid 1980s through early 1990s as widespread diffusion of personal computers into public schools and universities occurred in the United States. During this period, Sherry Turkle revealed through ethnographic research that children learning to program exhibited a number of distinct programming styles. Often these individual styles clashed with the dominant culture, forcing some to conceal their true stylistic preferences, thwarting the creative impulses of others, and discouraging some altogether from writing programs. Her theoretical model relies strongly on developmental psychology and post-structuralist cultural critique, and attributes only minor relevance to the specific computer languages being learned, and makes no mention of the different brands and models of computers that were being used. My dissertation prospectus outlines a strategy for discovering how structural, rhetorical, and media-specific features of early personal computers and accompanying texts relate to the development of individual programming style by interviewing people who learned to program on such systems in the late 1970s through mid 1980s. The purpose of the data collection phase is to determine the relevant platforms, texts, and programming styles. The analysis phase will seek connections between them using qualitative analysis of the empirical data from the interviews. Close readings of Original Equipment Manufacturer (OEM) manuals, books, and magazines associated with these computer platforms seek to employ theories and methods from the study of texts and technology to respond to the gap in Turkle's work. The synthesis stage of the dissertation work may reveal recommendations for best practices in teaching computer programming in general settings, and guide the development of future computing platforms and their accompanying texts that are designed for learning.

Literature Review

For a period in the late 1970s through early 1990s, the rapid proliferation of personal computers in public schools and universities generated substantial interest among humanities scholars examining both the study of learning computer programming (Papert; Turkle, 1984; Turkle and Papert; Perkins, Schwartz, and Simmons), as well as how skills related to programming may transfer to other domains (Hoar; Mayer), in addition to the overall phenomenon of the growing presence of computer in education (Shields). Then the interest diminished, and scholarly study of learning programming devolved to the specific domains of computer science and technology education – and only for those children considering programming as a career (Gillespie; Ge, Thomas and Green; Panell). There is no clear picture of the current state of computer technology instruction in American public and private schools; however, it is likely to have shifted from a programming-centric to applications-centric emphasis. (If necessary, this hypothesis will be tested via empirical research, but it is not anticipated as a precondition affecting the validity of the proposed methodology.) Put another way, it is likely that programming-centric learning environments equivalent to (offering at least the same affordances of) late 1970s through early 1990s are not offered to the extent, as a percentage of school age children, that they were during this period. Interest in programming over application-based literacy has diminished since the 1990s, as articulated by David Brin in a 2006 online article, "Why Johnny Can't Code,” among others (Turkle, 1995; Bogost). A few who do promote the casual learning of programming in general academic contexts recommend very simple, modern languages such as HTML and XML (Cummings). Recent observers of this phenomenon suggest learning programming needs a different basis, such as using long defunct, early models of personal computers (PCs) on account of their relative simplicity and operational proximity to the raw hardware (Brin; McAllister; Bogost).
Is such an approach sensible? Has mainstream American culture shifted such that it is not reasonable to expect there to be a great deal of interest in what I would like to call general-purpose, casual programming? Within the realm of cultural studies, a possible means of analysis is to examine whether particular models of early PCs encouraged the development of particular programming styles, and how different programming styles fit into the overall culture of the time. While it may be fanciful to resurrect obsolete PC models to encourage children to learn programming, there may be lessons that can be learned from their design, marketing, and the texts that accompanied them, especially bundled instruction and reference manuals, for the next generation of devices.

Seymour Papert invented the Logo programming language as part of his vision of bringing together the power of easily programmable computers and the innate creativity of small children, as recounted in his 1980 book Mindstorms: Children, Computers, and Powerful Ideas. He set in motion other researchers and helped introduce personal computers to American schools. Sherry Turkle, who performed ethnographic studies of children learning to program in the late 1970s through mid 1980s, and later published with Papert, argued that children naturally express different programming styles. She called them 'hard mastery' and 'soft mastery'; children flourished or floundered in their efforts at learning programming depending on whether the learning environment and culture was compatible with their form of mastery (Turkle, 1984; Turkle and Papert, 1990; Turkle and Papert, 1991). She used the term 'bricolage', taken from Claude Levi-Strauss' anthropological studies of pre-industrial societies, to represent a programming style that “works on a problem by arranging and rearranging these elements, working through new combinations” (1984, p. 105). Essential differences between the formal, canonical programming style that is typically taught, and the 'bricoleur' style are the programmers relationship to their materials (abstract versus concrete) and attitude toward errors (avoidance versus acceptance). Turkle writes, “for planners, mistakes are missetps; for bricoleurs they are the essence of a navigation by mid-course corrections. For planners, a program is an instrument for premeditated control; bricoleurs have goals, but set out to realize them in the spirit of a collaborative venture with the machine” (2000, p. 136). Most of her research focused on the Logo programming language, which she argued was superior to BASIC in affording expression of the soft mastery style, and made no distinction between programming language and the particular model of machine that her subjects were using. She leaves to a footnote a brief analysis of the affordancs of Logo over BASIC: “Not all computer systems, not all computer languages offer a material that is flexible enough for differences in style to be expressed. A language such as BASIC does not make it easy to achieve successful results through a variety of programming styles” (1984, p. 105).

Turkle's interest in programming styles represents one aspect of research in learning computer programming. Studies about what teaching techniques do and do not work, and whether and how particular skills such as algorithm flowcharting and troubleshooting (debugging) may transfer from a programming environment to other domains, are more typical. Richard Mayer edited Teaching and Learning Computer Programming: Multiple Research Perspectives in 1988, a book containing research articles reflecting the rapid proliferation of personal computers in public schools and universities. The history of research on teaching and learning computer programming, as related by Mayer in the introduction, began with strong claims by Papert and others concerning the expected positive outcomes of non-directed methods for teaching programming. This was followed by empirical research revealing disappointing realities: “learning even the rudiments of LOGO appeared to be difficult for children and transfer to other domains seemed minimal” (p. 3). The present state (in 1988) characterized by multidisciplinary research and theory, retreated from the early, positive claims. Two trends he notes are “first, instead of advocating discovery methods of instruction, current research suggests that more guidance is needed to insure learning. . . . Second . . . transfer is most likely to occur for skills most similar to those learning in programming” (p. 4). Nonetheless, Papert's influence is evident by the number of chapters that deal with the Logo language. The next four chapters of Teaching and Learning Computer Programming focus on Logo: componential problem solving, cognitive analysis of learning Logo, it influence of intellectual development, and teaching methods that emphasize transfer of general skills. Chapter 11 returns to the topic of skills transfer from programming environments to non-programming contexts, again using Logo, and, contrary to Mayer's assessment in the introduction, claiming to successfully teach a skill that can be transferred beyond computer programming environments.

The other 'learning languages' besides Logo of note in Mayer's anthology are BASIC and Pascal. Chapter 7 presents the research of Perkins, Schwartz, and Simmons on “Instructional Strategies for Novice Programmers,” in which a 'metacourse' was designed to provide supplementary material to a BASIC programming curriculum. It attempts to address three common sources of difficulty for beginning programmers: fragile knowledge of the domain, lack of elementary problem-solving strategies, and attitudinal problems of confidence and control towards computers. This theme is continued in Chapter 8, which investigates the social context of learning computer programming. In Chapter 9 the results of a particular instructional project (the Autonomous Classroom Computer Environments for Learning) in high school Pascal programming classes are analyzed. Chapter 10 is a case study of typical errors students make in an introductory Pascal class. A glance at the history of scholarly journals combining computing and teaching reveals an expectant atmosphere in which it was assumed by most that programming instruction would continue to expand in public school curricula.

The bulk of current literature on learning computer programming is published within computer science and technology education domains, focusing on classroom settings where the currently popular languages used by professional programmers are taught, and not surprising, focusing on post-secondary instruction. A typical research article is "Exploring the relationship between modularization ability and performance in the C programming language: the case of novice programmers and expert programmers” by Maurice Vodounon. The days of general-purpose, casual programming instruction appear to be gone, along with wood shop, sewing, and other 'home economics'. While research publications about learning computer programming increased in quantity in the last two decades, by and large they appear in domain-specific journals and conferences rather than venues more popular for 'humanities computing'. This is not to disregard the rich dialog in new disciplines like Software Studies (Manovich) and Critical Code Studies (Wardrip-Fruin). However, these theorists appear more concerned with overall, societal technology consumption than with the topic of learning programming.

A few stand outs among humanities scholars who do directly address learning programming can be found, and they will be considered next. It is easier to see connections between the themes developed in Mayer's anthology than direct continuations of Turkle's work, although Turkle is often invoked in studies of class, race, and gender biases in instructional and work environments, often associating hard mastery with masculine preferences, and soft mastery with feminist themes (McKenna; Lynn, Raphael, and Olefsky; Sutton; Lau and Yuen). Recent research into 'pair programming' by Jill Denner and Linda Werner further elaborates on the importance of background knowledge and social support for novices that was the subject of the metacourse designed by Perkins, Schwartz, and Simmons.

Robert E. Cummings published an interesting article in 2006, “Coding with power: Toward a rhetoric of computer coding and composition” that explicitly ties programming, as a form of composition targeted towards machine readers, to composition targeted towards human readers. Besides making a theoretical argument linking programming and composition by presenting a parallel of the rhetorical triangle for addressing machines, he offers practical tips for how to implement his ideas in a class. In private correspondence with Cummings, he supports, as a replacement for writing BASIC programs, having student write Extensible Markup Language (XML) Document Type Definition (DTD) representations of the rhetorical elements of narratives and essays that they are studying. His emphasis on writing software code to clarify conceptual understanding alludes to the object-oriented programming style, which can be seen in addition to Turkle's bricoleur style as a rich ground for asserting individual perspectives by focusing on how programming problems map onto their external referents rather than on how to most efficiently code algorithms. The trend toward object-oriented metaphors was recognized by Turkle in Life on the Screen by such ground breaking changes in human computer interfaces as the desktop oriented Apple Macintosh of the mid 1980s. By the mid 1990s her research had taken a turn from studying how people learned to program computers to how people relate to and live in computer environments in general.

From this review of scholarly literature on learning programming a number of gaps are evident, and these gaps appear to be echoed by the cries of the popular press who lament America's loss of its edge in technical innovation attributable to failures of the educational system. The argument of the article by Soloway, Spohrer, and Littman in Chapter 6 of Teaching and Learning Computer Programming is that it is better to focus on the process rather than the product when the goal is to teach that problems can be solved in multiple ways. This idea complements Turkle's notion of epistemological pluralism. Her early investigation of programming styles can be refreshed using more recent programming languages and computing platforms, to see what is happening in elementary and secondary schools today. Alternatively, there is the unasked question of whether there is any influence by the specific, material conditions of learning programming on the development of individual style. By material conditions I mean both the computer platforms themselves, such as the Apple II+ and Commodore 64, and the texts that were bundled with them by the manufacturer, such as the Applesoft Tutorial and the Commodore 64 Programmer's Reference Guide, as illustrated in Figure 1.

This question about the influence of specific platforms and media can be studied by interviewing people who learned to program during the same period in which Turkle did her research focusing on Logo. N. Katherine Hayles (2004) coined the term Media-Specific Analysis (MSA) to describe a critical technique that takes into account the physical and structural properties of different media that putatively perform the same rhetorical function that can be used here. Second, the site of learning programming can be shifted from formal instruction – classroom settings in colleges, universities, public and private schools – to where many people in America first learned to write programs, at home, alone, self-taught via OEM manuals, library books, and magazines. Many respondents to online surveys and discussion forums state that they were self-taught (see Table 1 in the Data Collection section). In these settings the texts that were used alongside the computer itself often fulfilled the role of an accompanying instructional regime. A third gap to consider is extending the categories of programming styles beyond Turkle's dichotomy of hard and soft mastery by importing knowledge from mainstream computer science literature and history of software studies. For instance, there may be a recognizable stylistic difference between procedural and object-oriented approaches. There may also be distinct styles tied to specific computing platforms, a point made by Bogost and Montfort in their cultural and technical study of the Atari 2600 game console. In the spirit of qualitative research, however, these questions are best answered using a mixed method of critical analysis guided by empirical study of human subjects.

Methodology

The research questions I am posing revolve around the gaps in Turkle's early work on programming styles, informed by theorists like Hayles who, with Turkle, recognize that specific technologies and texts can have a broad impact on learning, development, and personality, perhaps, in this case, programming style. Understanding the relevance of the affordances of specific technological systems and texts serves to remind us that certain obsolete platforms may offer advantages for learning programming over contemporary, putatively superior ones. The outcome of this study may reveal recommendations for best practices in teaching computer programming in general settings, and may guide the development of future computing systems (for example, as using all free, open source software) and their accompanying texts, both printed and online, when they are designed as platforms for learning. These research questions are:

  • What platforms were used in learning to program in the late 1970s through mid 1980s?

  • What printed texts (OEM manuals, popular press books, textbooks, magazines) were used?

  • What programming styles developed?

My methodology is summarized in the following outline. See Appendix D for the overall project milestones and target completion dates.

  1. Data Collection Stage

    1. Refine Interview Protocol

    2. Select Interview Candidates

    3. Submit IRB materials

    4. Conduct Interviews

  2. Data Encoding Stage

    1. Code Interview Data from Forms

    2. Transcribe and Code Significant Audio Recordings

  3. Data Analysis Stage

    1. Qualitative Analysis of Key Platforms, Texts, and Programming Styles

    2. Structural, Rhetorical, and Media-Specific Analysis of Key Platforms and Texts

  4. Synthesis Stage

Data Collection

An interview protocol has been developed that elicits data about what computer platforms and what texts people used while learning to program. It also inquires into when, where, and with whom they did so, as well as what kinds of programs they wrote. At the heart of the interview process is a single page form that provides prompts for the researcher and provides room to take organized notes. See Appendix A for a blank form and an image of one that was filled out during a trial interview. Trials of the interview process took 30-60 minutes to cover all of the items on the form, including a substantial amount of free discussion at the end of the interview. However, the protocol needs to be adjusted to provide prompts for learning more about the programming style characteristic of the subject. Turkle seems to imply a connection between programs asking the user questions and soft mastery, although her assignment of style hinges more on how the programmer relates to the computer platform running the program and source code than the actual nature of the programs written. This may be due to her choice of environments (instructional setting) and programming language (Logo) influencing the type of programs being written. It is hoped that Sherry Turkle or one of her students can be solicited to provide guidance. Susan Lammers' interviews of influential programmers in her 1986 book Programmers At Work is helpful for seeing the sorts of responses that may be indicative of programming style without begging the question because their wider scope itself gives narrative, biographical evidence of programming style.

Interviews will be conducted over telephone, voice over IP (VOIP), or in person. During the interviews the researcher takes handwritten notes using the interview form, and also makes an audio recording for later transcription if permission has been given to do so. The Informed Consent Form in Appendix C provides a check box where the interview subject can elect to be recorded or not. In either case, the researcher must be diligent in removing all personally identifying information from any quotations or paraphrases that appear in subsequent oral and written presentations based on this research.

The intended subjects for this study are persons likely to have used early personal computer platforms and associated texts (manuals, books, magazines, etc.) that were popular in the United States from the late 1970s through mid 1980s. Interview candidates will be selected based on their participation in existing surveys, polls, and discussion group threads that address how, when, and with what platforms they first learned to program computers. Sample data sources from which to cull potential interview subjects are shown in the following table. Obviously, only those resources that provide email addresses or another way to contact the respondent, such as through the discussion forum messaging system, will be considered.

URL

http://stackoverflow.com/questions/348098/are-you-a-self-taught-programmer-or-did-you-take-a-programming-course

Survey Question

Are you a self-taught programmer or did you take a programming course?

Sample Response

Self taught using ZX BASIC/assembly on a ZX spectrum. Got it for the games but quickly became very interested in what was happening underneath. No internet/forums so just had to make it up as I went along.

Then did a university degree which required us to do programming but did not really teach it (apart from some simple Fortran 77). Was good for me as I was really interested in programming anyway. Then used Fortran/C++ in the real world just by continued learning on the job.

Have continued self teaching ever since (e.g reading stackoverflow) but I don't get to do as much programming as I would like these days...... (posted Dec 7, 2008)

Respondent

Registered User “luapyad”

URL

http://blogs.msdn.com/b/johnmont/archive/2006/05/03/586766.aspx

Survey Question

How did you learn to program?

Sample Response

I started as a hobbiest in high school typing in programs from Compute! magazine into my C=64. The most programming I did at that point was debugging my type-o's which forced me to understand the source of the problem in order to fix them. I started professionaly with MSAccess (using Mike Groh's Access 95 book). I now work with VB.Net/Sql Server/ASP.Net/etc. Most of the learning I get at this point is through trial-error. I also find user groups to be a good source of free training as well as networking opportunities. For a new programmer, the best learning will be to give them a simple task and have them figure out how to solve it. Sure they will make mistakes along the way, but we often learn best from our mistakes. We can read all the books we want, but actually doing is what counts. As evidence, consider the number of people who read the certification books, pass the tests and still can't code their way out of a shoe-box. (posted May 3, 2006)

Respondent

Registered User “Jim Wooley” (registration required to access additonal user profile information)

URL

http://www.quartertothree.com/game-talk/showthread.php?p=2082157

Survey Question

How did you learn to program?

Sample Response

I started learning (and learned about backups) simultaneously; the first while modifying my uncle's TRS-80 Color Computer animation program, and the second by saving my alterations to his original tape in ignorance of cassette tape leaders. (posted February 4, 2010)

Respondent

Username “ciparis”

Table 1: Data sources for selecting interview candidates

Interviews may also be solicited through blog postings and other social media, as well as personal references. There are a number of software development businesses in the greater Orlando area including Toptech Systems, Electronic Arts, Fiserv, and NCR with which the researcher has ties. A sample email invitation to participate in the research is shown in Appendix B. Following University of Central Florida Institutional Review Board (IRB) procedures, all interview subjects must read, sign, and return a copy of the Informed Consent Form prior to being interviewed.

Data Encoding

Data will be encoded from the interview forms and audio recordings. Sets of identifiers will be created for each category (computing platform, texts, types of programs written, and other stylistic indicators) to normalize the data into units represented by machine configurations, printed texts, and programming styles. A spreadsheet may be used to initially encode the handwritten interview forms, although it is anticipated that custom software will be developed in conjunction with the Spring 2011 Digital Editing and Database course to facilitate encoding. Table 2, below, represents a spreadsheet representation of sample interviews displaying the interview numbers, the age and year they started programming, the computer platforms, and the OEM texts they recall using. The full interview form contains many more dimensions, some of which may take multiple entries for each interview subject to reflect concurrency and temporal change in their programming activities.

Number

Age

Year

Computer Model

OEM Manual

1

18

1978

Mainframe PDP


2

18

1980

PDP – 10 or 11 minicomputer


3

13

87/88

Apple //e, PC jr (later that year)


4

10

1985

C-64, 486 in college

big thick manual that came with it

5

9

1979

Vic-20, C-64, PCs late 80s, Mac 2006

Reference Manual, 64 had 6510 instruction set, section on architecture, very basic overview of kernel, sprites, sounds

6

24

1978

Heathkit H-100 (H8) dual 8085/8080 kit, Zenith Z-100 integrated unit

MSDOS/ROM BASIC came with ROM chips, not the computer

Table 2: Spreadsheet containing interview data from 6 sample interviews

The next level of encoding is to normalize dimensions and category information from the spreadsheet data using an enumerative coding scheme (see Geisler, chapter 4). It is anticipated that the data will be loaded into MySQL relational database tables designed for this project to allow manipulation by custom programs for analysis and display of results. Appendix E contains the Structured Query Language (SQL) commands used to create the sample database and tables that are shown below. The tables will contain fields for the encoded categories for each dimension (as an integer reference to a secondary lookup table), as well as the raw interview transcription if it is available, or raw researcher notes (as a text field). It is important to retain the raw interview text since it is likely that the data may be iteratively recoded as more results accrue and emerging patterns reshape the direction of the study. The table structure will also support multiple instances of the same dimension for each interview, reflecting that the subjects may recall having used multiple computer platforms over the course of the years they learned to program that fall within the target time frame of late 1970s through mid 1980s. The two sample tables definitions shown in Figure 2 and Figure 3 represent the enumeration of various dimensions and their possible categories.

These are considered secondary lookup tables so that the integer key can be used to consistently reference each category type for all of the interviews. The encoding task requires identifying the relevant, distinguishable computer platforms and assigning each an integer; these are the categories of the dimension 'computer platform'. A similar process must be done for the OEM texts, magazines, and so on that are recounted during the interviews, for the types of programs written, and so on.

Then each interview can be encoded using these enumeration numbers to facilitate automated analysis, as shown in Figure 4. Notice the use of the InstanceEnumeration field to indicate when there are multiple instances of the same dimension for an interview, reflecting the temporal ordering of, for example, the computer platforms Commodore Vic-20 followed by Commodore 64 by InterviewNumber 5. A utility program is envisioned for entering the data from the interview form directly into the database rather than through an intermediary spreadsheet representation.

It is understood that the encoding models for many of the dimensions captured by the form and that may be derived from transcriptions of the recorded audio have yet to be defined. In particular, the categories of the dimension 'programming style' have yet to be expanded beyond Turkle's dichotomy of hard and soft mastery, let alone have heuristics for determining those programming styles from raw data be developed. Therefore, one of the first tasks in the project time line presented in Appendix D is to refine the interview protocol to achieve these goals.

Data Analysis

Data analysis will be divided into two parts: analysis of the interview data, and analysis of computer platforms and texts. Interview data will be analyzed using custom software written by the researcher to perform qualitative analyses or prepare the datasets for analysis by third-party software applications. Nominal data about the use of computer platforms, texts, and programming style will be generated to perform quantitative statistical analysis of correlations between platforms/texts and programming styles, as well as other pairings of variables that emerge as possibly significant. This data will also inform the selection of the specific platforms and texts that will be scrutinized in detail as items of material culture. The analysis of computer platforms and texts represents a study of material culture and is the least developed part of this proposal. The cultural milieu and aspects of production of many early personal computers are recounted in Freiberger and Swain's Fire in the Valley. It is anticipated that future coursework in 2011 will flesh out the methods for this content analysis. Any custom software created from this project will be licensed under a popular free, open source license such as the GNU General Public License (GPL) version 2 or higher. The analysis methods will include assessment of the reliability and validity of the measurement system, following the recommendations made by Lauer and Asher (see Chapter 7 on Measurement).

Synthesis

The final stage of the dissertation project will synthesize the findings of the data analysis that will happen after encoding data from interviews. In connecting this work with mainstream research in teaching and learning computer programming, I am hypothesizing that certain texts may impose "more guidance" (as developed by Perkins, Schwartz, and Simmons in the description of their 'metacourse' supplement to formal programming instruction) than their absence or other texts during the unmediated discovery of the computer's programming possibilities, as well as other features of the metacourse for novice programmers. Texts may also function like a partner in the "pair programming" research discussed by Denner and Werner without the authoritarian auspices of an adult teacher. Also consider how media-specific features of colorful, spiral-bound printed texts such as The Applesoft Tutorial differentiate their rhetorical and practical functions from other types of texts, such as browser-based resources or help features built into Integrated Development Environments (IDEs). For example, consider the rhetorical impact of its cover (Figure 5), containing a program that can be typed in and run. This gave the Apple platform an advantage because you could use it at face value. In terms of designing an optimal text today, while it is true that printed and video display media can both show the picture on the cover, there may be advantages to having a physical, printed book from which to learn about how to use the computer, versus using the same computer to display the instructional text. At minimum, a solution to the conundrum of why it seems to be better to go back to previous states of the art in computer technology to best learn how to program in general to best develop individual style, both innate and determined by the environmental affordances of the technologies in use in experience, will be offered.

Works Cited

Bogost, Ian. (2010, February 19). Pascal spoken here: Learning about learning programming from the Apple ][. Message posted on http://www.bogost.com/blog/pascal_spoken_here.shtml. Retrieved October 3, 2010.

Brin, David. (2006, September 14). Why Johnny can't code. Message posted on http://www.salon.com/technology/feature/2006/09/14/basic. Retrieved October 3, 2010.

Cummings, Robert E. (2006). Coding with power: Toward a rhetoric of computer coding and composition. Computers and Composition, 23(4), 430-443.

Denner, Jill, & Werner, Linda. (2007). Computer programming in middle school: How pairs respond to challenges. Journal of Educational Computing Research, 37(2), 131-50.

Freiberger, P., & Swaine, M. (2000). Fire in the valley: The making of the personal computer. New York: McGraw-Hill.

Ge, Xun, Thomas, Michael K., & Greene, Barbara A. (2006). Technology-rich ethnography for examining the transition to authentic problem-solving in a high school computer programming class. Journal of Educational Computing Research, 34(4), 319-52.

Geisler, Cheryl. (2004). Analyzing streams of language. New York: Pearson Education, Inc.

Gillespie, C.W. (2004). "Seymour Papert's vision for early childhood education? A descriptive study of preschoolers and kindergarteners in discovery-based, Logo-rich classrooms. Early Childhood Research and Practice, 6(1).

Hayles, N. Katherine. (2004). Print is flat, code is deep: The importance of media-specific analysis. Poetics Today, 25(1), 67-90.

Hoar, Nancy. (1987). Conquering the myth: Expository writing and computer programming. College Composition and Communication, 38, 93-95.

Lammers, Susan. (1986). Programmers at work: Interviews with 19 programmers who shaped the computer industry. Redmond, WA: Tempus Books of Microsoft Press.

Lau, Wilfred W. F., & Yuen, Allan H. K. (2009). Exploring the effects of gender and learning styles on computer programming performance: Implications for programming pedagogy. British Journal of Educational Technology, 40(4), 696-712.

Lauer, Janice M. and J. William Asher. (1988). Composition research: Empirical designs. New York: Oxford University Press.

Lynn, Kathleen-M., Raphael, Chad, & Olefsky, Karin. (2003). Bridging the gender gap in computing: An integrative approach to content design for girls. Journal of Educational Computing Research, 28(2), 143-162.

McAllister, Neil. (2008, October 2). Should computer programming be mandatory for U.S. students? Infoworld.

McKenna, Peter. (2000). Transparent and opaque boxes: Do women and men have different computer programming psychologies and styles? Computers & Education, 35(1), 37-49.

Mayer, Richard E. (1988). Introduction. In Richard E. Mayer (Ed.), Teaching and learning computer programming: Multiple research perspectives. Hillsdale, N.J: L. Erlbaum Associates.

Panell, Chris. (2003). Teaching computer programming as a language. Tech Directions, 62(8), 26-27.

Papert, Seymour. (1980). Mindstorms: Children, computers, and powerful ideas. New York: Basic Books.

Perkins, D.N, Schwartz, Steve, & Simmons, Rebecca. (1988). Instructional strategies for the problems of novice programmers. In Richard E. Mayer (Ed.), Teaching and learning computer programming: Multiple research perspectives. Hillsdale, N.J: L. Erlbaum Associates.

Shields, Mark A. (1995). The legitimation of academic computing in the 1980s. In Mark A. Shields (Ed.), Work and technology in higher education: The social construction of academic computing. Technology and education. Hillsdale, N.J: Lawrence Erlbaum Associates.

-----. (1995). The social construction of academic computing. In Mark A. Shields (Ed.), Work and technology in higher education: The social construction of academic computing. Technology and education. Hillsdale, N.J: Lawrence Erlbaum Associates.

Sutton, Rosemary E. (1991). Equity and computers in the schools: A decade of research. Review of Educational Research, 61(4), 475-503.

Turkle, Sherry. (1984). The second self: Computers and the human spirit. New York: Simon & Schuster.

-----. (1995). Life on the screen: Identity in the age of the Internet. New York: Simon & Schuster.

Turkle, Sherry, & Papert, Seymour. (1990). Epistemological pluralism: Styles and voices within the computer culture. Signs, 16(1), 128-157.

-----. (1991). Epistemological pluralism and the revaluation of the concrete. Journal of Mathematical Behavior, 11(1), 3-33.

Vodounon, Maurice A. (2006, June 22). Exploring the relationship between modularization ability and performance in the C programming language: the case of novice programmers and expert programmers. The Free Library.

Weinstein, Peter. (1999). Computer programming revisited. Technology & Learning, 19(8), 38-42.

Werner, Linda, & Denner, Jill. Pair programming in middle school: What does it look like? Journal of Research on Technology in Education, 42(1), 29-49.

Wednesday, October 13, 2010

Programming Survey

The refined problem that interests me concerns the rhetorical and media-specific functions of the texts that people used to learn (or turn away from) programming computers, particularly (based on initial responses) the OEM manuals that were packaged with early personal computers that contain programming examples and language tutorials (for example, The Applesoft Tutorial that accompanied Apple //+ computers from 1979-1982). A cursory look at online discussion threads and polls reveals that many people consider themselves self-taught, or starting programming at home long before receiving any formal instruction at school. These resources can be analyzed to reveal the computer models used to learn programming. Sometimes they mention using OEM manuals, magazines or other texts; however, a new survey and interview process needs to be created that focuses on discovering these texts and how they were used. Use a blog posting to host the survey and its responses, or some other hosted, on line survey instrument. Invite participants of prior discussions and surveys to participate in the new survey (eventually; for now just invite friends, coworkers, and colleagues). Interview candidates can then be selected from the survey respondents. Please answer these questions:

Have you ever written computer programs?

Do you remember how old you were when you started programming?
(Or what grade you were in?)

Can you recall the setting of your first programming experiences? (Was it at school, at home, at a friend's, some other family member, another place?)


Did you work alone or with other people?

And what model did you use? (Was it made by Apple, Commodore, IBM, Tandy, Texas Instruments?)

What was the programming language?

Do you remember if you used any books that were with the computer?

What do you remember about them?

Did they contain programs that you typed in?

Did they contain pictures?

Did you read the text?


Ask similar questions for those who respond negatively to the first question, and did not program computers. They still will recall the first places they remember using computers, their approximate age, and perhaps less clearly, the model. The researcher (or proxy) can guess at which texts they may have used, and for certain models, the fact of their absence. Thus I can collect information about the negative effects of bundled texts for particular models and particular people who never became programmers. Please answer these questions:


If you did not write programs, do you remember how old you were when you first used a computer?

Can you recall the setting of your first computer using experiences?

Were you alone or with other people?


Do you remember what type of computer it was?


Do you remember using any books were with the computer?


Do your remember anything about them?


In connecting this work with mainstream research in teaching and learning computer programming, I am hypothesizing that certain texts may impose "more guidance" than their absence or other texts during the unmediated discovery of the computer's programming possibilities, and that certain texts may provide the function of the "metacourse" recommended by Perkins, Schwartz, and Simmons for novice programmers. Texts may also function like a partner in the "pair programming" research discussed by Denner and Werner without the authoritarian auspices of an adult teacher. Also consider how media-specific features of colorful, spiral-bound printed texts such as
The Applesoft Tutorialdifferentiate their rhetorical and practical functions from other types of texts, such as browser-based resources or help features built into Integrated Development Environments.


References

  • Bogost, Ian. "Pascal Spoken Here: Learning about Learning Programming from the Apple ][." [Weblog entry.] Ian Bogost Blog. 19 Feb 2010. (http://www.bogost.com/blog/pascal_spoken_here.shtml) 12 Oct 2010.
  • Brin, David. "Why Johnny Can't Code." Salon. Salon, 14 Sept 2006. (http://www.salon.com/technology/feature/2006/09/14/basic). 3 Oct 2010.
  • Cummings, Robert E. "Coding with power: Toward a rhetoric of computer coding and composition." Computers and Composition 23.4 (2006): p. 430-443. Print.
  • Denner, Jill, and Linda Werner. "Computer Programming in Middle School: How Pairs Respond to Challenges." Journal of Educational Computing Research 37.2 (2007): 131-50. Print.
  • Mayer, Richard E. Introduction. Teaching and Learning Computer Programming: Multiple Research Perspectives. Ed. Richard E. Mayer. Hillsdale, N.J: L. Erlbaum Associates, 1988. Print.
  • Perkins, D.N, Steve Schwartz, and Rebecca Simmons. "Instructional Strategies for the Problems of Novice Programmers." Teaching and Learning Computer Programming: Multiple Research Perspectives. Ed. Richard E. Mayer. Hillsdale, N.J: L. Erlbaum Associates, 1988. Print.
  • Vodounon, Maurice A. "Exploring the relationship between modularization ability and performance in the C programming language: the case of novice programmers and expert programmers." The Free Library. 22 June 2006. (http://www.thefreelibrary.com/Exploring the relationship between modularization ability and...-a0144705087) 10 Oct 2010.

Participate in the survey by copying the appropriate set of questions into a comment (programmer or nonprogrammer).

Wednesday, October 6, 2010

Refining Research Questions

Searching for two to three research papers to find the gaps:

  • Brin, David. (2006). Why Johnny Can't Code. Salon. 9/14/2006.

  • Cuban, L. (2001). Oversold and underused: Computers in the Classroom. Harvard U. Press.

  • Cummings, Robert E. (2006). Coding with power: Toward a rhetoric of computer coding and composition. Computers and Composition v. 23 no.4, p. 430-433.

  • Denner, Jill. "The Girls Creating Games Program: An Innovative Approach to Integrating Technology into Middle School." Meridian: a Middle School Computer Technologies Journal Winter 2007. 4 Oct. 2010. http://www.ncsu.edu/meridian/win2007/girlgaming/index.htm

  • Denner, Jill, and Linda Werner. "Computer Programming in Middle School: How Pairs Respond to Challenges." Journal of Educational Computing Research 37.2 (2007): 131-50. Education Full Text. Web. 4 Oct. 2010.

  • "From codex to code: programming and the composition classroom." Computers and Composition 16.3 (1999): 319-436. Education Full Text. Web. 4 Oct. 2010.

  • Ge, Xun, Michael K. Thomas, and Barbara A. Greene. "Technology-Rich Ethnography for Examining the Transition to Authentic Problem-Solving in a High School Computer Programming Class." Journal of Educational Computing Research 34.4 (2006): 319-52. Education Full Text. Web. 4 Oct. 2010.

  • Gillespie, C.W. (2004). Seymour Papert’s vision for early childhood education? A descriptive study of preschoolers and kindergarteners in discovery-based, Logo-rich classrooms. Early Childhood Research and Practice, 6(1).

  • Hoar, Nancy. "Conquering the myth: expository writing and computer programming." College Composition and Communication 38 (1987): 93-5. Education Full Text. Web. 4 Oct. 2010.

  • Kocian, Lisa. (2010). One computer for every student. Boston Globe. 2/11/2010.

  • Lau, Wilfred W. F., and Allan H. K. Yuen. "Exploring the effects of gender and learning styles on computer programming performance: implications for programming pedagogy." British Journal of Educational Technology 40.4 (2009): 696-712. Education Full Text. Web. 4 Oct. 2010.

  • Lynn, Kathleen-M., Chad Raphael, and Karin Olefsky. "Bridging the Gender Gap in Computing: An Integrative Approach to Content Design for Girls." Journal of Educational Computing Research 28.2 (2003): 143-62. Education Full Text. Web. 4 Oct. 2010.

  • McAllister, Neil. (2008). Should computer programming be mandatory for U.S. students? Infoworld. 10/2/2008.

  • McKenna, Peter. "Transparent and opaque boxes: do women and men have different computer programming psychologies and styles?." Computers & Education 35.1 (2000): 37-49. Education Full Text. Web. 4 Oct. 2010.

  • National Center for Education Statistics. (2005). Computer Technology in the Public School Classroom: Teacher Perspectives.

  • Panell, Chris. "Teaching Computer Programming as a Language." Tech Directions 62.8 (2003): 26-7. Education Full Text. Web. 4 Oct. 2010.

  • Solomon, Justin. "Programming as a Second Language." Learning and Leading with Technology 32.4 (2005): 34-9. Education Full Text. Web. 4 Oct. 2010.

  • Sutton, Rosemary E. (1991). Equity and Computers in the Schools: A Decade of Research. Review of Educational Research. Vol. 61, No. 4 (Winter, 1991), pp. 475-503.

  • Weinstein, Peter. "Computer programming revisited." Technology & Learning 19.8 (1999): 38-42. Education Full Text. Web. 4 Oct. 2010.

  • Werner, Linda, and Jill Denning. "Pair Programming in Middle School: What Does It Look Like?." Journal of Research on Technology in Education 42.1 (2009): 29-49. Education Full Text. Web. 4 Oct. 2010.

The Brin article is an opinion piece but hits the subject directly, asking why BASIC or some other language is not provided on home computers for children to type in sample programs found in their math textbooks. He declares that “Microsoft and Apple and all the big-time education-computerizing reformers of the MIT Media Lab are failing, miserably. For all of their high-flown education initiatives (like the "$100 laptop"), they seem bent on providing information consumption devices, not tools that teach creative thinking and technological mastery.” The opinion piece by McAllister attacks the stereotype of the programmer as nerdy, male social misfit by appealing to the rationale responding to the new needs of the global marketplace. His argument is that baseline computer literacy, including programming fundamentals, should be a requirement: “Make no mistake; the days when knowledge of computer programming was a ticket to a golden future are over. In today's globalized job market, computer literacy should be seen as a baseline skill for the U.S. workforce, not a differentiator.” The Wikipedia entry for “Educational Programming Languages” fills in many details to bring things up to the present. Its external links includes this interesting news story about the Scratch language. A recent article by Lisa Kocian in the Boston Globe describes technology initiatives aimed at providing “one computer for every student,” noting that “Maine is at the forefront of technology in the schools, with all public school students in the seventh and eighth grades receiving a laptop computer.” However, the article does not describe what the children will be doing with the computers, besides emphasizing the need for teacher buy in and training. Nor does the U.S. government report on teacher perspectives of the adequacy of computer technology in public school classrooms, although it does introduce some statistics. Nancy Hoar wrote a short piece in 1987 linking expository writing and programming. The most promising article so far is by Robert Cummings, who “explores the connection between computer programming (coding) and traditional composition.” He does not directly cite any quantitative or qualitative research, though (I sent him an email asking for more information). The article by Lynn, Raphael and Olefsky does involve empirical findings, but its focus is studying the gender gap in computer use in general, not programming in particular. Peter McKenna's article directly picks up on Turkle's work, but does not appear to contain any empirical data. Lau and Yuen's article does contain research findings, but its focus on gender in Hong Kong schools strays from my interest in the current state of programming instruction in general in the U.S. Denner's article on girls creating a computer game contains research data but is still not in the direct sphere of texts and technology studies. Her article with Linda Werner on how pairs of girls respond to programming challenges gets closer still, containing qualitative empirical data and citing Turkle, and may be transportable into the composition studies area as a way to implement Cummings' starter ideas. See also their 2009 article that contains coded transcripts. Then there is the article by Ge, Thomas and Greene that almost enunciates my original research idea. “Technology-Rich Ethnography for Examining the Transition to Authentic Problem-Solving in a High School Computer Programming Class” states in its abstract, “our findings lend support to the argument that teachers in high school computer programming classes should incorporate the following features in their curricula: open-ended problem solving, real-world clients, group work, student autonomy and ample opportunities for student creative expression. ”

My sense from the first iteration of the research question postings was that my topics did not have clear connections to texts and technology, despite the fact that I was invoking theorists whose work we read in various T&T courses, in particular Sherry Turkle. Therefore, I spent some time this weekend looking for journal articles dealing with the intersection of computer programming and English composition. In the process I found a number of qualitative research studies looking at differences in programming styles between men and women, most of which cited Turkle. Robert E. Cummings published an interesting article in 2006, “Coding with power: Toward a rhetoric of computer coding and composition” that explicitly ties programming, as a form of composition targeted towards machine readers, to composition targeted towards human readers. Besides making a theoretical argument linking programming and composition by presenting a parallel of the rhetorical triangle for addressing machines, he offers practical tips for how to implement his ideas in a class. I contacted him to see whether he has indeed enacted his idea in any of his classes, and if so, if he tried to measure anything, or whether he knows of any external studies conducted to test his hypotheses or that are related to them. But I when it comes down to want-to-do-ability, I still wish to focus on improving our understanding of casual, everyday programming as it exists in America, how children are being exposed to programming instruction, in order to meditate upon the art of programming for humanities scholars and as a subject for texts and technology theorists.

Part I: Framing the Problem and Considering the Context

The problem in which I am interested is that the sense of urgency and value in teaching computer programming as a basic skill to American youth appears to have diminished from the late 1980s and early 1990s when Turkle was doing her ethnographic studies of programming styles. The disciplinary and professional communities that are interested in this problem may of course be technology educators, but I believe the problem is also relevant to ethnographers of technology as well as texts and technology theorists because programming computers is one of the quintessential acts of post-literate culture. Conversational partners include those like the later Turkle who argue that deep structure cognitive styles have been replaced by mastery of surface applications, as well as those who continue to rally for the importance of the former. My purpose for conducting this research is to explore the current diffusion of programming as an everyday skill in the United States, demonstrating its value for non-professional as well a professional achievement, with the hope that it will influence policy and practice to help America regain its competitive edge in the global economy.

Part II: Writing Focused Research Questions for your Problem and Context

  1. What exposure to computer programming instruction did Americans who were the first generation to grow up with personal computers have in their schooling? What about subsequent generations, up to the present? This may entail both quantitative measures (how much exposure? Over what period of time?) and qualitative details about what was learned (different languages) and how this knowledge was applied (for example, in programming contests and after-school activities).

  2. In what ways and to what extent have they written software in a non-professional context, and do they currently do so?

  3. In what ways, if any, do they feel that their programming experience helps them solve everyday problems, come up with novel ideas, and communicate their ideas more effectively?

Part III: Choosing Research Methods to Answer Your Questions

  1. Survey (web-based), interview small number of survey participants from each generation

  2. Survey (web-based), interview small number of survey participants

  3. Interview small number of survey participants

Assessing Your Plan

Do-ability/feasibility: The availability of web-based survey tools and services makes the initial survey feasible and the population is accessible on line, with the exception of the current generation, which is children in school now. While I can speak the language as a professional programmer and long time technology hobbyist, I need more training in interview and data collection skills.

Should-do-ability: My initial searching has not found any research directly addressing these questions. The topic is significant based on the presence of theoretical articles and opinion pieces that address the current state of general programming instruction in schools and the its perceived advantages. There do not seem to be any risks to participants or repercussions for me or the research participants.

Want-to-do-ability: While my personal interest has been shifting the more I get to know the disciplinary boundaries of this program, as long as I can keep relevant linkages I care deeply about promoting general programming skills and am personally interested in talking to people my age about their experience of learning to use computers.

Monday, September 20, 2010

Forming a Research Question

Is it better to program as a bricoleur (following the famous ethnographer of electronic computing Sherry Turkle) or to not program at all?


inspirations
Epistemological Pluralism and the Revaluation of the Concrete
By Sherry Turkle and Seymour Papert
http://www.papert.org/articles/EpistemologicalPluralism.html

Wednesday, March 10, 2010

Working Code Abstract revised

From Codework to Working Code: A Programmer's Approach to Digital Literacy


John Bork, University of Central Florida


What does it mean to be digitally literate? Obviously it entails a basic familiarity with commonly used technologies, so that one may navigate the technological life world that has permeated nearly every aspect of the human one. One aspect of this knowledge is the recognition of computer languages, communications protocols, syntactic forms, passages of program code, and command line arguments, even when they have been taken out of their operational context for use as literary and rhetorical devices. In addition to the infiltration of the abbreviated language of email and text messaging into mainstream print media, it is now also commonplace to encounter programming keywords, symbols, operators, indentation, and pagination entwined with natural, non-technical, mother tongue expressions. Codework is the term associated with the literary and rhetorical practice of mixing human and computer languages (Hayles; Raley; Cramer). Types of codework span from intentionally arranged constructions intended for human consumption that do not execute on any real computer system, to valid expressions in bona fide programming languages that are meaningful to both human and machine readers. Examples of the former include the work of Mez (Mary-Anne Breeze) and Talon Memmott, of the latter, the work of John Cayley and Grahan Harwood (Raley; Fuller). Rita Raley notes, however, that of the popular electronic literature of the early twenty first century, there is “less code per se than the language of code.” In addition to its infusion for literary effect, program source code may be cited in scholarly texts like conventional citations to explain a point in an argument. Although it is more common to encounter screen shots of user interfaces, examples of working source code appear on occasion in humanities scholarship. This study will briefly consider why working code has been largely shunned in most academic discourse, and then identify the types and uses of bone fide code that do appear, or are beginning to appear, in humanities scholarship. Its goal is to suggest ways in which working code – understood both as code that works, and as the practice of working code – plays a crucial role in facilitating digital literacy among social critics and humanities scholars, and demonstrate through a number of examples how this effect may be achieved.


The first argument in favor of studying computer code in the context of humanities scholarship can be drawn from N. Katherine Hayles' methodological tool of Media-Specific Analysis (MSA). Probing the differences between electronic and print media when considering the same term, such as hypertext, requires comprehension of the precise vocabulary of the electronic technologies involved. A second, more obvious argument comes from the growing disciplines of Software Studies and Critical Code Studies. If critical analysis of software systems is to reveal implicit social and cultural features, reading and writing program code must be a basic requirement of the discipline (Fuller; Mateas; Wardrip-Fruin). As the media theorist Friedrich Kittler points out, the very concept of what code is has undergone radical transformations from its early use by Roman emperors as cipher to a generic tag for the languages of machines and technological systems in general; “technology puts code into the practice of realities, that is to say: it encodes the world” (45). Or, following the title of Lev Manovich's new, downloadable book, software takes command. Yet both Kittler and Manovich express ambivalence towards actually examining program code in scholarly work. A third argument, which will form the focus of this study, is reached by considering the phenomenon of technological concretization within computer systems and individual software applications. According to Andrew Feenberg, this term, articulated by Gilbert Simondon, describes the way “technology evolves through such elegant condensations aimed at achieving functional compatibilities” by designing products so that each part serves multiple purposes simultaneously (217). The problem is that, from the perspective of a mature technology, every design decision appears to have been made from neutral principles of efficiency and optimization, whereas historical studies reveal the interests and aspirations of multiple groups of actors intersecting in design decisions, so that the evolution of a product appears much more contingent and influenced by vested interests. The long history of such concretizations can be viewed like the variegated sedimentation in geological formations, so that, with careful study, the outline of a technological unconscious can be recovered. The hope is that, through discovering these concealed features of technological design, the the unequal distribution of power among social groups can be remedied. Feenberg's project of democratic rationalization responds to the implicit oppression of excluded groups and values in technological systems by mobilizing workers, consumers, and volunteers to make small inroads into the bureaucratic, industrial, corporate decision making.


For computer technology in particular, digital literacy is the critical skill for connecting humanities studies as an input to democratic rationalizations as an output. Working code replaces the psychoanalytic session for probing the technological unconscious to offer tactics for freeing the convention-bound knowledge worker and high tech consumer alike. Many theorists have already identified the free, open source software (FOSS) community as an active site for both in depth software studies and for rich examples of democratic rationalizations (Fuller; Yuill; Jesiek). Simon Yuill in particular elaborates the importance of revision control software for capturing and cataloging the history of changes in software projects. As as corollary to this point, it can be argued that concealed within these iterations of source code are the concretizations that make up the current, polished version of the program that is distributed for consumption by the end users, and from which the technological unconscious may be interpreted. However, even when they are freely available to peruse in public, web-accessible repositories, these histories are only visible to those who can understand the programming languages in which they are written. Therefore, it is imperative that humanities scholars who wish to critically examine computer technology for its social and cultural underpinnings include working code - as practicing programming - in their digital literacy curricula.


References


Cramer, Florian. “Language.” Software Studies: A Lexicon. Ed. Matthew Fuller. Cambridge, Mass: The MIT Press, 2008. Print.


Feenberg, Andrew. Questioning Technology. New York: Routledge, 1999. Print.


Fuller, Matthew. Introduction to Software Studies: A Lexicon. Ed. Matthew Fuller. Cambridge, Mass: The MIT Press, 2008. Print.


Hayles, N. Katherine. “Print is Flat, Code is Deep: The Importance of Media-Specific Analysis.” Poetics Today, Volume 25:1 (Spring 2004): 67-90. Print.


Jesiek, Brent K. “Democratizing Software: Open Source, the Hacker Ethic, and Beyond.” First Monday, Volume 8:10 (October 2003). n. page. Web. 5 Oct. 2008.


Kittler, Friedrich. “Code.” Software Studies: A Lexicon. Ed. Matthew Fuller. Cambridge, Mass: The MIT Press, 2008. Print.


Mateas, Michael. “Procedural Literacy: Educating the New Media Practitioner.” On The Horizon. Special Issue. Future of Games, Simulations and Interactive Media in Learning Contexts 13.1 (2005): n. pag. Web. 21 Oct. 2009.


Raley, Rita. “Interferences: [Net.Writing] and the Practice of Codework.” Electronic Book Review 8 Sept. 2002. n. pag. Web. 17 Oct. 2009.


Wardrip-Fruin, Noah. Expressive Processing: Digital Fictions, Computer Games, and Software Studies. Cambridge, MA: The MIT Press, 2009. Print.


Yuill, Simon. “Concurrent Version System.” Software Studies: A Lexicon. Ed. Matthew Fuller. Cambridge, Mass: The MIT Press, 2008. Print.

Monday, February 1, 2010

Hayles Electronic Literature reread

Notes on Katherine Hayles Electronic Literature

CHAPTER ONE

Electronic Literature: What Is It?

(3) Electronic literature, generally considered to exclude print literature that has been digitized, is by contrast “digital born,” a first-generation digital object created on a computer and (usually) meant to be read on a computer. .. The [Electronic Literature Organization] committee's formulation reads: “work with an important literary aspect that takes advantage of the capabilities and contexts provided by the stand-alone or networked computer.”


What about digitizations of print literature mediated by programs whose source code fall within Hayles's conception of EL, for she implies that source code can be considered and interpreted as a part of EL: create source code containing 'verbatim' digitizations of print literature such as ancient Greek and Latin texts beyond the grasp of any copyright, patent, trademark or other type of law taken in the form of the kind that put Socrates to death, not the biochemical law of the poison he presumably drank, but the state, government, body politic, collective consciousness, what about custom code consuming print literature? The ELO's formulation does not entail the universal law based on the specific example of the most basic digitization of print texts that we would all agree with Hayles does not rise to the occasion of being sufficiently literary, a term she will soon introduce (and there could be a useful hyperlink here). So despite the harsh use of conjoining exclusion and generally Hayles has really opened the door to electronic texts that are powered, in part, by exact digitizations of commonly conceived as the authoritative and canonical originals, all of which if in the public domain can be cited. In fact she says as much on page 84. The next thing to consider is the question of style, whether early versions of programs should be preserved, for we do not have a fixed number to consider like we do ancient texts. Can you imagine an index of combinations of key phrases such as “electronic literature” and “code work”, a sort of phasor (a degree beyond vector in physics, and another word I used to use often in my old notes, which I referred to a few sentences ago as “a useful hyperlink” do distinguish it from all the possible combinations most of which are nonsense)?


(4) Hybrid by nature, it comprises a “trading zone” (as Peter Galison calls it in a different context) in which different vocabularies, expertises, and expectations come together to see what might emerge from their intercourse. .. I propose “the literary” for this purpose, defining it as creative artworks that interrogate the histories, contexts, and productions of literature, including as well the verbal art of literature proper.

GENRES OF ELECTRONIC LITERATURE

(5) The immediacy of code to the text's performance is fundamental to understanding electronic literature, especially to appreciating its specificity as a literary and technical production.

(6) Readers with only a slight familiarity with the field, however, will probably identify it first with hypertext fiction characterized by linking structures, such as Michael Joyce's afternoon: a story, Stuart Moulthrop's Victory Garden, and Shelly Jackson's Patchwork Girl. .. Although Storyspace continues to be used to produce interesting new works, it has been eclisped as the primary Web authoring tool for electronic literature.

(6-7) Whereas early works tended to be blocks of text (traditionally called “lexia”) with limited graphics, animation, colors, and sound, later works make much fuller use of the multimodal capabilities of the Web; while the hypertext link is considered the distinguishing feature of the earlier works, later works use a wide variety of navigation schemes and interface metaphors that tend to deemphasize the links as such.

(7) hypertext fictions also mutated into a range of hybrid forms, including narratives that emerge from a collection of data repositories such as M.D. Coverley's Califia.

(8) Paraphrasing Markku Eskelinen's elegant formulation, we may say that with games the user interprets in order to configure, wheres in works whose primary interest is narrative, the user configures in order to interpret. .. In his pioneering study [Twisty Little Passages], [Nick] Montfort characterizes the essential elements of the form as consisting of a parser (the computer program that understands and replies to the interactor's inputs) and a simulated world within which the action takes place.

(10) While works like [Donna Leishman's] Deviant use perspective to create the impression of a three-dimensional space, the image itself does not incorporate the possibility of mobile interactivity along the Z-axis.

(10) One kind of strategy, evident in Ted Warnell's intricately programmed Javascript work TLT vs. LL, is to move from the word as the unit of signification to the letter.

(11) David Knoebel's exquisitely choreographed “Heart Pole,” from his collection “Click Poetry,” features a circular globe of words, with two rings spinning at 90 degrees from one another, “moment to moment” and “mind absorbing.”

(11) The next move is from imaging three dimensions interactively on the screen to immersion in actual three-dimensional spaces. As computers have moved off the desktop and into the environment, other varieties of electronic literature have emerged.

(12) The complements to site-specific mobile works, which foreground the user's ability to integrate real-world locations with virtual narratives, are site-specific installations in which the locale is stationary, such as a CAVE virtual reality projection room or gallery site.

(12) Pioneering the CAVE as a site for interactive literature is the creative writing program at Brown University spearheaded by Robert Coover, himself an internationally known writer of experimental literature.

(13) the work has redefined what it means to read, so that reading becomes, as Rita Raley has pointed out, a kinesthetic, haptic, and proprioceptively vivid experience, involving not just the cerebral activity of decoding but bodily interactions with the words as perceived objects moving in space.

(13) the “page” is transformed into a complex topology that rapidly transforms from a stable surface into a “playable” space in which she is an active participant.

(14) CAVE equipment, costing upward of a million dollars and depending on an array of powerful networked computers and other equipment, is typically found only in Research 1 universities and other elite research sites. .. Of the few institutions that have this high-tech resource, even fewer are willing to allocate precious time and computational resources to creative writers.

(15) Like the CAVE productions, interactive dramas are often site specific, performed for live audiences in gallery spaces in combination with present and/or remote actors.

(16) Interactive drama can also be performed online.

(16) How to maintain such conventional narrative devices as rising tension, conflict, and denouement in interactive forms where the user determines sequence continues to pose formidable problems for writers of electronic literature, especially narrative fiction.

(17) the constraints and possibilities of the medium have encouraged many writers to turn to nonnarrative forms or to experiment with forms in which narratives are combined with randomizing algorithms.

(18) The combination of English and Spanish vocabularies and the gorgeous images from Latin American locations [in Glazier's White-Faced Bromeliads on 20 Hectares] further suggest compelling connections between the spread of networked and programmable media and the transnational politics in which other languages contest and cooperate with English's hegemonic position in programming languages and, arguably, in digital art as well.

(18) Philippe Bootz has powerfully theorized generative texts, along with other varieties of electronic literature, in his functional model that makes clear distinctions between the writer's field, the text's field, and the reader's field, pointing out several important implications inherent in the separation between these fields, including the fact that electronic literature introduces temporal and logical divisions between the writer and reader different from those enforced by print.

(19) Naming such works [Noah Wardrip-Fruin's Regime Change and News Reader] “instruments” implies that one can learn to play them, gaining expertise as experience yields an intuitive understanding of how the algorithm works.

(20) As Andrews, Kearns, and Wardrip-Fruin acknowledge, these works are indebted to William Burroughs's notion of the “cut-up” and “fold-in.” They cite as theoretical precedent Burroughs's idea that randomization is a way to break the hold of the viral word and liberate resistances latent in language by freeing it from linear syntax and coherent narrative.

(20-21) “Code work,” a phrase associated with such writers and Alan Sondheim, MEZ (Mary Ann Breeze), and Talan Memmott and with critics such as Florian Cramer, Rita Raley, and Matthew Fuller, names a linguistic practice in which English (or some other natural language) is hybridized with programming expressions to create a creole evocative for human readers, especially those familiar with the denotations of programming languages. “Code work” in its purest form is machine readable and executable, such as Perl poems that literally have two addressees, human and intelligent machines. More typical are creoles using “broken code,” code that cannot actually be executed but that uses programming punctuations and expressions to evoke connotations appropriate to the linguistic signifiers.

(21) The conjunction of language with code has stimulated experiments in the formation and collaboration of different kinds of languages.


Among those languages are programming languages and natural languages, breaking their exclusion in Ong in Orality and Literacy on the grounds that they could never be natural languages. The door opened by the ready supply of ideological constants, a term I used many years ago when I was groping at the vision now much clearer, leads to the idea of “code work” usable by both “human and intelligent machines.”


(22-23) The multimodality of digital art works challenges writers, users, and critics to bring together diverse expertise and interpretive traditions so that the aesthetic strategies and possibilities of electronic literature may be fully understood. .. when a work is reconceived to take advantage of the behavioral, visual, and/or sonic capabilities of the Web, the result is not just a Web “version” but an entirely different artistic production that should be evaluated in its own terms with a critical approach fully attentive to the specificity of the medium.

(23) the computational media intrinsic to electronic textuality have necessitated new kinds of critical practice, a shift from literacy to what Gregory L. Ulmer calls “electracy.”

(24-25) Exemplifying this kind of critical practice is Matthew Kirschenbaum's Mechanisms: New Media and Forensic Textuality. .. He parses the materiality of digital media as consisting of two interrelated and interacting aspects: forensic materiality and formal materiality. Whereas forensic materiality is grounding in the physical properties of the hardware - how the computer writes and reads bit patterns, which in turn correlated to voltage differences - formal materiality consists of the “procedural friction or perceived difference . . . as the user shifts from one set of software logics to another” (ms. 27). Using the important distinction that Espen J. Aarseth drew in Cyberspace: Perspectives on Ergodic Literature between scriptons (“strings as they appear to readers”) and textons (“strings as they exist in the text”) (62), Kirschenbaum pioneers in Mechanisms a methodology that connects the deep print reading strategies already in effect with scriptons (letters on the page, in this instance) to the textons (here the code generating the screenic surface). He thus opens the way for a mode of criticism that recognizes the specificity of networked and programmable media without sacrificing the interpretive strategies evolved with and through print.


Come in via this guy by suggesting that subdivisions of forensic and formal materiality cross in the articulation of technological .


(25) In “Writing the Virtual: Eleven Dimensions of E-Poetry,” she focuses on the ways in which E-poetry achieves dynamism, leading her to coin the neologism “poietics” (from “poetry” and “poiisis,” the Greek (sic) work for “making”).


Here is a slip by Hayles and her proofreaders, friends, and editors.


(26) Any work that uses algorithmic randomizers to generate text relies to a great or lesser extent on the surprising and occasionally witty juxtapositions created by these techniques. It should be noted that algorithmic procedures are not unique to networked and programmable media.

(27) The Demon, Stefans notes, is involved in a two-way collaboration: between the programmer who works with the limitations and possibilities of a computer language to create the program, and between the user and the computer when the computer poem is read and interpreted.

(27) The collaboration between the creative imagination of the (human) writer and the constraints and possibilities of software is the topic of Ian Bogost's Unit Operations: An Approach to Videogame Criticism, in which he develops an extended analogy between the unit operations of object-oriented programming and a literary approach that explores the open, flexible, and reconfigurable systems that emerge from the relations between units.

(28) As Bogost's approach suggests, taking programming languages and practices into account can open productive approaches to electronic literature, as well as other digital and nondigital forms. The influence of software is especially obvious in the genre of the Flash poem, characterized by sequential screens that typically progress with minimal or no user intervention.

(30) Hypertext fiction, network fiction, interactive fiction, locative narratives, installation pieces, “codework,” generative art, and the Flash poem are by no means an exhaustive inventory of the forms of electronic literature, but they are sufficient

ELECTRONIC LITERATURE IS NOT PRINT

(31) Early hypertext theorists, notably George Landow and Jay David Bolter, stressed the importance of the hyperlink as electronic literature's distinguishing feature, extrapolating from the reader's ability to chose which link to follow to make extravagant claims about hypertext as a liberatory mode that would dramatically transform reading and writing and, by implication, settings where these activities are important, such as the literature classroom.

(31-32) Compared to the flexibility offered by the codex, which allows the reader complete freedom to skip around, go backward as well as forward, and open the book wherever she pleases, the looping structures of electronic hypertexts and the resulting repetition forced on the reader/user make these works by comparison more rather than less coercive.


The limitation of current technologies reflected in state of the art designs sees electronic literature as much more restricting than the codex (book) form of literature, overshadowing the unique capability of electronic literature to reform itself dynamically in response to the reader. Whereas following hyperlinks may have its print correlate, this property is unique.


(32) In conflating hypertext with the difficult and productive aporias of deconstructive analysis, these theorists failed to do justice to the nuanced operations of works performed in electronic media or to the complexities of deconstructive philosophy.


They are two different things, being dynamically reconfigurable and being deconstructive.


(33) Rather than circumscribe electronic literature within print assumptions, Aarseth swept the board clean by positing a new category of “ergodic literature,” texts in which “non-trivial effort is required to allow the reader to traverse the text” (1).

(33) Markku Eskelinen's work, particularly “Six Problems in Search of a Solution: The Challenge of Cybertext Theory and Ludology to Literary Theory,” further challenges traditional narratology as an adequate model for understanding ergodic textuality, making clear the need to develop frameworks that can adequately take into account the expanded opportunities for textual innovations in digital media.

(34) Similar ground clearing was undertaken by Lev Manovich in his influential The Language of New Media. .. Although it is too simplistic to posit these “layers” as distinct phenomena (because they are in constant interaction and recursive feedback with one another), the idea of transcoding nevertheless makes the crucial point that computation has become a powerful means by which preconscious assumptions move from such traditional cultural transmission vehicles as political rhetoric, religious and other rituals, gestures and postures, literary narratives, historical accounts, and other purveyors of ideology into the material operations of computational devices.

(35) Alexander Galloway in Protocol puts the case succinctly: “Code is the only language that is executable” (emphasis in original). Unlike a print book, electronic text literally cannot be accessed without running the code. Critics and scholars of digital art and literature should therefore properly consider the source code to be part of the work, a position underscored by authors who embed in the code information or interpretive comments crucial to understanding the work.


This means any study of electronic literature may, or ought, to include some analysis of the source code and enframing technologies. The implications of availability of the source code are obvious here. Additionally, where is the boundary between “the” source code of the work and the surrounding operating environment? What is the status of database records and ephemera of the running of the code?


(35) [Jerome McGann] turns this perspective on its head in Radiant Technology: Literature after the World Wide Web by arguing that print texts also use markup language, for example, paragraphing, italics, indentation, line breaks, and so forth.

(36-37) Complementing studies focusing on the materiality of digital media are analyses that consider the embodied cultural, social, and ideological contexts in which computation takes place. .. Much as the novel both gave voice to and helped to create the liberal humanist subject in the seventeenth and eighteenth centuries, so contemporary electronic literature is both reflecting and enacting a new kind of subjectivity characterized by distributed cognition, networked agency that includes human and non-human actors, and fluid boundaries dispersed over actual and virtual locations.

(37-39) How and in what ways it should engage with these commercial interests is discussed in Alan Liu's magisterial work The Laws of Cool: Knowledge Work and the Culture of Information. .. Realizing this broader possibility requires that we understand electronic literature not only as an artistic practice (although it is that, or course), but also as a site for negotiations between diverse constituencies and different kinds of expertise.

Among these constituencies are theorists and researchers interested in the larger effects of network culture. .. Adrian Mackenzie's Cutting Code: Software as Sociality studies software as collaborative social practice and cultural process. .. electronic literature is evolving within complex social and economic networks that include the development of commercial software, the competing philosophies of open source freeware and shareware, the economics and geopolitical terrain of the internet and World Wide Web, and a host of other factors that directly influence how electronic literature is created and stored, sold or given away, preserved or allowed to decline into obsolescence.


This choice of wording suggests that a more sensitive study of free, open source cultural movements can expand the perspective taken by Mackenzie and/or Hayles.


PRESERVATION, ARCHIVING, AND DISSEMINATION

(39) whereas books printed on good quality paper can endure for centuries, electronic literature routinely becomes unplayable (and hence unreadable) after a decade or even less. The problem exists for both software and hardware.

(40) The Electronic Literature Organization has taken a proactive approach to this crucial problem with the Preservation, Archiving and Dissemination Initiative (PAD). Part of that initiative is realized in the Electronic Literature Collection, volume 1, co-edited by Nick Montfort, Scott Rettberg, Stephanie Strickland, and me, featuring sixty works of recent electronic literature and other scholarly resources.

(41) [Montfort and Wardrip-Fruin's] “Acid-Free Bits” offers advice to authors to help them “find ways to create long-lasting elit, ways that fit their practice and goals” (3). The recommendations include preferring open systems to closed systems, choosing community-directed systems over corporate driven systems, adhering to good programming practices by supplying comments and consolidating code, and preferring plain-text to binary formats and cross-platform options to single-system options.

(41) More encompassing, and even more visionary, is the proposal in “Born-Again Bits” for the “X-Literature Initiative.” The basic premise is the XML (Extensible Markup Language) will continue to be the most robust and widespread form of Web markup language into the foreseeable future.

(42) The X-Literature Initiative makes startlingly clear that the formation we know as “literature” is a complex web of activities that includes much more than conventional images of writing and reading. Also involved are technologies, cultural and economic mechanisms, habits and predispositions, networks of producers and consumers, professional societies and their funding possibilities, canons and anthologies designed to promote and facilitate teaching and learning activities, and a host of other factors.


CHAPTER TWO

Intermediation: From Page to Screen

DYNAMIC HETERARCHIES AND FLUID ANALOGIES

(44) I focus here on two central conceptual clusters to develop the idea of intermediation: dynamic heterarchies and fluid analogies as embodied in multiagent computer programs, and the interpretive processes that give meaning to information.

(45) One proposal is “intermediation,” a term I have adopted from Nicholas Gessler, whereby a first-level emergent pattern is captured in another medium and re-represented with the primitives of the new medium, which leads to an emergent result captured in turn by yet another medium, and so forth. The result is what researchers in artificial life call a “dynamic hierarchy,” a multi-tiered system in which feedback and feedforward loops tie the system together through continuing interactions circulating throughout the hierarchy.

(45) The potential of this idea to explain multilevel complexity is the subject of Harold Morowitz's The Emergence of Everything: How the World Became Complex.

(46) digital and analog processes together perform in more complex ways than the digital alone. .. They [analog processes] excel in transferring information from one medium to another through morphological resemblance, and the complexity of continuous variation allows them to encode information in more diverse ways than digital encodings.

(47) Now let us make a speculative leap and consider the human and the digital computer as partners in a dynamic heterarchy bound together by intermediating dynamics.

(47) citizens in technologically developed societies, and young people in particular, are literally being reengineered through their interactions with computational devices.

(48-49) In Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, Hofstadter details this research. His mantra, “Cognition is recognition,” nicely summarizes his conclusion that cognition is built on the ability to recognize patterns and extrapolate from them to analogies (pattern A is like pattern B).

(51) Because literature is not limited to factual recreation but rather works through metaphor, evocation, and analogy, it specializes in the qualities that programs like Jumbo and Copycat are designed to perform.

(51-52) the programs function as components in an adaptive system bound together with humans through intermediating dynamics, the results of which are emergent realizations. .. the literal/metaphoric binary becomes a spectrum along which a variety of programs can be placed, depending on their cognitive capacities and the ways in which the patterns they generate and/or recognize are structurally coupled with humans.

The idea of considering meaning making as a spectrum of possibilities with recursive loops entangling different positions along the spectrum has been catalyzed by Edward Fredkin's recent proposal that “the meaning of information is given by the process that interprets it” (my emphasis), for example, an MP3 player that interprets a digital file to produce audible sound. The elegance of the concept is that it applies equally well to human and nonhuman cognizers.

(53) For the MP3 player, “aboutness” has to do with the relation it constructs between the digital file and the production of sound waves. For the music sophisticate, “aboutness” may include a detailed knowledge of Beethoven's work, the context in which it was written and performed, historical changes in orchestral instrumentation, and so on.

(53) [Dennett's thought experiments demonstrate] that in a certain sense human intentionality too is an artifact that must ultimately have emerged from the subcognitive processes responsible for the evolution of humans as a species.

(54-55) Fredkin's concept also can potentially heal the breach between meaning and information that was inscribed into information theory when Claude Shannon defined information as a probability function. .. The divorce between information and meaning was necessary, in Shannon's view, because he saw no way to reliably quantify information as long as it remained context dependent, because its quantification would change every time it was introduced into a new context, a situation calculated to drive electrical engineers mad. Nevertheless, the probability functions in Shannon's formulations necessarily implied processes that were context dependent in a certain sense - specifically, the context of assessing them in relation to all possible messages that could be sent by those message elements. The difficulty was that there seemed to be no way to connect this relatively humble sense of context to the multilayered, multifaceted contexts ordinarily associated with high-level meanings (for example, interpretations of Beethoven's Fifth). Fredkin's formulation overcomes this difficulty by defining meaning through the processes that interpret information, all the way from binary code to high-level thinking.

(55) Putting Shannon's mechanistic model together with MacKay's embodied model make sense when we see higher-order meanings emerging from recursive lower level subcognitive processes, as MacKay emphasizes when he highlights “visceral responses and hormonal secretions and what have you.” Like humans, intelligent machines also have multiple layers of processes, from ones and zeros to sophisticated acts of reasoning and inference.

(56-57) In electronic literature, this dynamic is evoked when the text performs actions that bind together author and program, player and computer, into a complex system characterized by intermediating dynamics. The computer's performance builds high-level responses out of low-level processes that interpret binary code. These performances elicit emergent complexity in the player, whose cognitions likewise build up from low-level thoughts that possess much more powerful input to high-level thoughts than the computer does, but that nevertheless are bound together with the computer's subcognitive processes through intermediating dynamics. The cycle operates as well in the writing phase of electronic literature. When a programmer/writer creates an executable file, the process reengineers the writer's perceptual and cognitive systems as she works with the medium's possibilities. .. The result is a meta-analogy: as human cognition is to the creation and consumption of the work, so computer cognition is to its execution and performance.

(57-58) The book is like a computer program in that it is a technology designed to change the perceptual and cognitive states of a reader. .. “Recombinant flux,” as the aesthetic of such works [that is, electronic texts] is called, gives a much stronger impression of agency than does a book.

(58) this knowledge is carried forward into the new medium typically by trying to replicate the earlier medium's effects within the new medium's specificities.

(59) In My Mother Was a Computer: Digital Subjects and Literary Texts, I explored intermediation by taking three different analytical cuts, focusing on the dynamics between print and electronic textuality, code and language, and analog and digital processes.


FROM PAGE TO SCREEN: MICHAEL JOYCE'S AFTERNOON: A STORY AND TWELVE BLUE

(60) That evolution is richly evident in the contrast between Michael Joyce's seminal first-generation hypertext afternoon: a story and his later Web work Twelve Blue.

(62-63) The technique of conflicting plot lines is of course not original with Michael Joyce. .. Comparing the two works reveals how printcentric afternoon is, notwithstanding its implementation in an electronic medium. .. Although the reader can choose what lexias to follow, this interaction is so circumscribed that most readers will not have the sense of being able to play the work - hence my repeated use here of the term “reader” rather than “player.”

In Twelve Blue, by contrast, playing is one of the central metaphors. .. Compared with afternoon, Twelve Blue is a much more processual work. Its central inspiration is not the page but rather the flow of surfing the Web.

(63-64) Twelve Blue's epigraph, taken from William H. Gass's On Being Blue: A Philosophical Inquiry, signals that the strategy will be to follow trails of associations, as Gass says, “the way lint collects. The mind does that” (7). .. The second, less explicit, intertext is Vannevar Bush's seminal essay “As We May Think,” in which he argues that the mind thinks not in linear sequences but in associational links, a cognitive mode he sought to instantiate in his mechanical Memex, often regarded as a precursor to electronic hypertext. .. [Twelve Blue] instantiates associational thinking and evokes it for the player, who must in a certain sense yield to this cognitive mode to understand the work (to say nothing of enjoying it). .. Like sensual lovemaking, the richness of Twelve Blue takes time to develop and cannot be rushed.

(69) As Anthony Enns points out in his reading of Twelve Blue, this work challenges Frank Kermode's criterion for “the sense of an ending” that helps us make sense of the world by establishing a correlation between the finitude of human life and the progression through a beginning, middle, and end characteristic of many print narratives. .. I would argue rather that Twelve Blue makes a different kind of sense, one in which life and death exist on a continuum with flowing and indeterminate boundaries.

(70) Gregory Ulmer relates it to the shift from a novel-based aesthetic to a poetics akin to the lyric poem. He also relates it to a change from literacy to “electracy,” arguing that its logic has more in common with the ways in which image and text come together on the Web than to the linearity of alphabetic language bound in a print book. .. The leap from afternoon to Twelve Blue demonstrates the ways in which the experience of the Web, joining with the subcognitive ground of intelligent machines, provides the inspiration for the intermediating dynamics through which the literary work creates emergent complexity.


MARIA MENCIA: TRANSFORMING THE RELATION BETWEEN SOUND AND MARK

(71) In “Methodology,” Mencia comments that she is particularly interested in the “exploration of visuality, orality and the semantic/'non semantic' meaning of language.” On the strength of her graduate work in English philology, she is well positioned to explore what happens when the phone and phoneme are detached from their customary locations within morphemes and begin to circulate through digital media into other configurations, other ways of mobilizing conjunctions of marks and sounds. .. With traditional print literature, long habituation causes visuality (perception of the mark) to flow automatically into subvocalization (cognitive decoding) that in turn is converted by the “mind's eye” into the reader's impression that the words on the page give way to a scene she can watch as the characters speak, act, and interact.


An excellent articulation of the hegemonic computation process of reading to produce virtual realities.


(73) [In Birds Singing Other Birds' Songs] the human is in-mixed with nonhuman life forms to create hybrid entities that represent the conjunction of human and nonhuman ways of knowing.

(74) The analogy-between-analogies suggests that media transformations are like the dynamic interchanges between different kinds of cognizers, thus revealing a deep structure of intermediation that encompasses the history of media forms as well as the emergent complexities of interactions between humans, animals, and networked and programmable machines.


RUPTURING THE PAGE: THE JEW'S DAUGHTER

(74-76) The entire work exists as a single screen of text. .. When the player mouses over the blue letters, some part of the text, moving faster than the eye can catch, is replaced. Reading thus necessarily proceeds as rereading and remembering, for to locate the new portion of the page the reader must recall the screen's previous instantiation while scanning to identify the new portion, the injection of which creates a new context for the remaining text.

(78-79) The “stickiness” of phrases that can ambiguously attach to different sentences and phrases also enacts a difference between modernist stream of consciousness and the kind of awareness represented in The Jew's Daughter. .. narration is both belated and premature, early and late.

(79) Taken as a representation of consciousness, the kind of awareness performed here is not a continuous coherent stream but rather multilayered shifting strata dynamically in motion relative to one another.

This kind of interaction is very similar to the “Multiple Drafts Model” that Daniel C. Dennett, in Consciousness Explained, argues best explains the nature of consciousness. Dennett proposes that consciousness is not the manifestation of a single coherent self synthesizing different inputs (characterized as the “Cartesian Theater,” the stage on which representations are played out and viewed by a central self); rather, interacting brain processes, operating with varying temporal dynamics and different neural/perceptional inputs, are consciousness. .. To explain the subjective impression of possessing a central self, Dennett argues that the self is not synonymous with consciousness as such. Rather, the illusion of self is created through an internal monologue that does not so much issue from a central self as give the impression a central self exists.


This is a very rich but complicated reinterpretation of consciousness as an epiphenomenon in which there is no self, only an illusion of one.


(80) Seen in this perspective, The Jew's Daughter recapitulates the temporal and spatial discontinuities constitutive of consciousness through the (inter)mediation of computer software and hardware.

(81) Without knowing anything about The Jew's Daughter, Dennett sets up the comparison between human and machine cognition by likening the subcognitive agents from which consciousness emerges, and the even simpler processes that underlie them, to mechanical programs that could theoretically be duplicated in a computer.

(82-83) The Error Engine, a collaborative work co-authored by Judd Morrissey, Lori Talley, and computer scientist Lutz Hamel, carries the implications of The Jew's Daughter to another level by functioning as an adaptive narrative engine that initiates a coevolutionary dynamic between writer, machine, and player. .. In the next instantiation of the program, no yet implemented, the authors envision an algorithm whose selection criteria can itself evolve in relation to the player's choices. Such a program would deserve to be called a “genetic algorithm,” a complex adaptive system in which the user's choices and the algorithm responding to those choices coevolve together. .. In this sense intermediating dynamics, whereby recursive feedback loops operate through the differently embodied entities of the computer and human, become an explicit part of the work's design, performance, and interpretation. Adaptive coevolution implies that real biological changes take place in the player's neuronal structure that result in emergent complexity, expressed as a growing understanding of the work's dynamics, thematics, and functional capabilities; these in turn change and evolve in interaction with the player's choices.

(83) Certainly print literature changes a reader's perceptions, but the loop is not closed because the words on the page do not literally change in response to the user's perceptions. .. To fully take this reflexivity into account requires understanding the computer's cascading interpretive processes and procedures, its possibilities, limitations, and functionalities as a subcognitive agent, as well as its operations within networked and programmable media considered as distributed cognitive systems. .. Whatever limitations intermediation as a theory may have, its virtue as a critical framework is that it introduces computation into the picture at a fundamental level, making it not an optional add-on but a foundational premise from which to launch further interrogation.


Gratuitous reference to Phaedrus with the close loop feedback difference between electronic and print literature. More important is the undeniable influence of computation for the critical framework.


(84) No less than print literature, literary criticism is affected because digital media are increasingly essential to it, limited not just to word processing but also to how critics now access legacy works through digital archives, electronic editions, hypermedia reinstantiations, and so forth.

(85) Contemporary literature, and even more so the literary that extends and enfolds it, is computational.


CHAPTER THREE

Contexts for Electronic Literature: The Body and the Machine

(87) The context of networked and programmable media from which electronic literature springs is part of a rapidly developing mediascape transforming how citizens of developed countries do business, conduct their social lives, communicate with each other, and perhaps most significantly, how they construct themselves as contemporary subjects. . . . The stakes are nothing less than whether the embodied human becomes the center for humanistic inquiry within which digital media can be understood, or whether media provide the context and ground for configuring and disciplining the body.

(88) I argue that both the body and machinic orientations work through strategic erasures. A fuller understanding of our contemporary situation requires the articulation of a third position focusing on the dynamics entwining body and machine together. .. Most importantly, it empowers electronic literature so that it not only reflects but reflects upon the media from which it springs.


THE EPOCH OF TECHNICAL MEDIA

(88-89) No theorist has done more to advance the idea of technical media as an autonomous force determining subjectivity than Friedrich A. Kittler. .. Influenced by Foucault rhetorically as well as methodologically, Kittler departs from him in focusing not on discourse networks understood as written documents, but rather on the modes of technology essential to their production, storage, and transmission.

(89) Literature acts on the body but only within the horizon of the medium's technical capabilities. Especially important in this regard, Kittler argues, was the development of the phonetic method of reading, introduced in Germany by Heinrich Stephanie around 1800. The phonetic method transformed the mark into sound, erasing the materiality of the grapheme and substituting instead a subvocalized voice.

(90) [from Gramophone, Film, Typewriter] So-called Man is split up into physiology and information technology. (intro, 16)

(90) With the formation of a new kind of subject, the voice of Mother/Nature ceases to spring forth from the page in a kind of hallucination.

(91) In his excellent forward to Discourse Networks 1800/1900, David E. Wellbery calls this the “presupposition of exteriority” (DN, xii). . . the crucial move of making social formations interior to media conditions is deeply flawed. . . . Although Kittler's presupposition is fruitful as a theoretical provocation, leading to the innovative analyses that make his work exciting, it cannot triumph as a theoretical imperative because it depends on a partial and incomplete account of how media technologies interact with social and cultural dynamics.

(91-92) One indication of this partiality is the inability of Kittlerian media theory to explain how media change comes about (as has often been noted, this is also a weakness of Foucault's theory of epistemes). In a perceptive article, Geoffrey Winthrop-Young argues that in Kittler's analyses, war performs as the driving force for media transformation.

(93) clearly the real problem is that media alone cannot possibly account for all the complex factors that go into creating national military conflicts. What is true for war is true for any dynamic evolution of complex social systems; media transformations alone are not sufficient.

(93) To be fair to the Kittlerian viewpoint, I have chosen a site where media conditions are unusually strong in determining the interactions that take place within it - namely, the elite world of global finance. The media conditions that prevail here are characteristic of the contemporary period, in that the differentiation between data streams marking early twentieth century media transformations have undergone integration. . . . Contemporary de-differentiation crucially depends on digital media's ability to represent all kinds of data – text, images, sound, video – with the binary symbolization of “one” and “zero.”


MEDIA CONDITIONS FOR GLOBAL FINANCE: WHY MEDIA THEORY IS NOT SUFFICIENT

(94) Among important recent work on global finance are the ethnographic studies of international currency traders by Karin Knorr Cetina and Urs Bruegger. . . . In brief, this is money at its most virtual, moving around the globe in nearly instantaneous electronic exchanges and reflecting rate fluctuations sensitively dependent on a wide variety of fast-changing economic, social, and political factors.

(94-95) Knorr Cetina and Bruegger propose the theoretical concept of global microsociality. . . . Global microsociality represents a new kind of phenomenon possible only with advanced communication technologies allowing for nearly instantaneous exchanges between geographically distant locations; compared to the telephone and teletype, the quantitative differences are so great as to amount to qualitative change. . . . Inflecting by the dynamics of global economics, the traders nevertheless operate within microsocial dynamics – hence the necessity for global microsociality.

(95) Recapitulating within their sensoria the media differentiation into separate data flows, the traders develop a form of parallel processing through a division of sensory inputs, using phones to take orders from brokers through the audio channel and the screens to take in visual data and conduct trades electronically. The environment, however, is dominated by the screens.

(95-96) the effect that dominates is watching time unfold. .. As new events appear over the ever-transforming horizon, the traders use their knowledge of past configurations, present statistics, and anticipated tendencies to weave a fabric of temporality, which like the fabled magic carpet is perceived at once as a space one can occupy and as an event as ephemeral and ever-changing as the air currents on which the magic carpet rides.

(96-97) Temporality partakes of these characteristics because the screens function as temporalized “places” traders occupy; time in these circumstances becomes the spatialized parameter in which communities are built and carry out their business. . . . Time thus ceases to be constructed as a universal “now” conceived as a point source moving unambiguously forward along a line at a uniform pace. Driven by globalized business pressures, time leaves the line and smears into a plane.

(97) Greenwich time thus operates as the conventional one-way time that always moves in one direction, whereas local time becomes incorporated into a spatialized fabric that can be traversed in different directions as circumstances dictate.

(97-98) In this spatialized temporality, the traders occupy an ambiguous position. On the one hand, they are participants in the place of temporality they create by watching the screens, helping in significant ways to shape the market and related events as they continuously unfold and affect one another. . . . On the other hand, they are also observers outside the screens, watching the action as it unfolds. . . . The net result of these interactions is perceived by the traders as “the market.” . . . Note that although location enters into the trader's sense of the market, it is the temporal dimension - everything all the time - that constitutes the place of habitation the market creates and the traders occupy.

(98) This sense of the market as “everything” is reinforced by the traders' experience in being so intimately and tightly connected with the screens that they can sense the “mind” of the market. . . . This intuition is highly sensitive to temporal fluctuations and, when lost, can be regained only through months of immersion in current conditions. Attributing a “mind” to the market of course implies it is an entity possessing consciousness, desires, and intentions; more precisely, it is a megaentity whose existence is inherently emergent. Containing the traders' actions with everything else, it comes into existence as the dynamic realization of innumerable local interactions.


Instead of calling the market a mind, call it a megaentity. Does this then exclude interpreting it with respect to Gallagher's body image / body schema distinction?


(99) This is the context in which the screens become objects of intense attachment for the traders.

(99) So far the case study has functioned as an object lesson demonstrating Kittler's dictum, “Media determine our situation.” At this point, however, let us turn to consider how cultural dynamics interact with the media conditions to codetermine their specificities. . . . This gender predominance, far from accidental, is deeply imbricated into the ways in which the media dynamics play out.

(100-101) In this setting, gendered cultural practices proliferate more uniformly and extensively than normally would be the case. . . . Traders see themselves as engaged in combat, if not outright ware, with rival banks and other traders. . . . Warfare here does not function to bring about media transformations, as it often does in Kittler's analyses; rather, warfare is encapsulated within the horizon codetermined by media conditions and cultural formations. It is appropriated in part because it expresses - indeed, explains and justifies - the intensified desires and fears aroused by the traders' situation.

(101) The media conditions alone, then, are underdetermining with respect to the culture that actually emerges. Other factors, particularly cultural models linked with masculine dominance, are necessary to explain how the media function to “determine the situation.”

(102) Media provide the simultaneity that spatializes time, creates global microsociality, catalyzes attachment to screens, and gives rise to emergent objects “that are not identical with themselves,” but the emotional tone, dominant metaphors, hypermasculinized dynamics, and capitalist economics codetermine how trading practices actually operate.


EMBODIMENT AND THE COEVOLUTION OF TECHNOLOGY

(102-103) No one has more forcefully argued for the importance of embodiment in relation to new media art than Mark B. N. Hansen. . . . Updating Bergson's idea in Matter and Memory that the body selects from the environment images on which to focus, Hansen contests Deleuze's reading of Bergson in Cinema 1 in order to reinstall affectivity at the center of the body-brain achievement of making sense of digital images.

(103) This is a major intervention that serves as an important counterweight to Kittler's perspective. Hansen posits that “only meaning can enframe information” (82), and in his view it is humans, not machines, who provide, transmit, and interpret meanings.

(103-104) Yet despite my overall sympathy, I cannot help noticing places where the argument, in its zeal to establish that embodiment trumps every possible machine capacity, circumscribes the very potential of the body to be transformed by its interaction with digital technologies for which Hansen otherwise argues.

(105) Vision, then, cannot in Hansen's account be allowed to be the dominant perceptual sense, on even on par with privileged faculties that (not coincidentally) are much more difficult to automate, particularly what he calls “affectivity,” the capacity of the sensorimotor body to “experience itself as 'more than itself' and thus to deploy its sensorimotor power to create the unpredictable, the experimental, the new” (7). To substantiate that the sensorimotor body has this capacity, he draws on the writing of Raymond Ruyer, a French theorist who during the 1950s proposed to combat the mechanist tendency of post-World War II cybernetics by positing bodily faculties that, he argued, are nonempirical and nonobservable. Chief among these is a “transpatial domain of human themes and values” (80) . . . If we were to call the “transpatial domain” by the more traditional name “soul,” its problematic nature would quickly become evident.


Clearly a tie into Gallagher's work since she mentions Francisco Valera. Can't find the reference to Demasio I remember reading.


(106) Although it is undoubtedly true, as Hansen argues citing Brian Massumi (109), that proprioception, kinesthetic, and haptic capacities are involved with vision, this does not mean that they replace vision or even that they become dominant over vision in the VR interface. Indeed, it is precisely because vision plays such an important role in VR that VR sickness arises.

(108) Falling back on such mystified terms as “transpatial domain” and “absolute survey” fails to do justice to the extensive research now available on how synesthesia actually works.

(108) there is an unavoidable tension between Hansen's insight that technology and the body coevolve together and his ideological commitment to the priority of embodiment over technology.

(109) As Hansen uses the term, however, reality becomes “mixed” when the perceptual input for humans comes not from their unaided bodies operating alone in the environment, but rather from their embodied interactions with technologies.

(109) In this account, then, the coevolution of the body and technology is given a teleological trajectory, a mission as it were: its purpose is to show the “constituting or ontological dimension of embodiment.” Largely erased are material specificities and capacities of technical objects as artifacts. It is as though the feedback loop between technical object and embodied human enactor has been cut off halfway through: potentiality flows from the object into the deep inner senses of the embodied human, but its flow back into the object has been short-circuited, leading to an impoverished account of the object's agential capacities to act outside the human's mobilization of its stimuli.

(110) This encapsulation is problematic for several reasons. It ignores the increasing use of technical devices that do not end in human interfaces but are coupled with other technical devices that register input, interpret results, and take action without human intervention.


Thus the machine dimension on my timeline.


(111) Such an account is helpless to explain how technology evolves within the horizon of its own limitations and possibilities.

(111-112) As if assuming a mirror position to Kittlerian media theory, which cannot explain why media change except by referring to war, Hansen cannot explain why media develop except by referring to embodied capacities. . . technologies are embodied because they have their own material specificities as central to understanding how they work as human physiology, psychology, and cognition are to understanding how (human) bodies work.

(112-113) Had early tools developed along different technological lineages, early hominid evolution also might have developed along quite different biological lines. . . . Instead of subordinating the body to technology or technology to the body, however, surely the better course is to focus on their interactions and coevolutionary dynamics.

(113) Central to these dynamics, especially in the context of media theory and electronic literature, are neural plasticity and language ability. . . . Evidence indicates that compound tools were contemporaneous with the accelerated development of Broca's area in the frontal cortex, a part of the brain involved in language use.

(114) Although synaptogenesis is greatest in infancy, plasticity continues throughout childhood and adolescence, with some degree continuing even into adulthood. In contemporary developed societies, this plasticity implies that the brain's synaptic connections are coevolving with environments in which media consumption is a dominant factor. .. Children growing up in media-rich environments literally have brains wired differently than humans who did not come to maturity in such conditions.

(115) [James Mark] Baldwin argued for what he called “organic selection.” In the same way that the brain overproduces neuronal connections that are then pruned in relation to environmental input, so Baldwin thought that “organic selection” proceeded through an overproduction of exploratory behaviors, which are then pruned through experience to those most conducive to the organism's survival. This results in a collaboration between phylogenetic selection (that is, selection that occurs through genetic transmission) and ontogenic mechanisms of adaptation (which occur in individuals through learning).

(115) what begins as ontogenetic adaptation through learning feeds back into selective pressures to affect physical biology.


What does Gallagher think about synaptogenesis and this explanatory combination of phylogenetic selection (genetics) and ontogenic mechanisms of adaptation (learning)?


(116) Ambrose's scenario linking compound tools with the emergence of language illustrates how technology enters into the psychophysical feedback cycle by changing the ways in which learning occurs and the kinds of learning that are most adaptive. . . . If data differentiation at the beginning of the twentieth century broke the ancient monopoly of writing, the computer at the beginning of the twenty-first century breaks the monopoly of vision associated with reading. Interactive text, reminiscent in some ways of the digital art discussed by Hansen, stimulates sensorimotor functions not mobilized in conventional print reading, including fine movements involved in controlling the mouse, keyboard, and/or joystick, haptic feedback through the hands and fingers, and complex eye-hand coordination in real-time dynamic environments. Moreover, this multisensory stimulation occurs simultaneously with reading, a configuration unknown in the Age of Print. Brain imaging studies show that everyday tool use entails complex feedback loops between cognitive and sensorimotor systems. For humans who habitually interact with computers, especially at young ages, such experiences can potentially affect the neurological structure of the brain.

(117) Steven Johnson, in Everything Bad Is Good For You: How Popular Culture Is Actually Making Us Smarter, cites the studies of James R. Flynn indicating that Iqs rose significantly from 1932-78, the so-called Flynn effect that Johnson correlates with increased media consumption. Anecdotal evidence as well as brain imaging studies indicate that “Generation M” (as the Kaiser Family Foundation dubbed the 8- to 18-year-old cohort) is undergoing a significant cognitive shift, characterized by a craving for continuously varying stimuli, a low threshold for boredom, the ability to process multiple information streams simultaneously, and a quick intuitive grasp of algorithmic procedures that underlie and generate surface complexity. The cognitive mode, which I have elsewhere called “hyper attention,” is distinctively different from that traditionally associated with the humanities, which by contrast can be called “deep attention.” Deep attention is characterized by a willingness to spend long hours with a single artifact (for instance, a seven-hundred-page Victorian novel), intense concentration that tends to shut out external stimuli, a preference for a single data stream rather than multiple inputs, and the subvocalization that typically activates and enlivens the reading of print literature.


Deep attention versus hyper attention as examples of ontogenic mechanisms of adaptation. Ironically, hyper attention is what makes me turn away from The Jew's Daughter. Heim discusses this shfit, too.


(118) The effects of hyper attention are already being reflected in literary works, for example in John Caylety's Translation and Imposition, discussed in Chapter 5, where text is accompanied by glyphs visually indicating the algorithm's operation.

(118) As media change, so do bodies and brains; new media conditions foster new kinds of ontogenic adaptations and with them, new possibilities for literary engagements. This is the context in which we should evaluate and analyze the possibilities opened by electronic literature.

(119) It is precisely when these multilayered, multiply sited processes within humans and machines interact through intermediating dynamics that the rich effects of electronic literature are created, performed, and experienced.


THE BODY AND MACHINE IN ELECTRONIC LITERATURE

(120) The notorious “nervousness” of this work [Lexia to Perplexia], whereby a tiny twitch of the cursor can cause events to happen that the user did not intend and cannot completely control, conveys through its opaque functionality intuitions about dispersed subjectivities and screens with agential powers similar to those we saw with international currency traders.

(122) Memmott's rewriting of the myth in the context of information technologies, the “I-terminal,” a neologism signifying the merging of human and machine, looks at the screen and desires to interact with the image, caught like Narcissus in a reflexive loop that cycles across the screen boundary between self/other.

(122) The feedback cycle suggested here between self and other, body and machine, serves as a metaphor for the coconstruction of embodiment and media technologies.

(123) The passage cited above continues with “broken” code, that is, code that is a creolization of English with computer code, evocative of natural language connotations but not actually executable. . . . In particular, the play between human language and code points to the role of the intelligent machine in contemporary constructions of subjectivity, gesturing toward what Scott Bukatman has called “terminal identity,” or in Memmott's lexicon, the “I-terminal.”


A good place to distinguish broken code and pseudo code.


(123-124) Engaging the hyper-attentive characteristics of multiple information streams and rapid transformations (images, words, graphics, lightning-quick morphing of screens, mouseovers, and so on), the work reflects upon its own hyper-attentive aesthetics in the final section, where the prefix “hyper” replicates itself with every imaginable stem. At the same time, the work obviously requires deep-attention skills to grasp the complex interactions between verbal play, layered screen design, twitchy navigation, and JavaScript coding. . . . In terms of the complex dynamics between body and machine, we might say that the gamer and textual critical have had their neural plasticities shaped in different but overlapping ways.

(124) While Lexia to Perplexia is primarily concerned with the transformative effect of information technologies on contemporary subjectivity, Young-Hae Chang Heavy Industries engages the global microsociality and spatialization of temporality characteristic of information-intensive settings such as international currency trading discussed above. . . . Programmed in Flash, their works use timed animation to display sequential blocks of text, with the movement from one screen of text to the next synchronized with an accompanying sound track, typically jazz.

(125) The impression is not that the eye moves but rather that the text moves while the eye remains (more or less) stationary. Agency is thus distributed differently than with the print page where the reader controls the pace of reading and rate at which pages turn.

(125) [Bill] Brown devised a machine that he called the “Readie,” [in the 1920s] which was intended to display text much as it appears in YHCHI's compositions.

(126-127) In Nippon, global microsociality is emphasized by an intimate address that appears on a screen split between Japanese ideograms above and English words beneath. .. The subvocalization that activates the connection between sound and mark in literary reading here is complicated by the text's movement and its interpenetration by sound, becoming a more complex and multimodal production in which embodied response, machine pacing, and transnational semiotics, along with the associated spatialization of temporality, all contribute to construct the relation between text, body, and machine.

(128) The resulting tension mandates that the user intent on comprehending the work will necessarily be fored to play it many times, unable to escape hyper attention by stopping the text-in-motion or deep attention by lapsing into interactive game play.

(130) Hence the rhetoric of imperatives employed by Kittler (“must not think,” “forbids the leap,” and so on) finds its mirror opposite in Hansen's rhetoric of encapsulation (“subordination of technics,” “from within the operational perspective of the organism,” and the like). In contrast, the model herein proposed entagles body and machine in open-ended recursivity with one another. This framework mobilizes the effect recursivity always has of unsettling foundations while simultaneously catalyzing transformations as each partner in the loop initiates and reacts to changes in the other. In this model neither technological innovation nor embodied plasticity is foreclosed. The future, unpredictable as ever, remains open.


CHAPTER FOUR

Revealing and Transforming: How Electronic Literature Revalues Computational Practice

(131) The chapter elucidates further a framework in which digital literature can be understood as creating recursive feedback loops among embodied practice, tacit knowledge and explicit articulation.
(132) in developed societies, almost all communication, except face-to-face talk, is mediated through some kind of digital code.

(132) As the French sociologist Pierre Bourdieu has shown, robust and durable knowledge can be transmitted through social practices and enactments without being consciously articulated.

(134) In this context, literature can be understood as a semiotic technology designed to create - or more precisely, activate - feedback loops that dynamically and recursively unite feelings with ratiocination, body with mind.


But how do these feedback loops operate in nonhuman systems? Hayles does not spend much time at all discussing the source code of any of the examples of EL in this book. Her musings on what feelings/body and ratiocination/mind may be in nonhuman systems trace the same boundaries of fantasy as do those surrounding her postulate that nonhuman intelligences exist in Memmott's many representations of it.


(135) while the technological nonconscious has been a factor in constituting humans for millennia, the new cognitive capabilities and agencies of intelligent machines give it greater impact and intensity than ever before.

(136) The first proposition asserts that verbal narratives are simultaneously conveyed and disrupted by code, and the second argues that distributed cognition implies distributed agency.

(136) Stuart Moulthrop, writing on “404” errors, notes that such episodes are not simply irritations but rather flashes of revelation, potentially illuminating something crucial about our contemporary situation.


Deleuze and G make the same point about break-downs in Anti-Oedipus; the torn sock..


(137) In Natural Born Cyborgs: Minds, Technologies, and the Future of Human Intelligence, Andy Clark argues that we are “human-technology symbionts,” constantly inventing ways to offload cognition into environmental affordances so that “the mind is just less and less in the head.” Edwin Hutchins makes similar points in Cognition in the Wild.

(138-139) Electronic literature can tap into highly charged differentials that are unusually hetergeneous, due in part to uneven developments of computational media and in part to unevenly distributed experiences among users. .. These differences in background correlated with different kinds of intuitions, different habits, and different cognitive styles and conscious thoughts. .. Only because we do not know what we already know, and do not yet feel what we know, are there such potent possibilites for intermediations in the contemporary moment.


This may be an antidote to the suggestion that all narrative themes have been exhausted except science fiction by escaping print into electronic media forms.


RECURSIVE INTERACTIONS BETWEEN PRACTICE AND ARTICULATION

(139) William Poundstone's Project for Tachistoscope is modeled after a technology developed for experimental psychology exploring the effects of subliminal images.

(140) In historical context, then, the tachistoscope was associated with the nefarious uses to which subliminal perception could be put by Communists who hated capitalists and capitalists who egged on the persecution of “Reds” and “Commies.”

(140-141) As the kinds and amounts of sensory inputs proliferate, the effect for verbally oriented users is to induce anxiety about being able to follow the narrative while also straining to put together all the other discordant signifiers.

(142-143) Programmed by a human in the high-level languages used in Flash (C++/Java), the multi-modalities are possible because all the files are ultimately represented in the same binary code. The work thus enacts the borderland in which machine and human cognition cooperate to evoke the meanings that the user imparts to the narrative, but these meanings themselves demonstrate that human consciousness is not the only actor in the process. Also involved are the actions of intelligent machines. In this sense the abyss may be taken to signify not only those modes of human cognition below consciousness, but also the machinic operations that take place below the levels accessible to the user and even to the programmer.


Incomprehensible temporal orders.


(143) Without subvocalization, which connects the activity of the throat and vocal cords with the auditory center in the brain, literary language fails to achieve the richness it otherwise would have. [Garrett] Stewart's argument implies that embodied responses operating below the level of conscious thought are essential to the full comprehension of literary language, a proposition enthusiastically endorsed by many poets.

(145) Automating the homophonic variants that are the stock in trade of literary language, [Millie Niss's] Sundays in the Park brings to conscious attention the link between vocalization and linguistic richness.

(145-146) Cayley has been exploring what he calls “transliteral morphing,” a computational procedure that algorithmically morphs, letter by letter, from a source text to a target text. .. Cayley conjectures that underlying these “higher-level” relationships are lower-level similarities that work not on the level of words, phrases, and sentences but individual phonemes and morphemes. .. Just as Mencia invokes the philological history of language as it moves from orality to writing to digital representation, so Cayley's transliteral morphs are underlain by an algorithm that reflects their phonemic and morpemic relations to one another.

(146-147) The complexity of these relationships as they evolve in time and as they are mediated by the computer code, Cayley conjectures, are the microstructures that underlie, support, and illuminate the high-level conceptual and linguistic similarities between related texts.

(147) Cayley suggests that if a user watches these long enough while also taking in the transliteral morphs, she will gain an intuitive understanding of the algorithm, much as a computer game player intuitively learns to recognize how the game algorithm is structured. The music helps in this process by providing another sensory input through which the algorithm can be grasped.

While all this is happening, through embodied and tacit knowledge, the conscious mind grapples with the significance of the transliterating text [Walter Benjamin “On Language as Such an on the Language of Man”].

(149) For Benjamin the transcendent language associated with God ensures translatability of texts, while for Cayley the atomistic structures of computer and human languages are the correlated microlevels that ensure translatability.

(149-150) The “objectivity” of this translation is guaranteed not by God but by the entwining of human and computer cognitions in our contemporary mediascapes.

(151) At the performance, laptops throughout the space began playing versions while Cayley projected the full implementation on the front screen. .. The stunning effect was to create a multimodal collaborative narrative distributed on laptops, throughout the performance space, in which different sensory modalities and different ways of knowing entwined together with machine cognition and agency.

In other recent work, Cayley has focused on the ways in which our inuitive knowledge of letter forms can define space and inflect time.

(152) The temporal interactions, as well as the virtual/actual spatiality of the textual surfaces, create an enriched sense of embodied play that complicates and extends the phenomenology of reading.

(153) Cayley further explores the phenomenology of reading in Lens, designed first as a CAVE installation and then transferred to a QuickTime maquette.

(154-155) For text, the ability to function simultaneously as a window into the computer's performance and as a writing surface to be decoded puts into dynamic interplay two very different models of cognition. .. Mediatint between the brute logic of these machinic operations and human intentions is the program that, when run, creates a performance partaking both of the programmer's intentions and the computer's underlying architecture as symbolic processor. In electronic literature, authorial design, the actions of an intelligent machine, and the user's receptivity are joined in a recursive cycle that enacts in microcosm our contemporary situation of living and acting within intelligent environments.


REVALUING COMPUTATIONAL PRACTICE

(156) [Brian Kim] Stefans sees in this confrontation the possibility that the boundaries of the conscious self might be breached long enough to allow other kinds of cognitions, human and nonhuman, to communicate and interact. “The space between tehse poles - noise and convention - is what I call the 'attractor,' the space of dissimulation, where the ambiguity of the cyborg is mistaken as the vagary of an imprecise, but poetic, subjectivity” (151).

(157) Joining technical practice with artistic creation, computation is revalued into a performance that addresses us with the full complexity our human natures require, including the rationality of the conscious mind, the embodied response that joins cognition and emotions, and the technological nonconscious that operates through sedimented routines of habitual actions, gestures, and postures. Thus understood, computation ceases to be a technical practice best left to software engineers and computer scientists and instead becomes a partner in the coevolving dynamics through which artists and programmers, users and plays, continue to explore and experience the intermediating dynamics that let us understand who we have been, who we are, and who we might become.


CHAPTER FIVE

The Future of Literature: Print Novels and the Mark of the Digital

(159) So essential is digitality to contemporary processes of composition, storage, and production that print should properly be considered a particular form of output for digital files rather than a medium separate from digital instantiation.

(160) This engagement is enacted in multiple senses: technologically in the production of textual surfaces, phenomenologically in new kinds of reading experiences possible in digital environments, conceptually in the strategies employed by print and electronic literature as they interact with each other's affordances and traditions, and thematically in the represented worlds that experimental literature in print and digital media perform.

(161) How does the mark of the digital relate to the subjectivities performed and evoked by today's experimental print novels? .. Rather than asking if there is evidence that the “literary” novel may in fact be losing audience share to other entertainment forms, however, [Kathleen] Fitzpatrick asks what cultural and social functions are served by pronouncements about the death of the print novel.

(162) imitating electronic textuality through comparable devices in print, many of which depend on digitality to be cost effective or even possible; and intensifying the specific traditions of print, in effect declaring allegiance to print regardless of the availability of other media.

(163-164) Computer-mediated text is layered. .. the layered nature of code also inevitably introduces issues of access and expertise.

(164) Computer-mediated text tends to be multimodal.

(164) In computer-mediated text, storage is separate from performance. .. code can never be seen or accessed by a user while it is running.

(164) Computer-mediated text manifests fractured temporality.


Excellent example is the 'futz time' required to adequately 'see' ('run') Lexia to Perplexia.


DIGITALITY AND THE PRINT NOVEL

(166) Why write it [numerical code in Jonathan Safran Foer's Extremely Loud and Incredibly Close] in code? Many reviewers have complained (not without reason) about the gimmicky nature of this text, but in this instance the gimmick can be justified. It implies that language has broken down under the weight of trauma and become inaccessible not only to Thomas but the reader as well.

(169) The text, moving from imitation of a noisy machine to an intensification of ink marks durably impressed on paper, uses this print-specific characteristic as a visible indication of the trauma associated with the scene, as if the marks as well as the language were breaking down under the weight of the characters' emotions. At the same time, the overlapping lines are an effect difficult to achieve with letter press printing or a typewriter but a snap with Photoshop, so digital technology leaves its mark on these pages as well.

(170) The novel remediates the backward-running video in fifteen pages that function as a flipbook, showing the fantasized progression Oskar has imagined (327-41).

(172) Further complicating the ontology implicit in the book's materiality is the partitioning of some chapters into parallel columns, typically with three characters' stories running in parallel on a page spread, as if imitating the computer's ability to run several programs simultaneously.

(173) Within the narrative world, however, this apparent imitation of computer code's hierarchical structure is interpreted as the baby's ability to hide his thoughts from the reader as well as from Saturn, an interpretation that locates the maneuver within the print novel's tradition of metafiction by playing with the ontological levels of author, character, and reader.


What ontological levels are available for metafictional play in the genres of electronic literature Hayles has introduced? Relate to Foucault's meditation upon what is an author.


(175) In a now-familiar pattern, a technique that at first appears to be imitating electronic text is transformed into a print-specific characteristic, for it would, of course, be impossible to eradicate a word from an electronic text by cutting a hole in the screen.

(175) In House of Leaves, the recursive dynamic between strategies that imitate electronic text and those that intensify the specificities of print reaches an apotheosis, producting complexities so entangled with digital technologies that it is difficult to say which medium is more important in producing the novel's effects.

(177) As if positioning itself as a rival to the computer's ability to represent within itself other media, this print novel remediates an astonishing variety of media, including film, video, photography, telegraphy, painting, collage, and graphics, among others.

(178) Digital technology functions here like the Derridean supplement; alleged to be outside and extraneous to the text proper, it is somehow also necessary.

(180) Although it is true that digital technologies can create objects for which there is no original (think of Shrek, for instance), the technology itself is perfectly representable, from the alternating voltages that form the basis for the binary digits up to high-level languages such as C++. The ways in which the technology actually performs plays no part in Hansen's analysis. For him the point is that the house renders experience singular and unrepeatable, thus demolishing the promise of orthographic recording to repeat the past exactly. Because in his analogy the house equals the digital, this same property is then transferred to digital technology.


Enter the concept of epistemological transparency.


(180-181) More important, in my view, is an aspect of digital technology that Hansen's elision of its materiality ignores: its ability to exercise agency. .. the layered architectures of computer technologies enable active interventions that perform actions beyond what their human programmers envisioned.

(182) Increasingly human attention occupies only the tiny top of a huge pyramid of machine-to-machine communication. .. We would perhaps like to think that actions require humans to initate them, but human agency is increasingly dependent on intelligent machines to carry out intentions and, more alarmingly, to provide the data which the human decisions are made in the first place.


Old thoughts bad bots; participation in process of preference formation.


(183) Although humans originally created the computer code, the complexity of many contemporary programs is such that no single person understands them in their entirety. In this sense our understanding of how computers can get from simple binary code to sophisticated acts of cognition is approaching the yawning gap between our understanding of the mechanics of human consciousness. .. As Brian Cantwell Smith observes, the emergence of complexity within computers may provide crucial clues to “how a structured lump of clay can sit up and think.”


Really we have long been in the position that no single person can comprehend not just many but most programs and communications systems. The tie back to Socrates question is another old thought (von Neumman).


(185) Like the nothingness infecting the text's signifiers, a similar nothingness would confront us if we could take an impossible journey and zoom into a computer's interior while it is running code. We would find that there is no there there, only alternating voltages that nevertheless produce meaning through a layered architecture correlating ones and zeros with human language.


This statement may be rich or slip. Ties back to Dennett's theory of consciousness.


(186) Overwhelmed by the cacophony of competing and cooperating voices, the authority of voice is deconstructed and the interiority it authorized is subverted into echoes testifying to the absences at the center.



Hayles, N. Katherine. (2008). Electronic Literature. Notre Dame, IN: University of Notre Dame Press.


Hayles, N. Katherine. Electronic Literature. Notre Dame: University of Notre Dame Press, 2008. Print.