Wednesday, December 10, 2008

The Poller Project

The Poller Project http://sourceforge.net/projects/poller/

This project is intended to be a single PHP script capable of administering a simple, browser based, multiple question, multiple choice poll for small audiences such as during a presentation or lecture served directly from the presenter's computer. It is driven by MySQL database tables that contain the poll questions, and into which the responses are recorded. Two browser windows are used on the controlling computer: one that is set to “display” on the projector screen, which refreshes once every second to control the poll (if it is set in automatic mode), and one set to “console” which the presenter or associate uses to start the poll, and advance the questions if it is not in automatic mode.

The presenter establishes a local, wireless network and seeds the PollInfo table with the IP address of the controlling computer. Members of the audience connect to the that IP address using their laptops and mobile browsing devices. When a “question” is asked, they are able to respond by selecting one of the responses that will display on their screens as well as the main projection screen.

The current version (0.3.0) allows for displaying different lexia depending on the statistical mode of the responses to the previous question, allowing the presenter to adjust the flow of the presentation based on anonymous feedback from the audience. Since the software is free, open source distributed under the GPL, it can be modified and extended to perform other functions. Here are some facts about the sample data included in this package: the name of the Poll is FOSS, the controlling webserver IP address is 192.168.0.10; from the first question to completion, the poll runs for approximately five minutes, which is 300 seconds; there are n discrete questions, a few of which accept responses from the network interactors via HTTP to the PHP program poller.php; only two did, for a total of approximately ten responses; out of approximately twelve humans; four different Ogg Vorbis codec digital audio files are played during the course of the poll.

After the poll is complete, running the old CGI script (poller.cgi) yields the results:

Results for Question 13: These FOUR FREEDOMS are guaranteed by

A) Creative Commons 0 responses

B) DMCA 0 responses

C) GPLv2 1 responses

D) Linux 0 responses

E) Microsoft Shared Source 0 responses

Results for Question 15: So what is Linux?

A) Finnish detergent 0 responses

B) Monolithic kernel 0 responses

C) Shareware operating system 0 responses

D) Unix-like suite of system programs 0 responses

E) Free, open source operating system 3 responses

Results for Question 43: The Storyspace software Michael Joyce used to create Twelve Blue is

A) Discontinued 0 responses

B) FOSS 1 responses

C) $295 0 responses

D) Shareware 0 responses

Results for Question 74: Are these oven controls designed well?

A) YES 1 responses

B) NO 0 responses

C) Did not read Norman 0 responses


When I used the “What is Linux” question in a presentation with about 50 attendees of the 2007 Computing and Philosophy conference, the responses were mixed. The participants in the Introduction to Texts and Technology seminar all chose “Free, open source operating system,” so it was appropriate to present the GNU material. A future version of the program could skip or condense the GNU segment if the statistical mode of the responses to the question had been correct response.


Please visit the project web site on Sourceforge (http://poller.sourceforge.net/) for more information, including how to download the source code. Included in the 0.3.0 release are the SQL statements used to create the questions for the “FOSS” presentation, and a full dump of the database in order to recreate the state of the webserver at the end of the presentation.

Tuesday, December 9, 2008

Giant Underwater Books


I literally drew this on the table cloth tonight as I was trying to come up with some kind of experimental book that was, for a change, not related to computer technology or free, open source software.

It is a giant underwater ribbon of recycled garbage arranged to be read by divers over many years using flashlights that project three dimensionally on the surrounding particles suspended in the water (BS&W is the technical term from petroleum engineering, not sure what the oceanographic term is even though I had an oceanography course in college). You can read what is woven and otherwise inscribed onto the fabric of the huge text itself, or view the image shimmering around you.

  • This book is like a Moebius strip or a continuous sheet like the old cloth towels wound like typewriter ribbon you used to find in restrooms to dry your hands. Floating, suspended in the water both sides are legible.Out of habit I divided it into a sequence of pages, like large computer display screens or an infinite Escher-like Turing Machine tape. Consider the clustering concept that Michael Heim describes in Electric Language: a four foot by five foot paper sheet that is beyond the scale of any imaginable computer display in the 1980s - the 'pages' of this book are larger still. That is why one of the divers exclaims, It's going to take me 10 years to read this!

  • Reading tied to swimming underwater instead of playing dead on dry land. At first it seems very unnatural, but once staying still to read seemed odd. DL quote, noting that in BofB no non-sedentary reading methods are described. Doesn't Ulmer play on this idea with his Dreadmill?
  • Blocks out sound and other visual distractions. The aural effect of submersion reflects on Ong's distinction between the all aboutness of sound for speaking versus the singular locus of vision for reading.
  • Just as the Kindle is being promoted as a 'green' environmentally sound reading device, this huge underwater ribbon will be woven from garbage floating in the oceans today, by robots or specially trained sea creatures. Over many decades as the books are slowly 'written', trash will turn into treasure. It is the reversal of the destruction of nature by the side effects of literacy, if you are like Ong and credit the rise of modern science and industry to literacy and print. Crap, technology creeps in..

Monday, November 10, 2008

ENG_6801 Draft for Module 3

Reading Electronic Literature

Part I. Introduction: All Hail Electronic Literature!

The weakness of any survey of electronic literature published as a printed book is how quickly it becomes obsolete. So may become the case with N. Katherine Hayles' thoroughly researched and remarkably comprehensive Electronic Literature, which bearing the publication date of 2008 at least the carries the mark of the current year. The purpose of its first chapter - Electronic Literature: What is it? - is to define electronic literature (henceforth EL), survey its genres, expand on its differences with print literature, both in terms of composition and criticism, and provide some ideas on its preservation and dissemination, again emphasizing new challenges and opportunities provided by this medium. This introduction sets the stage for the remainder of the book. What EL is not, is the mere digitization of print literature; there must be important aspects of the work that make it “a first-generation digital object created on a computer and (usually) meant to be read on a computer“ (3). She gives some ground by including in the scope of “the literary” “creative artworks that interrogate the histories, contexts, and productions of literature, including as well the verbal art of literature proper” (4). On this reading, EL may include digitizations of originally print literature, such as ancient Greek texts or the Star Wars screenplay, provided there is a creative element to them that is natively digital.

She offers a broad survey of genres of EL, beginning with familiar “first-generation” hypertext-oriented works that everyone knows such as Michael Joyce's afternoon: a story, Stuart Moulthrop's Victory Garden, and Shelly Jackson's Patchwork Girl, many of which were created my proprietary software such as Storyspace (6). Both the nature of their composition - blocks of text (lexia) energized primarily by hyperlinks - and their means of production have been eclipsed by new works leveraging a panoply of multimedia components and navigational mechanisms, as well as delivery over the Web via standard browser-based technologies instead of proprietary, stand-alone solutions. Her survey includes many recent examples of EL in genres including “[h]ypertext fiction, network fiction, interactive fiction, locative narratives, installation pieces, 'codework', generative art, and the Flash poem” (30).

New modes of analysis and criticism have arisen along with the new forms. For example, it is worthwhile to examine the similarities and differences between EL and computer games; in both, the user is required to invest substantial effort to engage in the computational mechanisms, but for different purposes: “[p]araphrasing Markku Eskelinen's elegant formulation, we may say that with games the user interprets in order to configure, wheres in works whose primary interest is narrative, the user configures in order to interpret” (8). In many cases it is appropriate to describe EL as instruments that users can learn to play in order to fully appreciate their nuances. Furthermore, the program source code and operating environment supporting EL must be accounted in their analysis, since, as Hayles quotes Alexander Galloway, “Code is the only language that is executable” (35). Widening the scope to include code and operating environments reflects the fact that EL engages many skills beyond literary composition, and point made many times by Drucker and McVarish in Graphic Design History, which Hayles refers to as “a site for negotiations between diverse constituencies and different kinds of expertise”(38). Indeed, the appreciation of the collaborative design and production processes of most complex works of EL contacts the discipline with wider social practices, such as “the development of commercial software, the competing philosophies of open source freeware and shareware, the economics and geopolitical terrain of the internet and World Wide Web, and a host of other factors that directly influence how electronic literature is created and stored, sold or given away, preserved or allowed to decline into obsolescence” (39).

Critical engagement with the social practices surrounding EL foreground the importance of the means by which works are disseminated and preserved, not merely because electronic formats have historically enjoyed much shorter lifespans than printed books, which last for centuries rather than decades, but also because electronic formats involve a host of design decisions that are intimately tied to the early stages of their creation, not merely the publication of the finished product. Thus recommendations are offered concerning the choice of open versus closed systems, community direction versus corporate, plain-text versus binary data formats, and so on. In the sections that follow, three major themes - intermediation, the historical context of EL, and computational practice - will be explored using the examples Halyes provides. In the concluding section, a summary table orienting these works upon a number of dimensions will be presented, including these social and technological concerns that are not necessarily unique to EL but whose implications are clearly significant for its creators and consumers.

Part II. Intermediation: Interpreting Electronic Literature

In the second chapter of Electronic Literature, Hayles develops the concept of intermediation as a way to examine electronic texts without binding them to the traditional modes of interpretation that have been used for critical study of print literature. The mark of digital born works is the non-trivial role played by nonhuman, technological components in not only the preparation of the work, but its dynamic rendering to readers, viewers, listeners - many prefer the term 'interactors', since many senses may be elicited at once in activities that go beyond passive consumption - and how it abides and potentially mutates within information systems. Departing from the traditional model in which “it's all in the head of the reader,” meaning develops through the interaction of human and machine in ways that are often emergent, associative, layered, and adaptive through various levels iteratively feeding back into each other. Through process of intermediation, a term Hayles adapts from Nicholas Gessler, as it is often employed in the context of computer software simulating artificial life,

whereby a first-level emergent pattern is captured in another medium and re-represented with the primitives of the new medium, which leads to an emergent result captured in turn by yet another medium, and so forth. The result is what researchers in artificial life call a “dynamic hierarchy,” a multi-tiered system in which feedback and feedforward loops tie the system together through continuing interactions circulating throughout the hierarchy (45).

Such systems typically employ multiple, intermediating levels of processes that may be digital, analog, or a combination of both, so that their overall effect resembles a self-emerging, living system. Hayles' maneuver is to “make a speculative leap and consider the human and the digital computer as partners in a dynamic heterarchy bound together by intermediating dynamics” (47). To support her position, she invokes computational models of consciousness that counter the Cartesian model of an irreducible rationality by theorists including Douglas Hofstadter, Daniel C. Dennett, and Edward Fredkin. Hofstadter emphasizes the importance of pattern recognition and extrapolation from analogy as playing roles as important as literal, logical deduction in rounding out models of cognition. Dennett uses thought experiments to demonstrate how human intentionality may emerge as an artifact from subcognitive processes. Fredkin contributes the concept of “aboutness,” to align the meaning of information with the process that interprets it, whether it is a music player generating sound from digital files or a human appreciating the details of the musical work reproduced (52-53).

Taking something of a leap, Hayles joins these theories together to posit intermediation as a symbiosis of human and computer, with each partaking in the layered processing of emergent, high-level responses from lower-level operations in both the reading and writing of EL, making the claim that cognition occurs in both. “The result is a meta-analogy: as human cognition is to the creation and consumption of the work, so computer cognition is to its execution and performance” (57). How meaning evolves through the interplay between human and nonhuman cognition will be examined in three works: Twelve Blue by Michael Joyce, The Jew's Daughter by Jud Morissey, and Birds Singing Other Birds' Songs by Marcia Mencia.

Intermediation explains that cognition arising from iterative associations occurs in both human and nonhuman systems. The uninformed reader will bounce around the space of possible hyperlinks and absorb, through repetition, what we might otherwise explain through associations. Jill Walker, in her interpretation of Joyce's earlier Storyspace output afternoon, relays J. Hills Miller's 'Nietzschean repetition' in “[r]e-reading nodes in new surroundings is a form of repetition typical of hypertext. Often, re-reading a node invests it with new meaning.” The work is a series of lexia, generally a page or less, some containing hyperlinks, or images, which are also hyperlinks. The oft-repeat quote from William Gass describes this gentle type of intermediation - many of the other works to be reviewed forcefully jar the interactor from one component of the text to another - "So a random set of meanings has softly gathered around the word the way lint collects. The mind does that."

Users interact with Twelve Blue via generic HTML browser parsing 208 text, image, and image map source files; the only session information that is intentionally part of literary work as planned by the author is the effect visiting links has on making parts of certain files disappear from the client display because the link visited color is identical to the background blue color. Here is how it works:

Two HTML frames separate the view into a navigation area on the far left of 85 columns, and a viewing area on the right that is the rest of the screen. In the starting page (Twelve_Blue.html) the two frames contain 'subtexts' (identified via SRC tags) titlepage.html and twelvepic.html, respectively, that are displayed, and offer the initial user interface of navigation choices. The left (title) frame offers a single hyperlink called “BEGIN” that points to sl1.html. The main frame contains a large image of twelve colored 'threads' (twelvepic1.gif), eight hyperlinks for the “BARS” referred to in the title frame, and the quote from William Gass. Each of the numbered “BARS” 1-8 below the image point to sl1.html, sl2.html, etc. The image itself is defined with a map of hyperlinks so that various rectangular regions of the image point to the same eight destinations as the “BARS”. For example, the region spanning the Cartesian grid beginning at the upper left-hand point 3 on the x-axis and 2 on the y-axis to 42 on the x-axis and 227 on the y-axis links to sl1.html. The region beginning at point 43 on the x-axis and 2 on the y-axis to 85 on the x-axis and 227 on the y-axis (coords="43,3,85,227") links to sl2.html, and so on. This means that while a horizontal sequence is established, the threads themselves are indistinct. Subsequent pages will allow offer different destinations depending on the thread. Using the available navigation links, therefore, the user has seventeen entry points into the work, although there are only eight different destinations. This mobility invites playing the work like a musical instrument, rather than passively reading pages.

The literary technique of flash-forward and flash-back includes post-mortem awareness resembling conscious reflection; the narratives of the drowning deaf boy and Ed Stanko continue into a dreamy state as they die. Moreover, there is no clear ending to the work, although the disappearing links do provide a sense of temporality. Hayles describes this break from the traditional plot as “one in which life and death exist on a continuum with flowing and indeterminate boundaries” (69). The overall effect is coming to know the story from multiple perspectives, from different characters at different points in time, piecemeal, so that eventually through associations the intertwined bars come together, collecting like lint or snowflakes.

Interacting with Twelve Blue retains much of the control enjoyed by readers of printed pages, whereas Jud Morissey's Mac or Windows binary executable The Jew's Daughter (TJD.exe) subverts the interactor's control. At one point it reads, “The broken sum of its parts is a great agonist.” The entire work is 22 lines of black text on a white background with blue text that behaves like a mouse-over hyperlink, instantly changing some of the text, forcing you to re-read and try to remember what you just read, slowly building the narrative in a more broken manner than Twelve Blue's clean transitions. You learn to read it after a few mouse-overs of the blue text, when “It is an ultimatum, one that” ends the page and continues at the top. The text pretends to be intelligent. Some thoughts finish before they are started. At one point the letters appear one by one; “In Java she had seen a woman decapitated.” Like Twelve Blue, this counts as literature because it tells a story. At some points new words appear one by one. Hayles views the work as a very rich but complicated reinterpretation of consciousness as an epiphenomenon in which there is no self, only an illusion of one. The focalized reader consciousness gets fragmented into strong memories of persistent text on the 22 lines and into weak memories from associations. It can be the reverse mirror image of the unintelligent machine fomenting consciousness as the closed-loop feedback, dynamically self-reprogrammable control system symbiosis with humans known as cyberspace, the internet, the WWW, and so on. Like Twelve Blue, it takes a while reading the disjointed pages before the story coalesces, although this experience is no different than reading the first hundred pages of a very long novel before grasping the story lines and identity of the various characters. “Are you going to risk the night?” seems to begin a dream sequence. The whole work now seems like a dream, the piecing together that is a dream to the semi-conscious dreamer, then trying to remember the pieces while awakening. By focusing on a word, mousing over the blue words, like the semi-conscious dreamer trying too hard to focus on something in the dream, the scene is distorted as more connections are made with the other world.

“And now for something completely different,” as the Monty Python meta-narrator used to say. At the far pole of intermediation is the destruction of narrative into the experimental, such as the investigation of basic aural and visual phenomena in Marcia Mencia's Birds Singing Other Birds' Songs. Moreover, like TJD.exe, this work is delivered as binary data in Macromedia Flash format, offering no plain-text, prima facie clues to the underlying source code. skymove.swf plays via Flash plugin for a generic browser such as Mozilla Firefox. Clouds drift by horizontally, and at the bottom of the screen are 13 numbered sets of buttons using the common symbols for PLAY and STOP. Each button seems to run a program displaying a bird-like silhouette, and letters, and accompanying sound is produced from a generic audio subsystem. The letters are sometimes the edge of the bird's outline, inside it, pushed in front of it, and so on. Activating the first bird brings forth a silhouette of a bird flying left to right across the screen whose outline is composed of the letters of its song, which is heard on the computer speakers. The second button elicits the formation of a large, single bird image from dispersed symbols that immediately flies off the screen once assembled. By the sixth button, the characters of this large, single, immobile bird are much more difficult to discern, moving around overlapping each other within the confines of the bird's silhouette. Rather than spelling out the song, one letter at a time shifts into visibility and then back into obscurity. In the seventh, the letters circulate around the form like a serpentine belt, with no spaces in between to suggest distinct words. The twelfth button invokes a single large, white semi-transparent silhouette whose song emerges one letter at a time coming forward overlaying the other letters, “see see” and then there are too many letters piled up to make sense of it. Activating all the birds at once - the first thing a child would be likely to do - creates a cacophony that eventually regulates itself. The author describes the works as an exploration of “kinetic typography, the animation of images and sound.” She noted similarities between the phonemes of the transcript of birds' songs in The Thinking Ear and phonemes used in her other works. So these are translations of birds songs into written human language, interpreted back into human voice. Hayles interprets this work as intermediating the habitual, automatic cognitive decoding of print as subvocalization, for here the vocalized sounds dominate and are only suggested by the display of characters.

Part III. Historical Context

To describe a historical context for EL reflects a particular ontology; Hayles places it within the perspective of Friedrich Kittler's media theory, for which the essence of literature is in its material medium, or media, if multiple, such as visual and aural media, pictures and sounds, and therefore, Hayles argues, “[l]iterature acts on the body but only within the horizon of the medium's technical capabilities” (89). Kittler saw a great technological advance in Heinrich Stephanie's phonetic method of reading, which occurred around 1800 - something most readers of the 2000s do not realize, that there had ever been different ways of reading - “erasing the materiality of the grapheme and substituting instead a subvocalized voice” (89). Other great innovations include the gramophone, film, and typewriter, if this can be intuited from the title of one of his most famous books. Kittler's theory reflects a deterministic approach to the philosophy of technology; it ignores the deep interaction between technical processes and human societies and cultures. Hayles points to multiple critics who argue that Kittler focuses on war as the primary influence determining the course of technological change for media precisely because military exigencies seem to offer the only compelling reason to advance the state of the art. No doubt he would say the same for the Internet, as do most networking textbooks when they trace its origins back to the need of the US military to develop inter networking and electronic communications standards. Yet as Hayles is adamant, “media alone cannot possibly account for all the complex factors that go into creating national military conflicts. .. media transformations alone are not sufficient” (93). Her counterexample describing the combination of technological and cultural conditions that shape the fascinating lifeworld of global currency traders reveals both a media specific component, the different sense of time created by their monitors displaying global, real-time market conditions, and social formations such as male aggression, antagonism, and warfare that are common to the high stakes, rapidly transforming trade floor arena.

The human components, especially physical attributes of human bodies, also play a role. Mark Hansen is a proponent of the position that embodiment - “proprioception, kinesthetic, and haptic capacities” - plays as important a role in experiencing media as the primary senses like vision and hearing, citing research in “virtual reality sickness” (106). Hayles is not satisfied with this reductive stance because, like Kittler's technological determinism, it fails to keep in view the materiality of the human/machine interface: “[i]t is as though the feedback loop between technical object and embodied human enactor has been cut off halfway through: potentiality flows from the object into the deep inner senses of the embodied human, but its flow back into the object has been short-circuited, leading to an impoverished account of the object's agential capacities to act outside the human's mobilization of its stimuli” (109). Hayles claims her own framework “entagles body and machine in open-ended recursivity” so as not to concretize the possibilities of electronic literature in particular technological or anthropological paradigms (130).

An exemplar of an electronic text that is open ended, entangling human and machine in recursive intermediation is Talon Memmott's Lexia to Perplexia, a Javascript enhanced HTML text that works in standard browsers by combining navigation via the webserver responding to hyperlinks executed by the interactor and the 'client side' dynamism offered by Javascript, albeit originally programmed using non-standard extensions only supported by current versions of Microsoft Internet Explorer browser. The author's description notes that “[a]t times its interactive features override the source text, leading to a fragmentary reading experience. .. certain theoretical attributes are not displayed as text but are incorporated into the functionality of the work.” It develops its own terms and is “play between the rigorous and the frivolous.” Hayles write about “[t]he notorious 'nervousness' of this work, whereby a tiny twitch of the cursor can cause events to happen that the user did not intend and cannot completely control, conveys through its opaque functionality intuitions about dispersed subjectivities and screens with agential powers similar to those we saw with international currency traders” (120). Like Twelve Blue and many lexia-based EL, following the default entry into the work offers the interactor with a small number of selections akin to chapter headings. Once entered, navigation within one of the headings (“The Process of Attachment, “Double-Funnels,” “Metastrophe,” and “Exe.Termination”) is suggestive, experimental, often surprising, through the combination of traditional mouse click hyperlinks and Javascript code that activates features by mere mouse over. By “taking fingersteps into the apparatus” the materiality of the text is highlighted, although really mouse gestures moreso than keyboarding. Images, diagrams, moving lines and pointers, as well as large and small blocks of text compete for the interactor's attention, often obscuring one another. This work is not narrative in the sense of Twelve Blue or The Jew's Daughter; it is more a quasi-academic exegesis of the four movements from “The Process of Attachment” to “Exe.Termination” that illustrates Hayle's interpretive framework.

The first part, “The Process of Attachment” seems to depict the human eye attaching itself to the nonhuman, machinic system, not as a proxy of the self but transformed, mediated: “It is never I that enters. .. The screen-bound avatar is a micromental reproduction of the trans|missive hero-agent. .. Though the delivery-machine feels no-thing, the mediation, all co-operation between the I and the apparatus is con.sensual.” Memmott's neologisms can be exasperating to the detail-oriented interactor, never sure when their temporary meanings will transform into something different, or whether their 'codework' genuinely reflects the design of the underlying source code or merely suggestive of imaginary operations. When the possibilities of this part seem to have been played out, the interactor must return to the “main menu” by clicking the ever-present “LEXIA to PERPLEXIA” in the upper left-hand corner.

Once attached to the system, intermediation occurs through a model of “Double Funnels,” the second main heading, symbolized by these expressions:

[local.{[*...(*] | )}(...^...){( | [*)...*]}.remote]

(s)T(ex)(T)(s)

The code-work attempts to encode a mechanisms for uniting local and remote (s)'s (eye icons). A way for the “analog and slippery digits of the real” to exit to the remote, distant other as if poured into the funnel. The screen briefly displays, in very large characters, EXIT, and below, (s)T(ex)T(s), in which texts unmistakenly appears although it is better understand as another representation of the code-work below it. Mousing around and clicking suggestive hyperlinks again produces a collage of images, diagrams, and lexia, all the while maintaining the auspices of an academic presentation. A lengthy commentary upon the communication process, 'exe.change', reminiscent of Shannon's theory, notes that because it is shared conduit, “[b]etween the local and the remote, the success and failure of communification in the middle, the mess in the middle is prone to various mechanoid intensities borne from the simultaneous passage of others through the general conduit.”

The third part, “Metashtrophe,” begins with “Minifestos,” complete with timestamps (in the future 2000), mouse-overs producing secondary pop ups giving definitions of certain words or adding more figures to the background diagram. The final part, “Exe.Termination,” presents two columns with a large white word changing to form different combinations, the lefthand side like minutes, the righthand side like hours, and the center large grey symbols like seconds. Images that look like pages of printed text, when mouse over, add a diagram or explanatory lexia for a few seconds. Some of the images resemble rough blackboard sketches of the design of L2P itself. (Following “TALAN MEMMOTT” from the main page to the “ABOUT” offers another glimpse into the behind-the-scenes perspective of the work, including a photographed image of a pile of books for the bibliography.)

Part IV. Computational Practice

When referring to computational practice, Hayles means the active role of “technological nonconscious” embodied in modern electronic computing systems, as opposed to the affordances provided by print technologies. Computational practice intermediate the experience of EL, in particular two ways: “verbal narratives are simultaneously conveyed and disrupted by code” and “distributed cognition implies distributed agency” (136). It is important to believe that knowledge can be transmitted through social practices and enactments, such as craft making, dance, and other demonstrative activities without consciously verbalizing it. Otherwise, it is hard to understand the epistemological function of most mixed media. Contemporary computational practices erode the privileged role of the coherent, well-formed lexia while offloading more and more cognition and agency from human to nonhuman mechanisms. The result is that “[e]lectronic literature can tap into highly charged differentials that are unusually heterogeneous, due in part to uneven developments of computational media and in part to unevenly distributed experiences among users” (138). This effect is a result of the combination of the 'ergodic' nature intrinsic to of much EL, inexperience with the new stylistic conventions that have been surveyed here, as well as the non-trivial art of literary interpretation that may not be shared by those who are highly skilled in the other areas. Thus the experience of EL will vary greatly from interactor to interactor, depending on their particular computing environments, their ability to use them, their experience with formal theories of print literature, and also their ability to peer into the creator's world if they choose to “hack” the work. While Hayles does not investigate this potential, it is certainly one of those “highly charged” zones that interest the technically savvy who want to know “how it works” or wish to rework or build off of the underlying programmatic aspects.

When it is impossible to peer into the inner workings of a literary work with a high degree of computational complexity, it is nonetheless possible to intuit the underlying algorithms. John Cayley's work is a prime example. In Translation, three blocks of symbols are in constant motion to the with the accompaniment of a soundtrack. In a computational procedure he calls “transliteral morphing,” a source text in one language is slowly translated into a target text in another language, letter-by-letter. Hayles explains:

Cayley conjectures that underlying these “higher-level” relationships are lower-level similarities that work not on the level of words, phrases, and sentences but individual phonemes and morphemes. .. Just as Mencia invokes the philological history of language as it moves from orality to writing to digital representation, so Cayley's transliteral morphs are underlain by an algorithm that reflects their phonemic and morpemic relations to one another. (146)

Like Mencia's Birds, visual language is disconnected from subvocalization and rendered letter by letter. However, Cayley's theory is that microstructures bind translations between different human language, and also within machines. His selection of Walter Benjamin's “On Language as Such and on the Language of Man” positions this new possibility against a theological interpretation going back to Berkeley that it is God that ensures the possibility of translation from one language to another. Of course, viewing Translation renders the appearance of a complex, underlying algorithm, but it could be a trick. Something presented in a movie format offers little access to the means of production in a much more profound way than any plain-text format. While it was possible to delve into the inner workings of the hypertext works like Twelve Blue and Lexia to Perplexia, it is not part of the design for the interactor to actually see the source code of the translation algorithm. Moreover, this and all works encoded in Apple Quicktime format are difficult to view on generic x86 GNU/Linux systems. Choice of a commercial, proprietary, patented or copyrighted encoding format is a gamble that future interactors will incur the cost to compute it.

Brian K. Stefans work Star Wars, One Letter At A Time presents the entire printed screenplay of the 1977 movie Star Wars as typed one page at a time on an electric typewriter one key at a key at an average rate of about 10 characters per second. 10 Hertz is about the human limit of 'intelligibility' for discrete sound units as distinguished from very low frequency bass sounds. For each typed letter is the same sound; the space bar and carriage return keys have distinct sounds including a bell. The temptation is great to play the audio soundtrack along with it. There are two files, an HTML file starwars_one_letter.html that invokes the other one, starwars_one_letter.swf. Shockwave Flash movies play readily on generic x86 architecture GNU/Linux browsers with the appropriate Flash plugin. Running Ubuntu Linux 8.04 this was much more “working off the shelf” than the fussy-to-configure WINE software required to view the Apple Quicktime work Translation. A feature of this work is that it can be viewed in a very small window since only one character is ever displayed at a time. Like its two counterparts in Birds Singing Other Birds' Songs, there is nothing else to see on the screen but the rapid changes of this small field. It is difficult for an unfamiliar reader to make sense of it without access to a more familiar page version. The movie Star Wars captivated a generation, and combining replay of the original soundtrack, or even playing the movie itself, with executing starwars_one_letter.swf will be experienced differently by different age and social groups, attesting to an example in which there are few deep program secrets to learn about the work, as opposed to exploring the depths of the source code of Twelve Blue and Lexia to Perplexia.

Before digital computer screens, the tachistoscope was a high speed slide projector that could flash images at 'subliminal rates' in experiments that demonstrated that viewers picked up subconscious cues such as “buy popcorn” when suggestive text and images were delivered via the tachistoscope. William Poundstone notes in the one of the links encircling the START button on Project for Tachistoscope: Bottomless Pit that

The tachistoscope used in early perceptual experiments were slide projectors capable of flashing images as brief as 1 millisecond. This speed was overkill. No matter how brief the image, the retina's afterimage persists for c. 50 milliseconds .. The briefest full-screen image on a 75 Hz CRT lasts 13 milliseconds. .. A given phosphor, or a group of them forming a small image, remains illuminated for about 4 milliseconds. LCD monitors do not scan, but their pixels have a mechanical afterimage of up to 25 milliseconds. .. This site will run on an LCD monitor, but more of the flashed images are likely to be perceptible.

Ironically, most CRT monitors have been retired in favor of energy and space saving LCD units. If starwars_one_letter.swf modulated the 10 Hertz range of aural and visual human perception, William Poundstone's Tachistoscope.swf therefore modulates the up to the 20 Hertz range of visual perception while playing music modulating a much broader aural spectrum than the digital tones of Stephan's work using the same runtime environment (generic x86 architecture GNU/Linux browsers with the appropriate Flash plugin and audio subsystem).

The work presents one word at a time at a quick rate, in the center of the screen, with a pulsating, colorful background and a set of repeating images vaguely related to the story or the current word. Occasionally a second word or additional images or pictures will flash by in very rapid succession, bordering on the subliminal. Meanwhile a minimalist, hypnotic musical accompaniment plays reminiscent of Phillip Glass composition. It takes a number of viewings to grasp the overall narrative, which is about a large pit, bottomless pit that has opened near the town Carbondale during road construction activities. The fictional narrative is often injected with subliminal words such as “Heidelberg”, “Chinatown”, “Chomsky”, “Kuwait” and sometimes what may be complete sentences but are impossible to discern because the only control operation is to EXIT and start the work over. Rather than interpreting itself like Lexia to Perplexia, the outer band of the START screen provides seven lexia that explain the tachistoscope principle, the historic controversy surrounding subliminal messages, system requirements for viewing, and the concept of “semantic priming” that is the key feature of the work.

Summary Table for Reading Electronic Literature

Text

Year

Author

Source Media

Run-time Requirements

Why it's Literature

Twelve Blue

1996

Michael Joyce

HTML

Generic browser

narrative

The Jew's Daughter

2000

Jud Morissey

unknown

Windows binary executable

narrative

Birds Singing Other Birds' Songs

2001

Maria Mencia

Shockwave Flash

Generic browser
Shockwave Flash plug-in

Audio subsystem

interpretive art

Lexia to Perplexia

2000

Talon Memmott

HTML

Javascript

IE extensions

Internet Explorer browser

philosophical essay

narrative

Translation

2004

John Cayley

Quicktime movie

Apple Quicktime Player

theoretical demonstration

Star Wars, One Letter At a Time

2005

Brian K. Stefans

Shockwave Flash

Generic browser

Audio subsystem

interpretive art

Project for Tachistoscope

2005

William Poundstone

Shockwave Flash

Generic browser
Shockwave Flash plug-in

Audio subsystem

narrative

These contemporary examples illustrate the electronic computer playing a much more engaged and dynamic role that the hypertext viewer characteristic of EL the 1990s. To Hayles, the unique possibilities the computational practices bring to literary creation invite a revaluation: the technoculture should no longer be seen as an unimaginative outsider, providing merely an updated support mechanism or prop to the codex book, but instead an integral component of creative endeavor, “a performance that addresses us with the full complexity our human natures require, including the rationality of the conscious mind, the embodied response that joins cognition and emotions, and the technological nonconscious that operates through sedimented routines of habitual actions, gestures, and postures” (157).

References

Hayles, N. Katherine. (2008). Electronic Literature. Notre Dame, IN: University of Notre Dame Press.

Walker, Jill. (1999). “Piecing together and tearing apart: finding the story in afternoon.” Retrieved 10/10/2008 from http://jilltxt.net/txt/afternoon.html.

Note: all of the electronic texts discussed in this essay were included on the Electronic Literature Collection Volume 1 CDROM provided with Electronic Literature.

Acknowledgments

Thanks to Sonia Stephens for editorial assistance.

Monday, October 20, 2008

Notes on Norman's Design of Everyday Things

Notes on Donald A. Norman The Design of Everyday Things


PREFACE TO THE 2002 EDITION

(vii) Far too many items in the world are designed, constructed, and foisted upon us with no understanding - or even care - for how we will use them. Calling something a “Norman door” is recognition of the lack of attention paid by the maker to the user, which is precisely my message.


Would we dare apply Norman's arguments to the creation of texts? He criticizes his choice of titles.


(viii) The problems sound trivial, but they can mean the difference between pleasure and frustration. The same principles that make these simple things word well or poorly apply to more complex operations, including ones in which human lives are at stake. Most accidents are attributed to human error, but in almost all cases the human error was the direct result of poor design.


This seems like a generalization designed to suit his purpose; are the majority of automobile accidents really the result of poor design? The admission, then, is that we routinely engage in risky behavior due to the poor designs in our environment that we cannot really avoid.


The Hidden Frustrations of Everyday Things

The Book Title: A Lesson in Design

(x) I talked with book buyers and clerks. My editor was correct: I needed to change the word “psychology” to “design.” In titling my book, I had been guilty of the same shortsightedness that leads to all those unusable everyday things!

Lessons from DOET

(x) the appearance of the device must provide the critical clues required for its proper operation - knowledge has to be both in the head and in the world.

(xi) The device must explain itself. Even the location and operation of the controls require a conceptual model - an obvious and natural relationship between their location and the operation they control so you always known which control does what (in the book I call this a “natural mapping”).

(xiv) Appropriate, human-centered design requires that all the considerations be addressed from the very beginning, with each of the relevant design disciplines working together as a team.

Technology Changes Rapidly; People Change Slowly

(xiv) in selecting examples, I deliberately kept away from high technology, looking instead at everyday things, things that have been around a while. High technology changes rapidly, but everyday life changes slowly.

(xv) Each time a new technology comes along, new designers make the same horrible mistakes as their predecessors. Technologists are not noted for learning from the errors of the past. They look forward, not behind, so they repeat the same problems over and over again.


PREFACE

(xvii) Humans do not always err. But they do when the things they use are badly conceived and designed. Nonetheless, we still see human error blamed for all that befalls society.

Acknowledgments

(xix) A major argument in POET is that much of our everyday knowledge resides in the world, not in the head. .. People certainly do rely upon the placement and location of objects, upon written texts, upon the information contained within other people, upon the artifacts of society, and upon the information transmitted within and by a culture. There certainly is a lot of information out there in the world, not in the head.


CHAPTER ONE

THE PSYCHOPATHOLOGY OF EVERYDAY THINGS

You Would Need an Engineering Degree to Figure This Out

(2)(figure 1.1) Carelman's Coffeepot for Masochists. The French artist Jacques Carelman in his series of books Catalogue d'objets introuvables (Catalong of unfindable objects) provides delightful examples of everyday things that are deliberately unworkable, outrageous or otherwise ill-formed.

The Frustrations of Everyday Life

(4) The correct parts must be visible, and they must convey the correct message. With doors that push, the designer must provide signals that naturally indicate where to push. .. The vertical plate and supporting pillars are natural signals, naturally interpreted, without any need to be conscious of them. I call the use of natural signals natural design.

(5) Other problems concern the mappings between what you want to do and what appears possible.

(8) Visibility indicates the mapping between intended actions and actual operations.

The Psychology of Everyday Things

AFFORDANCES

(9) affordance refers to the perceived and actual properties of the thing, primarily those fundamental properties that determined just how the thing could possibly be used.

TWENTY THOUSAND EVERYDAY THINGS

(12) Here is where the designer's knowledge of the psychology of people coupled with knowledge of how things work becomes crucial.

CONCEPTUAL MODELS

Principles of Design for Understandability and Usability

PROVIDE A GOOD CONCEPTUAL MODEL

(16)(figure 1.10) Conceptual Models. The design model is the designer's conceptual model. The user's model is the mental model developed through interaction with the system. The system image results from the physical structure that has been built (including documentation, instructions, and labels). .. If the system image does not make the design model clear and consistent, then the user still end up with the wrong mental model.

MAKE THINGS VISIBLE

(22) Whenever the number of possible actions exceeds the number of controls, there is apt to be difficulty.

THE PRINCIPLE OF MAPPING

(23) Mapping is a technical term meaning the relationship between two things, in this case between the controls and their movements and the results in the world.

(23) Natural mapping, by which I mean taking advantage of physical analogies and cultural standard, leads to immediate understanding.

(25) A device is easy to use when there is visibility to the set of possible actions, where the controls and displays exploit natural mappings.

THE PRINCIPLE OF FEEDBACK

(27) Feedback - sending back to the user information about what action has actually been done, what result has been accomplished - is a well-known concept in the science of control and information theory.

(27) To be fair, these new designs are pushing hard on the paradox of technology: added functionality generally comes along at the price of added complexity.

Pity the Poor Designer

The Paradox of Technology

(30) The development of a technology tends to follow a U-shaped curve of complexity: starting high; dropping to a low, comfortable level; then climbing again.

(31) Whenever the number of functions and required operations exceeds the number of controls, the design becomes arbitrary, unnatural, and complicated. The same technology that simplifies life by providing more functions in each device also complicates life by making the device harder to learn, harder to use. This is the paradox of technology.


CHAPTER TWO

THE PSYCHOLOGY OF EVERYDAY ACTIONS

Falsely Blaming Yourself

(36) If an error is possible, someone will make it. The designer must assume that all possible errors will occur and design so as to minimize the chance of the error in the first place, or its effects once it gets made. Errors should be easy to detect, they should have minimal consequences, and, if possible, their effects should be reversible.

Misconceptions of Everyday Life

ARISTOTLE'S NAIVE PHYSICS

PEOPLE AS EXPLANATORY CREATURES

(39) everyone forms theories (mental models) to explain what they have observed. In the case of the thermostat, the design gives absolutely no hint as to the correct answer. In the absence of external information, people are free to let their imaginations run free as long as the mental models they develop account for the facts as they perceive them.

Blaming the Wrong Cause

(40) The psychology of blame (or, to be more accurate, of attribution) is complex and not fully understood. In part, there seems to have to be some perceived causal relationship between the thing being blamed and the result. The word perceived is critical: the causal relationship does not have to exist; the person simply has to think it is there.

(41) Interestingly enough, the common tendency to blame ourselves for failures with everyday objects goes against normal attributions people make. In general, it has been found that people attribute their own problems to the environment, those of other people to their personalities.

LEARNED HELPLESSNESS

TAUGHT HELPLESSNESS

The Nature of Human Thought and Explanation

How People Do Things: The Seven Stages of Action

(48) Forming the goal; Forming the intention; Specifying an action; Executing the action; Perceiving the state of the world; Interpreting the state of the world; Evaluating the outcome

The Gulfs of Execution and Evaluation

THE GULF OF EXECUTION

(51) The difference between the intentions and the allowable actions is the Gulf of Execution.

THE GULF OF EVALUATION

(51) The Gulf of Evaluation reflects the amount of effort that the person must exert to interpret the physical state of the system and to determine how well the expectations and intentions have been met.

The Seven Stages of Action as Design Aids

(52) The seven-stage structure can be a valuable design aid, for it provides a basic checklist of questions to ask to ensure that the Gulfs of Evaluation and Execution are bridged.


CHAPTER THREE

KNOWLEDGE IN THE HEAD AND IN THE WORLD

(54-55) not all of the knowledge required for precise behavior has to be in the head. It can be distributed - partly in the head, partly in the world, and partly in the constraints of the world. Precise behavior can emerge from imprecise knowledge for four reasons: Information in the world, Great precision is not required, Natural constraints are present, Cultural constraints are present.

(56) There is a tradeoff between the amount of mental knowledge and the amount of external knowledge required in performing tasks.

Precise Behavior from Imprecise Knowledge

INFORMATION IS IN THE WORLD

(56) There is a tradeoff between speed and quality of performance and mental effort.

(57) People function through their use of two kinds of knowledge: knowledge of and knowledge how.

GREAT PRECISION IS NOT REQUIRED

THE POWER OF CONSTRAINTS

(60) Combining the two constraints of rhyme and meaning can therefore reduce the information about the particular word that must be kept in memory to nothing; as long as the constraints are known, the choice of word can be completely determined.

(61) The notion that someone should be able to recite word for word is relatively modern. Such a notion can be held only after printed texts become available; otherwise who could judge the accuracy of a recitation?


Point made by Ong.


Memory is Knowledge in the Head

THE CONSPIRACY AGAINST MEMORY

THE STRUCTURE OF MEMORY

(67) If we examine how people use their memories and how they retrieve information, we discover a number of categories: Memory for arbitrary things, Memory for meaningful relationship, Memory through explanation.

(70) People probably make up mental models for most of the things they do. This is why designers should provide users with appropriate models: when they are not supplied, people are likely to make up inappropriate ones.

Memory Is Also Knowledge in the World

REMINDING

(72) One of the most important and interesting aspects of the role of external memory is reminding, a good example of the interplay between knowledge in the head and in the world.


Is this the distinction Plato intends in Phaedrus, that reminding involves knowledge in the world? Or the two aspects signal and message?


(73) A good reminding method is to put the burden on the thing itself.

(73) There are two different aspect to reminder: the signal and the message.

NATURAL MAPPINGS

(78) Wherever labels seem necessary, consider another design.

The Tradeoff between Knowledge in the World and in the Head

(80) Reminders provide a good example of the relative tradeoffs between the roles of internal versus external knowledge. Knowledge in the world is accessible. .. Knowledge in the mind is ephemeral.


CHAPTER FOUR

KNOWING WHAT TO DO

A Classification of Everyday Constraints

(84) for different classes of contraints - physical, semantic, cultural, and logical. These classes are apparently universal, appearing in a wide variety of situations, and sufficient.

PHYSICAL CONSTRAINTS

(84) Physical limitations constrain possible operations. .. The value of physical constraints is that they rely upon properties of the physical world for their operation; no special training is necessary.

SEMANTIC CONSTRAINTS

(85) Semantic constraints rely upon the meaning of the situation to control the set of possible actions. .. Semantic constraints rely upon our knowledge of the situation and of the world.

CULTURAL CONSTRAINTS

(85) Cultural issues are at the root of many of the problems we have with new machines: there are as yet no accepted conventions or customs for dealing with them.

(85-86) guidelines for cultural behavior are represented in the mind by means of schemas, knowledge structures that contain the general rules and information necessary for interpreting situations and for guiding behavior.

LOGIAL CONSTRAINTS

(86) Natural mappings work by providing logical constraints. There are no physical or cultural principles here; rather there is a logical relationship between the spatial or functional layout of components and the tings that they affect or are affected by.

Applying Affordances and Constraints to Everyday Objects

THE PROBLEM WITH DOORS

THE PROBLEM WITH SWITCHES

WHICH SWITCH CONTROLS WHICH FUNCTION?

HOW ARE THE SWITCHES ARRANGED?

Visibility and Feedback

MAKING VISIBLE THE INVISIBLE

(101) Nothing succeeds like visual feedback, which in turn requires a good visual display.

USING SOUND FOR VISIBILITY

(103) Natural sounds reflect the complex interaction of natural objects.

(103) If they are to be useful, sounds must be generated intelligently, with an understanding of the natural relationship between the sounds and the information to be conveyed.

(103) One of the virtues of sounds is that they can be detected even when attention is applied elsewhere. But this virtue is also a deficit, for sounds are often intrusive.


CHAPTER FIVE

TO ERR IS HUMAN

(105) People make errors routinely. Hardly a minute of normal conversation can be by without a stumble, a repetition, a phrase stopped midway through to be discarded or redone. Human language provides special mechanisms that make corrections so automatic that the participants hardly take notice; indeed, they may be surprised when errors are pointed out. Artificial devices do not have the same tolerance. Push the wrong button, and chaos may result.


Recall von Neumann's discussions about error tolerance of natural and artificial automata.


(105) Slips result from automatic behavior, when subconscious actions that are intended to satisfy our goals get waylaid en route. Mistakes result from conscious deliberations.

(105) Form an appropriate goal but mess up in the performance, and you've made a slip. .. Form the wrong goal, and you've made a mistake.

Slips

(106) Some slips may indeed have hidden, darker meanings, but most are accounted for by rather simple events in our mental mechanisms.


Footnote on Sherry Turkle's The Second Self confirms that Freud is reinterpreted.


(106) Slips show up most frequently in skilled behavior. .. On the whole, people can consciously attend to only one primary thing at a time. .. We can do more than one thing at a time only if most of the actions are done automatically, subconsciously, with little or no need for conscious attention.

TYPES OF SLIPS

(107) We can place slips into one of six categories: capture errors, description of errors, data-driven errors, associative activation errors, loss-of-activation errors, and mode errors.

(107) The capture error appears whenever two different action sequences have their initial stages in common, with one sequence being unfamiliar and the other being well practiced. Seldom, if ever, does the unfamiliar sequence capture the familiar one.

(108) Description errors usually result in performing the correct action on the wrong object.

(108) Description errors occur most frequently when the wrong and right objects are physically near each other.

(109) Automatic actions are data driven - triggered by the arrival of the sensory data. But sometimes data-drive activities can intrude into an ongoing action sequence, causing behavior that was not intended.

(109) Associate activation errors are the slips studied by Freud; you think something that ought not to be said and then, to your embarrassment, you say it.


This is an excellent enumeration of types of slips but not doing justice to Freud's insight into their nature. Freud would certainly pay attention to all types of slips, not just this one.


(110) Lack-of-activation errors occur because the presumed mechanism - the “activation” of the goals - has decayed. The less technical but more common term would be “forgetting.”

(110) Mode errors occur when devices have different modes of operation, and the action appropriate for one mode has different meanings in other modes. Mode errors are inevitable any time equipment is designed to have more possible actions than it has controls or displays, so the controls must do double duty.


How about some examples of human equivalents? Different uses of body parts, for example, for sex.


DETECTING SLIPS

(110-111) detection can only take place if there is feedback. .. Some trail of the sequence of actions that was performed is valuable.

(111) The most global description (the one at the top of the list), is called the high-level specification. The more detailed descriptions, the ones at the bottom of the list, are called the low-level specifications. Any one of them might be in error.

(112) In all the situations I have examined the error correction mechanism seems to start at the lowest possible level and slowly works its way higher.

DESIGN LESSONS FROM THE STUDY OF SLIPS

(113) In computer systems, it is common to prevent errors by requiring confirmation before a command will be executed, especially when the action will destroy a file. But the request is ill timed; it comes just after the person has initiated the action and is still fully content with the choice. .. The user has requested deletion of the wrong file but the computer's request for confirmation is unlikely to catch the error; the user is confirming the action, not the file name. Thus asking for confirmation cannot catch all slips. It would be more appropriate to eliminate irreversible actions.

(114) When you build an error-tolerant mechanism, people come to rely upon it, so it had better be reliable.

Mistakes as Errors of Thought

SOME MODELS OF HUMAN THOUGHT

(114-115) Even though principles of rationality seem as often violated as followed, we still cling to the notion that human thought should be rational, logical, and orderly. .. Many scientists who study artificial intelligence use the mathematics of formal logic - the predicate calculus - as their major tool to simulate thought.

But human thought - and its close relatives, problem solving and planning - seem more rooted in past experience than in logical deduction. Mental life is not neat and orderly. It does not proceed smoothly and gracefully in neat, logical form. Instead, it hops, skips, and jumps its way from idea to idea. .. it is the difference that leads to creative discovery and to great robustness of behavior.

(115) Human memory is most definitely not like a set of photographs or a tape recording. It mushes things together too much, confuses one event with another, combines different events, and leaves out parts of individual events.

(115-116) Another theory is based on the filing cabinet model, wherein there are lots of cross references and pointers to other records. .. Of course, it is not called a file cabinet theory. It goes by the names of “schema theory,” “frame theory,” or sometimes “semantic networks” and “propositional encoding.” The individual file folders are defined in the formal structure of the schemas or frames, and the connections and associations among the individual records make the structure into a vast and complex network. The essence of the theory consists of three beliefs, all reasonable and supported by considerable evidence: (1) that there is logic and order to the individual structures (this is what the schema or frame is about); (2) that human memory is associative, with each schema pointing and referring to multiple others to which it is related or that help define the components (thus the term “network”); and (3) that much of our power for deductive thought comes from using the information in one schema to deduce the properties of another (thus the term “propositional encoding”).

THE CONNECTIONIST APPROACH

(116) A newer approach is rooted in the working of the brain itself. Those of us who follow this new approach call it “connectionism,” but it also goes under the names of “neural nets,” “neural models,” and “parallel distributed processing.” .. This approach follows the rules of thermodynamics more than it does the rules of logic.

(117) We can think of the interactions as the computational part of thought: when one set of units sends signals activating another, this can be interpreted as offering support for a cooperative interpretation of events; when one set of units sends signals suppressing another, it is because the two usually offer competing interpretations. The result of all this support and competition is a compromise: not the correct interpretation, simply one that is as consistent as possible with all possibilities under active consideration. This approach suggests that much of thought results from a kind of pattern matching system, one that forces its solutions to be analogous to past experiences, and one that does not necessarily follow the formal rules of logical inference.

(118) Throw everything into memory on top of one another. That is a crude approximation of the connectionist approach to memory.

(118) We must together details of things that are similar, and give undue weight to the discrepant. We relish discrepant and unusual memories.

(119) This event-based reasoning is powerful, yet fundamentally flawed. Because thought is based on what can be recalled, the rare event can predominate.

The Structure of Tasks

WIDE AND DEEP STRUCTURES

(121) Everyday structures are either shallow or narrow

SHALLOW STRUCTURES

(121) There are many alternative actions, but each is simple; there are few decisions to make after a single top-level choice.

NARROW STRUCTURES

(121) If each possibility leads to only one or tow further choices, then the resulting tree structure can be said to be narrow and deep.

(123) Any task that involves a sequence of activities where the action to be done at any point is determined by its place in the sequence is an example of a narrow structure.

THE NATURE OF EVERYDAY TASKS

Conscious and Subconscious Behavior

(125) Subconscious thought is biased toward regularity and structure, and it is limited in formal power. It may not be capable of symbolic manipulation, of careful reasoning through a sequence of steps.

(126) Conscious thought tends to be slow and serial. Conscious processing seems to involve short-term memory and is thereby limited in the amount that can be readily available.

(127) Which is exactly what everyday tasks ought to be - boring, so that we can put our conscious attention on the important things of life, not the routine.

EXPLAINING AWAY ERRORS

(128) When there is a devastating accident, people's explaining away the signs of the impending disaster always seems implausible to others.

SOCIAL PRESSURES AND MISTAKES

(129) In industrial settings social pressures can lead to misinterpretations, mistakes, and accidents. For understanding mistakes, social structure is every bit as essential as physical structure.

Designing for Error

(131) Designers make the mistake of not taking error into account. Inadvertently, they can make it easy to err and difficult or impossible to discover error or to recover from it. .. Don't think of the user as making errors; think of the actions as approximations of what is desired.

(131) Design so that errors are easy to discover and corrections are possible

HOW TO DEAL WITH ERROR - AND HOW NOT TO

(132) Warnings and safety methods must be used with care and intelligence, taking into account the tradeoffs for the people who are affected.

FORCING FUNCTIONS

(132) Forcing functions are a form of physical constraint: situations in which the actions are constrained so that failure at one stage prevents the next step from happening.

(125) Forcing functions are the extreme case of strong constraints that make it easy to discover erroneous behavior. .. In the field of safety engineering, forcing functions show up under other names, in particular as specialized methods for the prevention of accidents. Three such methods are interlocks, lockins, and lockouts.

(137) Forcing functions almost always are a nuisance in normal usage. The clever designer has to minimize the nuisance value while retaining the safety, forcing-function mechanism, to guard against the occasional tragedy.

A Design Philosophy

(140) The designer shouldn't think of a simple dichotomy between errors and correct behavior; rather, the entire interaction should be treated as a cooperative endeavor between the person and machine, one in which misconceptions can arise on either side. This philosophy is much easier to implement on something like a computer which has the ability to make decisions on its own than on things like doors and power plants, which do not have such intelligence. .. Put the required knowledge in the world. .. Use the power of natural and artificial constraints: physical, logical, semantic, and cultural. Use forcing functions and natural mappings. Narrow the gulfs of execution and evaluation.


CHAPTER SIX

THE DESIGN CHALLENGE

The Natural Evolution of Design

FORCES THAT WORK AGAINST EVOLUTIONARY DESIGN

(142) Natural design does not work in every situation; there must be enough time for the process to be carried out, and the item must be simple.

THE TYPEWRITER: A CASE HISTORY IN THE EVOLUTION OF DESIGN

(147) In the end, the qwerty keyboard was adopted throughout the world with but minor variations. We are committed to it, even though it was designed to satisfy constraints that no longer apply, was based on a style of typing no longer used, and is difficult to learn.

(150) Once a satisfactory product has been achieved, further change may be counterproductive, especially if the product is successful.

Why Designers Go Astray

(151) First, the reward structure of the design community tends to put aesthetics first. .. Second, designers are not typical users. .. Third, designers must please their clients, and the clients may not be the users.

PUTTING AESTHETICS FIRST

(151-152) Because prizes tend to be given for some aspects of a design, to the neglect of all others - usually including usability.

DESIGNERS ARE NOT TYPICAL USERS

(156) In their work, designers often become expert with the device they are designing. Users are often expert at the task they are trying to perform with the device.

THE DESIGNER'S CLIENTS MAY NOT BE USERS

(157) The state of California requires by law that universities purchase things on a price basis; there are no legal requirements regarding understandability or usability of the product.

The Complexity of the Design Process

DESIGNING FOR SPECIAL PEOPLE

(162) Some problems are not solved by adjustments. Left-handed people, for example, present special problems.


The handwriting example is great: the social convention of left to write scansion smudges.


(164) designing for flexibility helps.

SELECTIVE ATTENTION: THE PROBLEM OF FOCUS

(165) When there is a problem, people are apt to focus on it to the exclusion of other factors. The designer must design for the problem case, making other factors more salient, or easier to get to, or perhaps less necessary.

(165) A corollary principle is that designers must guard against the problems of focus in their own design.

The Faucet: A Case History of Design Difficulties

(170) If you can't put the knowledge on the device, then develop a cultural constraint: standardize what has to be kept in the head.

Two Deadly Temptations for the Designer

CREEPING FEATURISM

(173) Creeping featurism is the tendency to add to the number of features that a device can do, often extending the number beyond all reason.

(174) The proper division of a complex set of controls into modules allows you to conquer complexity.

THE WORSHIPPING OF FALSE IMAGES

(177) the false image is appearance of technical sophistication.

The Foibles of Computer Systems

(177) There is nothing particularly special about the computer; it is a machine, a human artifact, just like the other sorts of things we have looked at, and it poses few problems that we haven't encountered already. But designers of computer systems seem particularly oblivious to the needs of users, particularly susceptible to all the pitfalls of design. The professional design community is seldom called in to help with computer products. Instead, design is left in the hands of engineers and programmers, people who usually have no experience, and no expertise in designing for people.


Typical criticism of FOSS.


HOW TO DO THINGS WRONG

(179) DO you want to do things wrong? Here is what to do: Make things invisible; Be arbitrary; Be inconsistent; Make operations unintelligible; Be impolite; Make operations dangerous.

IT'S NOT TOO LATE TO DO THINGS RIGHT

(180) the best computer programs are the ones in which the computer itself “disappears,” in which you work directly on the problem without having to be aware of the computer.

COMPUTER AS CHAMELEON

EXPLORABLE SYSTEMS: INVITING EXPERIMENTATION

(183) One important method of making systems easier to learn and to use is to make them explorable, to encourage the user to experiment and learn the possibilities through active exploration.

TWO MODES OF COMPUTER USAGE

(184) The point cannot be overstressed: make the computer system invisible. This principle can be applied with any form of system interaction, direct or indirect.

THE INVISIBLE COMPUTER OF THE FUTURE


Heim would argue strongly against this equivocating the computer as an ordinary tool. Norman's recommendation of designing software such that the computer disappears and the task is foregrounded allows concealing of enframing. “A worn sock is better than a mended one; not so with metaphysics.”


CHAPTER SEVEN

USER-CENTERED DESIGN

Seven Principles for Transforming Difficult Tasks into Simple Ones

USE BOTH KNOWLEDGE IN THE WORLD AND KNOWLEDGE IN THE HEAD

THREE CONCEPTUAL MODELS

THE ROLE OF MANUALS

SIMPLIFY THE STRUCTURE OF TASKS

KEEP THE TASK MUCH THE AME, BUT PROVIDE MENTAL AIDS

USE TECHNOLOGY TO MAKE VISIBLE WHAT WOULD OTHERWISE BE INVISIBLE, THUS IMPROVING FEEDBACK AND THE ABILITY TO KEEP CONTROL

(192) With modern computers and their powerful graphic displays, we now have the power to show what is really happening, to provide a good, complete image that matches the person's mental model of the task - thereby simplifying both understanding and performance.

(193) if the skill is easily automated, it wasn't essential.

(193) In general, I welcome any technological advance that reduces my need for mental work but still gives me the control and enjoyment of the task. .. I want to use my mental powers for the important things, not fritter them away on the mechanics.

AUTOMATE, BUT KEEP THE TASK MUCH THE SAME

CHANGE THE NATURE OF THE TASK

(194) In general, technology can help transform deep, wide structures into narrower, shallower ones.

(194) The introduction of new fastening materials - for example, Velcro hook-and-loop fasteners - has eliminated the need for a complex sequence of skilled motor actions by changing the task to one this is considerably simpler, one that requires less skill.

(195-196) Digital timepieces are controversial: in changing the representation of time, the power of the analog form has been lost, and it has become more difficult to make quick judgments about time.


Heim examines this shift in psychic framework in much more detail.


(196) Today, because we can no longer remember the origins, we think of the analog system as necessary, virtuous, and proper. It presents a horrid, classic example of the mapping problem.


DON'T TAKE AWAY CONTROL

(197) One problem is that overreliance on automated equipment can eliminate a person's ability to function without it. .. A second problem is that a system may not always do things exactly the way we would like, but we are forced to accept what happens because it is too difficult (or impossible) to change the operation. A third problem is that the person becomes a servant of the system, no longer able to control or influence what is happening.

(197) All tasks have several layers of control. .. Sometimes we really want to maintain control at the lower level. .. At other times we want to concentrate on higher level things.


Supervisory control models, and closed-loop feedback.


MAKE THINGS VISIBLE: BRIDGE THE GULFS OF EXECUTION AND EVALUATION

GET THE MAPPINGS RIGHT

(199) Natural mappings are the basis of what has been called “response compatibility” within the fields of human factors and ergonomics. .. Difficulties arise wherever the positioning and movements of the controls deviate from strict proximity, mimicry, or analogy to the things being controlled.

EXPLOIT THE POWER OF CONSTRAINTS, BOTH NATURAL AND ARTIFICIAL

DESIGN FOR ERROR

WHEN ALL ELSE FAILS, STANDARDIZE

(201) standardization is essential only when all the necessary information cannot be placed in the world or when natural mappings cannot be exploited.

STANDARDIZATION AND TECHNOLOGY

(202) Standardization is simply another aspect of cultural constraints.

(202) Today's computers are still poorly designed, at least from the user's point of view. But on problem is simply that the technology is still very primitive - like the 1906 auto - and there is no standardization.

THE TIMING OF STANDARDIZATION

(202) Standardize and you simplify lives: everyone learns the system only once. But don't standardize too soon: you may be locked into a primitive technology, or you may have introduced rules that turn out to be grossly inefficient, even error-inducing. Standardize too late and there may already be so many ways of doing the task that no international standard can be agreed on; if there is agreement on an old-fashioned technology, it may be too expensive to change. The metric system is a good example.

Deliberately Making Things Difficult

DESIGNING A DUNGEONS AND DRAGONS GAME

(208) making things difficult is a tricky business. .. several psychological factors hang in a delicate balance: challenge, enjoyment, frustration, and curiosity.

EASY LOOKING IS NOT NECESSARILY EASY TO USE

(209) Complexity of appearance seems to be determined by the number of controls, whereas difficulty of use is jointly determined by the difficulty of finding the relevant controls (which increases with the number of controls) and difficulty of executing the functions (which may decrease with the number of controls).

(209) Hide the controls not being used at the moment.

Design and Society

HOW WRITING METHOD AFFECTS STYLE

FROM QUILL AND INK TO KEYBOARD AND MICROPHONE

(211) Style may change further when we get voice typewriters, where our spoken words will appear on the page as they are spoken.

(211) On the one hand, it is satisfying to be able to type your thoughts without worrying about minor typographical errors or spelling. On the other hand, you may spend less time thinking and planning.

OUTLINE PROCESSORS AND HYPERTEXT

(211) The current fad in writing aids is the outline processor, a tool designed to encourage planning and reflection on the organization of material. .. Outline processors attempt to overcome organizational problems by allowing collapsed views of the manuscript to be examined and manipulated. But the process seems to emphasize the organization that is visible in the outline or heading structure of the manuscript, thereby deemphasizing other aspect of the work. It is characteristic of thought processes that attention to one aspect comes at the cost of decreased attention to others.


Reaches similar conclusions as Heim but not at the same depth of ontological analysis.


(212) Hypertext makes a virtue of lack of organization, allowing ideas and thoughts to be juxtaposed at will.

(212) Imagine that this book was in hypertext. How would it work? Well, I've used several devices that relate to hypertext: one is the footnote, another is parenthetical comments, and yet another is contrasting text.

(213) If hypertext really becomes available, especially in the fancy versions now being talked about - where words, sounds, video, computer graphics, simulations, and more are all available at the touch of the screen - well, it is hard to imagine anyone capable of preparing the material. It will take teams of people.

THE HOME OF THE FUTURE: A PLACE OF COMFORT OR A NEW SOURCE OF FRUSTRATION

(214) it is difficult to see how the complex instructions required for such a system will be conveyed.

The Design of Everyday Things

(216) Design, therefore, takes on political significance. .. In Western cultures, design has reflected the capitalistic importance of the marketplace, with an emphasis on exterior features deemed to be attractive to the purchaser. .. We are surrounded with objects of desire, not objects of use.

(216) If you are a designer, help fight the battle for usability. If you are a user, then join your voice with those who cry for usable products.


Feenberg tie-in to go along with the Heim tie-in?



Norman, Donald A. (2002). The Design of Everyday Things. United States: Basic Books.