Wednesday, June 29, 2011

Notes for Leonardo to the Internet

Notes for Thomas J. Misa Leonard to the Internet: Technology and Culture from the Renaissance to the Present

Establishes history and demonstrates methodology more so than offers theory.

Preface
(x) The Renaissance court system was the conceptual key. . . . The technical projects they commissioned from the Florence cathedral to the mechanical robots for courtly entertainment, as well as the printed works on science, history, philosophy, religion, and technology, created and themselves constituted Renaissance culture.
(x-xi) There are good reasons to see the industrial revolution as a watershed in world history, but our time-worn inclination to seize on industrial technologies as the only ones that really matter has confounded a proper understanding of the great commercial expansion that followed the Renaissance. . . . I began not only to think of technologies as located historically and spatially in a particular society and shaped by that society's ideas of what was possible or desirable, but also to see how these technologies evolved to shape the society's social and cultural developments. To capture this two-way influence, I look up the notion of distinct “eras” of technology and culture as a way of organizing the material for this book.

Compare to Kittler whom Hayles criticizes for emphasizing military technologies. We are in the age where electronic technologies are now central to interpretation.

(xi) If technologies come from outside, the only critical agency open to us is slowing down their inevitable triumph – a rearguard action at best. By contrast, if technologies come from within society and are products of on-going social processes, we can, in principle alter them – at least modestly – even as they change us.

The participant culture, in principle, although the default comportment of consumer (spectator) is justified by Zizek.

(xii) Beyond Britain, commentators and technologist sometimes looked to copy British models of industry but more frequently adapted industrial technologies to their own economic and social contexts. The result was a variety of paths through the industrial revolution.
(xii) The legacy of the industrial revolution, it seemed, was not a single “industrial society” with a fixed relationship to technology but rather a multidimensional society with a variety of purposes for technology.
(xii) The first of these technology-intensive activities to fully flower was empire building, the effort by Europeans and North Americans to extend economic and political control over wide stretches of land abroad or at home.

He gives interesting accounts of British empire building in India but little detail about American internal activity.

(xiii) A second impulse in technology gathering force from the 1870s onward lay in the application of science to industry and the building of large systems of technology.
(xiv) The achievement of mass-produced steel, glass, and other “modern materials” around 1900 reshaped the aesthetic experience of working or walking in our cities and living in our homes.
(xiv) Technology has been and can be a potent agent in disciplining and dominating. I also discuss the modernists' troubling embrace of a fixed “method” of creativity.
(xiv) In the Cold War decades, scientists and engineers learned that the military services had the deepest pockets of all potential technology patrons.
(xv) The hardest history to write is that of our own time, and yet I believe that “globalization,” or “global culture,” is a force that oriented technology and society in the final three decades of the twentieth century.
(xvi) My corollary [to Moore's Law] states that the size of computer operating systems and software applications has doubled
at the same pace as the operational speed of computer chips, soaking up the presumed power of the hardware and blunting its impact.

Do we have any better use for that power as consumers? Does it just mean we would have had internet based television sooner?

(xvii) It is not so much that our technologies are changing especially quickly but that our sense of what is “normal,” about technology and society, cannot keep pace.
(xvii) These eras appear to be shortening: the Renaissance spanned nearly two centuries, while the twentieth century alone saw the eras of science and systems, modernism, war, and global culture. It is worth mentioning a quickening also in the
self-awareness of societies – our capacities to recognize and comprehend change are themselves changing. . . . This self-awareness of major historical change is clearly an instance of “reflexive” modernization in sociologist Ulrich Beck's sense. In this way, then, these eras do capture something real in our historical experience.

Is Beck on the same level as Lacan? McLuhan, Ong, and others recognized this quickening of awareness.

1
Technologies of the Court
1450-1600

(1) Whether from the Medici family or from his numerous other courtly patrons, Leonardo's career-building commissions were not as a painter, anatomist, or visionary inventor, as he is typically remembered today, but as a military engineer and architect.

Who are Leonardos of our recent era? Technology billionaires?

(3) Even the well-known history of movable-type printing needs to be reexamined in the light of pervasive court sponsorship of technical books and surprisingly wide court demand for religious publications.

We already are clever enough to examine Internet history in light of the triangle. Hayles develops are more nuanced and less deterministic narrative than Kittler whom she criticizes for focusing on war determining technological development.

The Career of a Court Engineer
(4-5) In addition to his work as an architect and sculptor, Brunelleschi was a pioneer in geometrical perspective, especially useful in capturing the three dimensionality of machines in a two-dimensional drawing. From Leonardo's notebooks it is clear that he mastered this crucial representational technique. . . . The multiple-view drawings, done in vivid geometrical perspective, are a signature feature of his notebooks.
(5) His notebooks from Milan are filled with drawings of crossbows, cannons, attack chariots, mobile bridges, firearms, and horses.
(8-9) While certainly not such exciting subjects as muskets or cannon, the varied means for attacking or defending a fortification were at the core of Renaissance-era warfare.
(9) It is often suggested that Leonardo chafed at having to design theatrical costumes, yet scholars have recently found evidence indicating the Leonardo also built moving stage platforms and settings – and perhaps even an articulated mechanical robot for these festivities.
(10) His fascination with self-acting mechanisms is also evident in Leonardo's many sketches of textile machines found in the surroundings of Milan.

Link Leonardo's fascination with autonomous artificial automata to von Neumann. (Here a timestamp operator would reveal a later reading.)

(13) The special character of technological creativity in the Renaissance resulted from one central fact: the city-states and courts that employed Leonardo and his fellow engineers were scarcely interested in the technologies of industry or commerce. Their dreams and desires focused the era's technologists on warfare, city building, courtly entertainments, and dynastic displays. . . . The intellectual resources and social dynamics of this technological community drew on and helped create Renaissance court culture.
(13) Foremost among these intellectual resources was the distinctive three-dimensionality and depth of Renaissance art and engineering.
(14) Leading Florentine artists such as Massaccio were already practicing something like linear perspective a decade or more before Alberti's famous treatise
On Painting (1436).
(14) Durer's most famous “object,” illustrating his 1525 treatise on geometry and perspective and reproduced widely ever since, was a naked woman on her back, suggesting that perspective was not merely about accurately representing the world but about giving the (male) artist power over it.

Throwing a bone to feminists and liberal studies?

(16) Leonardo even copied may of Alberti's distinctive phrases. It is Alberti's ideas we are reading when Leonardo writes that the perspective picture should look as thought it were drawn on a glass through which the objects are seen.
(17) Close study of the two men's notebooks has revealed that Francesco was one source of designs for machines and devices that had previously been attributed to Leonardo alone.
(17-18) In a curious way, the presence of Leonardo's voluminous notebooks has helped obscure the breadth and depth of the Renaissance technical community, because researchers overzealously attributed all the designs in them to him. . . . Scholars believe that about one-third (6000 pages) of Leonardo's original corpus has been recovered; these papers constitute the most detailed documentation we have on Renaissance technology. . . . His notebooks record at least four distinct types of technical projects: his specific commissions from courtly patrons; his own technological “dreams,” or devices that were then impossible to build; his empirical and theoretical studies; and devices he had seen while traveling or had heard about from fellow engineers; as well as “quotations” from earlier authors, including Vitruvious.
(18) Perhaps the most distinctive aspect of Leonardo's career was hist systematic experimentation, evident in his notebooks especially after 1500. . . . Some objects of Leonardo's systematic investigations were gears, statics, and fluid flow.
Gutenberg's Universe
(19) The first several generations of printers as well as the best-known early technological authors were, to a surprising extent, dependent on and participants in late-Renaissance court culture.
(19-20) Movable type was also “first” developed in the Far East, centuries before Gutenberg. . . . The first truly movable type is credited to Pi Sheng (1041-48), who engraved individual characters in clay, fired them, and then assembled them on a frame for printing.
(20) Islam permitted handwriting the words of Allah on paper but for many years forbade its mechanical printing. The first Arabic-language book printed in Cairo, Egypt, did not appear until 1825.
(22) Gutenberg's principal inventions were the adjustable mold for casting type and a suitable metal alloy for the type.
(22) Printing traveled quickly.
(22-23) The printing press made a little-known German theology professor named Martin Luther into a best-selling author and helped usher in the Protestant Reformation. . . . Yet printers sensed a huge market for his work and quickly made bootleg copies in Latin, German, and other vernacular languages to fill it. It was said that Luther's theses were known across Germany in two weeks and across Europe in a month. . . . Eventually, Luther himself hailed printing as “God's highest and extremest act of grace, whereby the business of the Gospel is driven forward.”

Compare to Busa's praise of magnetic tape.

(23) The Protestant movement's emphasis on individuals' reading the Bible themselves required a massive printing effort. Whatever their personal believes, printers thus had material reasons to support Protestantism.
(23) Although it is tempting to see printers as proto-capitalists – owing to their strong market orientation and substantial capital needs – their livelihood owed much to the patronage and politics of the court system.
(25) Plantin's massive output suggests the huge scale of book production at the time. In the first fifty years of printing (1450s-1500) eight million books were produced in Europe. . . . This economy of scale sharply reduced the cost of books, which meant that one scholar could have at hand multiple copies from several scholarly traditions, inviting comparison and evaluation. Eisenstein writes, “Not only was confidence in old theories weakened, but an enriched reading matter also encourage the development of new intellectual combinations and permutations.” In this way, the availability of vastly more and radically cheaper information led to fundamental changes in scholarship and learning.

Print humanities were born. Compare to relative scarcity and then proliferation of electronic computing machinery.

Technology and Tradition
(26) Transfer of technology before the Renaissance could be hit-or-miss. Machines invented in one time, or place, might well need to be rediscovered or even reinvented. Indeed, something very much like this occurred, after the great technological advances of Song China (960-1279).
(26) Yet these pioneering Chinese technologies were not reliably recorded with the rigorous geometrical perspective that allowed Renaissance engineers to set down their ideas about the crucial workings of machines.

Importance of having technological tools to reflect upon technology.

(27) Eugene Ferguson, a leading engineer-historian, has brilliantly shown how quickly technical drawings might be corrupted, even in the West.
(28) In these terms a permanent and cumulative tradition in technology, enabled by the invention of printing and perspective, appeared first in central Europe's mining industry.
(29) Each of these three authors [Bringuccio, Agricola, Ercker] praised the values of complete-disclosure, precise description, and openness often associated with the “scientific revolution.” These books detailed the processes of mining, smelting, refining, founding, and assaying. Biringuccio and Agricola used extensive illustrations to convey the best technical practices of their time.

Value of open standards, technologies and licenses.

(31) The scientific revolution was also surprisingly dependent on printing technology and courtly patronage networks.
(32) The desires and dreams of Renaissance courts and city-states defined the character of the era's technology and much of the character of its culture.

Manovich two cultures. Consider microcomputer revolution as desires and dreams of late American capitalism.

2
Techniques of Commerce
1588-1740

(34) The age of commerce, anticipate in Spain and Portugal as well as in China and India, found its fullest expression during the seventeenth-century Golden Age of the Dutch Republic.
Technology and Trade
(37) The emergence of specialized ship designs in the Netherlands was another early signal that the Dutch understood how to bring technology and trade together in the pursuit of commerce.
(42) Impressed with how multiple-share ownership helped raise money and spread the risk of losses, the Dutch took the practice much further.
Creating Global Capitalism
(43) The Dutch – through their East India Company in the Pacific and West India Company in the Atlantic, coupled with the extensive trading in Europe and Africa – in effect created the first global economy.
(43) The commodity traders' guild began publishing weekly lists of prices in 1585. Within a few years, the Amsterdam commodity exchanges – for grain, salt, silks, sugar, and more – had surpassed their regional rivals and become a set of global exchanges.
(45) More to the point, tulip trading embodied several of the classic Dutch financial techniques, including futures contracts, commodity pricing, and multiple-share owndership.
(48) On the southeast coast of India and on the innumerable islands of what is now Indonesia, each of the trading countries sought to establish trading alliances; and when these alliances were betrayed, they tried unarmed trading “factories” (warehouse-like buildings where “factors” - traders – did business).

Interesting, unexpected etymology of factories.

(49) While the VOC [Verenigde Oostindische Compagnie] dealt with spices and cotton, the West India Company traded in slaves and sugar.

Little mention of the ethics of slave trade. See multimedia The Corporation. He is more interested in the difference between overall technological modes, ways of being, Tart's states, “major alterations in the way the mind functions” (1986, 4).

The Great Traffic”
(51-52) Dutch preeminence came through the targeted processing and selective reexporting of the traded materials. . . . Indeed, high wages, relatively low volumes, and high-quality production typified the traffics, in sharp contrast with early industrial technologies, which emphasized low wages, high volumes, and low-quality production.

Compare Misa's differentiation between Dutch precision and British sloppy massive scale to McConnell's differentiation between systematic engineering and gold rush programming styles.

(55) Not only had Dutch traders captured commercial control over many key raw materials, including Spanish wool, Turkish mohair yarns, Swedish copper, and South American dyestuffs; the “traffic” system had also erected a superstructure of processing industries that added value to the flow of raw materials. The Dutch conditions of high wages and labor scarcity put a premium on mechanical innovation, the fruits of which were protected by patents. Another economic role taken on by the Dutch state (at the federal, state, and municipal levels) was the close regulation of industry in the form of setting standards for quality and for the packaging of goods.
(57) While choosing, developing, and using technologies with the aim of creating wealth had been an undercurrent before, this era saw the flourishing of an international (if nonindividual) capitalism as a central purpose for technology. It is really a
set of wealth-creating technologies and techniques that distinguishes the Dutch commercial era.

Consider alongside his evaluation of Renaissance era technology. Does Misa apply Kuhn's methodology to technology?

3
Geographies of Industry
1740-1851

(59) Unprecedented growth in the cotton, iron, and coal industries during the decades surrounding 1800, culminating in the steam-powered factory system, powered a self-sustaining “take-off” in the British economy.
The First Industrial City: London
(65) Beer brewing affords a revealing window into industrial London while illustrating the links between industry and sanitation, consumption, and agriculture. . . . Reducing costs and increasing output – rather than enhancing quality, as in Dutch commerce – was the focus of technology in the industrial era.
(66) The competition between brewers to build ever-larger vats waned after 1814, however, when a 7,600-barrel vat at the Horse Shoe Brewery burst open and flooded the neighborhood, killing eight persons “by drowning, injury, poisoning by the porter fumes or drunkenness.”

An amusing fact.

(67) The porter brewers pioneered industrial scales of production and led the country in the capitalization of their enterprises.
(68) Brewers indirectly fixed a key term of measurement born in the industrial era, since Watt had the “strong drayhorses of London breweries” in mind when he defined “horsepower” at 33,000 foot-pounds per minute.
(69) These ancillary industries have not received the attention they deserve, for they are key to understanding how and why industrial changes became self-sustaining and cumulative.

Misa lays out opportunities for future scholarship, part of the value of this work.

(70) By the early nineteenth century perhaps half of all London pubs were tied to brewers through exclusive deliveries, financing, or leasing.
(73) By 1825 Maudslay and Bramah were among the London engineers hailed for their use of specialized machine tools to replace skilled handcraftsmanship.
Shock City: Manchester
(77) Early Arkwright machines were small, handcranked devices with just four spindles. The death blow to home spinning came when Arkwright restricted licenses for his water-frame patent to mills with 1,000 or more spindles. . . . Artkwright's mills – with their low wages and skills, their high-volume production of lower-grade goods, and their extensive mechanization – embodied core features of the industrial era.

In addition to ruthless protection of competitive advantage by restricting licenses: an early Microsoft?

(79) While the first generation of them had built textile machines and managed textile factories, the midcentury machine builders – the generation of London transplants – focused on designing, building, and selling machine tools.
(82) For Engels, Manchester was ground zero for the industrial revolution (he wrote specifically of “industriellen Umw
รคlzung”).
(82) His real object was to shock his readers with visceral portraits of the city's horrible living conditions.

Horrible living conditions.

(83) Marx, with no firsthand industrial experience of his own, took Engels' description of Manchester as the paradigm of capitalist industry. Neither of them noticed a quite different mode of industry forming in Sheffield.
Region for Steel: Sheffield
(84) Sheffield was internationally known as a center for high-quality steel and high-priced steel products. . . . Not Manchester-style factories but networks of skilled workers typified Sheffield's industry.

Like the idealized network of small businesses? But then corrupted by scale. Nice to see remediated in Wired magazine stories.

(86) It is crucial to understand that the factory system so important in Manchester was absent in Sheffield.
(87) Some firms did nothing but coordinate such “hire-work” and market the finished goods, at home or overseas. These firms had the advantages of low capital, quick turnover, and the flexibility to “pick and choose to fit things in with whatever you were doing.”
(87-88) In the latter part of the nineteenth century these large steel mills and oversize forging shops symbolized a second generation of Sheffield's heavy industry.
(91) Steam not only directly killed many grinders, through dangerous working conditions, but also indirectly brought the deaths of many who crammed themselves and their families into the poorest central districts of industrial cities.

The indirect danger of steam technology. Would realization of this kill bourgeois interest in Steampunk?

(91) Sheffield's dire sanitary conditions resembled those of London or Manchester for much the same reason: the city's densely packed population lacked clean water.
(92) The geographies of industry surveyed in this chapter – multidimensional urban networks in London, factory systems in Manchester, and sector-specific regional networks in Sheffield – clinch the argument that there were many “paths” to the industrial revolution.
(93) Workers in steam-driven occupations, whether in London, Manchester, Sheffield, or the surrounding regions, were less likely to be in the country, to eat fresh food, to drink clean water, and (especially if female) to be skilled and have reasonable wages.

4
Instruments of Empire
1840-1914

(97) To a striking extent, inventors, engineers, traders, financiers, and government officials turned their attention from blast furnaces and textile factories at home to steamships, telegraphs, and railway lines for the colonies.
Steam and Opium
(101) Accurately mapping the Ganges in the latter eighteenth century had been a necessary first step in transforming the vague territorial boundaries assumed by the company into a well-defined colonial state. To this end one could say that the first imperial technology deployed on the Ganges was James Rennell's detailed
Map of Hindoostan.
(102-103) The opium war began when China took determined steps to ban the importation of the destructive substance, and the British government, acting on the demand of Britain's sixty trading firms with business in China, insisted on maintaining free trade in opium and dispatched a fleet to China to make good its demands.
Telegraphs and Public Works
(104) In the industrializing countries of Western Europe and North America, telegraph systems grew up alongside railroads. Telegraph lines literally followed railway lines, since telegraph companies typically erected their poles in railroad right-of-ways.
(105) Telegraph lines were so important for imperial communication that in India they were built in advance of railway lines.
(107) Quick use of the telegraph saved not merely the British in Punjab but arguably the rest of British India as well. Most dramatic was that the telegraph made possible a massive troop movement targeted at the most serious sites of rebellion.
(108-109) By the time of the 1857 Mutiny, British rule in India had become dependent on telegraphs, steamships, roads, and irrigation works; soon to come was an expanded campaign of railway building prompted by the Mutiny itself. . . . The colonial government in India had no choice but to begin large-scale educational programs to train native technicians.
Railway Imperialism
(113) (Fig 4.4 World Leaders in Railways, 1899.)

Interesting graph for 1899 almost looks like USA today shaving graph.

(127) Even today one can discern a shadow of the imperialist era in railroad maps of North America (look carefully at Canada, the western United States, and Mexico), in the prestige structure of technical education, and in the policy preferences of the orthodox development agencies in the United States and Europe.
(127) In this respect, we can see that imperialism was not merely a continuation of the eras of commerce and industry; rather, to a significant extent, imperialism competed with and in some circumstances displaced industry as the primary focus of technologists.

5
Science and Systems
1870-1930

(128) By transforming curiosities of the laboratory into consumer products, through product innovation and energetic marketing schemes, science-based industry helped create a mass consumer society. A related development was the rise of corporate industry and its new relationships with research universities and government bureaus.
(129) In these same decades
technology took on its present-day meaning as a set of devices, a complex of industry, and an abstract society-changing force in itself.

Important for our definition of technology.

The Business of Science
(130) The chemical structures of these early dyes were unknown at the time. It was German chemists – based in universities and with close ties to industry – who deciphered their chemical structures and set the stage for a science-based industry.
(133) “Mass production methods which dominate modern economic life have also penetrated experimental science,” the chemist Emil Fischer state in his Nobel Prize lecture in 1902. “Consequently the progress of science today is not so much determined by brilliant achievements of individual workers, but rather by the planned collaboration of many observers.” Duisberg put the same point more succinctly: “Nowhere any trace of a flash of genius.”
(134) In World War I, popularly known as the chemist's war, chemists were directly involved in poison gas manufacture.
(135) The entanglement of the German chemical industry with the Third Reich also has much to do with the system-stabilizing innovation and the corporate and political forms needed for its perpetuation. . . . With all these heavy investments, Farben's executives felt they had little choice but to conform with Hitler's mad agenda after he seized power in 1933. Not Nazis themselves – one-forth of the top-level supervisory board were Jews, until the Aryanization laws of 1938 – they nevertheless became complicit in the murderous regime.
Flashes of Genius
(136) The singular career of Thomas Edison aptly illustrates the subtle but profound difference separating system-originating inventions from system-stabilizing ones.
(139) Edison wanted his electric lighting system to be cost competitive with gas lighting and knew that the direct-current system he envisioned was viable only in a densely populated urban center. Using Ohm's and Joule's laws of electricity allowed Upton and Edison to achieve these techno-economic goals.
(140) When Edison tested his system in January 1881 he used a 16-candlepower bulb at 104 volts, with resistance of 114 ohms and current of 0.9 amps. The U.S. standard of 110 volts thus has its roots in Edison's precedent-setting early systems.
Battle of the Systems
(143) Edison was wary of the energy losses of transformers, the high capital costs of building large AC stations, and the difficulties of finding insulators that could safely handle 1,000 volts.
(143) Arc lighting for streets, AC incandescent systems for smaller towns, AC motors for factories, and the pell-mell world of street railways were among the lucrative fields that Edison's diagnosis overlooked.
(144) In the mid-1880s Thomson turned his inventive efforts on incandescent lighting and AC systems. His other notable inventions include electric welding, street railway components, improved transformers, watt meters, and induction motors. These inventions were among the necessary technical components of the universal system of the 1890s.
Tenders of Technological Systems
(148) Edison fought it, Thomson denied it, and Insull embraced it: a new pattern of technological change focused on stabilizing large-scale systems rather than inventing wholly new ones.
(148) Industrial scientists and science-based engineers stabilized the large systems by striving to fit into them and, most importantly, by solving technical problems deemed crucial to their orderly expansion. Neither of these professions existed in anything like their modern form as recently as 1870.
(150) Industrial research became a source of competitive advantage for the largest firms, including General Electric, AT&T, and General Motors. . . . Independent inventors, formerly the nations leading source of new technology, either were squeezed out of promising market areas targeted by the large science-based firms or went to work for them solving problems of the companies' choosing.
(151) The industrial orientation of electrical engineering at MIT from around 1900 into the 1930s contrasts markedly with its more scientific and military orientation during and after the Second World War.
(155) Hazen's work on the “network analyzer” began with his 1924 bachelor's thesis under Vannevar Bush. Bush, a pioneer in analog computing, was working for [Dugald] Jackson's consulting firm studying the Pennsylvania-based Superpower scheme. . . . By 1929 the measuring problems were solved and GE's Doherty approved the building of a full-scale network analyzer.
(155) Built jointly by GE and MIT and physically located in the third-floor research laboratory in MIT's Building 10, the network analyzer was capable of simulating systems of great complexity.
(156-157) Synthetic dyes, poison gases, DC light bulbs, AC systems, and analog computers such as Hazen's network analyzer constituted distinctive artifacts of the science-and-systems era. . . . The most important pattern was the underlying sociotechnical innovations of research laboratories, patent litigation, and the capital-intensive corporations of science-based industry.
(157) A neat contrast can be made of the British cotton-textile industry that typified the first industrial revolution and the German synthetic dye industry and American electrical industry that together typified the second.
(157) The presence of the financiers, corporations, chemists, and engineers produced a new mode of technical innovation and not coincidentally a new direction in social and cultural innovation. The system-stabilizing mode of technical innovation - “nowhere any trace of a flash of genius” - was actively sought by financiers. . . . The system-stabilizing innovations, with the heavyweights of industry and finance behind them also created new mass-consumer markets for electricity, telephones, automobiles, household appliances, home furnishings radios, and much else.

6
Materials of Modernism
1900-1950

(158) Modernism in art and architecture during the first half of the twentieth century can be best understood as a wide-ranging aesthetic movement, floated on the deeper currents of social and economic modernization driven by the science-and-systems technologies.
Materials for Modernism
(160) The materials that modernists deemed expressive of the new era – steel, glass, and concrete – were not new.
(163) Glass through most of the nineteenth century was in several ways similar to steel before Bessemer. It was an enormously useful material whose manufacture required much fuel and many hours of skilled labor and whose application was limited by its high cost.
Manifestos of Modernity
(168) Critical to the development of the modern architectural style were the interactions among three groups: the Futurists in Italy, who gave modernism an enthusiastic technology-centered worldview; the members of de Stijl in the Netherlands, who articulated an aesthetic for modern materials; and the synthesis of theory and practice in the Bauhaus in Germany.
(171) Marinetti's provocative avant-garde stance, frank celebration of violence, and crypto-revolutionary polemics landed the Futurists squarely in the middle of postwar fascism.
(173) The task of the artist was to derive a style – or universal collective manner of expression – that took into account the artistic consequences of modern science and technology.
(176) The durable contribution of de Stijl, then, was not merely to assert, as the Futurists had done, that modern materials had artistic consequences, but to identify specific consequences and embed these in an overarching aesthetic theory.
Ironies of Modernism
(184-185) The Stuttgart exposition of 1927 was the first salvo in a wide-ranging campaign to frame a certain interpretation of modernism. It was to be rational, technological, and progressive; historical references and ornamentation were strictly forbidden. In 1932, the Museum of Modern Art in New York gave top billing to its “International Style” show, which displayed and canonized the preponderantly European works representing this strain of modernist architecture. . . . The influential teaching of Bauhaus exiles Gropius, Moholy-Nagy, and Mies van der Rohe in Boston and Chicago raised a generation of U.S.-trained architects and designers who imbibed the modern movement directly from its masters. In the 1950s, in architecture at least, the International Style, or Modern Movement, became a well-entrenched orthodoxy.
(186) The German government agency charged with rationalizing workshops and factories also worked closely with several women's groups to rationalize the household.
(189) In examining how “technology changes culture” we see that social actors, often asserting a technological fundamentalism that resonates deeply in the culture, actively work to create aesthetic theories, exemplary artifacts, pertinent educational ventures, and broader social and political movements that embed their views in the wider society.

Misa focuses on what Manovich calls cultural conventions, saying little even in the final chapters of technological aesthetics that Manovich attributes to the conventions of software.

7
The Means of Destruction
1936-1990

(190) No force in the twentieth century had a greater influence in defining and shaping technology than the military. . . . Lamenting the decline of classic profit-maximizing capitalism, industrial engineer Seymour Melman termed the new economic arrangement as contract-maximizing “Pentagon capitalism.” During these years of two world wars and the Cold War, the technology priorities of the United States, the Soviet Union, and France, and to a lesser extent England, China, and Germany, were in varied ways oriented to the “means of destruction.”
(191) Such promising technologies as solar power, analog computers, and machinist-controlled computer machine tools languished when (for various reasons) the military back rival technical options – nuclear power, digital computers, and computer controlled devices of many types – that consequently became the dominant designs in their fields.

An interesting position on technological determinism.

A War of Innovation
(192) It may seem odd to distinguish between the two world wars, linked as they were by politics and economics, but in technology the First World War was not so much a war of innovation as one of mass production.
(193) Not merely a military tactic,
blitzkrieg was more fundamentally a “strategic synthesis” that played to the strength of Germany's superior mobility technologies, especially aircraft and tanks, while avoiding the economic strain and social turmoil of a sustained mobilization.
(195) Germany had neither the enriched uranium, the atomic physicists, nor the governmental resources to manufacture an atomic bomb.
Turning the Whole Country into a Factory”
(195-196) If the First World War is known as the chemists' war owing to military use of synthetic explosives and poison gases, it was the Manhattan Project that denominated the Second World War as the physicists' war. . . . In reality, Los Alamos served as the R&D center and assembly site for the bombs. The far greater part of the project was elsewhere, at two mammoth, top-secret factory complexes in Tennessee and Washington State.
(196) After several governmental committees considered its prospects, the project came to rest in the Office of Scientific Research and Development, or OSRD, a new government agency headed by MIT engineer Vannevar Bush.

Bush who get so much attention in digital media studies.

(197) Although the point is not frequently emphasized, it was entirely fitting that Roosevelt assigned the construction phase of the bomb project to the Army Corps of Engineers and that the Army assigned command over the Manhattan Engineering District to Brigadier General Leslie Groves, who had been the officer in charge of building the Pentagon complex.
(198) The crucial task at Oak Ridge was to produce enough enriched uranium, somewhere between 2 and 100 kilograms, no one knew precisely how much, to make a bomb.
(204) Many commentators, even Eisenhower and Churchill, miss the crucial point that the two bombs dropped on Japan were technologically quite distinct: the Hiroshima bomb used Oak Ridge's uranium while the Nagasaki bomb used Hanford's plutonium.
(206-207) One hesitates to put it this way, but the two bombs dropped on Japan appear to have been “aimed” also at the U.S. Congress. After all, there were two hugely expensive factories that needed justification. . . . Bohr's observation that the atomic project would transform “the whole country into a factory,” true enough in the obvious physical and organizational sense, may also be insightful in a moral sense as well.
(208)
Nautilus, it turned out, was a precedent for more than just the U.S. Navy, which in time fully matched the other military branches with its nuclear-powered submarines capable of launching nuclear missiles.
(210) The enduring legacy of the Manhattan Project above and beyond its contribution to the atomic power effort was its creation of a nuclear weapons complex that framed years of bitter competition between the United States and the Soviet Union.
(210) The cost from 1940 to 1986 of the U.S. nuclear arsenal is estimated at $5.5 trillion. No one knows the fair dollar cost of the former Soviet Union's nuclear arsenal, but its currently crumbling state – nuclear technicians have in effect been told to find work elsewhere, while security over uranium and plutonium stocks is appallingly lax – constitutes arguably the foremost danger facing the planet today.
Command and Control: Solid-State Electronics
(211) Yet, together, the massive wartime efforts on radar, proximity fuzes, and solid-fuel rockets rivaled the atom bomb in cost. . . . Even as its radar aided the Allied war effort, the Rad Lab [Radiation Laboratory at MIT] sowed the seeds for three classic elements of the Cold War military-industrial-university complex: digital electronic computing, high-performance solid-state electronics, and mission-oriented contract research.
(211-212) Vacuum tubes were sensitive only to lower frequency signals, so when the radar project's leaders decided to concentrate on the microwave frequency (3,000 to 30,000 megahertz), they needed an electronic detector that could work in these very high frequencies. . . . Much of the solid-state physics done during the war, then, focused on understanding these semiconductor materials and devising ways to purify them.
(213) In the transistor story, as in that of the Shippingport nuclear reactor, we see how the tension between military and commercial imperatives shaped the emergence of a technology that today is fundamental to our society.
(214) Indeed, instead of classifying transistors, the armed services assertively publicized military uses for them. . . . Each [Bell System] licensee brought home a two-volume textbook incorporating material from the first symposium. The two volumes, composing
Transistor Technology, became known as the bible of the industry. They were originally classified by the government as “restricted” but were declassified in 1953. . . . A third volume in the textbook series Transistor Technology resulted from a Bell symposium held January 1956 to publicize its newly invented diffused base transistor. . . . For several years Bell sold these high-performance diffused transistors only to the military services.
(215) The Army Signal Corps also steered the transistor field through its “engineering development” program, which carried prototypes to the point where they could be manufactured.
(215) Bell Laboratories had not forgotten its telephone system, but its commercial applications of transistors were squeezed out by several large high-priority military projects.
(216) The integrated circuit was also to a large degree a military creation.
(216-217) Across the 1950s and 1960s, then, the military not only accelerated development in solid-state electronics but also gave structure to the industry, in part by encouraging a wide dissemination of (certain types of) transistor technology and also by helping set industrywide standards. . . . These competing demands probably delayed the large-scale application of transistors to the telephone system at least a half-dozen years (from 1955 to the early 1960s).
Command and Control: Digital Computing
(217) Code-breaking, artillery range-finding, nuclear weapons designing, aircraft and missile controlling, and antimissile warning were among the leading military projects that shaped digital computing in its formative years, from the 1940s through the 1960s.

Impact of military agenda on digital computing.

(219) Forrester wanted Whirlwind to become another megaproject like the Radiation Laboratory or Manhattan Project.
(221) At the center of this fantastic scheme was Forrester's Whirlwind, or more precisely fifty-six of his machines. . . . With participation in SAGE, IBM gained a healthy stream of revenues totaling $500 million across the project's duration. Fully half of IBM's domestic electronic data-processing revenues in the 1950s came from just two military projects: SAGE and the “Bomb-Nav” analog computer for the B-52 bomber.
(221) As important as this revenue stream was the unparalleled exposure to state-of-the-art computing concepts and the unconstrained military budgets that permitted the realization of those concepts.
(222) Even though the commercial success of IBM's System 360 made computing a much more mainstream activity, the military retained its pronounced presence in computer science throughout the 1960s and beyond. . . . The IPTO [Pentagon's Advanced Research Project Agency Information Processing Techniques Office] was far and away the nation's largest funder of advanced computer science from its founding in 1962 through the early 1980s. . . . Among the fundamental advances in and applications of computer science funded by the IPTO were time-sharing, interactive computer graphics, and artificial intelligence. J.C.R. Licklider, head of the IPTO program in the early 1960s, also initiated work on computer networking that led, after many twists and turns, to the Internet.

Bush, Licklider, Engelbart.

(223) A 1964 RAND Corporation report, “On Distributed Communications,” proposed the theoretical grounds for a rugged, bombproof network using “message blocks” - later known as “packet switching” - to build a distributed communications system. . . . These concepts became the conceptual core of the Internet.
(223) Through the military-dominated era there was an unsettling tension between the West's individual-centered ideology and its state-centered technologies.
(224) Together, these military endeavors were not so much an “outside influence” on technology as an all-pervading environment that defined what the technical problems were, how they were to be addressed, and who would pay the bills. While closed-world, command-and-control technologies typified the military era, the post-Cold War era of globalization has generated more open-ended, consumer-oriented, and networked technologies.

8
Toward Global Culture
1970-2001

(227) Whatever the economic and political consequences of globalization, the threat of cultural homogenization concerns many observers.
(227) While mindful of the possibilities of convergence, I believe there is greater evidence for a contrary hypothesis.
(229) The “divergence hypothesis” is also consistent with what we have learned from earlier eras.
The Third Global Economy
(229) Our present-day global economy is not the first or second global economy we have examined in this book, but the third. The first was in the era of commerce.
(229) A second global economy developed in the 1860s and lasted until around the First World War, overlapping with the era of imperialism.
(231) Since around 1970 there has been a resurgence of global forces in the economy and in society, but who can say how long it will last.
Fax Machines and Global Governance
(232) One might say that in the United States the military market displaced the consumer market, while in postwar Japan it was the other way around. The structure of the global economy can in part be traced to the different paths taken by each nation's electronics industry.
(234) The CCITT, or Comite Consultatif International Telegraphique et Telephonique, was the leading international standards-setting body for all of telecommunications beginning in the 1950s. Its special strength was an remains standards setting by committee.
(235) It was CCITT's success with the 1980 standards that made facsimile into a global technology – and relocated the industry to Japan. . . . The achievement of worldwide standards, digital compression, and flexible handshaking, in combination with open access to public telephone systems, created a huge potential market for facsimile.
(236) This network of students and teachers, along with some journalists and government officials, is notable not only for creatively using fax technology but also for explicitly theorizing about their culture-making use of technology.
(236) The idea of using fax machines for building European identity and youth culture originated with the Education and Media Liaison Center of France's Ministry of Education, which was in the middle of a four-year project to boost public awareness of telematics and videotext. (France's famous Minitel system came out of this same context of state support for information technology.)
McWorld or McCurry?
(238) “McWorld” epitomizes the cultural homogenization and rampant Americanization denounced by many critics of globalization. “McDonaldization” refers to a broader process of the spread of predictability, calculability, and control – with the fast-food restaurant as the present-day paradigm of Max Weber's famous theory of rationalization.
(240) The presence of McDonald's in the conflict-torn Middle East is good news to Tom Friedman, the author of
The Lexus and the Olive Tree (1999). In his spirited brief on behalf of globalization, Friedman frames the “golden arches theory of conflict prevention.”
(245) McDonald's corporate strategy of localization not only accommodates local initiatives and sensibilities but also, as the company is well aware, blunts the arguments of its critics.
Internet Culture
(249) Overall, we can discern three phases in the Internet story: the early origins, from the 1960s to mid-1980s, when the military services were prominent; a transitional decade beginning in the 1980s, when the National Science Foundation became the principal government agency supporting the Internet; and the commercialization of the Internet in the 1990s, when the network itself was privatized and the World Wide Web came into being.
(250) The internet conception resulted from an intense collaboration between Vinton Cerf, a Stanford computer scientist who had helped devise the ARPANET protocols, and Robert Kahn, a program manager at ARPA. In 1973 they hit upon the key concepts – common host protocols within a network, special gateways between networks, and a common address space across the whole – and the following year published a now-classic paper, “A Protocol for Packet Network Intercommunication.” Although this paper is sometimes held up as embodying a singular Edisonian “eureka moment,” Cerf and Kahn worked very closely for years with an international networking group to test and refine their ideas.
(254) A good example of how the Internet gained its seemingly effortless “global” character is the so-called domain-name system, or DNS. . . . With the spread of the domain-name system, any single user can be addressed with on simple address. More important, the DNS established an address space that is massively expandable and yet can be effectively managed
without any single center.
(255) The Web is, at least conceptually, nothing more than a sophisticated way of sending and receiving data files (text, image, sound, or video).
(257) From the start, Berners-Lee built in to the Web a set of global and universal values. These values were incorporated into the design at a very deep level.
(257) The second goal, dependent on achieving the first goal of human communication through shared knowledge, is that of machine-understandable information.
(258) These examples – worldwide financial flows, fax machines, McDonald's, and the Internet – taken together indicate that globalization is both a fact of contemporary life and a historical construction that emerged over time.
(259) Indeed, the certainty during the 1990s that globalization would continue and expand, seemingly without borders, ended with the attacks on 11 September 2001. Whatever one makes of the resulting “war on terrorism,” it seems inescapable that the nation-state is, contrary to the globalizers' utopian dreams, alive and thriving as never before. . . . A national security-oriented technological era may be in the offing. It would be strange indeed if the September 11
th attackers – acting in the name of antimodern ideologies – because of the Western nations' national security-minded and state-centered reactions, brought an end to this phase of global modernity.

Misa suggests a post-globalization era resulting from the war on terror.

9
The Question of Technology

Science and Economics
(261) However, the centrality of science to technology is often overstated. Scientific theories had little to do with technological innovation during the eras of industry, commerce, and courts.
(263) Much of the frank resentment today aimed at the World Bank, International Monetary Fund, and World Trade Organization stems from their conceptual blindness to the negative aspects of technology in social and cultural change.
Variety and Culture
(265) A more subtle and yet more pervasive example of technology's interactions with the goals and aims of society resides in the
process of technical change.
(267) Power does flow from the end of a gun; Europeans' deadly machine guns in the colonial wars proved that point. But there is an important dimension of power that resides in things, in the built world, and in the knowledge about that world that people have access to or are excluded from.
(267) The conceptual muddle surrounding these questions of technology transfer can be cleared up with Arnold Pacey's useful notion of “technology dialogue,” an interactive process which he finds is frequently present when technologies successfully cross cultural or social barriers.

Pacey. How about Feenberg?

Displacement and Change
(268) Displacement occurs when a set of technology decisions has the effect of displacing alternatives or precluding open discussion about alternatives in social development, cultural forms, or political arrangements.
(269) For roughly fifty years, a certain technical perspective on modern architecture displaced alternative, more eclectic approaches.
(269) Displacement, then, is how societies, through their decisions about technologies, orient themselves toward the future and, in a general way, direct themselves down certain social and cultural paths rather than other paths.
(270) Can technologies be used by nondominant actors to advance their alternative agendas?
(271) A second reason for looking closely at the technology-power nexus is the possibility that
non-dominant groups in society will effectively mobilize technology.
(272) The new diagnosis coming from ecological modernization is that dealing effectively with the environmental crisis will require serious engagement with technology.
Disjunctions and Divisions
(273) Nevertheless, it is a mistake to follow the commonplace conviction that technology by itself “causes” change, because technology is not only a
force for but also a product of social and cultural change.

Misa's main point, countering a naรฏve perspective of technological determinism. Also need to broaden understanding of how modern technology interacts with other cultures.

(274) This internal disjunction is compounded by the external division between the Moslem-Arab worldview and the Western worldview, made evident by the September 11th attacks.
(275) It is an especially pressing concern that scholars and citizens in the West know all too little about the details and dynamics of how modern technologies are interacting with traditional social forms. This is true not only for the Middle East, Asia, and Africa but also for native peoples in North and South America.


Misa, Thomas J. (2004). Leonardo to the internet: Technology & culture from the Renaissance to the present. Baltimore: Johns Hopkins University Press.


Misa, Thomas J. Leonardo to the Internet: Technology & Culture from the Renaissance to the Present. Baltimore: Johns Hopkins University Press, 2004. Print.

Friday, June 17, 2011

Notes for ECrit

Notes for Marcel O'Gorman E-Crit: Digital Media, Critical Theory and the Humanities

INTRODUCTION

New Media Calls for New Majors

(xiii) E-Crit is an interdisciplinary program that combines English, Communications, Computer Information Systems, and Fine Art.

(endnote 1) the term 'new media' is historically determined.

(xiii-xiv) E-Crit was born out of the Frankfurt School / poststructuralism sensibility of two of my colleagues and their students, who positioned resistance and vigilant critique as the cornerstones in a new media studies curriculum that opposes the compartmentalization of knowledge. .. The goal, then, is to position discourse in such a way that it can play a formative role in reshaping the academic apparatus.

(xiv) The question of the marketability of the humanities is central to this book, and I draw heavily on the work of John Guillory, whose Cultural Capital provides a realistic analysis of the state of the humanities in techno-bureaucratic culture - a culture whose 'fetishization of “rigor”' has led to a veritable crisis in the humanities' (ix). .. I think it's time to take a harder look at how disciplines rooted in the study and preservation of printed texts can remain relevant and viable in a digital, picture-oriented culture.

(xv) One way of explaining this sense of disappointment in the 'failure of theory' is to investigate how attempts to apply deconstruction toward the materialization of revolutionary scholarly practices have been largely ineffectual. .. somewhere in the early 1990s, the major tenets of deconstruction (death of the Author, intertextuality, etc.) were displaced into technology, that is, hypertext. Or to put it another way, philosophy was transformed, liquidated even, into the materiality of new media. This alchemical transformation did not result in the creation of new, experimental scholarly methods that mobilize deconstruction via technology, but in an academic fever for digital archiving and accelerated hermeneutics, both of which replicate, and render more efficient, traditional scholarly practices that belong to the print apparatus.

Recall the readings for A Companion to Digital Humanities.

(xv-xvi) Shaping a new apparatus also involves more than a scholarly remediation of printed texts. .. A large-scale institutional change of the type I am envisioning can only come about with a careful and deliberate implementation that targets not only discourse of scholars, but that of students and classrooms (including ergonomics), administrators and buildings (including architecture), campuses and cities (including urban planning).

(xvi) A large portion of this study involves an attempt to create a new method of scholarly research - which I have dubbed hypericonomy - that is more suitable to a picture-oriented, digital-centric culture. .. E-Crit is a glimpse at what 'knowledge production' might look like, after deconstruction, in an age of computer-mediated communication.

1. The Canon, the Archive, and the Remainder: Reimagining Scholarly Discourse

The Remainder: Structural, Material, Representational

(4) All of the linguistic tools that account for the poetics of this study - a poetics I have called hypericonomy - might be classified under what Jean-Jacques Lecercle has termed 'the remainder' of language. Puns, anagrams, false etymologies, macaronics, and metaphor of all breeds fall into this repressed category, this 'other of language' (99). More importantly here, the remainder is the 'other' of academic or scholarly language. It is deemed as nonsense or rubbish, classified as 'cute' or juvenile, the stuff of children's literature, fantasy, and folklore, and lately, as unstylish poststructural writing.

(4) E-Crit attempts to take this teratological science [study of monsers] a step further by viewing the 'remainder' not only as a means of illuminating conventional language, but as a language with a revolutionary potential of its own. If the remainder is the hidden or repressed, monstrous 'other' of the conventional academic discourse, then those who seek to change that conventional discourse might engage in a science of anagnorisis; that is, a science of invention and knowledge-production that depends on a face-to-face encounter with the monster.

(4) Like the relationship between common sense and nonsense, the relationship between scholarly academic language and the remainder is that of master and slave.

(5) By speaking of the remainder in these political terms, as a case of exclusion, repression, and otherness, I am hoping to supplement John Guillory's important study of canon formation in Cultural Capital. .. Guillory suggests that the canon is nothing more than a product of scholarly imaginary, and that the debate points essentially to a crisis in the humanities wrought by a fetishistic clinging to traditional conceptions of literature and scholarship. This is the fate of literary studies in universities dominated by a techno-bureaucratic culture that values 'rigor' above all else.

(5) While Guillory focuses primarily on the permutations of the category of 'literature,' this study is more concerned with the category of 'academic writing,' which is the primary vehicle for mediating the 'imaginary structures' of higher education. As Guillory suggests, the ideology of literary tradition that is at the root of the canon debate is always 'a history of writers and not of writing' (63). Guillory is interested, therefore, in how writing becomes literature. This study, however, asks how writing becomes scholarship, and it does so not only by examining the practices and structures of the academic apparatus, but also by imagining a new method of scholarly writing (hypericonomy) and a new curricular strategy (Electronic Critique).

(5-6) Four years ago, I submitted a hypertext essay, 'A Provisional Treatment for Archive Fever,' to a Web-based humanities journal. .. The journal referees, however, were not so enthusiastic upon first reviewing the hypertext, and the work was not accepted for publication.

(6) Since the essay I submitted to the journal was non-traditional from an academic prespective, the referee's comments, as reproduced here, should act as a sort of warning for the inventors of new modes of academic discourse, namely, this is what to expect when you submit 'remainder-work' to a traditional journal. .. The first type of remainder is taken directly from Lecercle and Deleuze/Guattari, and it relates to the rhizomatic principle of structure disdained by traditional, rigorous humanities scholars: the structural remainder. The second type is more grammatological in nature; it concerns the repressed technological element of humanities scholarship, and the resistance of scholars to certain communications technologies: the material remainder. The third type of remainder, which is closely allied to the second, accounts for a great deal of the theoretical writing in this book: the representational remainder of scholarly discourse, which might also be termed the pictorial remainder.

Tree vs. Roots - Structural Remainder

(7) In punceptual writing, data is organized according to the logic of the pun, the most base and primitive species of remainder; punning is what makes the work of Marshall McLuhan, for example, both brilliant and annoying.

(7) the puncept can also be pictorial.

(8) Hypericonomy emulates the structural characteristic of the rhizome by foregrounding the remainder in scholarly research and writing. The pun, then, even though it may be deemed as 'cute' or 'confusing' to those who are unaccustomed to its rhizomatic ways, can be used as a structuring tool in a scholarly research program.

Print vs. Electronic - Material Remainder

(8) My submission was, to borrow Lev Manovich's term, an attempt to write (in) the language of new media. The suggestion that it should be 'put into conventional essay form ... before it goes deconstructive' is indicative of the referee's oppressive print-centricity.

(9) As I will argue throughout this book, it is a definitive characteristic of traditional scholars to reject any mode of discourse that diverges from the path of the conventional, hierarchical essay format.

(11) However, although digital technologies provide us with the most effective archiving tools to date, archiving should not be the defining task of digital humanities scholars. In these archival projects, scholars are using only a portion of the potential of new media; it is the portion which most appeases: (a) their nostalgia for a print-oriented culture; and (b) the demands of a digital-oriented, techno-bureaucratic culture that values predictable techno-scientific methods (e.g., archiving) over interpretation and, most of all, invention.

Text vs. Picture - Representational Remainder

(11) from the conventional point of view, pictures are entities to be 'added' to an essay or lesson, and not inherent or repressed elements of the processes of writing, reading, and learning. In this particular case, pictures are seen as elements which might change 'readings of canonical texts,' but not as elements which might altogether change the processes of reading and writing.

(12) On several occasions in this study, the term 'heuretics,' borrowed from Gregory Ulmer, will be used to describe a supplementary or alternative logic to hermeneutic discourse, a way out of the hermeneutic circle. In short, heuretics provides us with a logic of invention, 'a form of generative productivity of the sort practiced in the avant-garde' (Ulmer 1994a: xii). What I am attempting to outline in this book is a heuretic approach to discourse that draws on the suggestive power of pictures as a means of generating new modes of writing suitable to an image-oriented culture. .. The purpose of this foregrounding, however, is not to interpret the picture, or to offer an authoritative reading of it in the conventional sense, but to draw on the picture as a tool for invention, as a generator of concepts and linkages unavailable to conventional scholarly practices. This is how hypericonomy breaks out of the hermeneutic circle.

(12) To understand pictures as generators is to view them much in the same way as Lecercle describes the pun and other forms of metaphor, all of which fall into the category of the remainder, which Lecercle describes as instances of 'diachrony-within-synchrony.' .. The notion of 'diachrony-within-synchrony' points to the capacity of the remainder to interrupt our synchronic understanding of a word by invoking a diachronic association.

(13) it may be possible to capture or at least re-create this sense of schizo 'indirection' [where all possible meanings of a metaphorical phrase are present at once] before it is funnelled, before it is transformed into common sense.

(14) I would like to believe that one purpose of hypericonomy is to provoke or mimic the fluidity of creative thought and crystallize it, transforming delire or schizophrenia into a theory and a discursive practice.

The Good Sense of Nonsense

(14) sense, according to Deleuze, is present in every utterance, even in so-called nonsense, which should not be understood as lack of sense (or direction) at all, but as an overproduction of sense (indirection=too many directions at once, no single direction). .. It is in this sense that the language of new media, with its multi-discursive, diachronic structure, is nonsensical.

This seems like a special kind of intellectual, intentional nonsense rather than the ramblings of a drug-crazed, street corner schizophrenic. I think of a certain story by Paul Auster..

(16) The who of good sense is obvious, then, and the why might be answered by pointing to the history and tradition of scholarly discourse, with its roots in early print technology and the structure of the first universities. But there are other, more political, more confrontational answers to this why of scholarly discourse, which have to do with the unlikely coupling of traditionalists who seek to maintain a certain complacent, bourgeois, academic status quo, and techno-bureaucratic university administrators seeking to run a viable business.

2. The Search for Exemplars: Discourse Networks and the Pictorial Turn

Hypericon

(19) Eye Socket, with its cyborgian electrical outlets, provides us with a fine mnenomic device. Consider the Gibb picture above, then, and the limen, the enchanted looking glass, between a network of discourses and the discourse of networks that I am developing here. In this context, Eye Socket has now become a 'hypericon': 'a piece of movable cultural apparatus, one which may serve a marginal role as illustrative device or a central role as a kind of summary image ... that encapsulates an entire episteme, a theory of knowledge' (Mitchell 1994: 49).

Discourse Networks

(20) This is the episteme of what Friedrich Kittler has called the Republic of Scholars, a republic entirely committed to 'endless circulation, a discourse network without producers or consumers, which simply heaves words around' (Kittler 1990: 4). It is this form of scholarly discourse, this discursive circuit, which renders itself visible through the production of banal treatises and dissertations.

(21) To put it in the bluntly economic terms of Katherine Hayles, we are in a situation of 'too many critics, too few texts,' and the result has not been innovation, but repetition, recycling, and reduction.

(endnote 5) a definition of heuretics.. 'Without relinquishing the presently established applications of theory in our disciplines (critique and hermeneutics), heuretics adds to these critical and interpretive practices a generative productivity of the sort practiced in the avant-garde' (Ulmer 1994a: xii).

(21-22) a traditional scholar might spurn Kittler's proposal altogether, and protest its lack of historical rigor. But an alternative reaction - the one I am supporting here - would be to recognize Kittler's methodology as a new way of conducting humanities research, a new method in which a specific scene or textual image (e.g., Faust's sigh, Gibb's Eye Socket, Las meninas) acts as a hypericon, a generative, multi-directional passageway, onto a research project.

(23) the new era demands thinking about the ways in which new media have impacted, and will continue to impact, literary theory. For this reason, Friedrich Kittler, an electrical engineer turned critical theorist, serves as an excellent exemplar of the type of 'fresh thinking' demanded by the new era. Although it's likely that most humanities scholars would shun the idea that in their spare time they should 'pick up the soldering iron and build circuits' (quoted in Griffin 1996: 731).

The old Marxist fantasy of the trans-specialist, jack-of-all-trades. Unfortunately, electronics seems to be a discipline born from print culture and abstract logic, requiring a great deal of learning to grasp. All the same, I like using the references to circuits and relays as electronic metaphors, or, better, hypericons, so that the trigger-image is from a circuit schematic.

(23-24) Kittler draws on a single scene as an inlet into a network of discourses that circulate through the text. .. The text is not something to critique or comment on, but a generator of theories.

Kittler, then, does not write about Faust or about Goethe; he writes with Goethe, just as he writes with Foucault, Lacan, and Derrida. This tendency of Kittler to write with several theorists at once is, according to David Wellberry, an innovation in scholarly method. .. It is by means of this writing with that Kittler departs from the discourse of the Republic of Scholars.

But there are plenty of examples of this style, such as Plato's Symposium, where Plato takes on the identity of each speaker whose name is significant of his psychic framework.

The Republic of Scholars

(24) I am writing under the aegis of electracy (elec-trace-y). .. This Republic of Scholars, with its faith in transparent language, scientific proof, and the text-based, linear, sequential essay, provides the methodology and discourse for all who wish to maintain affiliation within the academic apparatus.

(endnote 8) Gregory Ulmer, who coined the term 'electracy,' explains it in the following manner: 'In the history of human culture there are but three apparatuses: orality, literacy, and now electracy. We live in the moment of the emergence of electracy, comparable to the two principle moments of literacy (The Greece of Plato, and the Europe of Galileo)'.


(25) Citing Michael Taussig, Ray suggests that 'what is at stake with such questions is “the issue of graphicness,” a quality generally disdained by materialist critics who associate it with the enemies - commerce and mystification' (9).

(25) If indeed we are in the thralls of a hypervisual, picture-oriented, digital age, then a scholarly discourse suitable to such an age must accept not only poststructuralism as prior knowledge, but also the fact that technologies of representation have induced a pictorial turn in our culture, subsequently placing us on the threshold of a new subjectivation that we are still in the process of understanding.

Picture Theory

(26) imagine the various intersections, linkages, and lines of flight incited by the following plotting of points on a graph: from Jonathan Crary's historical evalution of 'Scopic Regimes' to W.J.T. Mitchell's identification of a 'pictorial turn'; from E.H. Gombrich's theory of the 'mental set' to Rosalind Kraus's 'optical unconscious.'

You have to be familiar with them in order to visualize the graph.

(26) There is no print-based artifact so accommodating that it could represent the complex network of possibilities posed by the intersection of the various texts that I wish to gather here under the aegis of picture theory.

(endnote 10) The graphic elements of Hayles's text do succeed in pointing to the materiality of her subjects of investigation, but my goal is to have the graphicness drive the production of the text itself. I am attempting to invent a mode of discourse in which the images themselves are theories, and not merely reminders of the materiality of discourse.

(27) Perhaps a more accessible way to visualize such a model is to imagine the non-linear, graphic-rich environment of the Web. Would it complicate things to suggest that, if this essay were a hypertext, its explanation of picture theory would span various nodes?

(29) Gombrich's crucial theoretical contribution to this study is the 'mental set,' a subjective 'horizon of expectation' (60) that guides an individual's optical impressions. Vision, in Gombrich's model, is a form of projection, and each individual possesses mental schemata against which s/he attempts to match the shapes in her/his field of vision. Thus, that which 'we call “reading” an image,' Gombrich suggests, 'may perhaps be better described as testing it for its potentialities' (227).

(30) There are, however, certain methods of classification within 'the filing systems of our mind' (Gombrich 1969: 105) that are not culturally determined, but that are entirely personal and subjective, the result of an individual's psychic experience. These mental images may not even be recognized by the individual herself, although they may have radical effects on the way she organizes visual stimuli.

(31) In order to withstand the image bombardment being deployed in the current mediascape, readers and viewers must possess a means of filtration that will allow them to consciously organize visual information and arrange it into manageable patterns. But in order to develop such an apparatus, it seems that a reader must dismiss the notion of transparent communication, and accept the impossibility of a universal perspective, or of 'a purely responsive act of reading - an act which will decode the transmission in precisely the way that the sender desires' (McGann 1991: 37).

Recall Hayles' attack on Shannon's model of communication where neither the sender nor the receiver play any role in massaging the medium or the message. Of course this model exists for the sake of emphasizing the external, material, technological components of the system that is the object of electrical engineering.

Imagetext and the Sister Arts

(33) The image, he [Barthes] insists, is always subordinated to the message imposed upon it by the written text, whether it is a caption, a headline, or some other written form.

(34) Despite the apparent ingenuousness of Magritte's painting [La trahison des images], Foucault identifies it as a dialectical enigma, a scene of seduction into which the viewer is irresistibly drawn.

(34) When, years after painting La trahison des images, Magritte moved his pipe and caption to a blackboard mounted on an easel, it is as if he was directly targeting the academic apparatus, taunting it with a form of discourse which it could not possibly accommodate.

(34) According to W.J.T. Mitchell, Foucault's short essay ['Ceci n'est pas une pipe'] demonstrates that La trahison des images is not only a metapicture, a picture about pictures that instructs us on the 'infinite relation' between image and text; it is also a hypericon that 'provides a picture of Foucault's way of writing and his whole theory of the stratification of knowledge and the relations of power in the dialectic of the visible and the sayable' (1994: 71).

(37) what I am seeking in the development of a new mode of academic discourse lies between Drucker's 'serious' theoretical work and her artists's books.

(39) At the beginning of each chapter of The Optical Unconscious, we find an icon - a detail from a painting, drawing, or photograph - that serves as the title. The title of each chapter, then, is represented by a pictorial mise en abyme, a conceptually - and ideologically - loaded image that captures the central argument of each chapter.

Remediating Theory

(41) I would argue that Krauss's iconic methodology would be easily adaptable to an electronic environment, where the 'icon' appears as frequently as the written word and imagetexts are the most frequent mode of representation.

Likewise imagine starting with an image of the ground symbol in a relay driver circuit that itself is only a small part of the schematic diagram of a large circuit board, which is finally itself just one part of a device such as a pinball machine.

(42) In Heuretics, Gregory Ulmer suggests that electronic media might be used to invent a 'hyperrhetoric,' a rhetoric 'that replaces the logic governing argumentative writing with associational networks' (18). .. My approach here is to put poststructuralism to work as the software required for inventing new theories, new modes of discourse, new poetics capable of short-circuiting the discourse of the Republic of Scholars.

(42-43) There are no Microsoft software bundles that tell us how to invent a new scholarly methodology. .. I wonder what Blake would have done if his desktop was equipped not with burins, acids, and copper plates, but with a Mac (or would Blake prefer a PC?), Photoshop, Netscape, and Flash?

O'Gorman continues to employ electronic metaphors, but somewhat carelessly: short-circuiting is a destructive operation; shunting is better. And his appeal to Microsoft/PC, Macs, and commodity software reflects a consumer attitude toward electronic technology. He needs to take up the soldering iron!

3. The Hypericonic De-Vise: Peter Ramus Meets William Blake

Books for Little Boys: Thomas Murner and Peter Ramus

(46) Agricola's De inventione dialectica (1479) responds to information overload by providing a discourse on method that instructs readers in the ways of logical organization. Agricola's method, a form of pre-Renaissance new media, involves placing 'things' under their proper headings, and distributing then in an external writing space rather than containing them entirely in memory.

(47-48) The gender and youthfulness of MA students during the Renaissance may go a long way in explaining the methodologies and pedagogical materials used by their instructors. .. [Thomas] Wilson's sly attempt to engage students in a virtual foxhunt [in his 1553 The Rule of Reason] may well be one of the very first samples of an educational 'video game.'

Of course, there are more appropriate precedents to the tradition of teaching with visual aids, such as Thomas Murners Chartiludium logice or logical card game (1509). Murner provides young students with a woodcut set of iconic flashcards representing the elements of logical discourse. .. these texts document a shift from strictly mnemonic, internalized practices to methodologies that are reliant upon the external spatialization of thought.

(48-49) For Ramus, method referred specifically to the 'orderly pedagogical presentation of any subject by reputedly scientific descent from “general principles” to “species” by means of definition and bipartite division' (Ong 1958: 30). .. According to Ong, Ramus was simply responding to the need of universities to corporatize knowledge delivery.

(49) New media have done little to alter the practices of humanities scholars, except perhaps by accelerating - by means of more accessible databases - the rate at which hermeneutics can be performed. .. Just as Ramus's scholarly method had a great influence in shaping a print apparatus that has persisted for five centuries, might it not be possible to invent scholarly methods to shape the digital apparatus?

Rather than allow the default to prevail.

(49) Ulmer cites Andre Breton's co-option of Freud to invent surrealism. Since my goal is to invent a mode of discourse that challenges Ramist, print-based methods, I might very well co-opt a pre-Ramist methodology and ask the following question: Is it possible to do with Thomas Murner what Andre Breton did with Freud?

Books for Little Boys and Girls: William Blake

(55) Although such a 'booby-trap' beginning, as Geoffrey Summerfield calls it, would cause bells to go off in the head of the least satirically minded reader, this may not be the case if the reader is a child. An Island in the Moon, like many other satirical texts, from Gulliver's Travels to Animal Farm, works on a variety of levels, at least some of which can be appreciated by children. This concern for couching political and cultural critique in a form suitable for both children and adults is yet one more reason why Blake may have chosen to write children's books.

Digitization in the Age of Blake

(57) Unlike other Romantics, such as Rousseau, Blake was not an outright anti-technologist; his critique targets the mechanistic techniques tied into the apparatus, and not the apparatus itself. Rather than rejecting the apparatus of print production, then, he chose to invent his own, based on techniques that subverted the dehumanizing potential of mechanical reproduction.

Compare this to the free, open source software movement as a response to the dehumanizing potential of closed-source, 'cathedral' software epitomized by Microsoft.

(58) As [Morris] Eaves suggests [in The Counter-Arts Conspiracy: Art and Industry in the Age of Blake], 'digitization is not a notion confined to electronic devices but a technological norm that operates across a spectrum of materials and processes. As a rule of thumb, the more deeply digitization penetrates, the more efficient the process becomes' (186).

(58) At the heart of digitization is a praxis of 'division' that Blake strived to denounce through his 'chaosethetics.'

(61) As argued by Viscomi and other Blake scholars before him, Blake's references to 'corrosives ... melting apparent surfaces away' underscores the degree to which the materiality of his mode of production was etched into his visionary philosophy.

(62) Blake's work, informed by his notion of the 'contraries,' invovles a unification of form and content, material production and ideology.

(66) By creating execises such as 'Re-writing Blake,' instructors are not asking students to write about the poet/painter; they are asking students to write with him.

New Media for Everyone: Hypericonomy

(67) My strategy has been to 'show' this method in this chapter rather than explain it away with a series of easily replicable instructions. In this way, I am attempting to provoke a certain degree of misunderstanding, with the hope that readers might produce their own monstrous versions of hypericonomy. This is a strategy that, in Ulmer's terms, is designed to trigger a relay.

(endnote 11) Richard Coyne draws on the term 'technoromanticism' to identify narratives that promote an emancipatory vision of new technologies.

If humanities scholarship ever reaches into software design, then the notion of 'writing with' takes truly useful material possibilities, such as joining in the work of an FOS project or remediating obsolete technologies such as the electronic pinball machine by PMREK, so we do those projects. If the chapter begins with an icon, the ground symbol, for grounding the study of electronic media with a study of electronics itself, then zooming reveals a relay driver circuit. O'Gorman's rhetoric enticing you to repeat his experiment 'hypericonomy' succeeds as the impulse at the base of the transistor crossing the threshold to initiate current flow between the collector and emitter, in turn energizing the solenoid relay, which turns out to be a pop bumper momentarily energized during a game on the Flash Gordon pinball machine.

(68) It is this notion of a 'visual puncept' that is at the foot of hypericonomy, and which is also akin to the aesthetic techniques of William Blake.

4. Nonsense and Play: The Figure/Ground Shift in New Media Discourse

Visualization and Intelligence


(73) Could our visual culture, then, the culture which is making us 'sillier by the minute,' actually be responsible for a certain intellectual (r)evolution? The pedagogical avant-garde, from the U.S. Army to the Baby Einstein Company, seem to think so.

These paragons of collective intelligence, thinking, unconscious collective bargaining, and so on are instantiating Nietzsche's silliness?

(74)(figure 4.1) Wendy and Michael Magnifier, 1998: McDonald's Peter Pan Happy Meal toy.


(76) My argument, then, is not that visual media have made us, or our children, more intelligent than our predecessors, but that development in the materiality of media lead to shifts in the hierarchy or matrix of cognitive processes.

Heim's historical drift and gains and losses.

(77) The camera obscura, as described by Krauss, might serve as a convenient hypericon for encapsulating the classical understanding of visuality which the avant-garde challenged.

Influence of Lacan?

(78) In The Optical Unconscious, Krauss relies heavily on the work of Max Ernst in order to demonstrate the surrealists' undoing of the figure/ground binary.

(80) the contemporary popularity of surrealist imagery which stunned and baffled its initial audience, demonstrates the advanced level of optical sophistication possessed by the average contemporary consumer.

Figure/Ground 2: Children's Literature

(80) In general, surrealism involved a sort of psychoanalytic revision of childhood experiences, not as a means of therapy, however, but in order to apply these experiences to the transformation of everyday life.

(81) one can reverse the conventional figure/ground relationship by putting greater emphasis on the frame, or even, on the act of enframing, rather than on the content of an image or text.

Does this permit garbage to slip through the text uncriticized? Isn't that what we mean by “the unconscious”? Can't we detect it with computer programs that analyze what we have written?

(81) Nonsense, then, can take us across cultural and cognitive fields, forcing us to confront the other, and his/her methods of organization. If the form and logic of print textuality began with books for 'little boys' (i.e., Ramus's textbooks), then the model for an electronic textuality might also come from books for children - nonsense books, that is. And not only from children's books, but from their video games and television shows as well.

(endnote 10) It might also be appropriate to consider, here, Heidegger's conception of enframing (gestell) as the essence of technology, and the way in which nonsense thwarts the technological drive toward efficiency.

Invites analysis of computer software, books written for 'the other'. But are they really examples of the intellectual sort of nonsense proposed here, and not just “very stupid phenomena”?

Figure/Ground 3: Digital Media

(81-82) Susan Stewart characterizes nonsense, as a strikingly intertextual mode of discourse, one which cannot occur without transgression, without contraband, without a little help of the bricoleur's hand. To view nonsense in this way is to view communication as a constant interplay of 'universes of discourse' which are incessantly 'involved in borrowing from one another and transforming one another at every step as they are employed in an ongoing social process' (ibid.). .. The Web can facilitate a rapid shift between various modes of discourse and cognition, all within the same perceptual field. .. hypertext offers us a form, a material space, in which we can build our own models.


(82) Admittedly, the computer is the most far-reaching new media tool in education, but it is not the only electronic tool that influences learning. Television, video games, even cell phones - marginal electronic media from a scholarly point of view - all play a part in education, even though they may not be an integral element of the classroom experience. Whereas Katherine Hayles rightly calls for an increased emphasis on material-specific critique, I am calling for an increase in material-specific pedagogy, starting with the materiality for the Web.

Experiment with adding web-enabled mobile devices to the classroom experience using the poller software as an integral part of a presentation.

(83) What [Richard, writer of The Electronic Word] Lanham neglects to consider is that hypertext may be used not only as a sort of light switch between the classical, academic binary of rhetoric vs. philosophy, but also as a multivalent switch, or rheostat, if you will, for toggling between cultural, epistemological, autobiographical, political, and historical categories. .. It may be useful here to leave behind the binary, light-switch model of electronic writing and consider another model, that of Gregory Ulmer's argumentative 'tuning knobs.'

(84-85) If, alongside the knobs for narration, exposition, and poetics, we include knobs for politics, popular culture, theory, autobiography, etc., then we have indeed built a machine (a graphic equalizer?) capable of generating a mode of academic discourse more suitable to a culture of computing.

Give the humanists a quize: Tuning knobs are indeed rheostats if [] is what they control: your choices are (a) capacitance, (b) inductance, (c) resistance, (d) reactance, and (e), none of these.

Figure/Ground 4: 1\0

(86) The semiotic square, employed on its own, is a much too rigid and positivist apparatus. For this reason, 1\0 relies heavily on a more pliable apparatus known as the 'choral square.' The choral square, which first appears in Ulmer's Heuretics, is a descendant of Plato's notion of chora, which was picked up by Jacques Derrida. Like the mnemonic strategy of classical rhetoric or oratory, chorography relies upon the generative potential of a specific place. In Ulmer's chorography, the subject provides the place of invention, with the intention of generating a poetics. The term place here is somewhat inadequate, however, since it actually refers to the space of a quadripode graph which Ulmer calls the popcycle, and within which the chorographer (or mystorian) plots him/herself by filling in the following coordinates or slots: 'Family, Entertainment, School, Discipline.'

Unfamiliarity with Derrida signifies what? Is this the first time O'Gorman really invokes Derrida directly? If so, I'd like to note my astonishment that he never, throughout this entire book, as far as I can tell, nor in his bibliography mentions Memoirs of the Blind, Derrida's picture-driven meditation.

(86-87) What really matters for the sake of mystory, however, is that the categories are filled in before the project actually begins, and they are pursued faithfully as if they formed a set of rules for the deployment of the project.

(87) The popcycle first appeared in Ulmer's Teletheory as a set of guidelines employed in the creation of a mystory, a new critical genre which adds autobiography and pop culture to the scholarly mix. .. What remains essential in any case is that: (a) the academic category is forced to collide with other influential aspects of an individual's life; and (b) the categories are staged around the resolution of a specific problem.

This language is intended to be silly in the sort of indirectional nonsense that is like Blake's children's works.

(87-88) This resolution to not analyse the recurrent pictorial theme deserves further commentary here, since it is an essential element of mystory and of hypericonomy. If, according to Paul Feyerabend, 'by incorporation into a language of the future ... one must learn to argue with unexplained terms and to use sentences for which no clear rules of usage are as yet available' (1975: 256-7), then the commitment to deferring any in-depth analysis of one's thoughts and images during the time of hypericonomising must be considered as a seminal element of the method. In Ulmer's terms, by filling in the slots of the popcycle, we are 'learning how to write an intuition, and this writing is what distinguishes electronic logic (conduction) from the abductive (Baker Street) reasoning of the detective' (1994a: 37).

Isn't this an attempt to snatch meaning from the semi-conscious, like interpreting slips of the tongue but closer to the point where consciousness steers production? I bet Zizek crosses this innovative approach.

(88) Through this process of simulated intuition, or 'artificial stupidity,' the writer, completely unaware, performs an outering of the ideological categories that structure his or her organization of knowledge (1994a: 38). Hypericonomy, then, involves the invention of a new relation to knowledge itself, a techno-ideo-logical relation which Ulmer calls a 'knowledge of enframing' (1989: 183).

(89) The deferred understanding, or 'artificial stupidity,' might be considered as a form of Nachtraglichkeit, a psychoanalytic concept championed by Freud and Lacan. .. The point of recognition, then, can only take the form of a deferred understanding, an understanding-too-late, arrived at by means of a detour through the realm of nonsense (puns, anagrams, macaronics, etc.). .. When a hypericonomy such as 1\0 is finished, we are left with a veritable impression of its creator's unconscious (whether it be political, optical, or psychic).

(90) Lacan frequently employed images to instill this 'sublime of stupidity' in his audience. On the cover of each volume of his seminars, for example, is a hypericonic image taken from classical painting - an 'organizing image of the discourse, not to be interpreted but to serve as a point of departure for working through a theoretical problem' (Ulmer 1989: 194). .. In Lacan's mnemonic technique, we have the precursor, the theoretical bud, of which hypericonomy is indeed in full bloom.

This is the connection between O'Gorman's avant-garde method and 'science' (Freud and Lacan).

(94) Could it be that to produce a hypericonomy of this sort is to place oneself in the presence of a sublime object? An object which, in the Kantian/Derridean sense, invokes a 'violence done to the senses' (Derrida 1987: 130)? An object beyond the grasp of comprehension, beyond calculation and without end?

(94) The method reflects the current situation in which computer users approach their extremely complex and powerful machines as dilettantes. The sense of nausea that I feel when confronting 1\0 today has to do with the fact that I am confronting my own assumptions in 1996, my own lack of skill with Web design. .. The point of underscoring this design issue is to demonstrate that, with the rapid and incessant changes in software and hardware manufacturing, the best way to approach digital media pedagogy may well be to train students in the art of 'well-informed dilettantism.' This is why William Blake, a poet, painter, philosopher, printmaker, and visionary, serves as an excellent exemplar for students in the humanities today.

The cyperpunk/cybersage position. Careful to position between ignorant users and too-entrenched (technostressed); go back to his position wanting Microsoft to offer him a solution, and, lacking that, to permit dilettantism at the state of the art.

5. From Ecriture to E-Crit: On Postmodern Curriculum

Language of the Future

The 4fold Vision

(102) the 4fold Vision asks students to engage in a form of pattern recognition; it asks them to devise a method for organizing and producing knowledge that is suitable to a culture facing an onslaught of information, much of which is pictorial.

Critical Theory, Digital Media Studies, and the Curriculum of the Future

(103) In The Digital Revolution and the Coming of the Postmodern University [Carl A.] Raschke opposes interactivity (a common term in both pedagogy and new media development) to transactivity, which he sees as the pedagogical future of the 'postmodern university.' .. Web-based distance education has already changed the way we understand the university, but it has simply transposed print-centric habits (with varied success) into a new learning space. I believe that the transformation of the academic apparatus is most likely to occur by means of physical agents that engage directly with the traditional material structures of learning, from the essay, to the classroom, to the entire campus itself.

(104) Perhaps what needs to be developed most of all, however, in programs such as DMS, is the study of metastructure. But, to date, this has been the specialty of English departments, where critical theory found a home a few decades ago and is now ready to migrate from its literary, print-oriented focus to the realm of digital artifacts.

(105) As John Guillory has argued in Cultural Capital, the successful integration of critical theory into university education is a result of its being introduced - as a sort of contraband - at the graduate level.

(106) The 'off-the-radar' status of literary studies is capable of provoking severe self-pity in the traditional Romantics scholar. But to a Romatics scholar with an interest in critical theory and the materiality of visual communication, this state of neglect provides room for much-needed experimentation and revolution.

Electronic Critique: A Case Study in Curricular Remediation

(106-107) This leads Guillory to the conclusion that 'the moment of theory is determined, then, by a certain defunctioning of the literary curriculum, a crisis in the market value of its cultural capital occasioned by the emergence of a professional-managerial class which no longer requires the (primarily literary) cultural capital of the old bourgeoisie' (xii). .. The answer, I propose, lies in new media.

(107) The study of new media artifacts must coincide with the development of new media research methods.

(108) From the perspective of university administrators, however, E-Crit remains a nebulous sort of computer art studio that encourages students to take on Web and video projects on campus (this has earned them a reputation as the institution's leading media developers). These projects allow the administrators to tout E-Crit as a cutting-edge program when questioned by alumni and suspicious senior faculty, but the conversation usually ends there.

(109) In many ways, theory's failure has much in common with a culture that identifies with online dating, genetic engineering, and self-replication through increasingly sophisticated recordable media. When I propose that critical theory needs digital media and vice versa, I am proposing a curriculum that supports the thoughtful application of theory to the production of digital media artifacts, the creation of humane technologies and tech-related policies, and the investigation of the impact of technology on human being; or, to borrow Eagleton's shamelessly simple-minded words, I am proposing that educators can combine media and theory to 'find out how life can become more pleasant for more people' (2003: 5).

Notice the E-Crit program has a 15 credits “Programming Track” that includes data communications and networks and requirements and design with no focus on any particular programming language.

(111) For example, the Department of Electrical Engineering might offer a class in microcontroller programming that would involve students from both engineering and liberal arts in the creation of electronic devices suitable, for example, for a critical/digital art installation.

(112) The 'dot.com' bust only serves to increase resistance to changes in academia, but university administrators still recognize the powerful cultural capital of digital media, as evidenced in the persistence of distance education projects. But a legion of University of Pheonix's will certainly not spur a knowledge revolution.

E-Crit and ecriture

(113) As [Hugh] Culik indicates, E-Crit was formed out of a need for resistance, specifically, resistance to 'the ideologies that make up electronic culture.' .. E-Crit requires students and faculty to take an ironic stance toward technology; to be 'in technology, but not of technology,' as a deceased colleague of ours once said.

(114) As Lev Manovich has suggested, 'One general effect of the digital revolution is that avant-garde aesthetic strategies came to be embedded in the commands and interface metaphors of computer software. In short, the avant-garde became materialized in a computer' (2000: 306-7). Rather than turn the political and intellectual dynamics of poststructuralism or the avant-garde into a selection of menu items in a design program, E-Crit is an attempt to remotivate those dynamic strategies, and recouple them with their (now digitized) aesthetic strategies.

Not surprising at all that avant-garde techniques have been used in consumer-oriented human-machine interface design. O'Gorman's strategy is to reincorporate the philosophies behind avant-garde aesthetics into the new mode of material critique through imitation (writing-with). Be sure to consider this in the context of Drucker and McVarish.

(115) As a final statement on 'postmodern curriculum,' then, I will suggest that it must be as agile and ironic as the ecriture of Barthes and Derrida.

A Final Note on Techno-Romantic Idealism

(116) Between the repressive constraints of 'legacy' and the techno-fetishistic demand for 'progress' levied by the ruling managerial class, curricular innovation has very little chance of leaving the confines of an idealistic vision statement.


O'Gorman, Marcel. (2006). E-crit: Digital Media, Critical Theory and the Humanities. Buffalo, NY: University of Toronto Press.

O'Gorman, Marcel. E-crit: Digital Media, Critical Theory and the Humanities. Buffalo, NY: University of Toronto Press, 2006. Print.