Is tooth brush one word or two? In any case, I opened a new tooth brush this morning. It’s a Colgate Active Angle Soft Full Head #56 (“For a Noticeably Intense Feeling of Clean”). And I was immediately struck by the affinities between tooth brushes (toothbrushi?) and athletic shoes. Same color palette: greens, blues, and reds on a synthetic base of white. Same kinds of curves and contours, same balance and proportions. Whereas once upon a time toothbrushes were made from a single plastic cast, contemporary models, like contemporary athletic shoes, are built up out of inscrutable deposits of layers and sediment that speak to some elsuive yet exquisitely refined ergonomic principle. And like athletic shoes with their fractal patterns of cleats and treads, my tooth brush is lovingly detailed. It has little rubber traction nubs on the grip—as though I might brush my teeth with such force that it would go flying out of my hand were it not for the precisely calibrated tribological measure of resistance those nodules afford.
If I were teaching a creative writing workshop I’d have my students write descriptions of such an artifact; if I were teaching art I’d have them draw a still life. I would love to unleash a Nicholson Baker or a Henry Petroski on all this: what drives us to so overdesign an object most of us use for just a couple of minutes a couple of times a day? It’s not just marketing: my Colgate Active Angle Soft Full Head #56 was made with love.
Meanwhile, over at Invisible Adjunct, a place I generally admire, it’s time for a seasonal round of MLA bashing. The impetus is a slight piece in the Chronicle, which, while meant to be taken lightly, nonetheless seems a bit too taken with its own slights. The chief offenders in the subsequent discussion turn out to be a clutch of academic hate mongers hanging around the comments section. Someone who goes by “Geraldine” is particularly crude. Anyway. Kathleen says it all, much more eloquently than I could (or did, or tried to in the comments section at IA’s). The author of the Chronicle story weighs in there too, and cannot be said to distinguish himself. As I opined over at IA’s: Sad.
Word Circuits is pleased to announce the publication of a set of new, original critical essays on electronic literature:
E-LIT UP CLOSE
These seven short essays by students from Matthew Kirschenbaum’s graduate course at the University of Maryland, College Park eschew general musings on the nature of electronic textuality and instead attempt detailed close readings that treat seriously individual instances of electronic literature as literary creations. They include accomplished new readings of work by Scott Rettberg, mez, and geniwate, among others. Word Circuits hopes that these texts might lay the groundwork for an evolving critical archive featuring similar contributions from other new media and electronic literature courses around the world.
Word Circuits also offers other essays and reviews, as well as hypertext and Flash pieces by Milorad Pavic, Stephanie Strickland, Deena Larsen, Rob Swigart, Bill Bly, Peter Howard, Komninos Zervos, and Jackie Craven.
The students’ original assignment is available for anyone who might want to adapt it and contribute to the collection. I plan to repeat it this spring.
Peter G. Beidler presents some useful data in the new issue of the MLA journal Profession (2003). He surveys what former Lehigh University English majors considered most beneficial about their undergraduate English experience. Writing skills were at the top of the list, with almost 70% choosing them as one of their two most important benefits. Next were “critical thinking skills,” with 59%. Interestingly, when one gets down to the nitty-gritty of literary analysis and history—what many faculty consider the real meat of the major—the numbers fall off dramatically. Only 22% of respondents selected “literary appreciation and analysis” as one of their two most valued take-aways, and knowledge of the history of literature garnered a mere 1.4%.
The numbers, which Beidler acknowledges are specific to Lehigh as a “small, private, selective, eastern university,” are nonetheless noteworthy in that they represent an inversion of how the English major at a wide variety of institutions is typically structured, with its emphasis on literary genres and historical periods. Nearly all faculty would maintain that in the process of teaching literary history and appreciation they also teach critical thinking and writing: but Beidler’s numbers suggest that other approaches to the major should be given equally serious consideration.
. . . that one of the
gargoyles grotesques atop the Washington National Cathedral is Darth freakin’ Vader?
This is the second in a series of occasional excerpts from my current book project, Mechanisms: New Media and the New Textuality. Like the first such excerpt, “Grammatology of the Hard Drive,” this material is drawn from the chapter entitled “Extreme Inscription”; it is a portion of a longer section that argues for the origins of interactive computing in random access storage devices, particularly the magnetic hard disk, as opposed to the slightly later genealogy (usually traced through SAGE and Douglas Engelbart) that emphasizes real-time screen displays, direct manipulation, the GUI, and the mouse. The material here presents background on what may be both the first digital library and the first computational character. This work is still very much in draft, and I’d greatly appreciate comments and feedback.
For simplicity of formatting I have omitted the notes.
Mechanisms is under contract to the MIT Press. All material is offered here as copyright © Matthew G. Kirschenbaum, all rights reserved. This copyright notice supersedes the Creative Commons license in place for the rest of the blog.
Among the attractions at the 1958 World’s Fair in Brussels, Belgium, visitors would have beheld “Professor RAMAC,” a four-ton IBM machine capable of offering up responses to users’ queries on a two thousand year historical span ranging from “. . . the birth of Christ to the launching of Sputnik 1.” Described as an “electronic ‘genius’” with “almost total historical recall and the ability to speak 10 languages” the Professor offered the general public its first encounter with the magnetic disk storage technology today called the hard drive. Technically known as the RAMAC 305, the machine had been developed at IBM a few years earlier and was then in use by a handful of corporate clients, notably United Airlines. It was typically paired with an IBM 650, a general-purpose business computer. The RAMAC was capable of storing five million 7-bit characters on 50 vertically stacked disks, each two feet wide and rotating at 1200 RPM. In contemporary parlance this means that the first hard drive had a capacity of about 5 megabytes. The machine leased for $3200 a month, ran on vacuum tubes, and was taken off the market by 1961; some 1500 were manufactured in all.
When the RAMAC was first announced in 1956, Thomas J. Watson, Jr., President of IBM, opined that it was “the greatest product day in the history of IBM.” The remark was arguably not an overstatement. The RAMAC, which stood for Random Access Memory and Control, was, as its name implies, a random access storage device. This was fundamentally different from the strips of punched paper and magnetic tape that then dominated computer storage. As Paul E. Ceruzzi notes, “[i]n time, the interactive style of computing made possible by random access disk memory would force IBM, as well as the rest of the computer industry, to redefine itself” (70). This is a powerful insight, and not often grasped by students of new media who tend to ascribe “interactivity” to the advent of the mouse and the graphical screen, typically via Douglas Englebart’s NLS demo a decade later. (Incidentally, the NLS system Englebart introduced was based upon a Scientific Data Systems SDS 940 time-sharing computer with approximately 96 MB of magnetic disk storage. ) Professor RAMAC, I want to argue, inaugurated an important new trend in human-computer interaction, to which the importance of the hard drive is still not widely appreciated. According to the IBM press kit for the Brussels pavilion:
Visitors to the fair will be able to ask the machine what were the most important historical events in any year from 4 B.C. to the present and RAMAC will print out the answers on an electronic typewriter in a matter of seconds. . . . A query to the professor on what events took place in the year 30 A.D., for example, would yield answers like this: “Salome obtained the head of Saint John the Baptist.” In 1480? “Leonardo da Vinci invented the parachute.” In 1776? “Mozart composed his first opera at the age of 11.”
There are several observations to make here, starting with the Professor’s title and occupation. In 1950 Edmund C. Berkeley had published a book entitled Giant Brains: or Machines That Think, the first work to introduce computers to a general audience. The shift from Berkeley’s anthropomorphism to the RAMAC’s full-fledged personification as a “Professor” or “genius” hints at the kinds of synthetic identities that would culminate with Arthur C. Clarke’s HAL 9000 only a decade later. Second, we should note that while the Professor’s “almost total historical recall” was strictly hardwired, the notion of a computer endowed with the kind of encyclopedic capacity we today take for granted in an era of world wide webs and electronic archives would have then seemed quite novel. Much of the American public, for example, had first encountered computers during the 1952 presidential campaign, when the UNIVAC 5 (correctly) forecast Eisenhower’s victory over Adlai Stevenson a month ahead of time on live TV. Computers were thus on record as instruments of prediction and prognostication, not retrospection. The RAMAC, by contrast, represented what was perhaps the first digital library. Its multi-lingual capability, a brute force flourish clearly meant to impress, is also worth a comment: in the context of the World’s Fair it no doubt served to reinforce the machine’s supposed objectivity, its omniscient command of the human record and status as an impartial observer—at least until one realized that with the exception of Interlingua, an artificial language, the languages in question were all those of the major European or imperial powers: English, French, Italian, Dutch, Spanish, Swedish, Portuguese, German, and Russian. (That these were also all alphabetic languages compatible with the text processing technology of the day reinforces rather than diminishes the point.) As perhaps the earliest computational personality on record (almost a decade before Weizenbaum’s ELIZA), the Professor was thus marked out as a first-world citizen of the post-colonial present as well as a trans-historical rememberer of things past.
The RAMAC 305 was an instance of what is generally classed as a storage device. While storage technology has been well chronicled in corporate histories of the computer industry, it has not received much attention in critical media studies. Friedrich Kittler now writes his groundbreaking essays about Intel processors, not the contemporary storage devices that are the heirs to his gramophones, filmstrips, and typewriters. Lev Manovich, Matthew Fuller, and others in the nascent software studies movement tend to focus their attention on interfaces and end-user applications. Perhaps this is to be expected. The word itself, storage, is dull and flat sounding, like footfalls on linoleum. It has a vague industrial aura—tape farms under the fluorescents, not the Flash memory sticks that are the skate-keys of the wifi street. But it is precisely random access disk storage, as inaugurated by the RAMAC, which enabled the database paradigm Manovich sees as fundamental to contemporary new media.
For Manovich, new media productions are characterized by the discrete nature of their constituent objects, and the lack of an essential narrative or sequential structure for how those objects are accessed and manipulated: “In general, creating a work in new media can be understood as the construction of an interface to a database” (226). And while Manovich is reluctant to associate database and narrative with specific storage technologies in any deterministic sense—the codex book, he notes, is the random access device par excellence, yet it is a haven for some of our most powerful narrative forms (233)—the fact remains that computers could not have evolved from war-time calculators to new media databases without the introduction of a non-volatile, large-volume, inexpensive storage technology that afforded the operator near-instantaneous access. Magnetic disk media, more specifically the hard disk drive, was to become that technology and, as much as bitmapped-GUIs and the mouse, usher in a new era of interactive, real-time computing.
I’ve said it before and I’ll say it again: digital preservation represents a significant technical challenge, but it’s first and foremost a social challenge.
Simson Garfinkel, whose work I’ve followed, says much the same in the current issue of the MIT Technology Review: “The Myth of Doomed Data.” His piece deftly deconstructs the much ballyhooed example of the Domesday Book, which was originally digitized in the mid-1980s on 12” video disks—a format now well and truly extinct. The punch-line irony is of course that the original, handwritten on parchment leaves in 1086, is still perfectly legible. The digitized version, meanwhile, was painstakingly rescued by a team of dedicated research scientists—at great cost—only a decade and a half later. So digital preservation must be a fool’s errand, right? Wrong. Here’s Garfinkel:
To be sure, this has all been an expensive and time-consuming process. But it has been done, proving that the process is possible. Not all digital material is worth preserving—most, in fact, is not. But Domesday was worth preserving and, as a result, it has been.
Indeed, for every Domesday Project that has lost its data to proprietary equipment and file formats, it is easy to point to another project for which information created decades ago is still available. The Internet “Request For Comment” (RFC) series, started back in the 1970s, is readable on practically every computer on the planet today because the RFCs were stored in plain ASCII text. Similarly, you can download images sent back from the Voyager space probes 30 years ago and view them on your PC because NASA stored those pictures as bitmaps—pixel-by-pixel copies of the images without any compression whatsoever. Some argue that it’s impossible to look into the future and determine which of today’s formats will survive and which will go the way of the VP 415. Poppycock! As a society we have a very good understanding of what will make one file format endure while another one is likely to perish. The key to survival is openness and documentation.
I’ve said it before and I’ll say it again: digital preservation represents a significant technical challenge, but it’s first and foremost a social challenge.
I have a confession to make. I was a teenage grognard. It’s something I’ve long repressed, but Thanksgiving weekend back home I fell off the wagon and I fear there may be no help for it. Deep down I think I like it.
So what is a grognard, you ask? What deep and terrible secret have I been keeping? A grognard is a wargamer. The name was a term of affection that Napoleon bestowed on his grizzled veterans, the ones who had been with him from the Peninsula campaign to the snows of Russia to the rain-soaked fields at Waterloo. In a sense, I’ve been to those places too. On many a day after school you would have found me hunched over a topographical map overlaid with a hexagonal grid, pushing around dozens if not hundreds of little pieces of cardboard with cryptic markings, rolling dice, and consulting charts and a rulebook, thick and closely typed. The materials were crude indeed, but in retrospect I realize these were some of my first virtual realities. Books too, I suppose, but this was different. This was interactive. You would have seen me huddled over a mess of maps and markers, but I would have been busy turning Blucher’s flank at Ligny or plotting the movements of the third armored corps racing across the Sahara desert, or even (this was the eighties) working to counter a sudden Soviet air drop into the cities of western Europe.
Let’s get a few things straight. Wargames are not Dungeons and Dragons. I played my share of weekend D&D, but wargames were a solitary obsession: the rulebooks were thicker, the action was, if possible, even more opaque and plodding, and I alone among my friends seemed compelled by the chimera of interactive history. We’re not talking about miniatures (i.e., toy soldiers) here either. Miniatures are a type of wargame with their own often intricate rule sets, and they’re gorgeous to behold. I coveted them. But I had neither the money, nor the patience and skill to acquire and paint rank after rank of lead figures, or mold and shape the sandtable terrain on which they did battle. The games I played were manufactured by companies you probably won’t have heard of unless you were part of the hobby: Avalon Hill (most famous perhaps for a once-popular game called Diplomacy), the late great SPI (already defunct by the time I started gaming in the early eighties), Victory Games, GDW, and a handful of others. Dollar after dollar of first my allowance and then my after-school jobs went into acquiring these cardboard universes.
Wargaming, let it be said, has nothing to do with warmongering, or the glorification and celebration of war. For one thing, a good simulation—watching what happens when you send a column of troops across a wide open field to break an enemy position—you know it’s suicidal but this is the bottleneck for the entire front and there’s no other option—can teach someone more than reading can about the irreducible brutality of warfare, from sticks and stones to tactical nukes. But wargames were admitedly interesting to me for less high-minded reasons. They’re almost irresistible for anyone interested in military history, especially alternative history: could I have done it better? Would things have turned out differently if the German armor wasn’t a hundred miles away up the coast from Normandy? If Pickett’s charge had worked? If it hadn’t rained the night before Waterloo, bogging down Ney’s cavalry? Wargames are also of obvious interest to people who appreciate the art of modeling and simulation; a wargame, like any game, is a mechanism, a finely-tuned formal system. At the heart of a good wargame is a delicately wrought balance between playability and historical accuracy. But wargames foreground the interplay among their internal components, like a watch with its movement exposed, through their conspicuous apparatus of charts and tables, endless lists of modifiers and special cases. (No surprise, then, the cross-over between wargamers and computer geeks.)
Perhaps most of all, though, wargames fed my interest in narrative, which in turn has something to do with why I eventually went to graduate school in English and not military history. Here, I realize, I’m treading on a raging debate in contemporary game studies (that may or may not have taken place according to the latest accounts), but my wargame experience compels me beyond a shadow of a doubt to believe that games can be, can become, narrative. A key move or assault, a well-played defense, a deft maneuver or a tenacious holding action: some badly-printed die-cut little cardboard square (labeled “Second Armored Division”; “Third Platoon”; “82nd Airborne”; “The Horse Guards”) would take on a life of its own as the rest of the game ebbed and flowed around its aura. I played the games solitaire and tried not to take sides—I never deliberately made a “bad” move or fudged the dice—but if a favorite unit was suddenly cut down by enemy fire or suffered an ignominious defeat I was shaken by it.
There is, of course, a thriving grognard subculture on the Web, though the hobby is but a shadow of its former self (you can read that story here if you want to: “A Farewell to Hexes”). Last weekend at my parents’ house I dug some games out the closet in my old room and spent an hour flipping through the rulebooks and maps. Most of it was opaque to me. Most games, like Wellington’s Victory (a grand tactical level simulation of Waterloo involving thousands of units, four maps, and requiring upwards of fifteen hours to play) I know I’ll never touch again. I certainly don’t have the time, or the space, or even the will to ramp back up on the rules. Nor do I necessarily have any more patience or skill (or money) for miniatures than I once did. And as I now realize from reading Scott McCloud, as tantalizing as the brightly colored miniatures were in an HO model train kind of way, there’s a reason why I found the cardboard and paper so compelling: closure, the way we imaginatively project our own idiosyncratic verisimilitudes onto more abstract modes of representation. So I left Wellington’s Victory but took some of the simpler Napoleonic games back with me, and have since passed an hour or two at Marengo, Wagram, and Quatre Bras (yes, the old SPI versions). I also think I’ll play Squad Leader again, Avalon Hill’s brilliant simulation of World War II infantry combat that struck a near perfect balance between playability and realism. There was little chance of holding the ridge, but Major Everson had one last desperate hope. The remains of one platoon he sent to the far slope to screen the enemy advance. Corporal Kelly, meanwhile, a veteran of over a dozen major actions with a rucksack full of antitank mines led a second, smaller group to spring a trap for the enemy armor moving slowly up the sunken road. It might work. Probably not, but it might—the terrain was in their favor. Just then the first shells from a previously hidden gun battery began bursting around Everson’s command post . . . I think I might also pay a visit to Dream Wizards up in Rockville, a legendary game store that I used to read about in the hobby’s magazines. Or maybe I will start collecting a few miniatures after all, slowly. And yes, of course, I’ve been researching computer wargames, though a part of me resists: the whole point is to get up and away from the keyboard.
My present professional self finds no lack of ways to critique the project of using grids and rules and dice to “simulate” the ghastly phenomenon of war. It’s hard to read critical theory, or for that matter read a newspaper in today’s world and still want to be an armchair general. No surprise that when I went to grad school I left the games at home. But in the six degrees of separation that are the hexagonal grid of life, nothing is truly isolate. The author of “A Farewell to Hexes” turns out to have a blog I sometimes read. In fact, I suspect there may be one or two ex-grognards lurking on my blogroll (I’d love to hear from you). And perhaps one day these flimsy pieces of bellicose cardboard will be my springboard to some future project.
Not only are we suddenly talking about going back to the moon, but, according to a CNN report, Taito Corporation is planning to manufacture and sell 10,000 brand new Space Invaders units. Cabinets will go for $2800 a pop, an individual game fifty cents. There will be no changes to the ROM.
The news media and lots of folks around the blogosphere are having their fun with Donald Rumsfeld, who won a satirical “award” from something called the Plain English Campaign for the following statement:
Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.
Let me first say that you won’t find me silkscreening T-shirts for the Donald Rumsfeld fan club any time soon. His statement, however, is in fact perfectly lucid, and parses according to some classic rhetorical structures. The appearance of obfuscation is obviously the result of the repetition of the word “known” (and its variants), but the logical basis of the sentence is never at issue. This strikes me as a good example of the public sphere’s general lack of tolerance for subtlety or complexity in language—essentially the same impulse, I would argue, that leads people to rail against “jargon” in the academic humanities (but somehow it’s okay when doctors or engineers use big words we don’t understand). Again, I’m not much interested in holding up Rummy as a paragon of discursive grace, but one wants to ask which words in his sentence are not plain English?
Update: Dennis Jerz seems to agree.
The electronic William Blake Archive has won the Modern Language Association’s 2003 award for a Distinguished Scholarly Edition. The Archive is edited by three of the most important Blake scholars working today, Morris Eaves, Robert N. Essick, and Joseph Viscomi. I’ve been associated with the site since 1997, first as project manager during my time at Virginia, then as a technical editor, and now as a consultant. This is the first time this important award has been bestowed to an electronic edition, and it’s noteworthy not least because the Blake Archive challenges the very idea of an “edition.”
The award is also significant in a related but more subtle respect. One of the reasons “Web sites” (and the Blake Archive is a good deal more than a Web site—it’s really a structured electronic environment rendered in SGML and delivered via HTTP) suffer from academic illegitimacy is that they don’t have any entrée into the prevailing channels of scholarly communication. They’re not listed in library catalogs (though this changing), they’re not received for review by journals and serials, and they’re typically not on the radar screen for awards like this—not necessarily because of any overt prejudice against the medium, but simply because as a mode of scholarly production a Web site does not circulate in the same way as a monograph. So here’s hoping this is the start of a trend, kudos to the MLA for its progressive actions on behalf of electronic scholarship, and warm congratulations to the editors and current project team. The award will be officially conferred at the upcoming convention in San Diego.
I had jury duty today. Montgomery County, Maryland has a one day or one trial system, meaning that if you don’t get picked for a jury you’re done. I wasn’t picked.
The day begins with a video that features a reenactment of some wretch of a medieval peasent undergoing trial by ordeal. They throw him into a pond: if he sinks (and drowns) he’s innocent; if he floats (and lives) he’s guilty (and burned at the stake or worse). We’ve come a long way baby, or so narrator Diane Sawyer, whose custom-fitted behind I seriously doubt has ever touched a juror’s chair tells us. The rest of the morning in the jury lounge is a matter of waiting for your number to be called. Your zone of attention, other than whatever it is you’re doing to occupy yourself (for me it was a copy of Paul N. Edwards’ The Closed World, an alternative history of computing which I’ve been meaning to get to for a while now), narrows to listening for your number over the sound system, whereupon you’re sent upstairs to a courtroom for a voir dire. My number, never called, was 258A.
I’ve never had jury duty before (sheltered life or dumb luck I guess—I’ve been voting regularly since I came of age) and I have mixed feelings about the experience. On the one hand, the appeal to civic duty resonnates strongly with me. If you’re not willing to sit on a jury you lose the right to complain the next time a case you care about goes the other way. On the other hand, however, we were told that one trial would last eight business days, another twelve (that’s two and a half full weeks, plus deliberations). For someone to object that they simply can’t take that amount of time away from their regular responsibilities is not unreasonable. Nor is it enough to piously say, “well that’s the system and we all benefit from it.” And herein lies the problem: many people, I suspect, would be happy to sit on a jury for a day or two, but they start to get queasy when contemplating the major life disruptions that a two week absence from work or family would entail. The courts, however, recognize no formal distinction between these two situations, and while they’re very appreciative and conscientious about thanking jurors for their time—a judge came down to greet us, for example—it’s also clear that until your summons is released your butt belongs to them. This leads to all sorts of conniving and shenanigans for getting out of jury duty, and it really can’t serve the system very well in the long run. Our local jury commissioner used her head and handled the morning’s situation by asking who would be available for consideration for one of the shorter trials, even if not the longer two. This greatly cut back on the number of people who had suddenly appeared at the front of the room with pressing reasons for needing to be elsewhere.
Out of the seven trials scheduled to begin that day, four ultimiately resolved themeslves without a jury needing to be impaneled. Aparrently that’s mildly unusual, so most of us got to go home at noon. I collected my $15.00 for expenses and stopped at a taco place to have lunch with my tax dollars. There my number finally came up. Order 52.