Lots to say about this kind of bean counting, obviously, but the research is fascinating and quite, um, informative . . .
My own favorite definition of information, incidentally, comes from something I once read, and now can no longer find, by Umberto Eco: information is the confirmation of unlikely facts (I’m paraphrasing). Of course that’s based on classic Shannon . . . but my own inability to locate the quotation is a perfect example. If you were to tell me (as I devoutly hope someone will) that this is from page 83 of such and such a text, well, that would be information because it’s unlikely the passage was on page 83 of such and such a text, as opposed to page 82 or page 84 or page 83 of some other text or page 82 or 84 of some other text . . . and so forth.
Last night Jason Nelson blew the doors (not to mention the speakers and a few minds as well) off a packed house at the University of Maryland. The set list included new work like Dreamaphage and Conversations, old favorites (Plush, Nine Attempts to Clone a Poem, and Panhandle), and some early stuff in HTML. Plus stories about trains and Myrtle Beach seafood buffets. Eighty or more people left the room with the new-found notion that “electronic literature” is not an oxymoron, not to mention that Jason is a pretty strange (and brilliantly talented) guy. Thanks to the Jiménez-Porter Writers’ House, MITH, and everyone who helped put this together.
Update: A fresh interview (of sorts) with Jason by Scott Rettberg, interpolated by a dragon.
Demien Katz’s Gamebooks is one of those plain, no-nonsense Web sites that projects an air of instant authority for its treasure trove of resources, including a complete catalog of the classic Choose Your Own Adventure series.
Most interesting to me was his description of a recent book by one Kim Newman called Life’s Lottery (Pocket Books, 1999):
This is an amazing book. It contains more possibilities than you could possibly expect, and the paths through it run from funny and touching to grim and disturbing. The more you read it, the more the paths rebound off of one another, increasing the meaningfulness of all that happens. . . . Even the mechanics of the book are somewhat innovative. The book uses the “go to x, then y” instruction, which requires the reader to read two sections in a row — this means that events that happen in the middle of several different paths don’t have to be pasted repeatedly into different parts of the book. A nice space-saver. Even more interesting is the fact that the book works if you ignore the instructions and simply read it from cover to cover — there are intermediate sections which can only be found if you read it this way and which give meaning to the proceedings. In my opinion, this is a book that everyone (gamebook fan or not) should read. It shows the remarkable power of the interactive format, and it’s more than just a little bit thought-provoking.
A gamebook with real literary merit? That sounds enormously promising (and long overdue). The only drawback, according to Katz, is that right now it’s only available in the UK; so not practical for classroom use.
Kodak has recently announced it plans to stop making slide projectors:
“The Kodak slide projector has been a hallmark for quality and ubiquity, used for decades to produce the best in audio visual shows throughout the world,” the company said. “However, in recent years, slide projectors have declined in usage, replaced by alternative projection technologies.”
This morning the Web is abuzz, rightly so, with the news of Amazon’s full-text search option on 120,000 titles in its catalog (with plans for very rapid expansion). The implications are extraordinary, but I want to home in on one specific aspect of the project as reported in Wired’s “The Great Library of Amazonia”:
The copyrights to these titles are spread among countless owners. How was it possible to create a publicly accessible database from material whose ownership is so tangled? Amazon’s solution is audacious: The company simply denies it has built an electronic library at all. “This is not an ebook project!” Manber says. And in a sense he is right. The archive is intentionally crippled. A search brings back not text, but pictures — pictures of pages. You can find the page that responds to your query, read it on your screen, and browse a few pages backward and forward. But you cannot download, copy, or read the book from beginning to end. There is no way to link directly to any page of a book.
What intrigues me here is that the characteristics of different data formats are being deliberately (and rather cannily) leveraged against the copyright issue. It’s not a “library” or even a “book” because computationally the data is expressed as an image rather than machine-readable text. (I recently wrote about the differences between image and text as data types in an essay in MIT’s Eloquent Images volume.) This has very important implications for much of the current work in fields like textual studies, and indeed makes that work practically relevant to the legal and commercial sphere. Of course it also begs a more fundamental question: what is an ebook, or indeed a book—functionally, computationally, and imaginatively?
Wanted to put the discussion down here so as not to intrude on the textual space of the poem.
This is, if I can adapt a phrase from Deena Larsen, a viscious little cybertext, one which I think I’m going to use to kick off my Computer and Text course next semester. There’s just so much to do (and I do mean do) with it: self-referentiality, obviously; the poem as Williams’ “machine made of words”; the question of what is a poem (and a text); the address to (and implication of) the reader; the visual layout; even typing it in revealed my own conditioning by media—try it yourself and see if you can maintain the discipline to do a hard return (instead of space) after each word.
Another one I might throw at them opening day is Arthur C. Clarke’s one-page “The Longest Science-Fiction Story Ever Told.” Go look it up, it’s in the Collected Stories.
Crack the case, let in some light: the interior of a computer is a dense black box teeming with inscriptions seen and unseen, macro- and microscopic, alphanumeric and symbolic, electromagnetic and photolithographic, tumbling at all angles and orientations—up, down, over, under—through the internal subdivisions of boards and components: a graphemic riot of hieroglyphics and graffiti etched and stamped and dyed and pasted. Machined language not machine language (but to see it all takes more than human eyes). Beside such riches the pressed decals of the keyboard can seem poor indeed, and the flickering signifiers of the screen dim and wan.
Got this via the RISKS Digest:
Nine out of 10 computer users are stressed out by such regular occurrences as performance slowdown, spam overload and lost files, and the time wasted fixing problems just makes it worse, according to security firm Symantec. Anger management experts say computer stress must be alleviated before it affects productivity and human-to-human interactions. “If you are suffering from stress, the best thing to do is to breathe deeply, and remind yourself to keep your cool,” says Mike Fisher, of the British Association of Anger Management. The top five stress triggers, according to Symantec, are: 1) Slow performance and system crashes; 2) Spam, scams and e-mail overload; 3) Pop-up ads; 4) Viruses; and 5) Lost or deleted files. Men tend to freak out over viruses, spam and general information pollution, while crashing systems and sluggish performance really irk women. More than a third of both sexes will resort to extreme behavior during computer-related meltdown, including violence, swearing, showing and desperately hitting random keys. The good news is that 40% will actually try to fix the problem, often asking someone else for help. Symantec’s Kevin Chapman suggests a few ways to reduce the potential for problems: “For example, don’t download lots of large files and applications, and remove the clutter left behind by long periods on the Internet. To avoid spam, don’t sign up for lots of mailing lists, and if you do receive spam-mail, never reply to it asking to be removed from the list as this will confirm your e-mail address.” [BBC News 23 Oct 2003; NewsScan Daily, 23 Oct 2003]
The real reason I’m posting this, though, is just to have an excuse to link to an old MPEG file that demonstrates some of the above principles in action. Guess I’m easily amused.
Three recent finds:
First, via Slashdot a few days ago, Molecular Expressions, a gallery of microscopic images etched alongside the copper traces of silicon chips—the illuminated printing of our age.
Next, needle drops, a well-written and extremely well-informed column on electronic music to which I’ll regularly return.
Finally, via my colleague Bill Sherman, the “Industrious Clock.”
I’m deep into the anthology of Harvey Pekar’s American Spendor comics, and it’s really remarkable, mesmerizing. Harvey goes to work, Harvey jives, Harvey hustles records, Harvey hangs out at the corner, Harvey goes down to the supermarket and buys bread. Not Superman but Everyman. It’s slice of life stuff, obviously, but it’s also more than that. Was it Harvey who Lee Ranaldo had in mind when, at the end of a Sonic Youth track called “Small Flowers Crack Concrete,” he chants the line Fucked up in Cleveland?
Seems like everybody who’s anybody is up at AoIR this weekend. I’ve been enjoying all the blog coverage, much too abundant to link to here, but wanted to take special note of the posts (and another) on the MMRPG panel which included not only Jason Rhody on textual practices in Asheron’s Call but also another student of mine, D. Snyder, who’s doing major work on The Sims.
Last night I watched part of the cable TV movie on the DC-area sniper shootings. I was here and lived it, so I’ve earned the right to some trash TV. Anyway, the most unintentionally funny line in the movie was when, during that terrible first morning, the police are trying to figure out if there’s a pattern to how the killer is moving so easily through the area and one of the detectives yells out “He must be using the Beltway!” All but one of the first day’s shootings, you’ll recall, took place during the morning rush hour, and anyone who’s ever sat in the parking lot that is the Capital Beltway at rush hour knows why this one’s a howler.
What follows is the first of what I hope will be a series of occasional excerpts from my current book project, Mechanisms: New Media and the New Textuality. This material is drawn from the first chapter, entitled “Extreme Inscription.” Readers here will note that I’ve been posting a fair amount on magnetic media and disk storage, the subject for this chapter; indeed, bits and pieces of those posts are finding their way into the manuscript (in that regard the blog has been an invaluable freewriting tool). This excerpt is presented in a more polished state, but of course it’s still very much in draft and I’d greatly appreciate comments of any kind.
For simplicity I have omitted most of the notes.
Mechanisms is under contract to the MIT Press. All material is offered here as copyright © Matthew G. Kirschenbaum, all rights reserved. This copyright notice supersedes the Creative Commons license in place for the rest of the blog.
I am referring to the devices we call hard drives, which I will be examining in some detail as the pre-eminent digital storage technology of our day. The hard drive and magnetic media more generally are mechanisms of extreme inscription—that is, they offer a practical limit case for how the inscriptive act can be imagined and executed. The kind of inscription a hard drive actually performs is non-linguistic, invisible without highly-specialized instrumentation, and recursively encoded. Nonetheless, we will see that what happens at the surface of the disk is ultimately inscription, and that the hard drive is very literally a writing machine. Hard drives are in fact unique among magnetic storage technologies in that the mark-making instrument (the read/write-head of the drive) does not make physical contact with the inscription surface: the two are separated by a space a fraction of the width of a human hair. To examine the hard drive at this level is to enter a looking glass world where the Kantian manifold of space and time is measured in millionths of a meter (microns) and milliseconds, a world of experimental-edge engineering rooted in the age-old science of tribology, the study of interacting surfaces in relative motion. Inside of the hermetically sealed recesses of the drive the behaviors of magnetic fields are pushed to their physical limits by technologies with names like giant magnetoresistive cores; while the individual bits themselves are subjected to recondite data encryption schemes to impose digital structure on the relentlessly analog tendencies of magnetic media. Some of the material I will be presenting will be unapologetically technical; this is necessary because my ultimate goal is to situate the hard drive in a critical history of inscription that is millennia old, but which more specifically descends from the telegraph, the telephone, the phonograph, and other extrasensory engines of Victorian modernity. As students of old new media such as Friedrich Kittler or more recently Lisa Gitelman see so clearly, writing, for quite some time now, has meant more than visual transcription, and inscription has meant more than alphabetization—indeed, one way of reading mechanical writing machines, notes Gitelman, is as artifacts of a culture’s “consensual, embodied theories of language” (5). I will not be advancing any such ambitious “reading” of the hard drive here, but I will hope to show that the device has an aesthetic or symbolic as well as a functional dimension and that it is an emblem of a certain kind of human-computer interaction that has its roots in recoverable discourses about technologies and the body. This will involve us in critiques of inscription and instrumentation akin to those advanced by Bruno Latour and Timothy Lenoir in their work on the history of science, for the hard drive represents a rich site for examining technologies of inscription in relation to scientific practices of instrumentation.
Some may object that a decision to focus on hard drives, the most overtly mechanical portion of the computer, is arbitrary, even tendentious. The personal computer era was well underway without them, though the technology has actually been around since the 1950s. They are also, of course, by no means the only storage media in common use today, and there is increasing evidence that they will be surpassed, not only by solid state or laser optical devices, but also by more advanced techniques such as holography. Nonetheless, hard disk devices have been the primary storage media for personal computers since the mid-1980s, and also for countless internet and intranet servers; they are historically central to any narrative of computing and inscription in the twentieth-century. Though their speed, capacity, and reliability have all increased dramatically—increases in the capacity of drives have in fact outstripped the famous Moore’s Law for processor speeds—basic drive technology remains remarkably unchanged since the technology was first introduced by IBM.
Rather than offer up yet another generalized account of electronic textuality then, my objective in this chapter will be to examine one specific new media writing technology in its unique technical and imaginative milieu, and thereby connect to the kind of new histories of inscription being rendered by such diverse critics as Kittler, Gitelman, Latour, Lenoir, Patricia Crain, and Adrian Johns. Put another way, “the computer” as a generic appellation will not do as a starting point for the kind of investigation of electronic writing I am interested in, any more than “the book” by itself alone suffices as a useful rubric for serious students of earlier periods of textuality. Here we will follow the bits all the way down to the metal.
[ . . . ]
What, then, are the essential characteristics—the grammatological primitives, as it were—of the hard disk drive as inscription engine? I propose the following: it is random access; it is a signal processor; it is differential (and chronographic); it is volumetric; it is rationalized (and atomized); it is motion-dependant; and it is non-volatile (but also variable). I gloss each of these in further detail below, while also explaining something of the technical operation of the drive.*
It is random access. Like the codex and vertical file cabinets and vinyl records, unlike the scroll or magnetic tape or a filmstrip, hard drives permit (essentially) instantaneous access to any portion of the physical media, without the need to fast-forward or rewind a sequence. We will discuss the specific technological climate that lead to the development of magnetic disk storage in more detail later in the chapter, but here the point is simply to align the hard drive with one of two age-old traditions in the history of recordable media.
It is a signal processor. The conventional wisdom is that what gets written to a hard disk is a simple magnetic expression of a bit: a one or a zero, aligned as a north or south polarity. In fact, the process is a highly condensed and complex set of symbolic transformations, by which a “bit,” as a binary value in the computer’s memory, is converted to a voltage passed through the drive’s read/write head where it creates an electromagnetic field reversing the polarity of not one but several individual magnetic dipoles—a whole pattern of flux reversals—embedded in the material substrate of the platter. Likewise, to read data from the surface of the disk, these patterns of magnetic fields (actually patterns of magnetic resistance), which are received as analog signals, are interpreted by the head’s detection circuitry as a voltage spike that is then converted into a binary digital representation (a one or a zero) by the drive’s firmware. The relevant points are that writing and reading to the disk is ultimately a form of digital to analog or analog to digital signal processing—not unlike the function of a modem—and that the data contained on the disk is a second-order representation of the actual digital values the data assumes for computation.
It is differential. The read/write head measures reversals between magnetic fields rather than the actual charge of an individual magnetic dipole. In other words, it is a differential device—signification depends upon changes in the value of the signal being received rather than the substance of the signal itself. (Readers may recognize similarities to the classic Saussurian thesis of differential relation in linguistic meaning.) As noted above, the magnetic patterns on the surface of the disk are not a direct representation of bit values but an abstraction of those values, filtered through a range of encoding schemes that have evolved from basic frequency modulation to the current state of the art, which is known as PRML (Partial Response Maximum Likelihood). There are several reasons for this, but the most important concerns the drive head’s need to separate one bit representation from another: if the disk were to store a long, undifferentiated string of ones or zeros, the head would have no good way to determine precisely where in that long string it was located—was it at the 45th zero or the 54th zero? Frequency modulation, which was the first encoding scheme to address the issue, began each bit representation with a flux reversal, and then added another reversal for a one while omitting a second reversal to represent a zero. The result was that even a long string of absolute ones or zeros would consist of frequent flux reversals that the head could use to measure and orient its position. This is known as clock synchronization or simply “clocking,” and thus we can say that there is a sense in which the hard drive is also a chronographic inscription device. Subsequent encoding schemes have found various ways of improving upon the efficiency of these reversal patterns, such that a variable and always minimum number of reversals are used to encode a given bit value. Success in developing more efficient encoding schemes is one important factor in the rapidly escalating storage capacity of hard disk drives. PRML is especially interesting because, as its name implies, it is predictive rather than iterative in nature: rather than detecting the voltage spikes associated with each and every flux reversal, the firmware makes guesses as to the value of the bit representation from a sample of the overall pattern. Obviously this sampling, coupled with sophisticated error detection and correction routines built into the signal processing circuitry, works extremely well—users don’t notice that there is any “guesswork” involved in reading their data—but the performance does not change the essential characteristics of the process, which at this very low level are interpolative and stochastic.
It is volumetric. A hard disk drive is a three-dimensional writing space. The circular platters, sometimes as many as ten, are stacked one atop another, and data is written to both sides (like a vinyl record but unlike a CD-R). The read/write heads sit on the end of an actuator arm known as a slider, and are inserted over and under each of the individual platters. The slider arms themselves all extend from a common axis. Thus, a drive with four platters will also have eight vertically aligned slider arms and a total of eight separate read/write heads. That the hard disk offers a volumetric space for data storage is reflected in commonplace expressions, such as when we say a drive is “empty” or “full.”
The physical capacity of the platter to record bit representations is known as its aerial density (sometimes also bit density or data density), and innovations in drive technology have frequently been driven by the desire to squeeze more and more flux reversals onto ever decreasing surface space (for example, IBM now markets a hard disk device called a Mircodrive, a single platter one inch in diameter). Typical aerial densities are now at around 10,000,000,000 bits (not bytes) per square inch. Technologies or techniques that heighten the sensitivity of the drive head’s detection circuitry are critical to increasing aerial density because as bits are placed closer and closer together their magnetic fields must be weakened so that they don’t interfere with one another; indeed, some researchers speculate that we are about to hit the physical limit of how weak a magnetic field can be and still remain detectable, even by new generations of magnetoresistive drive heads and stochastic decoding techniques such as PRML. It is important to recognize that bit representations have actual physical dimensions at this level, however tiny: measured in units called microns (a millionth of a meter, abbreviated µm), an individual bit representation is currently a rectangular area about 4.0 µm high and .15 µm wide; by contrast, a red blood cell is about 8 µm in diameter, an anthrax spore about 6 µm. Individual bit representations are visible as traceable inscriptions using instrumentation like Magnetic Force Microscopy, which I will be discussing in more detail later in the chapter (the images are striking). While all storage media, including printed books, are volumetric—that is, the surface area and structural dimensions of the media impose physical limitations on its capacity to record data—the history of magnetic media in particular has been marked by a continuous struggle with aerial densities.
It is rationalized. There is no portion of the volumetric space of the drive that is left unmapped by an intricate planar geometry comprised of tracks (sometimes called cylinders) and sectors. Put another way, the spatial tolerances within which data is written onto the drive (and read back from it) are exquisitely rationalized, much more akin to a Cartesian matrix than a blank canvas. Tracks may be visualized as concentric rings around the central spindle of each platter, tens of thousands of them on a typical disk. Sectors, meanwhile, are the radial divisions extending from the spindle to the platter’s edge. The standard size for a sector is 512 bytes or 4096 bits; if we remember that aerial densities of 10,000,000,000 bits per square inch are common, we can get some idea (however abstract) of just how many sectors there in each of the disks many thousands of tracks. (A technique called zoned bit recording allows the outermost tracks, which occupy the greatest linear space, to accommodate proportionately more sectors than the inner tracks.) Formatting a disk, an exercise which many will have performed with floppies, is the process by which the track and sector divisions—which are themselves simply flux reversals—are first written onto the media. There is thus no such thing as writing to the disk anterior to the overtly rationalized gesture of formatting. There is in addition a very low-level type of formatting, always done at the factory, called servo writing. This entails writing a unique identifier (called a servo code) for each separate track so that the head can orient itself on the surface of the platter. Formatting a disk in the way that most are familiar with the process does not alter the servo codes, which the drive’s firmware prevent a user from even accessing. This information is permanently embedded in the platter for the practical life of the drive. Thus, digital inscription, even on the scale of flux reversals embedded in magnetic media, is never a homogenous act.
Every formatted hard disk stores its own self-representation, a table of file names and addresses known (on Windows systems) as the File Allocation Table (FAT). The FAT, which dates back to DOS (which itself stands for Disk Operating System, the software layer that moved data back and forth between disk storage and the computer’s semiconductor RAM), is the skeleton key to the drive’s content. It lists every file on the disk, together with its address. The notorious eight character/three character file naming convention of DOS and early Windows systems was a direct artifact of their FAT. The basic unit for file storage is not the sector but rather clusters, larger groupings of typically 32 or 64 contiguous sectors in a track. Since the size of a file rarely corresponds exactly to a multiple of the size of a cluster, most files have empty sectors appended after the logical end of the file—these unused sectors are called slack space and sometimes contain data remnants from previous files. Clusters, furthermore, are not necessarily contiguous; larger files may be broken up into clusters scattered all over volumetric interior of the drive. Thus, a file ceases to have much meaning at the level of the platter; instead the links of its cluster chain are recorded in the FAT, where files exist only as strings of relative associations. Defragmenting a disk, another maintenance task with which readers will be familiar, is the process of moving far flung clusters closer to one another in order to improve the performance of the drive (note that the only active mechanical motion the slider arm performs is moving the heads from one track to another; the more this motion can be kept to a minimum therefore, the faster the access times). The FAT, and the data structures it maps, are arguably the apotheosis of a rationalization and atomization of writing space that began with another random access device, the codex.
One final point: it is well known that “deleting” a file does not actually remove it from the disk, even after emptying the so-called Recycle Bin. Instead, in keeping with the volumetric nature of disk storage, the delete command simply tells the FAT to make the clusters associated with a given file available again for future use—a special hex character (E5h) is affixed to the beginning of the file name, but the data itself stays intact on the platter. Common desktop utilities work by removing the special character and restoring files to the FAT as allocated clusters; more advanced techniques are sometimes capable of deeper recoveries, even after the clusters have been rewritten. We will be looking at these matters in more detail later in the chapter, but for now the point is simply the master role played by the FAT, itself a purely grammatological construct, in legislating the writing space of the drive.
It is motion-dependent. As many commentators have pointed out computing is a culture of speed, and hard drives are no exception. Motion and raw speed are integral aspects of their operation as inscription technologies. Once the computer is turned on, the hard disk is in near constant motion. The spindle motor rotates the platters at up to 10,000 revolutions per minute. This motion is essential to the functioning of the drive for two reasons. First, while the read/write head is moved laterally across the platter by the actuator arm when seeking a particular track, the head depends upon passive motion to access individual sectors: that is, once the head is in position at the appropriate track it simply waits for the target sector to rotate past. (Incidentally, platters spin counter-clockwise; note that this means that head actually reads and writes right to left.) In the past, heads were not sensitive enough to read sectors fast enough as they spun by, which lead to elaborate encoding schemes that “interleaved” or staggered the sectors such that sequential pieces of the file were accessed over the course of multiple rotations. Due to a number of factors, heads are now more than sensitive enough to read each sector in passing, and interleaving is no longer necessary.
Motion is also fundamental to the operation of the drive in a second and even more basic sense. Unlike other forms of magnetic media such as video or audio tape, or even floppy disks, where the read/write heads physically touch the surface of the recording medium, the head of a hard disk drive “flies” above the platter at a distance a tiny fraction of the width of a human hair. (The actual distances are measured in units called nanometers. Earlier we encountered microns; one micron equals 1000 nanometers. Thus, even the length and breadth of bit representations vastly exceed the flying height of the drive head. If these distances are scaled upward we arrive at a picture of a jumbo jet flying a few millimeters above the surface of the Earth.) The rapid motion of the disk creates an air cushion that floats the head of the drive. Just as a shark must swim to breathe, a hard drive must be in motion to receive or return data. This air bearing technology, as it is called (pioneered at IBM in the 1950s), explains why dust and other contaminants must be kept out of the drive casing at all costs. If the heads touch the surface of the drive while it is in motion the result is what is known as a head crash: the head, which it must be remembered is moving at speeds upward of one hundred miles per hour, will plow a furrow across the platter, and data is almost impossible to recover. Thus, a key aspect of the hard drive’s materiality as an agent of digital inscription is quite literally created out of thin air.
It is non-volatile (but variable). Though magnetic media are subject to physical deterioration, tape reels kept in archival storage conditions have been known to preserve their data for upwards of fifty years. The advent of a random access storage device with non-volatile recording capabilities was a crucial catalyst for what we now consider “interactive” computing, a point to which I shall return shortly. Magnetic core memories, which were bulky mechanical precursors to disk storage, were random access but with much lower storage capacities and permanent data had to be rewritten with each successive access. The development of magnetic tape storage was roughly contemporaneous with disk technologies, but of course magnetic tape (and paper tape, which was used earlier) is a serial medium.
Just as important as magnetic disk storage’s non-volatility was the fact that the same volumetric area could be recycled and rewritten. Though the tendency in discussions of storage media is to fixate on permanence and preservation, it is worth remembering that the ability to erase and change data rapidly was a key characteristic of the computer as envisioned by pioneers like Norbert Wiener. Punched cards and paper tape clearly did not meet these criteria. Therefore, alongside of its non-volatility, we must also acknowledge magnetic media’s variability. Interestingly, holographic storage, which some see as eventually replacing magnetic media—data is stored in a solid array of crystals—is not generally reusable. One speculation is that holographic storage will be so cheap and capacious that it will not be functionally or economically necessary to ever erase anything. (With holographic storage, aerial density thus becomes a three-dimensional metric.) Such a technology would explode current conventions of data storage, re-conceiving human computer interaction as fundamentally as random-access non-volatile (but variable) storage media did in the 1950s. A glimpse of that future is perhaps to be had in Microsoft researcher Gorden Bell’s MyLifeBits project, described as “a lifetime store of everything. . . . Gordon Bell has captured a lifetime’s worth of articles, books, cards, CDs, letters, memos, papers, photos, pictures, presentations, home movies, videotaped lectures, and voice recordings and stored them digitally. He is now paperless, and is beginning to capture phone calls, television, and radio.”
Come one, come all (feel free to write me for directions):
The Jiménez-Porter Writers’ House and MITH present FLASH artist and writer Jason Nelson.
Thursday, October 30, 7:00 p.m.
MITH, The Maryland Institute for Technology in the Humanities
McKeldin Library 6107
University of Maryland, College Park
Out of the Oklahoma plains comes the swirling, oddly crafted, poetic world of Jason Nelson’s New Media poetry and prose. He will read, click, and shudder in person and on the screen, over the speakers and through the keyboard. Join us for his electronic literature array.
Jason Nelson was raised a Oklahoma poet, but found the allure of electronic bits far too strong to remain moored in print. His projects include Hyperrhiz, a hypermedia literary journal [http://www.heliozoa.com], and Secret Technology. His work has appeared in a variety of print and online journals including Beehive (Brown University), Boomerang (UK), Epitome (Madrid), 3rdbed (NYC), Nowculture, Blue Moon Review and others. In addition his work has been featured in art galleries worldwide. Nelson has a B.A. in Cultural Geography from the University of Oklahoma and an M.F.A. in Poetry from Bowling Green State University. Next year he’ll be missing the Plains, while moving to Queensland, Australia to join the new media faculty in an innovative interdisciplinary program at Griffith University.
Complete streaming video (and a very nice presentation too) of a round table on “New Directions in Humanities Computing” from the 2002 ALLC/ACH conference in Tübingen, Germany. Speakers include David Robey, László Hunyadi, Thomas Rommel, John Dawson, Susan Hockey, Jean Anderson, Willard McCarty, Harold Short, John Unsworth, and Bill Kretzschmar.
It’s kind of like picking up the trash that gets into your yard. You do it to keep the place looking nice.
In this week’s City Paper, our free alternative weekly here in DC (page 129 for those of you playing along at home):
A black and white photo ad showing a guy and a girl, young and dressed to impress. The guy’s blond, moussed, he’s got his sleeves rolled. The girl’s in a crop top and jeans, her lower back tattoed. (You know the look.) Both turn their bare upper arms to face the camera. The copy:
“This is not your parent’s smallpox vaccine!”
Well. Bet you didn’t see that one coming. Turns out it’s the National Institutes of Health and some other federal agencies recruiting “healthy adults 18-31 years old” to participate in an experimental vaccination program. Homeland security meets the Gap.
Here’s what the visual cortex—those couple of pounds of meat at the back of your skull—can do with 87,000,000,000 dollars.
According to this interview (via Slashdot), Neal Stephenson admits to writing the first draft of his new 900-page novel Quicksilver longhand, using a fountain pen.
This revelation is followed by something rather more pedestrian:
Paper’s a really advanced technology. That was brought home to me by working on this, when I read a lot of documents from that era, which were put down on really good, acid-free paper. They’re all pretty much as good as they were the day they were made 300 or 350 years ago. This is not going to be true of today’s electronic media in 300 years. There’s a lesson there.
The tacit assumption in statements like this is always that printed documents somehow survive without preservation. That happens, of course: we all have our favorite message in a bottle story. But I bet most of the seventeenth-century documents Stephenson looked at were kept in big buildings called libraries, where they are sheltered from the wind and the rain, and kept out of the grubby, grasping hands of the general public.
That, my friends, is called preservation.
ENGL 467: Computer and Text
This course will explore what one recent critic has called cybertexts: works of literature, primarily but not exclusively digital, that are meant to be played, navigated, and manipulated in addition to “read” in the conventional sense. Choose Your Own Adventure books are examples of printed cybertexts with which you might be familiar, though as we will see they only scratch the surface—quite unimaginatively—of what is possible within the form. Specific topics will include: interactive fiction; chatterbots and intelligent agents; MUDs and MOOs; writing and/as code; hypertext, both stand-alone formats and networked on the World Wide Web; literary games and simulations; and emergent literature or “smart” texts. We will read/play/explore works from all of these genres and formats, and our discussions will focus on both identifying the cybertextual traits they have in common as well as discriminating each form’s unique achievements and significance. These discussions will be set within a broader consideration of textuality, including the question of what a text actually is—an old question which digital technologies now ask us to ask anew. You will leave the course with a sense of the literary and digital tradition of cybertext, hands-on experience of some of the most innovative literature being produced today, and (hopefully) some fundamentally new ways of thinking about texts and textuality.
Requirements: class participation, weekly responses, short papers, one longer paper or digital project, mid-term and final exams.
A note on expectations: there are no technical pre-requisites for this course. You do not have to be—nor should you expect to become!—a computer professional. Students seeking only practical instruction in software, programming, or Web design would be best advised to look elsewhere. We will, however, be using a computer-equipped classroom for weekly exercises and experiments to build on our theoretical understandings, and try our hands at producing some cybertexts ourselves.
. . . for letters of recommendation, that is. Yep, the requests have started to come in. I post the following not for the sake of my regular readers here but as an archival entry to which I can direct letter-seeking students. (On the other hand, if regular readers have comments or suggestions to add, please do.)
Writing letters is one of the best things a professor does (really—it’s a way to reconnect with past students and play a part in their future plans), but it can also quickly get out of control—not just the time spent crafting the letters themselves, but the correspondence back and forth to negotiate all the details and information.
So, if you’ve asked me for a letter . . .
Here’s a taste:
As part of the spring ritual of National Poetry Month, poets are symbolically dragged into the public square in order to be humiliated with the claim that their product has not achieved sufficient market penetration and must be revived by the Artificial Resuscitation Foundation (ARF) lest the art form collapse from its own incompetence, irrelevance, and as a result of the general disinterest among the broad masses of the American People.
The motto of ARF’s National Poetry Month is: “Poetry’s not so bad, really.”
CNN brings us the news: a team of British researchers have apparently discovered exactly why packaged cookies are prone to breaking and crumbling:
. . . as biscuits cool down after coming out of the oven, they pick up moisture around the rim which causes them to expand. At the same time, moisture at the center makes them contract. The difference results in a build-up of strain forces that can pull a cookie apart. Cracks appear that weaken cookies so they easily break apart when handled, moved or packaged.
I post this not just for the novelty of the subject, but because of my standing interests in instrumentation and visualization. The technique used to gather this data is called digital speckle pattern interferometry (the actual paper, in Measurement Science and Technology, is entitled “A novel application of speckle interferometry for the measurement of strain distributions in semi-sweet biscuits”).
The process apparently has something to do with using light waves to record very tiny measures of spatial displacement, with the end product being a digital image that expresses the change in graphical form. Of course such an image is “mediated,” but I’m profoundly interested in how to talk about this nexus of vision, instrumentation, and representation: is what we’re seeing still a “cookie” in any phenomenological sense?
I’ve just been made aware that a special issue of Computers and the Humanities on image-based humanities computing that I guest edited a year or two ago is available in its entirety online, as the journal’s free digital sample [select “Journal Contents” from the left-hand menu, and then the February 2002 issue]. Articles or contributions on images and imaging from Kevin Kiernan, Joseph Viscomi, Eric Lecolinet, Laurent Robert, François Role, Mary Keeler, Jerome McGann, Bethany Nowviskie, and myself.
I’m giving serious thought to turning off the comments here. It’s not just the comment spam, which is getting worse by the day—but I know there are solutions out there—it’s also the just plain ugly sh*t that comes in out of left field (I’m a rotten censor, I don’t know what I’m talking about, etc.). Yeah, I’ve been online since the early nineties, I have a pretty thick skin, but still . . . that’s not the reason I post here. Plus I think there’s a way in which open comments on every post alters the reception of the blog as a whole: the worth of an entry is implicitly measured by how many comments it garners. I do it myself: if I just have time to skim someone else’s blog for the day I’m much more likely to linger over an entry that’s been tagged with a dozen comments than one with a goose egg.
Trackback, meanwhile, provides a mechanism for others who want to take up the issues raised in a given post, but with a measure of individual accountability that open comments lack. Indeed, if MT didn’t support comments at all I suspect trackback would be used even more widely than it is now, resulting in a much more densely woven blogosphere.
Anyway, just something I’m thinking about.
Um . . . comments, anyone?