I first got interested in computers back in the late seventies, when I was eleven or twelve -- you know, the prehistoric days when you had to shovel coal in the back of computers the size of a Buick. The hobby persisted through my college years from '85 to '89, even though I majored in English. My computer experience, of course, went a long way toward getting a real-world job for a poor English major, so I pretended I knew more than I really did, and worked for a big insurance company for a few years after graduation. But I got tired of Lotus and dBASE, and decided to follow my heart to English literature and smaller paychecks. In 1993, when I left the corporate world to go back to school for a Ph.D. in English, I thought that trading the title "Senior Systems Specialist" for "underpaid graduate student" would mean I'd bid the computers goodbye.
I didn't. I was surprised to see just how important computers had become, even in what seems so un-technological a place as an English department. I now use computers even more often, and certainly more innovatively, than I ever did when I was a corporate programmer. I'm now a nearly geriatric twenty-eight years old, so more than half my life has been spent hunched over keyboards and squinting into monitors. I'd like to give you some tips on just how revolutionary the Information Revolution can be.
What, then, does a computer have to offer an someone in an English department? Neither as much as the computer enthusiasts would have you believe, I say, nor as little as the grumpy old cyberphobes insist. But the skeptics, convinced the computer has no place outside the math and science departments, are as right to be skeptical as the enthusiasts are right to be enthusiastic. It's a long way from "Super Mario Brothers" to the Great American Novel. We take it for granted that the computer is somehow "naturally" mathematical. Computers, after all, think in ones and zeroes, right? -- they string these ones and zeroes together in mysterious patterns to make bigger numbers, and numbers have nothing to do with Huck Finn or Pride and Prejudice.
That's a good explanation, and its only drawback is that it's completely wrong. Computers don't use ones and zeroes: they use patterns of ons and offs. A computer is just a bunch of switches connected together: in fact the simplest computer is a single light switch, which has an input device (the switch), a central processing unit (the wires in the back), and an output device (the light bulb). It makes a simple decision based on the status of the input device. It's a little more complicated when two switches control one bulb -- both up means on, or whatever. Now connect seven million light switches and shrink them down to the size of a cornflake, and you have the modern computer.
The reason we pretend these ons and offs stand for numbers is a historical accident: the first computers happened to be built to solve math problems (missile trajectories in the Second World War). Besides, the people who built them were engineers, and they brought their math along with them. So we can call the ons one and offs zero, but we can just as easily say the ons are up and the offs down, or black and white, or forks and spoons, or Beavis and Butthead -- whatever tickles your fancy.
Put them together into patterns, and the meanings are just as arbitrary. Just as all the literature in the language was put together from combinations of twenty-six letters, we can represent anything we like with combinations of ons and offs. On-off-off-on-off-on-on can be the number seventy-five, but it can just as easily be the letter J, or a shade of reddish-purple, or the Queen of Clubs, or the sound of a bass guitar playing a G-sharp. A computer, that is, can represent anything that we can reduce to a symbol.
I know, I know: you'd have to be crazy to read Moby-Dick on a screen. Even the best screens are awful compared to typesetting machines; a bad typesetter is, if I haven't lost all my math ability while studying English, about 14,400% clearer. That's not the idea. You can, however, print out whatever part of Moby-Dick you want to read, or the Allen Ginsberg poem you're discussing in class tomorrow. (It's no crime to print things out, by the way, and Lord knows I'm not delivering a funeral oration for the printed book.) You can also do things nearly impossible on paper, such as searching books, a godsend when it comes time to write papers about passages you remember hazily but can't find. I'll race anyone here in finding the chapter of Moby-Dick where Steelkilt first appears; you thumb through a paper copy, and I'll type "Steelkilt" in the "Find" box. (It's chapter fifty-four, by the way.) And let's count the number of times Shakespeare uses "thou" versus "you" in his complete works: I'll give you a sixty-day head start.
The Internet, which seems to be in every third news story these days, is where the action is. It's pretty easy to understand: it's just a zillion stand-alone computers connected together with special cables, over which you can send information from any machine on the Net to any other, almost instantly. A message from Philadelphia will reach Tel Aviv in maybe fifteen seconds. E-mail is an obvious use of the Net, but don't underestimate just how momentous a new communications medium can be: at Penn, for instance, we have E-mail discussion groups for every course in the department, where students and instructors can talk with one another as equals outside class, and there are many thousands of international discussion groups in which no-name schmoes like me can ask questions of world-renowned authorities. And though text files are the most common kind of information on the Net, computers can contain anything that can be reduced to symbols: pictures, sound, even movies.
As the Internet grew, more and more people created more and more information, but it was scattered to the winds. Because the Internet has no center, no one in control, there's no index or guide to finding the good stuff. So a group of particle physicists in Switzerland developed the next big nineties buzzword, the World Wide Web -- which, when you get down to it, isn't much more than a naming convention. It provides a simple and consistent way of saying where this information is on the Internet, and tells your machine how to fetch it. Whereas just a few years ago it took real expertise to sniff out what you wanted and even more to figure out how to get it, the Web wisely lets the computer take care of the intimidating details, and arranges the information in a more manageable shape for human beings. Now you read things with what's called a Web browser, and by simply pointing and clicking you can find related information without worrying about the tricky computer details.
That's the technical side. In human terms, though, much more is going on. Anyone with access to an Internet computer -- and recent estimates are around thirty million people in America -- can read, see, or hear any of this information, wherever it is in the world, with just a few mouse clicks. What's even more important -- indeed, radical -- is that almost anyone can make information available on the Web just by putting it on one of these machines. And "make information available" is just a long way of saying "publish." "The freedom of the press," said a wise-cracker, "extends only to those who own one," and that's true; but with the Web, anyone with twelve bucks a month can be a publisher with a larger potential audience than Gutenberg ever imagined.
The result is an explosion of information in every field you can name. I provide a number of collections of information on-line: an index to literary material on the Web; another on the eighteenth century; a thirty-page guide to grammar and style for my students; the syllabus for my course, along with many of the books I have my students read, and so on. Anyone in the world can read them, and, as it turns out, around three thousand people do every day. Other people provide big archives of Modernist paintings, medieval manuscripts, books that have been out of print for centuries, Italian lessons, recordings of Renaissance madrigals, you name it. Some of it's put together by enthusiastic amateurs, while some is serious and scholarly -- such as the huge edition of Mary Shelley's Frankenstein I'm working on, which takes the two-hundred-page novel and embeds it in over ten thousand pages of electronic footnotes, including other complete works, paintings, and movie clips. Penn's Van Pelt Library, with its four million volumes, is in no danger of becoming obsolete any time soon, but it will certainly have to think through the implications of all this new information flooding the market.
Then again, most of what you'll find in a bookshop is also awful. The only difference is that we've learned where to look for the good stuff. The printed book is approaching its five hundred fiftieth birthday, so we've learned, pace the old saying, to judge books by their covers. We know that a cover sporting a shirtless Fabio holding a lusty, busty, pouting nymphette promises a certain kind of reading experience, and a sober blue cloth volume stamped with the insignia of Oxford University Press promises another. I have friends who publish both, so I'm not knocking either one: the publishing world is big enough for them and for countless others. But institutional structures, like publishers' colophons and The New York Review of Books, have taught us how to distinguish information we want from information we don't.
The Web can be even bigger, bigger by orders of magnitude, and there's room for great heaps of stuff, good and bad. What hasn't emerged yet is the same sort of institutional structures: anyone can "publish" anything, no matter how stupid, and by gum, anyone usually does. Nastiness, too, thrives almost as well in the new medium as stupidity, and even though the news reports are exaggerated and alarmist, it's true that the Web is crawling with pornography, tips for terrorists, dishonest advice on medical problems, scams that rip off the defenseless, more copyright violations than you can imagine, and other unpleasantness.
But I've read enough history to be convinced that technology will shake up everything we do, in and out of school. Not necessarily because computers will make anything better, but simply because technology always takes over everything: it's just what technology does. We can rush out to meet it or we can cower in a corner, but it's coming. It's coming to our classrooms, and not just the computer or math or science classrooms. It might even be transforming what the classroom is: one Penn professor held a Latin class that met only in cyberspace, never in person, with people enrolled from across the country and across the world.
As I see it, these changes can come from three groups of people. One group is in the engineering schools, the pocket-protector crowd that builds the machines and writes the programs. The second is in the big corporations, the blue-suited men (they're almost all men) with seven-figure salaries. (Bill Gates straddles the line, the king of the nerds with a nine-figure salary.) The third group is us -- the rest of us. I make a living teaching English literature to college students, and if anyone's going to tamper with how I do my job, I want it to be me. I want to be the one who has some control over the form information takes in the next century, and that's what drives me, skeptic that I am, to produce and sort through all this information on the Web.
I've been giving you examples from an English department, but that's just because it's what I know best. There's not a topic in the world that can't benefit from an innovative thinker with a new way of manipulating symbols. You've all doubtless heard that "computer skills" are the key to getting a good job, and that's true. But it means more than knowing how to use WordPerfect or a spreadsheet. The real revolutionaries are the ones who'll learn to take advantage of the new medium. One computer visionary, Mitch Kapor, points out, "The best uses for personal computers haven't been invented yet. To disagree with that is simply a failure of imagination." Believe him.
For those who've waved their fists and cursed at Unrecoverable Application Errors, for those who've stared in disbelief at the machine's smug belch as it swallowed twenty-three of your best pages, I'm with you -- I feel your pain, brothers and sisters. "The Good News," writes Ted Nelson: "With your personal computer, anything is possible. The Bad News: Nothing is easy." They're everywhere, but if we're honest, we have to admit computers have made almost no one's life easier. But the desktop computer isn't much older than most of you in this room, and things are bound to change. I like to think computers are at roughly the same stage of development as automobiles in, say, 1926 -- just starting to become mass-market items, but still more trouble than a horse. Being a pioneer might be exciting, but it's just as often a pain in the ass.
That's where there's room for you, for anyone willing to wrestle with the new technology and think of new ways to stretch our minds. Computers, after all, are nothing more than tools, and they help with mental tasks the same way hammers and pulleys help with physical ones. But whereas a hammer has (more or less) one use, whacking things, the computer is infinitely flexible: it can do anything we're smart enough to ask it to do. The reason it's still odd to think of computers and literature together is because previous generations have suffered from the failure of imagination Kapor talked about. Keep on imagining, and you'll find that a symbol-manipulating tool will take you to places my generation never thought of.