The Search for Respectability:
Cutting-Edge Anxieties in a Digital Age

By Jack Lynch,
Rutgers University -- Newark

Delivered 28 December 1999
at the MLA meeting in Chicago

I'm not certain I belong here, in a panel sponsored by the Committee on Computers and Emerging Technologies in Teaching and Research. That's because I'm not at all certain I work with computers and emerging technologies in my teaching or research. My shtick is eighteenth-century literary and intellectual history, where the only emerging technology is the steam-driven printing press. Most of my work on the Net is just taking advantage of a new medium to distribute traditional scholarship. I'm no expert on humanities computing.

I'm here, though, because a little knowledge goes a long way toward making one a de facto expert. My Web pages on literary resources, eighteenth-century studies, and grammar and style made me a part of the humanities computing avant garde while I wasn't looking. And while the term "avant garde" flatteringly implies being the first hero on the scene, it also has a more frightening undercurrent: the avant garde is the first to fall when the bullets start flying. I'd like to discuss how we might dodge some of those bullets. My little jeremiad is admittedly long on exhortation and short on specifics, but perhaps we're still at a stage where a good harangue is as needful as a detailed five-year plan.


The attention my Web pages and mailing lists have attracted is gratifying. My personal pages boast over ten thousand hits a day, and five thousand scholars saw my name on calls for papers daily, which made me uncommonly visible for a graduate student. A word to the job-seeking wise: when I went to market two years ago, at least a dozen members of interviewing committees knew of my Web efforts. In a business where success comes largely from self-promotion, it doubtless helped me to secure a job. Judicious use of the new media can be a wonderful way to gain professional visibility.

But while the Internet can secure visibility, that's not the same thing as respectability. I'm now an assistant professor in an English department that's curious about my electronic work, but still unsure of what to make of it. They do know, however -- despite my protests that my intellectual work is mostly traditional -- that I'm the computer guy, the one who does the mysterious stuff with the gizmos and the gadgets. Today's talks are therefore especially interesting to me, because the standards they propose will be applied to me very soon. My interest in evaluating electronic scholarship is therefore self-interest; I'm concerned first of all with securing my own future in the academy. And anxiety on that front may be warranted. Frankly, I took a short-cut to a scholarly reputation I haven't yet deserved. I therefore spend much of my time trying to convince a dubious department that what I do on-line has some merit so that, when I come up for tenure in four years, I won't be dismissed as a one-trick pony. And the problem isn't simply mine: anyone who does research in and on the new media is confronted with questions about its value. For junior scholars, disproportionately represented in this new arena, the problems are exacerbated by tenure timetables and the need to show credentials or be sent packing.

Let me advocate, for a moment, the anti-technological devil. Why worry about setting standards at all? -- why not just let them emerge, as it were, "organically"? Most of us here agree that computers are, if not the future, at least a big part of the future; there's a sense of inevitability which few of us shared even five years ago. So one might make the case that we needn't worry about standards, because they'll arise naturally. Some might go even further and suggest that standards are a bad idea. There's something to that: we should be cautious about establishing standards in a rapidly changing field, lest our future colleagues (and our future selves) complain in 2004 about those naive idiots who in 1999 carved such short-sighted guidelines in stone.

Why, then, bully our Luddite colleagues now, when they're bound to come around eventually? There are several reasons. First, clocks are already ticking, not least my own: at my back I always hear the tenure committee's wing'd chariot hurrying near. We haven't the leisure to wait for a generation to pass. Second, and more important, the standards we set now will help to determine the direction the profession takes in the future. If the fact of electronic scholarship is inevitable, the specific direction it takes most certainly is not. And as I see it, the new directions in research and teaching can come from three places. The first is Silicon Valley (and, of course, its outpost in Redmond): technology itself might shape our scholarly activity and determine its value. We can wait for the programmers to come up with new tools (and new toys), and shape our research and pedagogy accordingly. The second possible source of change is Wall Street: we can let our work be directed by Mammon. Administrators are understandably keen on this. When we think about high-tech pedagogy, we think about release time for curriculum development; when they think about high-tech pedagogy, they think about reaching more paying students with fewer paid faculty. You can almost see the dollar signs in their eyes. In my more cynical and dystopian moods, I worry that we cyberjockeys may be building more Western Governors Universities and Universities of Phoenix -- perhaps some future Bill Gates U. -- where all the administration's talk is of the educational "market."1 And administrators are as susceptible as anyone to dreams of scrillion-dollar dot-com IPOs. Who can blame them for thinking that a well-marketed academic CD-ROM might pay for the new football stadium?

I'm not so naive as to think we can, or even should, resist every change that comes from Silicon Valley or from Wall Street. Few of us can do without Palo Alto's gizmos, and if we can save our universities a few bucks, why not be a good sport? But I'd like to see most of the changes come from a third place: the campus. We teachers and researchers have not only the opportunity but the obligation to put our hand on the tiller, because if we just float down the current, we'll probably go in a direction we resent. It means that educators should be responsible for wresting control of educational criteria away from the programmers and the bean-counters. Technological inevitability, far from authorizing us to put our feet up, is the reason we need to address the problems of evaluation with such urgency.

Any convincing solution to these problems will have at least two components. The first will perhaps sound like heresy among those of us who know the secret E-scholarship club handshake, but I think we need to cultivate a healthy distrust of our own work. Anyone who wants to take scholarship in a new direction should never stop asking whether it's the right direction. The digital revolution isn't necessarily a good thing. It's downright foolish, for instance, to assume that democratizing publishing will produce a thousand Clarendon Presses; more likely it will produce a thousand vanity presses. Scholarly institutions are wise to be circumspect about schemers who might take their rejected essays, click on "Save As HTML," and declare them so many publications. We all agree, I hope, that no one should be tenured on the strength of a few vanity publications. We therefore have to resist the self-righteous urge to dismiss our critics as know-nothing Luddites: they can instill in us a healthy self-doubt. Let's avoid the tempting enlightened-us-versus-benighted-them approach to electronic scholarship: let's never stop taking our critics seriously, even the most reactionary.

The second part of the solution is the greater challenge: to change institutional attitudes. That's never easy: any au courant academic touting the latest theory or method faces similar problems in seeking legitimacy. The establishment is often reactionary in its adherence to familiar forms, and even the young Turks eventually become the old guard, eager to protect their turf from the rising generation. Our story is an old one.

But it's also a new one, because wired scholarship poses special problems. It brings with it changes not only in theories and methods, but also in medium, and maybe the academy isn't ready for that -- it goes straight to the heart of how universities do their business. Senior faculty members may be dubious about the cultural materialist and the queer theorist coming up for tenure, but when his articles come out in PMLA and her book bears a Harvard imprint, institutional inertia is at least nudged by a colophon that invokes awe. Will an "http," however distinguished the attached ".edu," have a similar effect?

The academy may not be willing to rethink the new media without prodding, and for those of us who hope to do more than write about digital issues in traditional print publications, we have a difficult case to make. Tenure committees are notorious for their reliance on shortcuts. Who can blame them? The chemists and the physicists who'll pass judgment on my work are no more able to evaluate the complexities of my research than I am to evaluate theirs. Much easier than engaging with the research itself is to look for reputable forums for refereed publication. In the humanities, that often means that a book from one of only about a dozen major university presses will pass muster. When can we expect any form of electronic publication to be admitted to that exclusive fraternity? Another favorite habit of tenure committees is simply counting publications: how will they adapt when the very idea of publication changes, and one scholar boasts of seven hundred published lexia, while another spends ten years on a project that still can't be called "finished"?

And so, after answering some of these questions to our own satisfaction, we need to prosecute the case convincingly in the rest of the university. The change will be helped by already-reputable scholarly societies adding an electronic component to their print publications -- recognizable names thereby lend an imprimatur to all electronic publications and help to make the digital media palatable. Certainly not even the grumpiest traditionalist will balk at publication in a Hopkins journal just because it also shows up in Project Muse. Other projects, though, will start from scratch in cyberspace, never having passed through a print stage, and these will naturally face greater obstacles in their search for recognition. It will sometimes be in their interest to mimic traditional publications, at least in the usual paraphernalia of scholarly respectability -- high-profile editorial boards, referees, blind submissions, and so on. But we also have to make a convincing case for the value of the new media themselves, treating them as more than print journals displayed on a screen. Project Muse helps in one way, but may hinder us in another, by suggesting that the digital revolution is nothing more than a neutral change in medium. The patron saint of humanities computing, Marshall McLuhan, has taught us there is no "neutral" change in medium. That's the word we have to get out. We need to demonstrate that what we're building is more than a belletristic Nintendo game, that the Web is more than the world's largest repository of time-wasters. Until we convince our least wired colleague that we're about more than high-tech gimmicks, all the criteria we turn out will be for nought.

This sea-change will inevitably be brought about with the same old blend of confrontation and conciliation by which universities always evolve. But it's in our own interest to strategize about how to effect the change, to make an intelligent argument that our work and the form in which it is presented deserve the attention of the academy. If we want to stay out of Bill Gates U. -- eight million students and thirteen full-time faculty -- our future depends on it.


Note

1. A depressing excursus: whereas Arizona State spends an average of $247 per credit hour taught, the University of Phoenix -- by relying on poorly paid part-time lecturers reaching thousands of students through the very technology we champion -- gets by on just $46 a credit hour. See Lisa Gubernick and Ashlea Ebeling, "I Got My Degree through E-mail," Forbes (19 June 1997).