DHQ: Digital Humanities Quarterly
2008
Volume 2 Number 1
2008 2.1  |  XMLPDFPrint

Something Called “Digital Humanities”

Wendell Piez , Mulberry Technologies, Inc.

Abstract

Are the Digital Humanities only a concession to fashion, and as such a sign of decline? A professor of English, off the cuff and online, suggests as much, but there is reason to wonder.

Sometimes I find myself procrastinating by surfing the web. I like to think this is worthwhile. It’s a way of clearing mental space and collecting energy for work to be done, while looking out and enlarging my world. Reading short articles and essays works best. It needs to be something I mainly agree with, but not entirely. There should at least be some insight or perspective that complicates and challenges my own to a degree, which I can integrate with some reward and satisfaction. Finding it unnerving to have my prejudices simply confirmed, at such moments I want to change my mind to whatever extent it can be changed without being troubled, which is to say, slightly.
For these purposes, the online edition of The Nation is just the thing. And so one day I found myself reading a lament about the present and future of academic English departments. William Deresiewicz’s Professing Literature in 2008 is ostensibly a review of a new reissue of Gerald Graff’s Professing Literature. As such, I found it to be trenchant enough, if not about its subject (which it merely glances at, and I have never read) then about its world (which preoccupies it, and with which, at one time, I was very familiar). It stood out in my mind for two things. There was a tidy analogy: by the author’s account, the curricula now offered in English departments are fragmented by fashion and identity politics to such an extent as to reflect nothing so much as efforts, ingenuous or not, to win students by flattery. “If grade schools behaved like this, every subject would be recess, and lunch would consist of chocolate cake.” And by way of depicting the symptoms of the problem, there is an interesting summary of how this fragmentation manifests itself in the want ads posted by departments in the annual MLA Job Information List. “Contemporary lit, global lit, ethnic American lit; creative writing, film, ecocriticism — whatever. There are postings here for positions in science fiction, in fantasy literature, in children’s literature, even in something called ‘digital humanities.’ ”
This got my attention. Fair enough, I thought, I can take it. Unless the analysis is completely wrong and wrong-headed, I can allow that “Digital Humanities” (the name “Humanities Computing” no longer serves in an era when the computer club has become an in-group: the rule of identity politics seems to be that for one’s old identity to become fashionable, you need a new name for it) could well be seen by an outsider as yet another canapé served up at the humanistic buffet. Yes, as long as things are being compared to food, Digital Humanities may indeed be the smoked salmon, sour cream, capers and dill of the English Department smorgasbord. I imagine one might well feel jaded by this, especially when it is served from a table at which one has already had more than one’s fill in any case. So I found myself sympathetic. And thus Prof Deresiewicz had met the need: he had made me think, but not too hard, and he’d put me in my place about this thing called Digital Humanities. Procrastination had been accomplished, I had been stimulated, and I could turn my attention back to the task at hand.
But as I discovered a couple of days later, something still rankled. And it wasn’t the casual put-down of my own discipline, such as it is. I’m fortunate enough to be making a happy living at it outside the academy, so what it is called and whether it is academically respectable are secondary concerns to me. Rather, it was the dismissal of all these fragmentary disciplines and sub-disciplines, not only the computer club. I had the sense that however amused I was by his polemic, my new friend Deresiewicz had missed the point. He characterizes this motley chaos as entropy on the way to death. (We are invited, in an essay that disparages those who would pay attention to their wishes and tastes, to make much of the fact that the number of students opting for an English major is declining.) But from a greater distance, I see something more like a field where native plants and wildflowers are overtaking a tidy lawn. I prefer to imagine that the efforts of departments to broaden their offerings may reflect more than just a rear-guard attempt to market to naïve and self-interested undergraduates, but rather more nobly, an effort to cultivate the most committed and imaginative of younger faculty, while also redressing some very old imbalances, and thus to lead, however clumsily, this most esoteric, inexplicable and vital discipline back to relevance and connection. For all the incomprehensible mix of offerings in media consciousness and post-colonial sensibility and historiographical critique and cultural studiousness, it seems to me they all share a kind of genetic code. Maybe all the work of close reading, of abstracting away from context to study the form itself, and then bringing the all-important context of the reading back to the reading, is actually beginning to work itself into seeing the world at large. If so, this reasoning goes, it isn’t simply an arid intellectual exercise: maybe the old-fashioned humanists were right when they claimed that these methods and habits of mind, the practiced powers of an encouraging, engaged but critical and self-critical sensibility, are the only way we have to loosen the truly hard, knotted problems, the ones that are complicated by interests and wet with sticky emotion and identification and self-concern. Keep the faith, I wanted to answer: a time when the humanities seem not only to be forgotten, but to be forgetting themselves, is exactly the time when all of us, from bleeding-heart animal liberationists, to neo-Marxists offended by concentrations of capital, to addled Queer Theorists, to cool and collected, technology-competent Digital Humanists with our grand visions and enthusiasms for acronyms, should be enrolling in one another’s courses — or at the very least, reading each others’ blogs sympathetically while procrastinating — with the deliberate purpose of reminding ourselves what we have in common.
So I returned to the topic, at least in mind. But I had lost the link. It hadn’t been important enough to bookmark, and who could remember where it was from? There was this essay to which I wanted to respond, maybe even write about, somewhere out on the Internet, but nowhere in sight. Fortunately, we have search engines to help with this sort of thing, and in only a few seconds spent between things I should have been doing instead, I had not only the original article, but much more.
So let me pass to a couple of the more incisive responses I came upon (at Margaret Soltan’s blog, as returned from a search for “something called digital humanities” in March of 2008):

Back in grad school in history (one of the Ivies), a friend made what struck me as a cogent observation. At this high-end university, where students could be sure of a comfortable income regardless of undergrad concentration, history was the most popular major, far ahead of English. His explanation was that students who wanted a real sense of the development and dynamics of their own civilization could no longer get that from the English department. It was still possible in history.  (Dave Stone)

This struck home because it was one of the reasons I turned to study English, enrolling in a PhD program after I’d finished an undergraduate degree in Classics (Ancient Greek) at one of these same high-end universities. I don’t think the explanation captures the whole of it (more on this below), but it does resonate. Certainly, in my own case, the attraction of the humanities was in their promise of some such sense of something called “Western civilization”. Nor was the promise empty. Yet the main difference between my own case, and that of my classmates who studied History and then went on to master law or finance, is also telling. Ten years was hardly enough to teach me that four years of anything so substantial is nothing more than the barest beginning of it.
Now this does not at all either contradict the thesis, or mitigate the concern expressed in Deresiewicz’s review. But it does get closer to the heart of it. Given that four years is all that most students will or can choose to take, and that they are not just teenagers but also free adults, what is the best a university can offer them, and especially those who are lucky or bold enough to approach their schooling with a long view? Maybe, something that touches them, something they find meaningful, and that has enough of the real stuff in it that they can take a taste with them for intellectual engagement, for the satisfaction and usefulness of recognizing the macrocosm in the microcosm. An understanding of why you look close up, and also why you step back and look from a distance. And perhaps most importantly, a willingness to have casual assumptions broken down before they are built up again, with care and deliberation. As an exile from the academy, I can vouch for how inestimably valuable such a cast of mind turns out to be even in less high-brow (though no less cerebral) pursuits than academic scholarship.
Or, as another of my virtual interlocutors has it,

Let’s take “the digital humanities”. In even the most traditional conception of an English Department, the development of print literature in successive forms was an absolutely core subject. That’s what you studied if you studied Beowulf or Chaucer. It’s what you studied if you studied Shakespeare. It’s what you studied if you studied Richardson and Fielding. It’s what you studied if you studied Dickens. It’s what you studied if you studied Joyce. You read closely, did the close work of interpretation, but you also looked at the history of the book, of publication, of annotation, of circulation. This is not a fancy new trendy concern. How could you read Beowulf in an English course and not ask about the connection between oral literature and writing? Shakespeare and the connection between Elizabethean theater and writing? Fielding and the development of the novel as a popular form? Dickens and serialization? (Timothy Burke)

This takes us much further, quite close to the essence of it. By implication, in Burke’s telling, the proper object of Digital Humanities is what one might call “media consciousness” in a digital age, a particular kind of critical attitude analogous to, and indeed continuous with, a more general media consciousness as applied to cultural production in any nation or period. Such an awareness will begin in a study of linguistic and rhetorical forms, but it does not stop there. Yet even this is only half of it. Inasmuch as critique may imply refiguration and reinvention, Digital Humanities has also a reciprocal and complementary project. Not only do we study digital media and the cultures and cultural impacts of digital media; also we are concerned with designing and making them. In this respect (and notwithstanding how many of its initiatives may prove short-lived), Digital Humanities resembles nothing so much as the humanistic movement that instigated the European Renaissance, which was concerned not only with the revival of Classical scholarship in its time but also with the development and application of high technology to learning and its dissemination. Scholar-technologists such as Nicolas Jenson and Aldus Manutius designed type faces and scholarly apparatus, founded publishing houses and invented the modern critical edition. In doing so they pioneered the forms of knowledge that academics work within to this day, despite the repeatedly promised revolutions of audio recording, radio, cinema and television. Only now are these foundations being examined again, as digital media begin to offer something like the same intimacy and connection that paper, ink and print media have offered between the peculiar and individual scholar, our subjects of study, and the wider community — an intimacy and connection (this cannot be overstressed) founded in the individual scholar’s role as a creator and producer of media, not just a consumer. And yet, when we look at their substance, how digital media are encoded (being symbolic constructs arranged to work within algorithmic, machine-mediated processes that are themselves a form of cultural production) and how they encode culture in words, colors, sounds, images, and instrumentation, it is also evident that far from having no more need for literacy, they demand it, fulfill it, extend and raise it to ever higher levels.
But this will be challenging, even upsetting. And so, a view that sees the proliferation of curricula in academic English departments as a catering to the marketplace, or even a sign of decline, is understandable if those who hold such a view are looking out for their own specialties or interests. Within the academic setting, the zero-sum competition over scarce resources like faculty lines, fellowships, students, publication opportunities, awards and recognition must tend to suggest that any broadening of attention must mean a diffusion; thus, the old order, which had thought things were settled well enough, finds itself embattled. Yet these subdisciplines, both individually and collectively, can now be flourishing only if they can all draw strength from broader and deeper bases than before. There are more publication venues, more channels for access, more ways of reaching audiences and hearing from them, more opportunities for engagement, dialogue and learning than are offered even in the classroom, lecture or dining hall (medieval institutions) or by monograph or journal article (since the age of print) or conference paper (since the age of mobility). Many of these opportunities, especially opportunities for the creation and maintenance of geographically dispersed communities of interest, are created by networked digital media: mailing lists, blogs, wikis, online publishing projects, digests, courseware, shareware, groupware. But that is not the primary point here. Rather, it is that this great variety could never arise from a zero-sum game, but must reflect positive-sum outcomes among and between participants in communities of knowledge that reach far beyond individual departments. And further, this wider economy of knowledge, curiosity and concern offers fantastic opportunities for more such outcomes, if only supposed rivals or antagonists can find common cause in mutual interest. No, faculty lines are not created out of nothing. Yet neither has intellectual wealth ever been created simply by the blind operation of faculty doing their jobs. It is generated in the combustion of passion and community under the compressive force of discipline. And a department that finds a way, while remaining a community, to include in its offerings a range that reflects the breadth as well as depth of interests of students, fellows and faculty, may discover that all its engagements are strengthened. Within this context, “something called digital humanities” — something so obscure and esoteric that it is almost beneath an English professor’s notice — may be more than just another subspecialty (although it is that, for the same reason that not all English professors will be experts in print technology): it also works directly at the ground where this new, larger, more elaborate, more entangled and variegated culture is rooted.
Which brings me back to the intuition that all these avenues, whether area or period studies, genre studies (including science fiction, fantasy, children’s literature, romance, the graphic novel or what have you), concentrations of concern such as eco-criticism, political or socio-historical or epistemological critiques, or simply diversions as obscure as they are compelling, as when students and faculty become caught up in the creative possibilities of the new media, nevertheless share something vital, which belies the notion that they must induce an irremediable fragmentation. Indeed, when they are pursued conscientiously, I think what they share is what the humanities have always shared. That may not, it is true, always be what the critics warn the new specialties will deny us, a “shared culture”; but then it is something better. The study of the humanities has always offered two things. Or rather, it has offered one thing and then, like a con artist or a fairy-tale trickster, been ready to switch it under your nose for something you didn’t think you wanted, but which turns out to be far more valuable than what you had thought you had put down money for. When you signed up, you had thought you were to be awarded a validating and affirming narrative, some account of origins that would banish your doubts and prove your boundless worth. (Maybe it’s for this that the ambitious but complacent scions of the well-to-do turn to major in History, if the English Department no longer offers them a satisfying foundation myth.) But if you are lucky, you are initiated instead into a world view that is not only critical, but tolerant of criticism and therefore capable of vitality, creativity and growth. The self-knowledge you are offered does not raise you and your tribe above humanity, but implicates you in it. Seen in this light, “Digital Humanities” does not need to be a catchphrase or a cause, unless the cause is the humanities themselves in a digital age.
2008 2.1  |  XMLPDFPrint