Working Digitally

Contributed by Willard McCarty Professor of Humanities Computing at King's College London
January 26, 2011
willardmccarty's picture

It is about a search for daily meaning as well as daily bread, for recognition as well as cash, for astonishment rather than torpor; in short, for a sort of life rather than a Monday through Friday sort of dying.

Studs Terkel, Working (1972: xiii)

For Sinéad O'Sullivan

In Working: People Talk About What They Do All Day and How They Feel About What They Do, the oral historian Studs Terkel writes movingly of job-related violence "to the spirit as well as to the body" of the ordinary American working men and women whom he interviewed. Their stories derive power not just from that violence but more from the humanity revealed in their aspirations to a better life. Reading these stories positively, it is difficult not to hear Raymond Williams' declaration that "culture is ordinary" and to bring back, as it keeps coming back, the passionate and articulate concern he shared with Richard Hoggart for improving the lives of workers by improving how those lives are understood. Without much difficulty one can recognise not just the evidence of blows but also intimations that a life worth living just might be possible. In his essay "Expanding Eyes" (1975), Northrop Frye observes that only a hell with hope is a real hell. Hence the damage to the spirit of which Terkel speaks. But hope also empowers endurance and morally compels our attention — not from on high (where we are not) but from the common ground we share.

Decades of persuasion and strong market-forces have made Williams' and Hoggart's arguments so much a part of our intellectual furniture that we are apt to be forgetful of them. Williams' essay "Culture is ordinary" (1958), his Culture and Society (1958) and Hoggart's The Uses of Literacy: Aspects of Working Class Life (1957) are of their time. Times have changed. The study of culture, altered forever by what they wrote, has moved on to produce new arguments for new situations. Yet, as I said, these works keep coming back despite disciplinary evolution — for example, Hoggart's book in a new edition introduced by a 21st-century champion, Lynsey Hanley (2009).

Williams wrote that "there are… no masses, but only ways of seeing people as masses" (2000/1978: 18). The way of seeing I want to adopt for my purposes here is the one that masses us together with Terkel's workers in respect of their frustrated hope for "a sort of life rather than… a sort of dying". I cannot see this hope as confined to the mid 20th century; it is just as keenly felt and keenly disappointed now. True, we are incomparably privileged. But does this cheapen our human solidarity with Terkel's workers? We at least share, I hope, their longing for something better, especially when, bruised by one or another of the academy's muscular gatekeepers or exiled in spirit from its current polity, we pace outside the walls or turn to wonder what might be done in the open fields beyond. Our lives are, it's true, easy in comparison, but ease isn't the point, is it?

There's a strong autobiographical element lurking here that I had better make explicit. It is meant only to illustrate that which is common enough. It will surface here and there in what follows.

I received my doctorate at the University of Toronto in 1984, when as now tenure-track jobs were scarce and Milton's Paradise Lost, the subject of my dissertation, no longer the sexiest of topics. Despite the most strenuous efforts of kind and influential people, I failed to find an academic appointment, and so took up a para-academic position, which I held until 1996. One mantra that served me well during that time, as I paced about outside the walls and came repeatedly up against their abrasive solidity was: "It's the work that matters". By this I did not mean the paid employment I then, as we say, enjoyed. Rather I meant that work for which I stole as much time from my employer as I could. Then, unexpectedly, I was snatched from my outcast state for an academic position in London. Nevertheless, that same mantra has remained central, especially in the last few years of steep moral decline in British academia, as its plot-lost rulers turn it ever more into a parody of itself. So, even from the inside (as my doctoral supervisor used to say), it makes sense to think about those open fields and what might be done there.

Others here and elsewhere know far more about such extramural possibilities and limitations, so I will leave that discussion to them.

I started this essay with Terkel to help bring attention not merely to the question of shared humanity, but rather more specifically on how it emerges in the implications of our collective work — on the great project that we humanities scholars are a part of. We are in fact elbow-deep in working out what computing means for the disciplines of the literae humaniores, the learning that pertains to human beings and so is, or should be, humane. In doing so, we are comrades in arms in a crowded assault on inherited ideas of the human, along with artificial intelligence, neurobiology et al. Several candidates — and, if we get what we do right, we are not the least among them — stand ready to join the list of corrosive ideas that Sigmund Freud made when he added psychoanalysis to the lineage of Copernican cosmology and Darwinian evolution (1920: 246-7). This may seem over-the-top to assert, but as some early practitioners of the digital humanities clearly saw, machines that continually get smarter, with no end to their improvement in sight, bring ever more into question what's left for us to do, what we are for.

"Culture is ordinary: that is where we must start" (Williams 2000/1958: 11). I would like to think that, sometime in the future, we could return to that starting-point with language clear enough to enable a conversation that would add our voices at their most radical to those of the thoughtful common people Williams knew. I would like to think that a life worth living is also our aim and that our lives would make sense to them.

My assignment here is otherwise, however. It is to discuss the institutional re-figuration parallel with, if not a product of, the re-figuration of the humanities in which we participate. What is happening in our working lives? What could?

Over the last quarter-century the institutional relation between computing and older disciplines has appeared to change considerably for the better. An involvement that, in the mid 1980s, would have stained a scholar's reputation if not poisoned his or her career (as seemed to happen to me) can now be advertised to good effect. Indeed, without digital abilities — if not accomplishments, or at least research interests that lend themselves to digital methods — a young scholar's chance for employment in many if not all fields is likely to be diminished. Within the last two decades, senior academics have enhanced or even made reputations on the basis of such interests (though not all of them have done their homework or even realised that homework is required of them and not just of those they hire, in this as in other intellectually demanding subjects). But even if we grant that we are in a transitional period and subtract the non-scholarly motivations handed down by government and received here below with all too little resistance, all is not as well as could be — for individuals or for scholarship. Better historical awareness of what has been happening and increased critical attention to it are needed. We must ask: toward what institutional and scholarly conditions, good for the work as well as for ourselves, do we want to be moving? What is to be done to get us closer to them?

In the para-academic position I held from 1984-1996 (that's 12 years, by the way) the tenure-divide ruled my working life, sharply because I was on the wrong side of it — an un-tenurable junior administrator, and therefore more easily disposed of than the union-protected staff who emptied the bin in my office (as I discovered during a particularly uncomfortable period). By 1996, when I arrived in the United Kingdom to take up an academic appointment, tenure there had been abolished. Its absence proved a good thing both for me and for our nascent field, since in the UK the creation of a new appointment or new department in the humanities is not made a nearly impossible undertaking by the budgetary constraints of a tenure-line, which by definition runs into the indefinite future. For universities in the UK, as Holm and Liinason note, "it is easy to set up courses and degrees [and therefore to create positions] in disciplines that can demonstrate market demand" (2005: 7). Under the economic conditions at the time of my arrival here, all that was required was to be able to demonstrate such demand at some point in the future. Experimentation was thus encouraged. So, it seemed to me, a best of all possible worlds, or at least far better than the one I had left behind.

What was not apparent at the time was the expanding reach of Thatcherite market demand, from measures of success according to student-numbers to a redefinition of the lecturer's role as a form of customer service. (This term is actually used in some UK institutions and, given cuts in research funding for the humanities, likely to gain currency and strength.) But cuts to funding have drawn attention away from the deeper problem and cause: the managerial assumption that vox pupilari vox Dei, or more accurately, "the customer is always right." Behind this assumption is a massive change in cultural attitudes that raises the fundamental question of what education is for, in a world where cultural authority has been flattened — where, as John Hartley has argued for the UK, since the Annan Report on the Future of Broadcasting (1977) no one in the public sphere has taken humanists seriously other than themselves (2009: 5).

Tenure is designed to guarantee intellectual autonomy by protecting academics from external pressure to conform. It grounds their personal authority in a contractual right that is in practice extremely difficult to challenge. Thus, although tenure makes the academic system considerably more rigid with respect to creation of new departments and positions in them (and so distressingly conservative to upstarts like us) there's another side to it. Tenure also shields a nascent department and its tenured members from the distracting and enervating demands for proof of usefulness and service — proof in many cases, one suspects, of usefulness the evaluators are incapable of recognizing, and service merely to those who become customers. My point is that, for the establishment of a new discipline like ours, tenure is an expression of the problem: not the problem itself, and not a remedy either. A tenure-line in the digital humanities, devoutlyto be wished for, removes one difficulty by creating another.

Differences in national and even local academic systems make institutional models difficult to transport. Institutional forms of the digital humanities are not only diverse in part because of local conditions, but are also still developing. We should look, I think, not at these models, therefore, but at a lower level — that is, to the individuals who have successfully found or created a niche, then ask what they have done. My experience suggests this: that we must consider the socio-intellectual qualities of the environment where the niche has been found or created. Despite my own autobiographical interjections I am not advocating that we do biography of individuals and sociology of groups, rather that we pay attention to our disciplinary ideals and historical trajectory. (I use the contentious word "discipline" not to assert success at achieving a place on a canonical list, rather merely to name what we numerous discipuli do.)

Along with our colleagues in the literae humaniores, we must continually worry the ideals by which we live — because, a close look suggests, we don't really know what they are or can be. We also cannot yet write the genuine history that would chart our disciplinary trajectory. But emerging from the experience of the last thirty years are without doubt two central qualities or modes of working: the collaborative and the interdisciplinary. Both are taking place on a daily basis; both are far more like questions than answers. Both are, as Peter Galison has said (2004: 380), often invoked as transcendental virtues, which we must make into qualified virtues.

Collaboration in the digital humanities remains mostly unstudied despite abundant activity and many precedents in studies of scientific collaboration and of "laboratory life," as Bruno Latour and Steve Woolgar named their 1979 book. Collaboration in our field goes back to the time of probing conversation between the academic humanist and the dedicated "humanities programmer," who were not social equals but, in my experience, had to become intellectual equals for the conversation to succeed. It's clear that our greatest potential can only be realised if that social equality becomes our norm. It should also be clear that when we define the meaning of collaboration in contrast to the pernicious caricature of the "lone scholar" — none is a scholar and alone in the intended sense — we damage its value irretrievably. All else, however, remains either a vague question or an unsupported claim. Collaboration is happening here and there, perhaps on some occasions even well. But we need to know how to steer for success in it.

Interdisciplinary research is as poorly understood. Those who study "interdisciplinarity," having reified a dynamic and changeable process and named it with an abstract noun, are typically bemused with ontological distinctions between the reified "it" and other abstractions, e.g. multi-disciplinarity, trans-disciplinarity and so forth ad nauseam. Very few ever ask how interdisciplinary research is done, i.e. how best to think one's way into a discipline other than one's own. The enormity of the challenge, described by Thomas Kuhn and Dame Gillian Beer, for example, tends to be ignored; actual studies of interdisciplinary research projects, attesting to the difficulties, too few;[1] and the absurdity of a neutral standing point, argued persuasively by Stanley Fish,[2] ignored. Hence the anti-disciplinary imperialism Fish documents and an endless talk of breaking down barriers. Collaboration in the digital humanities has been interdisciplinary from the beginning — we have no choice in the matter — but collaboration in its canonical form follows the model of the Manhattan Project, each discipline or specialisation tending to be represented by a separate person or persons. Hence Myra Strober's recent book, Interdisciplinary Conversations: Challenging Habits of Thought (2010), which is perhaps the best study to date of interdisciplinary work at the social level. Her conclusions are sobering, and being sober is helpful, but she does not deal with collaboration in the mind.

What remains untouched is the form of interdisciplinary research and the form of collaboration closest to the humanities as we find them, i.e. individual behaviour and solitary reasoning. The humanities are surely changing, indeed must adapt to changing social and institutional conditions, but the arrogance of those who brush aside our intellectual traditions with claims of a "new humanities" is damaging. Seeing the magnitude of change afoot, we think wrongly of replacing ways of working and reasoning rather than augmenting them. We make bandwagons for people to jump on rather than observe what they are already doing or trying to do. Bandwagons go in only one direction. Many explorers find many clues.

What, then, does all this have to do with our working lives? It offers, I suppose, a counsel to do what one can, from where one can, for the work that comes within reach, and not to be repelled by canonical forms, such as the funded project or tenure-bar or academic position. It is to ask, given what I have called the great project (for which funding etc. are helpful but not necessary), what can I do here and now, with what I have? But however good solitude may be for some kinds of work, communication is essential. Thus Humanist — not alone in this — which began in rebellion and resistance but has survived even acceptance.

[1] For the difficulties of actually doing interdisciplinary research see Kuhn 1977/1976: 5-6; Beer 2006; Catney and Lerner 2004; Oksen, Magid and de Neergaard 2004.

[2] Fish's argument that there can be no neutral standing-point, and so no perfectly interdisciplinary research, is persuasive, but the fact that perfection is impossible does not mean, as he suggests, that trying for it is absurd. This is essentially Liu's argument (2008).

Works Cited

Beer, Dame Gillian. 2006. "Dame Gillian Beer's Speech on the Challenges of Interdisciplinarity". Institute of Advanced Study, University of Durham, 27 April. (15/6/11).

Catney, Philip and David N. Lerner. 2004. "Managing Multidisciplinarity: Lessons from SUBR:IM". Interdisciplinary Science Reviews 34.4: 293-312.

Freud, Sigmund. 1920. A General Introduction to Psychoanalysis. Trans. G. Stanley Hall. New York: Boni and Liveright.

Galison, Peter. 2004. "Specific Theory". Critical Inquiry 30: 379-83.

Hartley, John. 2009. The Uses of Digital Literacy. Brisbane: University of Queensland Press.

Hoggart, Richard. 2009/1957. The Uses of Literacy: Aspects of Working Class Life. London: Penguin.

Holm, Ulla M. and Mia Liinason. 2005. Disciplinary Boundaries between the Social Sciences and the Humanities: Comparative report on Interdisciplinarity. Research Integration, Universities of York and Hull. Göteborg: University of Goteborg.

Kuhn, Thomas S. 1977/1976. "The Relations between the History and the Philosophy of Science". The Essential Tension: Selected Studies in Scientific Tradition and Change. 3-20. Chicago: University of Chicago Press.

Liu, Alan. 2008. "The Interdisciplinary War Machine". In Local Transcendence: Essays on Postmodern Historicism and the Database. 166-85. Chicago: University of Chicago Press.

Oksen, Peter, Jakob Magid and Andreas de Neergaard. 2004. "Thinking Outside the Box: Interdisciplinary Integration of Teaching and Research on an Environment and Development Study Programme". Interdisciplinary Science Reviews 34.4: 313-31.

Strober, Myra H. 2010. Interdisciplinary Conversations: Challenging Habits of Thought. Stanford: Stanford University Press.

Terkel, Studs. 1972. Working: People Talk About What They Do All Day and How They Feel About What They Do. New York: Ballantine Books.

Williams, Raymond. 2001/1958. "Culture is ordinary." In The Raymond Williams Reader. Ed. John Higgins. Blackwell Readers. Oxford: Blackwell.

–. 1958. Culture & Society 1780-1950. New York: Doubleday.


ghbrett's picture
Response from
George Brett

June 22, 2011

Re: Working Digitally

Hullo Willard,

Thanks for a thoughtful piece.

Glad to see Humanist mentioned. Boy that was a time. You did a smash up job editing it. Thanks! — George