Rethinking and Rebuilding: Grand Narratives in the History of Computing
We are gathering in Siegen and online this July for a rather unusual purpose: two days of discussion of the historiography of computing. The event is prompted (with a pandemic-related delay) by the publication last year of my book with Paul Ceruzzi, A New History of Modern Computing but the idea is not so much to celebrate our achievement as to discuss everything it leaves undone.
You can see the full program for the event at https://www.socialstudiesof.info/grandnarratives/.
Grand Narratives
This workshop is themed around the concept of “grand narratives,” a term that can be traced back to the work of Jean-Francois Lyotard in the late-1970s. Coming early in the turn to postmodernism the term was always pejorative, something that comes through still more clearly in the alternative formulation of the “master narrative.” Grand narratives are too neat, explaining history in terms of broad trajectories of progress and enlightenment, but are in fact myths granted the status of universal truths by self-serving elites. Only localized narratives rooted in the specifics and diversity of human experience can overcome the oppressive power of these totalizing stories. (Lyotard emphasized the role of computer technologies and artificial intelligence in postmodernity, so the framing seemed particularly appropriate).
Historians focus more on craft than theory but followed a parallel course, away from consensus views of US history and towards stories that stressed conflict, contingency, the hidden agency of marginalized groups, culture, and complex intersections of identity. Influential historians of the 1980s and 1990s routinely framed their work as a fight against generalization, to show that the actual stories of particular communities were richer and more complex, more deeply rooted in the specifics of time and place, than we might guess from the discredited narratives of progress, freedom, or democracy.
For several generations, then, humanities scholars have been rewarded more for undermining large-scale narratives than for constructing them, though as historians are inherently in the narrative construction business we’ve been less content than other humanists simply to smash a big story to pieces before declaring victory and going home to put our feet up. Some big stories have even come back into fashion. Global history demands an audacious willingness to generalize, and environmental histories often construct new narratives on a broad scale. But most historians remain much more comfortable undermining the efforts of journalists and popularizers to repeat discredited myths than in building up big narratives of their own.
This reluctance has been, I venture to suggest, a problem for the history of computing and particularly for the history of computer science. Within the history of science and technology an aesthetic preference for richly detailed, narrowly drawn studies that uncovered the agency of overlooked actors by looking closely at culture and practice led to a shift away from ploddingly comprehensive histories of disciplines, professional societies, or academic departments. For topics such as the history of evolution, quantum mechanics or the scientific revolution where well-trodden narratives were ripe for reinterpretation the new approaches worked well. Yet because scholarly incentives no longer rewarded the production of comprehensive narrative histories of scientific disciplines there was never a grand narrative of the history of computer science, or any of its constituent communities, for ambitious young scholars to audaciously undermine on their way to tenured positions at great universities.
I do not, of course, mean to suggest that our book is itself intended to create a new “grand narrative.” For one thing this would be reactionary as all right thinking humanist know that grand narratives are bad; for another presumptuous to the point of madness. Grand narratives are the products of entire cultures, not of individuals. More modest terms such as “overview history,” “synthetic history” or at the most “comprehensive narrative” are better descriptions of our intent. Books of this kind are better at consolidating perspectives that have already emerged in a field than creating entirely new narratives.
But to write a long and broad account of electronic computing since 1945 is to run up against various grand narratives in a way that challenges one engage with them. The narrative of technological progress has in recent decades has been kept alive almost entirely by developments in computing and digital media technologies to the extent that many people have been fooled into believing we live in an age of unprecedentedly rapid and disruptive innovation rather than in the most profound era of technological statis since the beginning of the industrial revolution. Also the narrative of the last fifty years as an epoch defined by growing inequality and triumphalist neoliberalism, the narrative of the US as a global hegemon sustained by its military-industrial-academic complex, the replacement of the US by Asia as the dominant force in global manufacturing, the story that Western societies moves from social cohesion to atomized individualism since the 1940s, and so on. As computing moves from the margins to center stage it is implicated in all the stories our societies tell themselves to explain their presents and predict their future.
Did we as authors fully and triumphantly situate the story of computing within these narratives. For the most part, no. We engaged with some more than others, most squarely with those concerning technological progress and the military industrial complex. Some surface clearly in the epilogue but only implicitly in the main text. The story we address most directly is the idea that the computer was born universal, which is not as grand a narrative as the myth of human progress but nevertheless looms large in the imagination of computer scientist and popular writers. But neither, I believe, has the community on whose work we drew been consistently successful in contextualizing its stories against these urgent and pervasive narratives.
What’s New About Our “New History”?
The two classic scholarly overview histories of computing were published back in the late 1990s: Campbell-Kelly and Aspray’s Computer and Ceruzzi’s A History of Modern Computing. In the final session we’ll be hearing from those three authors, in conversation with JoAnne Yates whose Structuring the Information Age takes a long slice through time to explore the use of IT in the insurance industry. Our book replaces Paul’s earlier volume, making it a somewhat unusual hybrid between a new book and a revised edition. I discussed our process in a 2018 working paper, “Finding A Story for the History of Computing.” Meanwhile a significantly expanded fourth edition of Computer with three additional authors is currently under preparation and will likely be a major topic of discussion during the session.
To allow myself one paragraph of unequivocal satisfaction, I do believe that A New History of Modern Computing makes some changes to the traditional narrative that will help it to resonate with a new generation of readers, many of which consist simply of integrating work from a secondary literature that is hugely more voluminous and intellectually diverse than that of the 1990s. Our book brings the narrative up to 2020, where the earlier books settled into wrap up mode somewhere around the introduction of the original PC and Mac in the 1980s. It fully integrates the history of computing and the history of networking. Rather than address software in a separate chapter, it integrates discussion of software and programming throughout the book. It looks at users as well as producers, and in fact users are integral to the structure of the book since each chapter follows a user-driven transformation in which new application areas drove the creation of new technologies and practices. It has far more to say about the history of personal computing than any previous overview history, it integrates the stories of home computing and videogaming into the story of the computer, and it departs more thoroughly than previous overview histories from the classic insider narratives of personal computing that focus on California in the mid-1970s. It talks more deeply about gender than any of the earlier comprehensive histories, which you might rightly suspect is a low bar to clear, and it engages sporadically with the social history of computing. It casts a broader net than earlier histories in its treatment of digital media decides, including computer-based devices such as cars, digital cameras, music players, and televisions as well as the boxes more obviously identifiable as computers. It’s been fact-checked by experts and has so far held up fairly well in the hands of even the most nitpicking of readers. A book like this leaves out many details, but we tried to make the sentences that we did write literally true even as each says only a small part of what could be said on the subject at hand.
If you like this sort of thing, as the saying almost goes, then it’s probably the sort of thing you will like quite a bit. Yet in much of the academic world likes this sort of thing than it used to. Some scholars may actively dislike this sort of thing: an effort by two white men to center a big story on the evolution of a technology. To write a book like A New History of Modern Computing is to be faced with the problem of deciding, as Michael Mahoney put it, what the history of computing is the history of. Like it or not, one must attempt to weave together the work our community has produced over decades into a coherent story. The one we chose was the incremental development of the computer towards the status of universality: beginning as a highly specialized scientific tool and finishing as a technology used by most humans daily to do almost everything. That’s a very big story, and one centered around the co-evolution of computer technologies and applications. As we say in the introduction, you can’t plausibly attempt to give a comprehensive answer to the question “how did the computer change the world” whereas the question of how the world changed the computer is comparatively tractable.
Our approach centers the computer, albeit the computer as a stack of technologies and practices rather than a simple artifact. But there have been many calls for us as a community to do the opposite: to decenter the computer. My response to that is pragmatic: while individual histories can and should decenter the computer, we must recognize that computer technology is at this point intertwined with most of what happens in the world. A broadly conceived history of computing will at some point become the history of (almost) everything, which is hard to write coherently in a single volume. As time goes by, there are some jarring shifts of scale, from dozens of computers in the world in the first chapter to hundreds of billions by the end of the book. A focus on technology and architecture made our task feasible by creating a web of connections between chapters.
The other choice we made was to stick with the idea of a comprehensive narrative history. We tried to include as many as possible of the platforms, applications, people, technologies and programming languages that historians or, in the more recent chapters, other experts have identified as particularly important. That leads to a big book, something like a three-dimensional jigsaw with thousands of pieces that much be somehow worked into a self-supporting structure.
I can imagine others telling big stories more selectively, through discontinuous case studies that together illustrate points in the trajectory of a long arc. Perhaps a user-centered history of human interaction with computers told in five case studies over five decades, a history of programming practice in six snapshots from Lovelace to the present day, a media-centric story told be examining the storage of data in seven representative platforms plucked from different eras. Jacob Gaboury’s recent Image Objects seems to me an exemplary effort of this kind.
Many such histories will surely be written, but I hope the existence of comprehensive narrative histories will help their authors, both in guiding the periodization and choice of systems and in giving them something to push against. “Haigh & Ceruzzi do not even mention X…,” “Haigh & Ceruzzi situate Y as a step in the development of Z, but in fact the story is much more complex …” and so on. I’ve seen Paul’s earlier book used in just this way, and I feel like we may be doing the community a service if only by making a better straw man that can be more productively set alight at regular intervals. Any worthwhile dissertation engaging with the history of computing must be able to claim to do something we didn’t do, and for that purpose we provide a more useful measuring stick than, for example, Walter Isaacson’s simplistic, unreliable, and spectacularly popular history The Innovators.
Five Questions For Five Panels
The event is structured primarily as five round-table discussions, each one titled with a question. The panelists are all smart, interesting and accomplished and most probably they will say whatever it is they want to say without worrying too much about the exact question asked or why I picked it. I did not put myself on any of the panels, and want to let the discussion proceed freely whether it happens to focus on what we did in A New History of Modern Computing or shift quickly into the articulation of hypothetical new narratives. But in case anyone does care, here are a few musings on why I picked each of the questions.
Panel 1: Could we structure a big story around the materialities of data, computation and networks?
You may wonder why materialities and networks go together here. Some consolidation of topics was inevitable – there were only so many slots on the program – but materiality has been rather underexplored in the history of computing and I felt its lack particularly strongly when writing about networking.
The grand narrative of technological progress in computing is usually referenced via Moore’s Law, which of course serves to hide and naturalize the phenomenal amounts of money and human effort that went into keeping it trueish for decades, as well as all the tiny advances in materials and processes. The story we tell of the computer’s journey toward ubiquity depends fundamentally on constant improvements in computational capabilities and decreases in cost, size, and power consumption. Yet we, and the literature we draw on, only fitfully explains how and why this happens.
We are reasonably thorough and I hope reliable in discussing some of the origins – the delay lines, drums, cores, and disks of vacuum tube computers, the invention of the integrated circuits, the first microprocessors and the shift to CMOS. I’ve recently accumulate some interesting stories on chip design in the 1970s and 80s that I’d add to any new edition to illustrate the connection between chips and photography. But, like other histories, we zoom right out as the story continues through the 1980s and 1990s, telling the story on the level of processor architecture (RISC vs CISC) and operating systems rather than transistor technologies. We say something about VLSI and ASICs but nothing about whatever is actually occurring to make transistor densities increase so spectacularly. We also have nothing much to say about flash memory, SSDs, and other modern storage and memory technologies. We do weave the shift of manufacturing to Asia into the story, though not Intel’s recent struggles to keep up with cutting edge technologies. So there’s a strange disjunction here: modern semiconductor process technologies have achieved enough geopolitical significance that you can read about them in The Economist but have largely vanished from the stories told by historians of computing.
The lack of attention to materiality is particularly notable in the history of networking. The literature we drew on, particularly Janet Abbate’s book Inventing the Internet, gives a clear idea of what’s going on in the early days of the ARPANET and timesharing: leased lines, ASR-33 teletypes, modems, IMPs and so on. Later on, the commercialization of the Internet and our current age of digital ubiquity is made possible by the plummeting costs of bandwidth. How else could an unmetered domestic Internet become the norm? But beyond the 1990s mania for laying long distance fiber optic cables, I haven’t come across any detailed historical explanations for why this happened or of the fundamental shifts in wireless communication that took mobile data from an expensive curiosity to the dominant mode of Internet usage.
Likewise the emergence of cloud computing, a central theme in our last two chapters, relies (as many scholars have pointed out) on an enormous and inescapably material infrastructure of data centers. We do talk a little bit about some of the software (Hadoop) and hardware (Google’s shift to cheap commodity components) transitions that made this possible. But as the bounds of “the computer” have shifted from specific boxes to world-wide redundant clusters (as we spin this: Sun’s slogan of “The network is the computer” finally came true) that’s a real challenge to think about platforms and infrastructures in new ways and integrate this into the story of the computer itself. One might also think about Paul Edwards’ data-centric view of the same history in A Vast Machine (which we don’t even cite) as a narrative that deals with changes in the affordances of scientific data processing over an even longer history, grappling with these spectacular changes in scale and scope. In a sense, the history of computing has become the materiality of the history of science.
Panel 2: What if we don’t center the United States?
Twelve years ago I wrote the following:
American scholars tend to view the history of information technology as a fundamentally American narrative. The US is the only country with a sufficiently central role in most of areas of information technology that a coherent (if skewed) overall history of computing can, and often has, been written with minimal reference to the world outside its borders. There are a few Englishmen who force their way into the narrative: Charles Babbage, Alan Turing, the teams behind the Manchester Mark 1 and the EDSAC, the LEO group that pioneered administrative computing, and Tim Berners-Lee. Konrad Zuse flies the flag for Germany. Jacquard’s automatic loom earns France a paragraph somewhere in an early chapter. But these can be dismissed in passing as brilliant figures whose seminal technical accomplishments quickly passed into the hands of Americans for practical exploitation. The history of personal computing, in particular, is told by Americans entirely without reference to the existence of a world beyond the oceans, or in most cases beyond the Valley.
Historians based in the United States have therefore been much less likely than those working in other countries to attempt to isolate peculiarities of their own national experience or relate the development of information technology industries to the influence of government policy…. Scholars in other countries have tended to focus on national narratives, generally framed with perceived differences between local developments and those in America.
Given the opportunity to create a new kind of history of computing narrative what did we do? In all honesty: the US story remains our default, yielding a strange hybrid of a national and global histories. We go overseas more often than previous overview histories, but we still make the journey only when necessary to illustrate the origins of some important new application of computer technology. For example, we visit the UK for LEO, the early computers at Cambridge and Manchester, the Sinclair and Acorn machines of the 1980s and the BBC’s computer literacy project, ARM, and so on. Continental Europe shows up mostly for CYCLADES, Minitel, and the World Wide Web while the USSR exists mostly as an off-stage bogeyman to justify US government investments in computer technology. Asia shows up later in the narrative, as the new home of microelectronic manufacturing, a source of challenges for text input and display and, eventually, as the world’s most important ecommerce marketplace. Africa is mentioned only very briefly, as a pioneer of cellphone-based financial transactions.
This is, in part, a pragmatic matter. The US installed far more computers in the 1950s than the rest of the world combined, and in later decades US firms such as IBM, DEC, Microsoft, Apple and Google exerted an outsize impact in shaping the hardware and software platforms used worldwide. Any history concerned primarily with explain where widely used technologies come from must recognize this. It’s also true that Paul’s existing text included important case studies of US-based user organizations such as NASA and the IRS. Some of these could have been replaced with non-US examples, as David Gugerli did in his recent book How the World Got Into the Computer, though his overall narrative appeared to be a universal one illustrated with US and German examples rather than being explicitly coupled with any national story
There have, of course, been efforts to write national histories of computing for countries outside the US. But what would an international history of computer technology that didn’t center the US even look like? Would it be basically the same story, with examples of computer use chosen from a broader range of countries? Would it periodize differently, focusing much more on the latter part of our story and less on the 1940s to 1960s? What important modes of computer use, kinds of technology, or applications do we slight by treating the US story as the default?
Panel 3: What can we gain by reconnecting the history of computing with the histories of computer science and mathematics?
Back in the 1970s, the history of computing was written mostly by computer scientists or aging computing pioneers and concerned itself primarily with devices built to carry out numerical mathematics. Both these things changed, as the field developed primarily as a subfield of the history of technology and, more recently, media studies. Historians of computing famously moved Donald Knuth to tears by shifting away from the technical analysis of computer hardware and software. More recently, Liesbeth de Mol and Maarten Bullynck called for a history of computing grounded more deeply in the history of mathematics. Meanwhile the history of computer science as an academic discipline remains largely unwritten: historians of science have shown remarkably little interest in the topic, and a recent flurry of work on early electronic computing wraps up in the mid-1960s just as the institutional history of computer science begins.
Here's how I described that in an unpublished essay:
Historians of technology, of business, and more recently of media studies have produced a vibrant and fast-growing literature on the history of computing, broadly conceived. But because computer science is a scientific discipline, the task of understanding its evolution falls to historians of science, who have so far shown little interest in the topic (S. Dick 2016). Princeton historian Michael S. Mahoney spent many years working on a history of theoretical computer science but produced only a set of densely suggestive essays (Mahoney and Haigh (ed.) 2011). Few scholars have focused on scientific computing or the history of computer science, which has for the most part been explored only in fragments where it touches other stories such as the development of biomedical science (November 2012), the Internet (Abbate 1999), programming work (Ensmenger 2010), bioinformatics (Stevens 2013),computer graphics (Gaboury 2021), missile defence (Slayton 2013),computational support for natural science (Akera 2007) or timesharing systems (Rankin 2018). The most technically engaged recent work on the history of computing such as (Haigh, Priestley, and Rope 2016; Daylight 2012; De Mol, Carle, and Bullynck 2013; Priestley 2018) has focused on the era just before the establishment of computer science as a discipline during the mid-1960s, or, in some cases (Jones 2016) before the arrival of the electronic computer itself More scholarly attention has been paid to the meaning of Soviet efforts to conceptualize computer science as part of cybernetics (Peters 2016; Gerovitch 2002; Tatarchenko 2013) than to the core story of the growth of computer science in the US. The closest thing to a comprehensive history of computer science, useful but schematic, comes from a computer scientist rather than a professional historian (Tedre 2015).
Our new book is more deeply grounded than most in the history of computer technology, to the extent that this may delegitimate it in the eyes of some historians. As I will discuss during the event, we found in Paul’s existing interest in computer architecture a through-line that could tie together the affordances of the very different systems considered over the course of our narrative. Yet this is not a history of computer science (as I wrote in the copy I sent to Donald Knuth, emphasis included). We talk about the contributions of many computer scientists, but only where we could draw a direct line from their contributions to developments in computing practice and technology. For that reason, for example, there is little on the history of AI or of theoretical computer science. We talk about the importance of numerical mathematical mathematics and computer simulation in driving developments in computer technology through the late-1970s in chapters 1 and 2, but touch only occasionally on scientific computing and supercomputing after that point.
So what opportunities did we miss to explain how later developments in scientific computing shaped the broader development of computing? Why is the history of computer science almost entirely unwritten? What might serve as grand themes and organizing narratives for an ambitious history of computing grounded more deeply in the history of mathematics and of computer science?
Panel 4: How could media theory and STS underpin new historical ways of understanding the story of the computer?
Our new book integrates the stories of digital communication and digital media into the core history of computing narrative far more thoroughly than any previous overview history. Yet, as readers with a media studies background will undoubtedly notice, we neither engage explicitly with existing media theory nor offer erect a theoretical apparatus of our own. This is both strategic, as we did not want to make our story less assessable by encumbering it with further jargon, and unavoidable since neither of us are media studies scholars. The closest we come to offering a theoretical generalization is the distinction between the conceptual universality that theoretical computer scientists see in all programmable computers, going back to ENIAC, and the slow historical progress of the actual computer towards practical universality as it was adapted to suit the needs of different user communities. This is also a matter of craft: we built our ideas on conceptualizing the history of computing the structure of the book rather than presenting them primarily as bullet points or a theory-driven manifesto.
The issues we dealt with are, however, of unmistakable interest to media scholars. The book’s creation was enthusiastically supported by the Media of Cooperation project at Siegen, reflecting a long interest in the intersection of media and computing technologies and the affordances of digital materiality. In addition the history of computing community, as represented by SIGCIS, has taken an unmistakable demographic turn in the direction of media studies and communication in the last decade and at this point a clear majority of the people in the US with stable academic jobs teaching and researching on topics related to the history of computing are doing it outside history departments (and STS/history of science programs).
So what did we miss? How could integrating media theory more deeply into the core narrative of the history of computing provide better historical explanations? What would the story itself look like if reconceived as a narrative in which media technologies take over computation rather than, as we have it here, one in which computers dissolve the insides of media technologies?
Panel 5: Can we integrate issues of gender, justice and embodiment into the story of the computer itself or must these narratives remain separate and particular?
I’ve had a weird career: educated first in computer science and then in the history of technology, with brief spells teaching in administrative science, STS, and informatics before settling down to a soul-crushing thirteen years in a school of library and information science followed by a very welcome transfer to the history department. But since graduate school my core identity has been fairly stable, as a US social historian of technology and business who specializes in the history of information technology. That was a weird thing to claim in a library school that had me teaching systems analysis and project management, but I still claimed it. It also meant I found myself identifying more with the social history courses I took in the regular history department at the University of Pennsylvania (which was also the home to my dissertation advisor) than with the history of science, despite having earned my Ph.D. in the history and sociology of science department.
Five years ago the gods hear my boasts and called my bluff. I found myself still in the same chronically underfunded public university but now teaching in a history department that has shed about half of its faculty in little more than a decade. With many gaps in our coverage, my teaching has been more about filling in the holes with undergraduate surveys than building up advanced courses in my specialized areas. Last semester I taught an undergraduate history of computing class for the first time in 21 years. In contrast, every semester I teach a class on the history of race and health in the US that begins with the Columbian Exchange and ends with the racial dynamics of Covid. Every year I teach a survey seminar on the history of capitalism, which also leans heavily into race and gender (including two weeks on slavery). Those topics are also prominent in my graduate research methods class.
This isn’t some recent transformation in the historical profession based on embrace of “critical race theory”: class, race, and gender have been central to scholarship on US social history for generations. Not foregrounding them would be professional malpractice. That is perhaps a difference between teaching US history and media studies, as in the latter such topics can be embraced or ignored according to personal preference, creating an environment in which it might seem more plausible to claim that the story of any group of people may be told (or rather retold, or synthesized, or summarized) only by someone identifying as part of the group concerned. One can’t recapitulate the story of the US semester after semester in class after class while speaking only about the experience of cisgendered middle class white men who grew up in the English Midlands in the 1970s and 80s.
And yet, here I am as the lead author of a big heavy book which is by no stretch of the imagination structured around class or race and is only intermittently concerned with gender. All three categories enter explicitly our the narrative at times. But we focus the structure of the story around the development of computer technologies themselves. Neither do we stop as often as we might to remark upon the overwhelming whiteness of our cast of characters.
Aside from an apocalyptic epilogue set admit the techlash and the pandemic, we generally avoid much overt editorializing about the social consequences of the adoption of computer technology. This mirrors the sudden loss of faith within the communities we are writing about in the generally positive trajectory of computer technology. Astute readers will notice the unmistakable prevalence of military and aerospace applications in the early chapters, driving everything from the invention of the modern computer to the SAGE project, Minuteman missiles, and the earliest markets of silicon chips. Yet we did not, for example, include discussion of the role of statistics and big data in racist social and criminal justice policies or the political agendas of US computer companies. Neither do we get far into discussing the role of IT in economic transformations that have increased inequality.
The SIGCIS community has taken some highly visible turns in the direction of dealing with issues of justice, diversity, and embodiment during recent years, complementing the longer established focus on gender. One of our panelists, Jeff Yost, worked with Gerardo Con Diaz to organize the recent Just Code event and establish a new book series at the intersection of computing and social justice. Yet work on these topics has so far been less likely to be deeply historical, or fully engaged with the body of scholarship on US history.
So, to me at least, some questions remain open. One is the extent to which focusing an overview narrative of computing around the categories used in US social or cultural history is compatible with the ideal of finding a global narrative of computing. Both would be worthy goals, but they might lead to very different stories. Another is the extent to which a synthetic history of computing more deeply engaged with the core narratives of US history would aim primarily to show that computing communities under discussion were microcosms representing the broader organization of US society during the period (pointing out, for example, heteronormativity and racism) or would focus instead on arguing that computer technology had influenced those dynamics (online discourse, economic inequality, etc.) A related question explains the title of this panel, given increasing acceptance of the idea that authors may only legitimately tell the stories of communities of which they are themselves a part. Will a concern with issues of social justice inherently draw us away from narratives structured around the evolution and affordances of computer technologies and towards incommensurable narratives rooted in the irreducible specifics of specific communities? Or perhaps to broad narratives rooted more in the history of capitalism than the history of technology, or to a renewed focus on the evils of military industrial complex and neoliberalism.
Embodiment is an interesting category here, and one that makes me realize how little concerned narratives around the history of computing have usually been with the biological materiality of human existence. This has central to the work of Liz Petrick, who is on the panel, on the history of accessibility in computer interfaces and I was struck by the response of my students last semester to Laine Nooney’s essay on ills wreaked on human bodies by the demands of computer technologies. It also has an obvious connection to the history of gender and sexuality. Yet we all have bodies, which differ of course but as variations on a common theme, so perhaps working them more systematically into the history of computing might offset both the persistent hostility to the realities of human existence found in geek culture and the difficulties of writing history across lines of class, race, gender, and sexual orientation.