We are pleased to welcome back Berkman Klein Fellow alumnus, Dennis Tenen, who joins us to discuss his new book, Plain Text: The Poetics of Computation (Stanford UP, 2017).
This book challenges the ways we read, write, store, and retrieve information in the digital age. Computers—from electronic books to smart phones—play an active role in our social lives. Our technological choices thus entail theoretical and political commitments. Dennis Tenen takes up today's strange enmeshing of humans, texts, and machines to argue that our most ingrained intuitions about texts are profoundly alienated from the physical contexts of their intellectual production. Drawing on a range of primary sources from both literary theory and software engineering, he makes a case for a more transparent practice of human–computer interaction. Plain Text is thus a rallying call, a frame of mind as much as a file format. It reminds us, ultimately, that our devices also encode specific modes of governance and control that must remain available to interpretation.
Notes from the Talk
The history of computing has its roots in not only mathematics, but also in the history of hermeneutics, or the interpretation of texts. In his talk, Dennis Tenen, Berkman Fellow Alumnus and Assistant Professor of English and Comparative Literature at Columbia University, drew on his recent book, Plain Text: The Poetics of Computation, and identified key intersections between literary theory and computation.
Tenen’s talk centered around three major points that supported his central argument that literary theory is a useful critical tool to examine computation. The first point is that computation is metaphoric. To help convey this idea, Tenen proposed the concept of "transmediation," which describes the process of relaying information from one unique medium into another, where it is then reconverted. Metaphors largely structure how think about technologies. However, they can also conceal what happens under the surface, unless the user has advanced technological knowledge. For example, things like desktop icons of folders and wastebaskets imitate those real-life material items, but may obscure the computational processes they represent.
The second point was that texts are laminate and composite. By this, Tenen explained, he means to call attention to the idea that texts are no longer “in one place.” For example, one can identify that text written in a notebook exists in that notebook, but text projected on a screen simultaneously exists on the screen, in the projector, on the screen of the computer from which it originates, and potentially other places, as well.
Tenen’s third major point was that a reading of any text should be always infrastructural. He stated that, “content is never disembodied; it always exists in a particular medium.” Therefore, studying both the medium and its related infrastructures matters. Tenen concluded by suggesting that his third point be taken up as a methodological imperative. In studying any kind of text, researchers must consider not only the content itself in isolation, but the material, social, and historical contexts of the text, as well.
notes by Donica O'Malley
Dennis Tenen's research happens at the intersection of people, texts, and technology.
His recent work appears on the pages of Amodern, boundary 2, Computational Culture, Modernism/modernity, New Literary History, Public Books, and LA Review of Books on topics that range from book piracy to algorithmic composition, unintelligent design, and history of data visualization.
He teaches a variety of classes in fields of literary theory, new media studies, and critical computing in the humanities.
Tenen is a co-founder of Columbia's Group for Experimental Methods in the Humanities and author of Plain Text: The Poetics of Computation (Stanford UP, 2017).
For an updated list of projects, talks, and publications please visit dennistenen.com.