October 10, 2024
|In a new book, Todd Presner examines how AI can help preserve and analyze survivors’ stories. For more information on this article, please click this link.
In the 21st century, digital tools have opened up new avenues of exploration for Holocaust testimonies, as well as ethical concerns and questions.
In his book “Ethics of the Algorithm: Digital Humanities and Holocaust Memory,” Todd Presner, chair of UCLA’s Department of European Languages and Transcultural Studies, argues that computational and data-driven methods can ultimately be key in preserving and analyzing mass amounts of Holocaust testimonies. The book was published in September by Princeton University Press.
“I think the possibilities unlocked by big data and AI are just tremendous in every field. We’re in the middle of an epistemological earthquake that we haven’t figured out — and not in a bad way,” Presner said. “I hope some of the ideas in ‘Ethics of the Algorithm’ will inform the way we’re approaching AI. I’m cautiously optimistic that we’re going to see an exciting revolution before us in terms of how we study history, culture and society.
For this installment of the UCLA College’s Bruin Bookshelf Spotlight series, we spoke with Presner in order to understand the book’s collaborative and interdisciplinary nature, digital humanities in the age of AI, and how his research has influenced his teaching.
What inspired you to write this book?
We have before us tens of thousands of Holocaust and genocide testimonies that have been recorded by various institutions over time, and yet it’s almost impossible for a single individual to listen to them one by one.
The questions became, “Could computational tools be used to help us access, understand and listen in new ways — and can we use them ethically?” Especially when you have something as sensitive as a Holocaust testimony, you have to really think about how these tools can be abstractionist — they can turn people into data or numbers, which we were determined to not do.
Another thing that inspired me was that for the last 15 years, I have served as a faculty member for a project called Bearing Witness, where students enroll in a Fiat Lux class and talk to Holocaust survivors one-on-one. One of the things that sadly has happened over the years is that most survivors have passed on, so very soon our understanding of this historic event is going to come through recorded testimonies and digital archives. This demands a need to develop the tools and the ethics to understand these histories that are no longer available through first-person testimony.
Can you describe the writing process for “Ethics of the Algorithm”?
“Ethics of the Algorithm” has contributions by a number of people who were students at the time in the digital humanities program at UCLA, including current Ph.D. students like Anna Bonazzi, alumni like Rachel Deblinger M.A. ’09, Ph.D. ’14 and Kyle Rosen M.A. ’18, Ph.D. ’23, as well as undergraduate digital humanities researchers like Michelle Lee ’22, Campbell Yamane ’18 and Leo Fan ’20. They contributed to the development of methodologies, data visualizations and, in some cases, writing.
It’s exciting to have a book that’s not just single-authored, but actually deeply collaborative. The work of this book was also deeply experimental, and it was done in collaboration with so many folks in other disciplines.
What did the student contributors bring to the table that was so special?
I think it’s an eagerness to work with new technologies and apply them in ways that are unexpected. In a sense, that’s kind of at the core of the digital humanities.
I call this an integrated methodology in the book, which means bringing qualitative and quantitative analysis together. I think it opens up new horizons of interpretation, because you can bring social, scientific, computational, linguistic and historical archive-based research together. The students are, in some ways, at the forefront of these kinds of mixed and integrated methodologies. We as faculty are learning.
How has your research surrounding digital humanities and the Holocaust intertwined with teaching the cluster course “Data, Justice, and Society”?
The “Data, Justice, and Society” class was also an exciting collaborative opportunity. I’m in the humanities, but also got to work with faculty in the social sciences, information studies, political science and other areas to imagine new ways of using data analysis to understand social and historical questions through a social justice lens. That has honestly been a through-line of a lot of my research and teaching over the years; it goes back to the HyperCities project [a digital mapping platform that explores the layered histories of city spaces].
Data and data collection practices have always had social, political and epistemological dimensions, and we’re trying to understand the way that data can be used to advance justice.
I also have this new AI and Cultural Heritage Lab, and there are four undergraduate students from the cluster course who are in it — Elliot Cao, Billy Peir, Aditya Patil and Aarya Khanna — and also Ulysses Pascal, who was one of the TAs. We’re undertaking a number of partnerships with different museums and libraries right now, and not just on Holocaust testimonies — we’re also potentially looking at court records from Los Angeles, or large collections of digital records that the UCLA Library has available to us.
What would you say to a Bruin who’s thinking about minoring in the digital humanities?
Marrying the humanities and the sciences has always been so exciting to me. When I was an undergrad, I actually started off as a chemistry major; I ended up dropping it my junior year and studied philosophy and literature. However, I’ve always been intrigued by the methodologies of the sciences and bringing those together with narrative-based fields.
There’s something beautiful about the humanities, the arts and design fields, because they’re about imagining new possibilities and telling stories — and bringing that to bear on computational fields, which are also about bringing something into reality but through different tools and techniques. I love approaches that do both, and that’s the basis of all work at UCLA, and central to the digital humanities.
As chair of UCLA’s Department of European Languages and Transcultural Studies, what would you say the department offers that the typical Bruin may not know about?
We teach about nine different languages, some of which are obviously very highly enrolled, like French, Italian, German, but also lesser-taught languages like Dutch, Yiddish and Swedish. We also teach media, film, literature and history — the expressivity around those languages.
The second part of the department, transcultural studies, is actually really interesting, because that’s about translation: trans-historical, trans-linguistic, transcultural, trans-media. It’s about movement. That’s the beauty of the world, right? It’s about migration, but also about the mobility of ideas.
Films, people, capital — everything moves across borders, and emphasizing that transcultural element is a different way of thinking. It’s certainly the hallmark of the globalized, digital world.