The Significance of the Twitter Archive at the Library of Congress

It started with some techies casually joking around, and ended with the President of the United States being its most avid user. In between, it became the site of comedy and protest, several hundred million human users and countless bots, the occasional exchange of ideas and a constant stream of outrage.

All along, the Library of Congress was preserving it all. Billions of tweets, saved over 12 years, now rub shoulders with books, manuscripts, recordings, and film among the Library’s extensive holdings.

On December 31, however, this archiving will end. The day after Christmas, the Library announced that it would no longer save all tweets after that date, but instead will choose tweets to preserve “on a very selective basis,” for major events, elections, and political import. The rest of Twitter’s giant stream will flow by, untapped and ephemeral.

The Twitter archive may not be the record of our humanity that we wanted, but it’s the record we have. Due to Twitter’s original terms of service and the public availability of most tweets, which stand in contrast to many other social media platforms, such as Facebook and Snapchat, we are unlikely to preserve anything else like it from our digital age.

Undoubtedly many would consider that a good thing, and that the Twitter archive deserves the kind of mockery that flourishes on the platform itself. What can we possibly learn from the unchecked ramblings and ravings of so many, condensed to so few characters?

Yet it’s precisely this offhandedness and enforced brevity that makes the Twitter archive intriguing. Researchers have precious few sources for the plain-spoken language and everyday activities and thought of a large swath of society.

Most of what is archived is indeed done so on a very selective basis, assessed for historical significance at the time of preservation. Until the rise of digital documents and communications, the idea of “saving it all” seemed ridiculous, and even now it seems like a poor strategy given limited resources. Archives have always had to make tough choices about what to preserve and what to discard.

However, it is also true that we cannot always anticipate what future historians will want to see and read from our era. Much of what is now studied from the past are materials that somehow, fortunately, escaped the trash bin. Cookbooks give us a sense of what our ancestors ate and celebrated. Pamphlets and more recently zines document ideas and cultures outside the mainstream.

Historians have also used records in unanticipated ways. Researchers have come to realize that the Proceedings of the Old Bailey, transcriptions from London’s central criminal court, are the only record we have of the spoken words of many people who lived centuries ago but were not in the educated or elite classes. That we have them talking about the theft of a pig rather than the thought of Aristotle only gives us greater insight into the lived experience of their time.

The Twitter archive will have similar uses for researchers of the future, especially given its tremendous scale and the unique properties of the platform behind the short messages we see on it. Preserved with each tweet, but hidden from view, is additional information about tweeters and their followers. Using sophisticated computational methods, it is possible to visualize large-scale connections within the mass of users that will provide a good sense of our social interactions, communities, and divisions.

Since Twitter launched a year before the release of the iPhone, and flourished along with the smartphone, the archive is also a record of what happened when computers evolved from desktop to laptop to the much more personal embrace of our hands.

Since so many of us now worry about the impact of these devices and social media on our lives and mental health, this story and its lessons may ultimately be depressing. As we are all aware, of course, history and human expression are not always sweetness and light.

We should feel satisfied rather than dismissive that we will have a dozen years of our collective human expression to look back on, the amusing and the ugly, the trivial and, perhaps buried deep within the archive, the profound.

Institutionalizing Digital Scholarship (or Anything Else New in a Large Organization)

I recently gave a talk at Brown University on “Institutionalizing Digital Scholarship,” and upon reflection it struck me that the lessons I tried to convey were more generally applicable. Everyone prefers to talk about innovation, rather than institutionalization, but the former can only have a long-term impact if the latter occurs. What at first seems like a dreary administrative matter is actually at the heart of real and lasting change.

New ideas and methods are notoriously difficult to integrate into large organizations. Institutions and the practitioners within them, outside of and within academia (perhaps especially within academia?), too frequently claim to be open-minded but often exhibit a close-mindedness when the new impinges upon their area of work or expertise. One need only look at the reaction to digital humanities and digital scholarship over the last two decades, and the antagonism and disciplinary policing it is still subject to, often from adjacent scholars.

In my talk I drew on the experience of directing the Roy Rosenzweig Center for History and New Media at George Mason University, the Digital Public Library of America, and now the Northeastern University library. The long history of RRCHNM is especially helpful as a case study, since it faced multiple headwinds, and yet thrived, in large part due to the compelling vision of its founder and the careful pursuit of opportunities related to that vision by scores of people over many years.

If you wish to digest the entire subject, please watch my full presentation. But for those short on time, here are the three critical elements of institutionalization I concluded with. If all three of these challenging processes occur, you will know that you have successfully and fully integrated something new into an organization.

Routinizing

At first, new fields and methods are pursued haphazardly, as practitioners try to understand what they are doing and how to do it. In digital scholarship, this meant a lot of experimentation. In the 1990s and early 2000s, digital projects that advanced scholarly theories eclectically tried out new technologies. Websites were often hand-coded and distinctive. But in the long run, such one-off, innovative projects were unsustainable. The new scholarly activity had to be routinized into a common, recognizable grammar and standardized formats and infrastructure, both for audiences to grasp genres and for projects to be technically sustainable over time.

At RRCHNM, this meant that after we realized we were making the same kind of digital historical project over and over, by hand, we created generalized software, Omeka, through which we could host an infinite number of similar projects. Although it reduced flexibility somewhat, Omeka made new digital projects much easier to launch and sustain. Now there are hundreds of institutions that use the software and countless history (and non-history) projects that rely on it.

Normalizing

To become institutionalized, new activities cannot remain on the fringes. They have to become normalized, part of the ordinary set of approaches within a domain. Practitioners shouldn’t even think twice before engaging in them. Even those outside of the discipline have to recognize the validity of the new idea or method; indeed, it should become unremarkable. (Fellow historians of science will catch a reference here to Thomas Kuhn’s “normal science.”) In academia, the path to normalization often—alas, too often—expresses itself primarily around concerns over tenure. But the anxiety is broader than that and relates to how new ideas and methods receive equal recognition (broadly construed) and especially the right support structures in places like the library and information technology unit.

Depersonalizing

The story of anything new often begins with one or a small number of people, like Roy Rosenzweig, who advanced a craft without caring about the routine and the normal. In the long run, however, for new ideas and methods to last, they have to find a way to exist beyond the founders, and beyond those who follow the founders. RRCHNM has now had three directors and hundreds of staffers, but similar centers have struggled or ceased to exist after the departure of their founders. This is perhaps the toughest, and final, aspect of institutionalization. It’s hard to lose someone like Roy. On the other hand, it’s another sign of his strong vision that the center he created was able to carry on and strengthen, now over a decade after he passed away.

Humility and Perspective-Taking: A Review of Alan Jacobs’s How to Think

In Alan Jacobs’s important new book How to Think: A Survival Guide for a World at Odds, he locates thought within our social context and all of the complexities that situation involves: our desire to fit into our current group or an aspirational in-group, our repulsion from other groups, our use of a communal (but often invisibly problematic) shorthand language, our necessarily limited interactions and sensory inputs. With reference to recent works in psychology, he also lays bare our strong inclination to bias and confusion.

However, Jacobs is not by trade a social scientist, and having obsessed about many of the same works as him (Daniel Kahneman’s Thinking, Fast and Slow looms large for both of us), it’s a relief to see a humanist address the infirmity of the mind, with many more examples from literature, philosophy, and religion, and with a plainspoken synthesis of academic research, popular culture, and politics.

How to Think is much more fun than a book with that title has the right to be. Having written myself about the Victorian crisis of faith, I am deeply envious of Jacobs’s ability to follow a story about John Stuart Mill’s depression with one about Wilt Chamberlain’s manic sex life. You will enjoy the read.

But the approachability of this book masks only slightly the serious burden it places on its readers. This is a book that seeks to put us into uncomfortable positions. In fact, it asks us to assume a position from which we might change our positions. Because individual thinking is inextricably related to social groups, this can lead to exceedingly unpleasant outcomes, including the loss of friends or being ostracized from a community. Taking on such risk is very difficult for human beings, the most social of animals. In our age of Twitter, the risk is compounded by our greater number of human interactions, interactions that are exposed online for others to gaze upon and judge.

So what Jacobs asks of us is not at all easy. (Some of the best passages in How to Think are of Jacobs struggling with his own predisposition to fire off hot takes.) It can also seem like an absurd and unwise approach when the other side shows no willingness to put themselves in your shoes. Our current levels of polarization push against much in this book, and the structure and incentives of social media are clearly not helping.

Like any challenge that is hard and risky, overcoming it requires a concerted effort over time. Simple mental tricks will not do. Jacobs thus advocates for, in two alliterative phrases that came to mind while reading his book, habits of humility and practices of perspective-taking. To be part of a healthy social fabric—and to add threads to that fabric rather than rend it—one must constantly remind oneself of the predisposition to error, and one must repeatedly try to pause and consider, if only briefly, the source of other views you are repulsed by. (An alternative title for this book could have been How to Listen.)

Jacobs anticipates some obvious objections. He understands that facile calls for “civility,” which some may incorrectly interpret as Jacobs’ project, is often just repression in disguise. Jacobs also notes that you can still hold strong views, or agree with your group much of the time, in his framing. It’s just that you need to have a modicum of flexibility and ability to see past oneself and one’s group. Disagreements can then be worked out procedurally rather than through demonization.

Indeed, those who accept Jacobs’s call may not actually change their minds that often. What they will have achieved instead, in Jacobs’s most memorable phrase, is “a like-hearted, rather than like-minded,” state that allows them to be more neighborly with those around them and beyond their group. Enlarging the all-too-small circle of such like-hearted people is ultimately what How to Think seeks.

Roy’s World

In one of his characteristically humorous and self-effacing autobiographical stories, Roy Rosenzweig recounted the uneasy feeling he had when he was working on an interactive CD-ROM about American history in the 1990s. The medium was brand new, and to many in academia, superficial and cartoonish compared to a serious scholarly monograph.

Roy worried about how his colleagues and others in the profession would view the shiny disc on the social history of the U.S., and his role in creating it. After a hard day at work on this earliest of digital histories, he went to the gym, and above his treadmill was a television tuned to Entertainment Tonight. Mary Hart was interviewing Fabio, fresh off the great success of his “I Can’t Believe It’s Not Butter” ad campaign. “What’s next for Fabio?” Hart asked him. He replied: “Well, Mary, I’m working on an interactive CD-ROM.”

Roy Rosenzweig

Ten years ago today Roy Rosenzweig passed away. Somehow it has now been longer since he died than the period of time I was fortunate enough to know him. It feels like the opposite, given the way the mind sustains so powerfully the memory of those who have had a big impact on you.

The field that Roy founded, digital history, has also aged. So many more historians now use digital media and technology to advance their discipline that it no longer seems new or odd like an interactive CD-ROM.

But what hasn’t changed is Roy’s more profound vision for digital history. If anything, more than ever we live in Roy’s imagined world. Roy’s passion for open access to historical documents has come to fruition in countless online archives and the Digital Public Library of America. His drive to democratize not only access to history but also the historical record itself—especially its inclusion of marginalized voices—can been seen in the recent emphasis on community archive-building. His belief that history should be a broad-based shared enterprise, rather than the province of the ivory tower, can be found in crowdsourcing efforts and tools that allow for widespread community curation, digital preservation, and self-documentation.

It still hurts that Roy is no longer with us. Thankfully his mission and ideas and sensibilities are as vibrant as ever.

Introducing the What’s New Podcast

whats_new_logo_NU-white-background

My new podcast, What’s New, has launched, and I’m truly excited about the opportunity to explore new ideas and discoveries on the show. What’s New will cover a wide range of topics, from the humanities, social sciences, natural sciences, and technology, and it is intended for anyone who wants to learn new things. I hope that you’ll subscribe today on iTunes, Google Play, or SoundCloud.

I hugely enjoyed doing the Digital Campus podcast that ran from 2007-2015, and so I’m thrilled to return to this medium. Unlike Digital Campus, which took the format of a roundtable with several colleagues from George Mason University, on What’s New I’ll be speaking largely one-on-one with experts, at Northeastern University and well beyond, to understand how their research is changing our understanding of the world, and might improve the human condition. In a half-hour podcast you’ll come away with a better sense of cutting-edge scientific and medical discoveries, the latest in public policy and social movements, and the newest insights of literature and history.

I know that the world seems impossibly complex and troubling right now, but one of themes of What’s New is that while we’re all paying closer attention to the loud drumbeat of social media, there are people in many disciplines making quieter advances, innovations, and creative works that may enlighten and help us in the near future. So if you’re looking for a podcast with a little bit of optimism to go along with the frank discussion of the difficulties we undoubtedly face, What’s New is for you.

Age of Asymmetries

Cory Doctorow’s 2008 novel Little Brother traces the fight between hacker teens and an overactive surveillance state emboldened by a terrorist attack in San Francisco. The novel details in great depth the digital tools of the hackers, especially the asymmetry of contemporary cryptography. Simply put, today’s encryption is based on mathematical functions that are really easy in one direction—multiplying two prime numbers to get a large number—and really hard in the opposite direction—figuring out the two prime numbers that were multiplied together to get that large number.

Doctorow’s speculative future also contains asymmetries that are more familiar to us. Terrorist attacks are, alas, all too easy to perpetrate and hard to prevent. On the internet, it is easy to be loud and to troll and to disseminate hate, and hard to counteract those forces and to more quietly forge bonds.

The mathematics of cryptography are immutable. There will always be an asymmetry between that which is easy and that which is hard. It is how we address the addressable asymmetries of our age, how we rebalance the unbalanced, that will determine what our future actually looks like.

Irrationality and Human-Computer Interaction

When the New York Times let it be known that their election-night meter—that dial displaying the real-time odds of a Democratic or Republican win—would return for Georgia’s 6th congressional district runoff after its notorious November 2016 debut, you could almost hear a million stiff drinks being poured. Enabled by the live streaming of precinct-by-precinct election data, the dial twitches left and right, pauses, and then spasms into another movement. It’s a jittery addition to our news landscape and the source of countless nightmares, at least for Democrats.

We want to look away, and yet we stare at the meter for hours, hoping, praying. So much so that, perhaps late at night, we might even believe that our intensity and our increasingly firm grip on our iPhones might affect the outcome, ever so slightly.

Which is silly, right?

*          *          *

Thirty years ago I opened a bluish-gray metal door and entered a strange laboratory that no longer exists. Inside was a tattered fabric couch, which faced what can only be described as the biggest pachinko machine you’ve ever seen, as large as a giant flat-screen TV. Behind a transparent Plexiglas front was an array of wooden pegs. At the top were hundreds of black rubber balls, held back by a central gate. At the bottom were vertical slots.

A young guy—like me, a college student—sat on the couch in a sweatshirt and jeans. He was staring intently at the machine. So intently that I just froze, not wanting to get in the way of his staring contest with the giant pinball machine.

He leaned in. Then the balls started releasing from the top at a measured pace and they chaotically bounced around and down the wall, hitting peg after peg until they dropped into one of the columns at the bottom. A few minutes later, those hundreds of rubber balls had formed a perfectly symmetrical bell curve in the columns.

The guy punched the couch and looked dispirited.

I unfroze and asked him the only phrase I could summon: “Uh, what’s going on?”

“I was trying to get the balls to shift to the left.”

“With what?”

With my mind.”

*          *          *

This was my first encounter with the Princeton Engineering Anomalies Research program, or PEAR. PEAR’s stated mission was to pursue an “experimental agenda of studying the interaction of human consciousness with sensitive physical devices, systems, and processes,” but that prosaic academic verbiage cloaked a far cooler description: PEAR was on the hunt for the Force.

This was clearly bananas, and also totally enthralling for a nerdy kid who grew up on Star Wars. I needed to know more. Fortunately that opportunity presented itself through a new course at the university: “Human-Computer Interaction.” I’m not sure I fully understood what it was about before I signed up for it.

The course was team-taught by prominent faculty in computer science, psychology, and engineering. One of the professors was George Miller, a founder of cognitive psychology, who was the first to note that the human mind was only capable of storing seven-digit numbers (plus or minus two digits). And it included engineering professor Robert Jahn, who had founded PEAR and had rather different notions of our mental capacity.

*          *          *

One of the perks of being a student in Human-Computer Interaction was that you were not only welcome to stop by the PEAR lab, but you could also engage in the experiments yourself. You would just sign up for a slot and head to the basement of the engineering quad, where you would eventually find the bluish-gray door.

By the late 1980s, PEAR had naturally started to focus on whether our minds could alter the behavior of a specific, increasingly ubiquitous machine in our lives: the computer. Jahn and PEAR’s co-founder, Brenda Dunne, set up several rooms with computers and shoebox-sized machines with computer chips in them that generated random numbers on old-school red LED screens. Out of the box snaked a cord with a button at the end.

You would book your room, take a seat, turn on the random-number generator, and flip on the PC sitting next to it. Once the PC booted up, you would type in a code—as part of the study, no proper names were used—to log each experiment. Then the shoebox would start showing numbers ranging from 0.00 to 2.00 so quickly that the red LED became a blur. You would click on the button to stop the digits, and then that number was recorded by the computer.

The goal was to try to stop the rapidly rotating numbers on a number over 1.00, to push the average up as far as possible. Over dozens of turns the computer’s monitor showed how far that average diverged from 1.00.

That’s a clinical description of the experiment. In practice, it was a half-hour of tense facial expressions and sweating, a strange feeling of brow-beating a shoebox with an LED, and some cursing when you got several sub-1.00 numbers in a row. It was human-computer interaction at its most emotional.

Jahn and Dunne kept the master log of the codes and the graphs. There were rumors that some of the codes—some of the people those codes represented—had discernable, repeatable effects on the random numbers. Over many experiments, they were able to make the average rise, ever so slightly but enough to be statistically significant.

In other words, there were Jedis in our midst.

Unfortunately, over several experiments—and a sore thumb from clicking on the button with increasing pressure and frustration—I had no luck affecting the random numbers. I stared at the graph without blinking, hoping to shift the trend line upwards with each additional stop. But I ended up right in the middle, as if I had flipped a coin a thousand times and gotten 500 heads and 500 tails. Average.

*          *          *

Jahn and Dunne unsurprisingly faced sustained criticism and even some heckling, on campus and beyond. When PEAR closed in 2007, all the post-mortems dutifully mentioned the editor of a journal who said he could accept a paper from the lab “if you can telepathically communicate it to me.” It’s a good line, and it’s tempting to make even more fun of PEAR these many years later.

The same year that PEAR closed its doors, the iPhone was released, and with it a new way of holding and touching and communicating with a computer. We now stare intently at these devices for hours a day, and much of that interaction is—let’s admit it—not entirely rational.

We see those three gray dots in a speech bubble and deeply yearn for a good response. We open the stocks app and, in the few seconds it takes to update, pray for green rather than red numbers. We go to the New York Times on election eve and see that meter showing live results, and more than anything we want to shift it to the left with our minds.

When asked by what mechanism the mind might be able to affect a computer, Jahn and Dunne hypothesized that perhaps there was something like an invisible Venn diagram, whereby the ghost in the machine and the ghost in ourselves overlapped ever so slightly. A permeability between silicon and carbon. An occult interface through which we could ever so slightly change the processes of the machine itself and what it displays to us seconds later.

A silly hypothesis, perhaps. But we often act like it is all too real.

What’s the Matter with Ebooks: An Update

In an earlier post I speculated about the plateau in ebook adoption. According to recent statistics from publishers we are now actually seeing a decline in ebook sales after a period of growth (and then the leveling off that I discussed before). Here’s my guess about what’s going on—an educated guess, supported by what I’m hearing from my sources and network.

First, re-read my original post. I believe it captured a significant part of the story. A reminder: when we hear about ebook sales we hear about the sales from (mostly) large publishers and I have no doubt that ebooks are a troubled part of their sales portfolio. But there are many other ebooks than those reported by the publishers that release their stats, and ways to acquire them, and thus there’s a good chance that there’s considerable “dark reading” (as I called it) that accounts for the disconnect between the surveys that say that e-reading is growing while sales (again, from the publishers that reveal these stats) are declining.

The big story I now perceive is a bifurcation of the market between what used to be called high and low culture. For genre fiction (think sexy vampires) and other genres where there is a lot of self-publishing, readers seem to be moving to cheap (often 99 cent) ebooks from Amazon’s large and growing self-publishing program. Amazon doesn’t release its ebook sales stats, but we know that they already have 65% of the ebook market and through their self-publishing program may reach a disturbing 90% in a few years. Meanwhile, middle- and high-brow books for the most part remain at traditional publishers, where advances still grease the wheels of commerce (and writing).

Other changes I didn’t discuss in my last post are also happening that impact ebook adoption. Audiobook sales rose by an astonishing 40% over the last year, a notable story that likely impacts ebook growth—for the vast majority of those with smartphones, they are substitutes (see also the growth in podcasts). In addition, ebooks have gotten more expensive in the past few years, while print (especially paperback) prices have become more competitive; for many consumers, a simple Econ 101 assessment of pricing accounts for the ebook stall.

I also failed to account in my earlier post for the growing buy-local movement that has impacted many areas of consumption—see vinyl LPs and farm-to-table restaurants—and is, in part, responsible for the turnaround in bookstores—once dying, now revived—an encouraging trend pointed out to me by Oren Teicher, the head of the American Booksellers Association. These bookstores were clobbered by Amazon and large chains late last decade but have recovered as the buy-local movement has strengthened and (more behind the scenes, but just as important) they adopted technology and especially rapid shipping mechanisms that have made them more competitive.

Personally, I continue to read in both print and digitally, from my great local public library and from bookstores, and so I’ll end with an anecdotal observation: there’s still a lot of friction in getting an ebook versus a print book, even though one would think it would be the other way around. Libraries still have poor licensing terms from publishers that treat digital books like physical books that can only be loaned to one person at a time despite the affordances of ebooks; ebooks are often not that much cheaper, if at all, than physical books; and device-dependency and software hassles cause other headaches. And as I noted in my earlier post, there’s still not a killer e-reading device. The Kindle remains (to me and I suspect many others) a clunky device with a poor screen, fonts, etc. In my earlier analysis, I probably also underestimated the inertial positive feeling of physical books for most readers—which I myself feel as a form of consumption that reinforces the benefits of the physical over the digital.

It seems like all of these factors—pricing, friction, audiobooks, localism, and traditional physical advantages—are combining to restrict the ebook market for “respectable” ebooks and to shift them to Amazon for “less respectable” genres. It remains to be seen if this will hold, and I continue to believe that it would be healthy for us to prepare for, and create, a better future with ebooks.

The Digital Divide and Digital Reading: An Update

Last month I wrote an article for The Atlantic on the state of the digital divide, the surprisingly high rate of device (smartphone and tablet) adoption at all socio-economic strata, and what these new statistics mean for ebooks and reading. An excerpt:

According to Common Sense, 51 percent of teenagers in low-income families have their own smartphones, and 48 percent of tweens in those families have their own tablets. Note that these are their own devices, not devices they have to borrow from someone else. Among middle-income families (that is, between $35,000 and $100,000), 53 percent of tweens have their own tablets and 69 percent of teenagers have their own smartphones, certainly higher but by a lot less than one might imagine.

If we pull back and look at households in general, the gap narrows in other ways. This winter, the Joan Ganz Cooney Center at Sesame Workshop published the first nationally representative telephone survey of lower-income parents on issues related to digital connectivity. The study, conducted by the research firm SSRS, included nearly 1,200 parents with school-aged children, interviewed in both Spanish and English, via landlines and cell phones. It was weighted to be representative of the American population.

In this comprehensive survey, a striking 85 percent of families living below the poverty line have some kind of digital device, smartphone or tablet, in their household. Seventy-three percent had one or more smartphones, compared to 84 percent for families above the poverty line. These are vastly changed numbers from just a few years ago. A 2011 study by Common Sense showed that in lower-income (under $30,000) households with children, only 27 percent of them had a smartphone, compared to 57 percent for households with children and income over $75,000.

It’s worth pondering the significance of these new numbers, and how we might be able to leverage widespread device adoption to increase reading. My conclusion:

We must do everything we can to connect kids with books. Print books, ebooks, library books, bookstores—let’s have it all. Let’s give children access to books whenever and wherever, whether it’s a paperback in the backpack, or a phone in the back pocket.

[Read the full article at The Atlantic.]

Ken Burns and Mrs. Jennings

As the Chairman of the National Endowment for the Humanities, William Adams, noted at the beginning of last night’s Jefferson Lecture, Ken Burns was an extraordinarily apt choice to deliver this honorary talk in the celebratory 50th year of the Endowment. Tens of millions of Americans have viewed his landmark documentaries on the Civil War, jazz, baseball, and other topics pivotal to U.S. history and culture.

Burns began his talk with a passionate defense of the humanities. The humanities and history, by looking at bygone narratives and especially by listening to the voices of others from the past—and showing their faces in Burns’s films, as Chairman Adams helpfully highlighted—prod us to understand the views of others, and thus, we hope, expand our capacity for tolerance. We have indeed lost the art of seeing through others’ eyes—perspective-taking—to disastrous results online and off. It was good to hear Burns’s fiery rhetoric on this subject.

His sense that the past is still so very present, especially the deep scar of slavery and racism, was equally powerful. As Burns reminded us, the very lecture he was giving was named after a Founder and American president who owned a hundred people and who failed to liberate even one during his lifetime.

While there were many grand and potent themes to Burns’s lecture, and many beautiful and haunting phrases, in my mind the animating and central element in his talk was a personal story, and a person. And it is worth thinking more about that smaller history to understand Burns’s larger sense of history. (Before reading further, I encourage you to read the full lecture, which is now up on the NEH website.)

* * *

When Burns was just a small boy, only 9 years old, his mother became terminally ill with cancer, and the family needed help as their lives unraveled. His father hired Mrs. Jennings, an African-American woman who was literally from the other side of the tracks in Newark, Delaware. Burns clearly bonded strongly with Mrs. Jennings; he loved her as a “surrogate mother” and someone who loved him and stood strong for him in a time of great stress and uncertainty.

Then came a moment that haunts Burns to this day, a moment he admits to thinking about every week for over 50 years. His father took a job at the University of Michigan, in part so that his deteriorating wife could get medical care at the university hospital. The family would have to move. They packed up, and on the way out of town, took a final stop at Mrs. Jennings’ house. As Burns recounts the moment:

She greeted us warmly, as she always did, but she was also clearly quite upset and worried to see us go, concerned about our family’s dire predicament. Just as we were about to head off for the more than twelve-hour drive to our new home, Mrs. Jennings leaned into the back of the car to give me a hug and kiss goodbye. Something came over me. I suddenly recoiled, pressed myself into the farthest corner of the back seat, and wouldn’t let her.

Burns sees this moment, which he had never recounted publicly before last night and which immediately hushed the audience, as a horrific emergence of racism in his young self. Internalizing the “n-word” that was used all around him in the early 1960s, he couldn’t bring himself, at this crucial moment, to simply lean forward and hug and kiss Mrs. Jennings.

In this way, and in this story, Ken Burns’s Jefferson lecture was, perhaps more than anything, a plea for forgiveness. In the largely white audience, you could sense, at that tense, core moment of his talk, the self-recognition of those in the darkness, who knew that they, too, had had moments like Ken’s—a deep-seated inability to treat a black friend or colleague or neighbor with the humanity they deserved and desired.

* * *

Upon further reflection, I think there is something in the story of Ken Burns and Mrs. Jennings that Burns may not have fully articulated, but that, even through his painful self-criticism, he may understand.

That moment of “recoil” is, I believe, more emotionally complex. Undoubtedly it includes the terrible mark of racism that Burns identified. But he was also a 9-year-old boy whose mother was dying, who was being driven away from his childhood home, the address of which he still remembers by heart as a 62 year old.

Young children respond to intensely stressful moments in ways that adults cannot understand. Surely Ken’s recoil also included feelings of not wanting to leave, not wanting to acknowledge that he was being driven away from all that he knew, with another, certain, grim loss on the horizon. Perhaps most of all, Ken didn’t want to be separated from someone he deeply loved as a human being: Mrs. Jennings. Kids don’t have the same coping mechanisms or situational behavior that adults have. Sometimes when they don’t want to affirm the horror of their present, they retreat into themselves. I hope that Ken Burns can let that possibility in, and begin to forgive himself, as much as he wishes that Mrs. Jennings and his father, who lashed out at him for his recoil, could return and do the forgiving.

If he can begin to forgive himself and recognize the complex feelings of that moment, then the story of Ken Burns and Mrs. Jennings can serve as both an example of the cruel, ongoing impact of racism in the United States, and also as a source of how change happens, albeit all too slowly. Surely Ken Burns’s unconscious reflection on this moment with Mrs. Jennings has been writing itself, subliminally, into his documentaries, and through them, into our own views of American history.

Burns mentioned toward the end of the lecture how African-American pioneers and geniuses such as Louis Amstrong and Jackie Robinson changed the racial views of many white Americans. But just as important, and perhaps more so, are the more complicated, daily interactions such as that between boyhood Ken Burns and Mrs. Jennings, experiences in which cold, dehumanizing stereotyping battles warm, humanizing sentiment. It takes constant work from us all for the latter to win.

[With thanks to my always insightful wife for our conversation about the lecture.]