Illustration by Karsten Petrat.
They thought closure had already come. Years ago, they had fought petty lawsuits or administrative orders in Spanish courts. Their cases were settled or dismissed, and the regrettable episodes were forgotten, consigned to the yellowing pages of newspapers and dusty government records. But as those archives moved online, some Spaniards discovered that these distant chapters of their lives had become top search results for their names. Long-dead legal disputes—accusations of medical malpractice, a university-age arrest—were now embarrassing first impressions.
In January 2011, after receiving complaints from ninety people about old documents revealed online, Spain’s Data Protection Agency ordered Google to remove the incriminating websites from its search database. The watchdog invoked Spanish citizens’ “right to be forgotten”—a relatively new legal concept that draws on a bundle of pre-existing privacy rules. Although it has yet to be upheld in court, the “right to be forgotten” is increasingly seen as a novel way to manage personal information on the internet.
Google challenged the order, arguing that it should not be forced to serve as a censor, and Spanish courts referred the dispute to the European Court of Justice, the European Union’s highest judicial body. If it rules along lines favoured by the Spanish agency, it could subject websites to similar orders all over Europe—assuming EU legislators don’t do so first; they are already weighing a proposal to extend the right to be forgotten to the rest of the union. The ethos of the information age—more data means more progress—may be facing its most serious challenge yet.
In a decade defined by WikiLeaks and 9/11 remembrance, it may seem as if we’ve always fought to pry open secrets and preserve memories. But the impulse to forget is also a recurring feature of Western history. “Remembering is only a new form of suffering,” Baudelaire lamented; “history,” groaned Joyce’s Stephen Dedalus, “is a nightmare from which I am trying to awake.” Inconvenient truths have often gone unaddressed: martial monuments, from Trajan’s Column to the Arc de Triomphe, valued vainglory over reflection; silence about erstwhile iniquities—like slavery, after the American Civil War—was seen as the price of peace.
Attitudes swung sharply in memory’s favour after World War II, a conflict too terrible for many thinkers to accept Western Europe’s amnesiac return to normalcy. Artists and critics championed remembrance as a source of moral purity; testimonial works by Holocaust survivor Elie Wiesel and gulag diarist Aleksandr Solzhenitsyn earned their authors Nobel Prizes. Commemorations of war, like Maya Lin’s minimalist Vietnam Veterans Memorial in Washington, DC, increasingly focused on the grieving process.
Belief in the cleansing power of recollection infiltrated politics as deeply as culture. War-crimes tribunals multiplied, providing stages for grief as much as accountability. Truth and Reconciliation Commissions—first established after the fall of repressive Latin American dictatorships and, more famously, in South Africa after the end of apartheid—sprouted worldwide, prescribing testimonial unburdening after periods of injustice. In 1985, on the fortieth anniversary of the end of World War II, West German president Richard von Weizsäcker summarized the impulse behind the memory-laden zeitgeist: The secret of redemption, he intoned, “lies in remembrance.”
Then came the internet. The web makes it clear that not all memories are redeeming—consider the Spaniards, or the US Library of Congress’ plan to archive, in perpetuity, all the instant reactions and drunken ephemera dashed off on Twitter. So much remembering may make us less free. “The fact that the internet never seems to forget is threatening, at an almost existential level, our ability to control our identities,” writes Jeffrey Rosen in a 2010 New York Times Magazine cover story, titled “The Web Means the End of Forgetting.”
Partial remedies have arisen almost as quickly as the new landscape of permanent remembrance itself. In his article, Rosen cites scholar Viktor Mayer-Schönberger’s suggestion of expiration dates for online information; Harvard internet-law expert Jonathan Zittrain’s proposed “reputation bankruptcy,” which would let people wipe their search-engine slates clean; and ReputationDefender (now Reputation.com), a company that, for a fee, floods the net with mentions which turn the algorithmic tide in clients’ favour.
So far, the Canadian and American governments have been more reluctant to take action than their European counterparts. Free speech isn’t their only concern—transparency is, too. Last February, the Association of Canadian Archivists sponsored a symposium, subtitled “The Right to Be Forgotten, The Duty to Remember,” that discussed, among other issues, the tension between personal privacy and government accountability. Allowing aggrieved individuals to conceal public records that mention them also makes it difficult for other citizens to monitor overall government activity—and hold elected officials to account. In countries with a history of official secrecy, especially, upholding the “duty to remember” is a source of legitimacy; their governments’ own reputations depend on operating open-book.
Yet the internet has even more prying eyes than most repressive regimes—one reason why Europe is erring on the side of personal protection. Canada’s privacy commissioner, Jennifer Stoddart, has hinted that she supports the right to be forgotten. Calling the internet’s endless memory a “challenge for humanity,” she’s argued that today’s web users still need “space in which to grow up, to flirt with different ideas,” without those experiments tying them down later in life.
The backlash against excessive remembrance has now gone far beyond concerns about personal history. Some argue that privileging the past doesn’t only affect individuals—it’s counterproductive for society as a whole. Last May, a New York University symposium called “Second Thoughts on the Memory Industry” gathered prominent advocates of remembrance, including graphic novelist Art Spiegelman (author of the Holocaust memoir Maus) and journalist Philip Gourevitch (known for chronicling Rwanda’s genocide), asking them to reconsider the effects of their work. The event poster outlined a few choice subjects, ranging from “competitive victimization” to “Holocaust tourism; genocide Olympics; memory as avoidance and distraction; the instrumentalization of guilt; retribution” and, simply, “memory fatigue.” “Come on, get over it,” the poster pleaded.
At the symposium, historian Timothy Snyder argued that too much “bearing witness” to the past meant draining focus from contemporary concerns like climate change. Others claimed different memories could cannibalize one another in their quest for attention. Cartoonist Ben Katchor and musician Mark Mulcahy presented a comic opera, Memorial City, about a town whose residents go mad building too many memory totems. It wasn’t far from reality: in Berlin, three neighbouring Holocaust memorials, all within walking distance of each other, compete for tourists.
Kanan Makiya—founder of the Iraq Memory Foundation, which records and archives Iraqis’ experiences of Baathist rule—has also cautioned against unrestrained remembrance, observing that, without critical distance, memory can provoke emotional trauma and distorted impressions. That points toward perhaps the most serious consequence of elevating remembrance, one critics have scantly addressed: we can end up reproducing the very catastrophes we’ve vowed never to forget.
Political leaders have long used painful memories to manipulate citizens. Zimbabwe’s president, Robert Mugabe, recounts colonial atrocities to maintain support; the African National Congress, South Africa’s oft-corrupt ruling party, depicts opponents as agents of apartheid. The Holocaust’s omnipresence in Israel’s historical memory means its government perceives even minor threats as apocalyptic—a paranoia that makes compromise seem impossible. American cheerleaders of the Iraq War didn’t hesitate to accuse critics, however illogically, of forgetting what had happened to the Twin Towers. The Iraq Memory Foundation’s Makiya, too, initially rushed to support the invasion. The abuses he’d recorded were, no doubt, weighing on his mind.
Yet memory’s most fatal attractions lie at a much deeper level. Writer Ian Buruma, who spoke at the NYU symposium, has claimed communities now form through a “quasi-religious identification with shared suffering.” In other words, memory is scripture for a secular society; spiritual vacuity has been filled by a devotional attachment to bygone, seemingly more serious times. Horror can become conflated with poignancy, vengeance with meaning.
Subconsciously, the cult of memory demands more of what would be honoured and memorialized: more suffering, more sacrifice, more war. The ever-present past, available online, renders these processes unremitting. Neverending memory makes it impossible to forgive, both between individuals and across borders. Like common search terms, history’s cycles of suspicion and recrimination now autocomplete.
For all the evidence of memory’s troubling impact on politics and culture, though, the exposure of our personal histories still rankles more. Take Günter Grass: the Nobel-winning novelist relentlessly reminded his fellow Germans of their country’s shameful past, revealing only in 2006 that he himself had served in the Schutzstaffel, or SS, the unit responsible for Nazism’s worst crimes. For decades, memory’s most powerful advocate had stopped short of fully confessing his own history.
When search results cast our names in an unflattering light, we feel as if we’ve been singled out for our mistakes. Claiming a personal “right to be forgotten” may just be a way to promote our side of the story. But so, too, is our reverence for collective memory. Its slipperiness often protects the guilty better than forgetting ever could: it helps them shift the blame.
Related on maisonneuve.org: