Sunday, 24 de November de 2024 ISSN 1519-7670 - Ano 24 - nº 1315

Erasing History

 

One of my favorite coffee-table books is an odd volume called “The Commissar Vanishes,” a portfolio of doctored photographs from Stalin’s Russia. When Stalin purged one of his fellow Bolsheviks, the comrade who fell from favor was duly cropped or airbrushed out of official photographs. “The Commissar Vanishes” juxtaposes the before and after. Here is the party stalwart grinning alongside Lenin in Red Square; and now — poof! — he’s gone. Person, un-person. History, un-history.

For a contemporary take on the subject of un-history, I take you now to a lawsuit scheduled for argument next month in a Connecticut courtroom. The case tests the proposition that in America in the Internet age, there are benign, even humane reasons that sometimes history should be erased.

Connecticut has a law that allows people accused of crimes to expunge the official record if a case is dismissed. Most states have some version of expungement laws, or erasure laws as they are sometimes called. They are intended to let those whose cases have been dropped or overturned get on with their lives, unencumbered by the taint of arrest. Thus under the Connecticut law any person whose record is erased “shall be deemed to have never been arrested” and “may swear so under oath.”

Lorraine Martin, a nurse in Greenwich, was arrested in 2010 with her two grown sons when police raided her home and found a small stash of marijuana, scales and plastic bags. The case against her was tossed out when she agreed to take some drug classes, and the official record was automatically purged. It was, the law seemed to assure her, as if it had never happened.

But Martin found that when she applied for jobs that should have been well within her reach, she got the cold shoulder. She Googled herself and discovered what any vigilant employer would have seen: stories still sitting in online news archives with headlines like “Mother and sons charged with drug offenses.”

“It’s essentially a scarlet letter,” her lawyer, Mark Sherman, told me. “She’s become unemployable in spite of the fact that she has no criminal arrest record.”

So Martin filed a class action against local news outlets, claiming that they had defamed her and everyone in a similar situation. Defamation is the publication of information that is both damaging and false. The arrest story was obviously true when it was first published. But Connecticut’s erasure law has already established that truth can be fungible. Martin, her suit says, was “deemed never to have been arrested.” And therefore the news story had metamorphosed into a falsehood.

There are passages in the court briefs that make you think the lawyers were possessed by the ghost of Lewis Carroll. They debate the difference between “historical fact” and “legal fact.” They dispute whether something that was true when it happened can become not just private but actually untrue, so untrue you can swear an oath that it never happened and, in the eyes of the law, you’ll be telling the truth. Several pages and copious footnotes are devoted to considering what the meaning of “publish” is. Martin’s lawyers insist that every time a search engine delivers the old story to a new reader, it amounts to republishing, and constitutes a new libel. The defending news companies say that is ridiculous.

The plaintiff’s brief concedes that the suit is “novel,” and most lawyers I talked to predicted the case would probably be dismissed. It seems to collide head on with the First Amendment. The closest thing I could find to a similar case, in New Jersey’s Supreme Court, was thrown out with a ruling that suggested the plaintiff’s logic was “Orwellian.”

But the dilemma underlying this case is real, and not so simple. The Connecticut case is just one manifestation of an anxious backlash against the invasive power of the Internet, a world of Big Data and ever more powerful search engines, in which it seems almost everything is permanently recorded and accessible to almost anyone — potential employers, landlords, dates, predators. In Europe, where press freedoms are less sacred and the right to privacy is more ensconced, the idea has taken hold that individuals have a “right to be forgotten,” and those who want their online particulars expunged tend to have the government on their side. In Germany or Spain, Lorraine Martin might have a winning case.

I sense that the idea is gaining traction here. Erasure laws seem to be proliferating. States feel greater pressure to put public records offline. (After a New York newspaper published names and addresses of local handgun permit-holders, the Legislature in Albany sharply limited access to that information.) Google’s latest transparency report shows a sharp rise in requests from governments and courts to take down potentially damaging material. Editors tell me they are increasingly beset by readers who once cooperated with a reporter on a sensitive subject — nudism, anorexia, bullying — and years later find that old story a recurring source of distress. (It’s called “source remorse.”)

Greg Brock, who handles reader complaints for The Times, says he now gets about four pleas a week from readers who want something purged. Most involve items from the police blotter, though he also gets requests to delete announcements of weddings that have since ended in ugly divorce. The most heartbreaking appeal he has heard involved the story of a toddler who apparently mistook her newborn twin siblings for dolls, pulled them from their cribs to play with them and inadvertently killed them. Nearly 30 years later, the toddler is now a teacher. When her students plugged her name into Google, the first thing that popped up was a headline tying her to the death of the twins.

As if that were not enough, there is a growing flock of buzzards to feed on your distress. Mugshots.com, for example, collects and posts arrest records from all over, complete with lurid mug shots, and then offers to redact your information — for $399.

However understandable the yearning to escape painful memories, The Times’s policy is not to censor history, because it’s history. The paper will update an arrest story if presented with evidence of an acquittal or dismissal, completing the story but not deleting the story.

Some papers have compromised by agreeing in certain instances to insert a bit of code in online articles that prevents them from being fetched by the major search engines. The stories can still be found in the paper’s digital archive, just as they can be found in bound volumes at the local library, but they do not show up on Google. The Hearst Corporation, a defendant in that Connecticut libel suit, is experimenting with such a program. The Times considered a similar policy a few years ago, and we decided it was a slippery slope. But perhaps it’s time to reopen that discussion.

A number of privacy protection firms use another technique. They prepare packets of non-damaging information about their clients — academic records, philanthropic work — and “optimize” them so they pop up in search-engine rankings before the more embarrassing stuff.

Last month Owen Tripp, a co-founder of Reputation.com, which has made a business out of helping clients manage their digital profile, advocated a “right to be forgotten” in a YouTube video. Tripp said everyone is entitled to a bit of space to grow up, to experiment, to make mistakes.

“How do we give people a chance to go back and to edit, to trim around the edges, or at least drop the veil across those things that are most private?” he asked.

“This is not just a privacy problem,” said Viktor Mayer-Schönberger, a professor at the Oxford Internet Institute, and author of “Delete: The Virtue of Forgetting in the Digital Age.” “If we are continually reminded about people’s mistakes, we are not able to judge them for who they are in the present. We need some way to put a speed-brake on the omnipresence of the past.”

What that brake might be, he says, is not entirely clear. He is wary of legislation, but would like to see search engine companies — the parties that benefit the most financially from amassing our information — offer the kind of reputation-protecting tools that are now available only to those who can afford paid services like those of Reputation.com. Google, he points out, already takes down five million items a week because of claims that they violate copyrights. Why shouldn’t we expect Google to give users an option — and a simple process — to have news stories about them down-ranked or omitted from future search results? Good question. What’s so sacred about a search algorithm, anyway? After all, nobody ever called Google the first draft of history.