In 1915, Gabrielle Darley killed a New Orleans man who had tricked her into a life of prostitution. She was tried, acquitted of murder and within a few years was living a new life under her married name, Melvin. Then a blockbuster movie, The Red Kimona, splashed her sensational story across America’s silver screens.
The 1925 film used Darley’s real name and details of her life taken from transcripts of the murder trial. She sued for invasion of privacy and won.
In deciding in favour of Darley, a California court said that people have a right to rehabilitation. “We should permit [people] to continue in the path of rectitude rather than throw [them] back into a life of shame or crime,” the court said. It is a sentiment that is harder to put into practice today, when information is much more readily available. Nonetheless, policymakers and media outlets are looking at the issue.
As a scholar of media history and law, I see Darley’s story as more than an interesting slice of legal and cinematic history. Her case provides an early example of how private people struggle to escape their pasts and how the idea of privacy is linked to rehabilitation.
‘Unpublishing’ old news
Protecting privacy for the sake of rehabilitation is much harder today, with information only a click away on the internet. Amid concern that the availability of past indiscretions may now be a permanent barrier to employment, some news organisations are, on request, taking down old stories about small crimes by private people.
The Cleveland Plain Dealer adopted such a policy in 2018.
“Not a week goes by anymore, it seems, that we don’t hear from people who are blocked from improving their lives by … stories about their mistakes in Google searches of their names,” explained Plain Dealer editor Chris Quinn at the time.
Earlier this year, the Boston Globe announced it too would be “unpublishing” old information as part of its “Fresh Start” programme. The intention is to “address the lasting impact that stories about past embarrassments, mistakes, or minor crimes, forever online and searchable, can have on a person’s life,” the newspaper said. And other newspapers, such as the Bangor Daily News, have begun similar programs.
These voluntary efforts are in step with one of the main tenets of the Society of Professional Journalists code of ethics: to “minimise harm”. But it also comes at a time when the news media is looking at how it has served Black and Hispanic communities. The industry has long suffered a racial gap, with minorities underrepresented in the newsroom.
There is growing concern that this has affected coverage, and that the reporting of local crime has been racially biased. It has tended to rely too much on police contacts and explanations. And in a country where Black and Hispanic men and women are disproportionately criminalised, it contributes to negative stereotyping of minorities.
This push to allow people involved with minor crimes to move on with their lives by scrubbing news reports seems to contradict a principle of freedom of information.
However, there are important exceptions, for example, in confidential pretrial negotiations, and also in trials of juvenile offenders, which are closed to help shield a young offender’s rehabilitation.
There are also ethical exceptions to the publication of criminal incident information. For example, ethical journalists do not publish the names of witnesses to crimes or survivors of sexual assaults. But this is voluntary. The courts have said that the First Amendment protects journalists who do publish these names.
The new dimension in this controversy is the longevity of and ease of access to this information on the internet. Victims and offenders continue to be in the public eye long after any useful purpose has been served.
Removal on request
In contrast to the voluntary programs in the US at news organisations like the Boston Globe and Cleveland Plain Dealer, the European Union has enacted broad privacy regulations. These laws began in the 1990s and were finalised in March 2014 with the General Data Protection Regulation. One provision allows individuals to ask that search engine links of all kinds are erased on request. It applies when the information is outdated, involves minor issues or is irrelevant to the public interest and potentially harmful to the individuals.
The General Data Protection Regulation was upheld in May 2014 when the EU Court of Justice ruled on the case of Mario Costeja González v. Google Spain. González had sued to have Google delist information about a forced auction to pay debts. The court ordered that the information be erased from Google links, but it specifically exempted the original publication by La Vanguardia, a daily newspaper in Barcelona. Although Google argued against the delisting requirement, the court said Google is a “data controller” and not a news organisation that would be protected under the Charter of Fundamental Rights of the EU.
Since then, Google Europe has complied with court orders. To date it has received more than 1 million requests to delist close to 4 million links, according to Google’s own data. Over 88% of requests have come from private individuals, with about 20% of the URLs being requested for removal being news items. Almost half of the links flagged have been removed by the company after review.
The right to be forgotten has sparked concern over “erasing” history. But neither regulation nor voluntary actions are aimed at protecting public figures, or those who have committed serious crimes.
The question in the US is whether the preliminary efforts toward self-regulation by the newspaper industry are sufficient in the long run, or whether a delisting privacy law may be warranted.
The principle at the heart of the Red Kimono court decision a century ago was that everyone deserves the chance for rehabilitation. Darley was not convicted of murder, and at the end of the movie, she symbolically threw away her red kimono and moved on to a better life.
But that kind of journey is a lot harder when the public is just a click away from your past life – a fact that poses a conundrum for media organisations, search engines and regulators alike.
Featured image: Wikimedia Commons