The Legal Examiner Affiliate Network The Legal Examiner The Legal Examiner The Legal Examiner search instagram avvo phone envelope checkmark mail-reply spinner error close The Legal Examiner The Legal Examiner The Legal Examiner

Evidence of war crimes and more lost as social media companies scrub posts

Social media platforms work to remove offensive and criminal posts and photographs from their sites for obvious reasons. But in the process, these platforms are hindering the work of investigators seeking to prosecute war criminals, pedophile rings and other crimes against humanity, according to a new report from Human Rights Watch.

Without properly archiving these posts and making them accessible to researchers and investigators, these platforms are removing photographs, videos, perpetrators and other media content central to prosecuting war crimes and other international crimes, according to the report.

a screen from the Facebook app is seen on an iPhone in front of a Facebook logo

Such content is regularly used by the International Criminal Court and in European court proceedings to help mete out atrocities and those who commit them. That includes chemical weapons attacks in Syria, security crackdowns in Sudan and police abuse in the United States.

RELATED: Domestic terrorism prosecutions reach all-time high

RELATED: War crimes investigations aided by artificial intelligence

Human Rights Watch investigates and reports on abuses happening across the globe.

Because these posts violate community guidelines or platform standards, they use algorithms to identify the content and remove it. Some platforms do not allow such posts to be uploaded at all.

Access is either denied or lost because the posts and photographs are not archived but deleted.

“Governments globally have encouraged this trend, calling on companies to take down content as quickly as possible, particularly since March 2019, when a gunman live-streamed his attack on two mosques in Christchurch, New Zealand that killed 51 people and injured 49 others,” the report states.

“Companies are right to promptly remove content that could incite violence, otherwise harm individuals, or jeopardize national security or public order. But the social media companies have failed to set up mechanisms to ensure that the content they take down is preserved, archived, and made available to international criminal investigators,” Human Rights Watch said.

Belkis Wille

“These take-downs and lack of access are having significant effects for non-law enforcement,’’ said Belkis Wille, Senior Crisis & Conflict Researcher for Human Rights Watch and author of the report. 

“So, that means researchers from Human Rights Watch and other organizations, academics, journalists, as well as international investigators with The United Nations and other international justice mechanisms” lose access. “We know this because, during the interviews for this report, I spoke to dozens of war crimes investigators who were talking about how more and more of the basis of their investigations are pieces of content that are posted online. And when those pieces of content have come down because they hadn’t been able to get access to them from the companies, it stopped them in their tracks.”

Most countries allow national law enforcement to use warrants, subpoenas and court orders to compel companies to release the content, but international investigators have much more limited access.

Wille said there is no way to know how much evidence is being lost as a result of these practices. Artificial intelligence systems are taking the content down, in some instances, before those investigating crimes even know they exist.

“I can’t give you a smoking gun case where there was an investigation that fell apart because the content has come down,” she said. “The content is coming down so quickly, you don’t know what you don’t know and you don’t know what you are missing. Investigators say there are many crimes they could have pursued, but because the content came down, they could not know the crimes even existed.”

Human Rights Watch and numerous other organizations have had meetings with YouTube, Facebook, and Twitter, companies at least willing to discuss the problem and consider solutions. Still, so far, only U.S.-based companies have been ready to go even that far, Wille said.

“We haven’t engaged with TikTok and others, which might pose their challenges because they are not U.S.-based,” she said. “So, the conversations around legal liability in transferring content might be different.”

To date, other than occasionally reposting these photographs and other evidence, social media platforms have done nothing to provide access to journalists or independent “civil society,” the report says.

“Independent civil society organizations and journalists have played a vital role in documenting atrocities in Iraq, Myanmar, Syria, Yemen, Sudan, the United States, and elsewhere — often when there were no judicial actors conducting investigations,” the report says. “In some cases, the documentation of organizations and the media has later triggered judicial proceedings. However, they also have no ability to access removed content.”

There have been instances where investigators have been able to copy content before it is removed, Wille said. But the question remains on whether courts will consider them as evidence rather than the original post.

It is not clear how long content is saved once posts are removed from social media sites. In one instance, YouTube restored content it had taken down two years prior, at the urging of Human Rights Watch.

“Holding individuals accountable for serious crimes may help deter future violations and promote respect for the rule of law,” the Human Rights Watch report states. “Criminal justice also assists in restoring dignity to victims by acknowledging their suffering and helping to create a historical record that protects against revisionism by those who will seek to deny that atrocities occurred.”

How the U.S. preserves evidence of child sexual exploitation posted online shows how the system could work more efficiently. “U.S.-registered companies operating social media platforms are required to take down content that shows child sexual exploitation, but also preserve it on their platforms for 90 days and share a copy of the content, as well as all relevant metadata — for example, the name of the content’s author, the date it was created, and the location — and user data, with the National Center for Missing and Exploited Children. 

The private nonprofit NCMEC has a legal right to possess the material indefinitely. In turn, it notifies law enforcement about any relevant content.

“Social media opens a new frontier in such investigations,” The Economist reported. “In 2016, a court in Frankfurt convicted a German national of war crimes after photos were posted to Facebook of him posing with the severed heads of enemy combatants impaled on metal poles in Syria. But social-media firms are in a tricky position. They are under pressure to protect users from horrific content and extremist propaganda and keen to stay on governments’ good side. That leads them to adopt stringent content-moderation policies. But their policies have also led to the loss of evidence of human rights violations. As a result, opportunities to bring the perpetrators of appalling atrocities to justice may be missed.”