War crimes investigations aided by artificial intelligence
Proving war crimes can be a tedious process, involving people spending many hours scouring thousands of images from war-torn countries like Yemen. Human rights groups across the globe are working to incorporate artificial intelligence into the process to speed up the time it takes to present evidence of war crimes in court.
With more and more digital documents available to study, the time it takes to analyze them is off the charts.
“The disturbing imagery can also traumatize the investigators who must comb through and watch the footage,” according to the Massachusetts Institute of Technology’s Tech Review. “Now an initiative that will soon mount a challenge in the UK court system is trialing a machine-learning alternative. It could model a way to make crowdsourced evidence more accessible and help human rights organizations tap into richer sources of information.”
RELATED: Police reconsider facial recognition, other tech
RELATED: Social media can be a tool or a weapon for police
The focus right now is on Saudi Arabia’s airstrikes in Yemen. Saudi Arabia led an airstrike in 2015 to defeat what it considered a threatening rise of the Shias. The escalation was to last weeks. Five years later, it continues. The crowdsourced evidence has also been used to show forced displacement and burning of homes in Myanmar, attacks on Syrian hospitals, executions in Libya and Cameroon, and attacks on protesters in Hong Kong.
Database documents abuses
Working with numerous human rights groups, Swansea University in Wales is leading the initiative to monitor war crimes using a database of photos and videos. Yemini Archive, in 2017, began building this database to document the alleged abuses. The content was gathered from thousands of sources, including journalists and volunteers and by scouring social media platforms and YouTube videos. The data is being preserved on what is called a blockchain, so no one can tamper with it.
“By some estimates, the (Saudi) coalition has since carried out over 20,000 air strikes, many of which have killed Yemeni civilians and destroyed their property, allegedly in direct violation of international law,” Tech Review reports. “Human rights organizations have since sought to document such war crimes in an effort to stop them through legal challenges. But the gold standard, on-the-ground verification by journalists and activists, is often too dangerous to be possible. Instead, organizations have increasingly turned to crowdsourced mobile photos and videos to understand the conflict and have begun submitting them to court to supplement eyewitness evidence.”
The Global Legal Action Network, a nonprofit that legally challenges states and powerful actors on human rights violations, is also a partner in the effort.
“We heard from many Yemeni sources that getting reliable information is particularly difficult if independent (non-government organizations) or journalists cannot get timely access for on-the-ground investigations,” GLAN states on its website. “Sometimes, a video or photograph posted on social media can give vital clues as to the location, cause, perpetrator or effects of an attack, or can corroborate witness evidence. Occasionally, attacks themselves are even captured on video and posted online.”
Legal challenges under way
GLAN is mounting legal challenges in various courts to get authority to use these digital archives to prove war crimes.
“It’s really important to emphasize that the aim here is not to replace human investigators in determining what evidence is relevant, nor judges in determining the guilt or innocence of an accused,” Swansea University Professor Yvonne McDermott Rees said. “Its rather more modest aim is to address the challenges posed by the volume of evidence out there and helping investigators and lawyers find the most relevant content.
“Just to illustrate this problem, the Syrian Archive, which archives content posted on social media platforms from the Syrian conflict, has archived over 3 million videos. As they say, ‘there are more hours of videos documenting the conflict than there have been hours in the conflict itself.’
“If things are coming through courtroom accountability processes, it’s not enough to show that this happened,” Rees said. “This project sought to look at what steps need to be put in place to ensure that this ‘citizen evidence’ can be used to support legal accountability processes.”
The focus right now is on a U.S.-manufactured cluster munition called BLU-63. These “cluster munitions” spray smaller explosives on impact. They are banned by 108 countries, including the UK.
“If the partners could prove in a UK court that they had indeed been used to commit war crimes, it could be used as part of mounting evidence that the Saudi-led coalition has a track record for violating international law, and make a case for the UK to stop selling weapons to Saudi Arabia or to bring criminal charges against individuals involved in the sales,” Tech Review states.
“If an investigator wanted to find evidence of, say, the use of cluster munitions, he or she could not possibly watch every one of those videos,” Rees said. “So VFRAME developed a machine-learning algorithm to detect cluster munition strikes in videos. Once the algorithm flags a set of videos as potentially containing evidence of cluster munitions, the next step is for a lawyer or investigator to view the filtered content, and determine whether they do indeed contain such evidence, and if so, whether this video is relevant to their case.”
Humans intervention still needed
Human experts still must verify the footage after the system filters it. But it could be a game changer for the human rights organizations going to court.
Open-source intelligence, as this type of database is called, can be evidence of the lawfulness of attacks, GLAN states. “If a passer-by films civilian rescuers in an open area being hit by a second ‘double-tap’ airstrike, this can, in conjunction with other information, serve as evidence that the attacker knowingly killed civilians. Likewise, if an investigator verifies through (open-source intelligence) videos, the exact location where a bomb landed in a crowded market, it can be combined with satellite imagery to infer that the attacker knew they would kill high numbers of civilians when the attack was launched.”
Sam Gregory, program director of the human rights nonprofit Witness, which works to help people use video and technology to defend human rights, called this initiative “a tremendous opportunity,” even though it is still an emerging field. He said it can level the playing field to turn both eyewitness evidence and perpetrator-shot footage into evidence that can be used to bring justice.