AP News: Detailed ‘open source’ news investigations are catching on – in Ukraine & Elsewhere
BY DAVID BAUDER – May 8, 2022
NEW YORK (AP) — One of the more striking pieces of journalism from the Ukraine war featured intercepted radio transmissions from Russian soldiers indicating an invasion in disarray, their conversations even interrupted by a hacker literally whistling “Dixie.”
It was the work of an investigations unit at The New York Times that specializes in open-source reporting, using publicly available material like satellite images, mobile phone or security camera recordings, geolocation and other internet tools to tell stories.
The field is in its infancy but rapidly catching on. The Washington Post announced last month it was adding six people to its video forensics team, doubling its size. The University of California at Berkeley last fall became the first college to offer an investigative reporting class that focuses specifically on these techniques.
The Ukraine radio transmissions, where soldiers complained about a lack of supplies and faulty equipment, were verified and brought to life with video and eyewitness reports from the town where they were operating.
At one point, what appears to be a Ukrainian interloper breaks in.
“Go home,” he advised in Russian. “It’s better to be a deserter than fertilizer.”
The Times’ visual investigations unit, founded in 2017 and now numbering 17 staff members, “is absolutely one of the most exciting areas of growth that we have,” said Joe Kahn, incoming executive editor.
The work is meticulous. . . . Video sleuthing also contradicted an initial Pentagon story about an American drone strike that killed civilians in Afghanistan last year. “Looking to us for protection, they instead became some of the last victims in America’s longest war,” the report said.
“There’s just this overwhelming amount of evidence out there on the open web that if you know how to turn over the rocks and uncover that information, you can connect the dots between all these factoids to arrive at the indisputable truth around an event,” said Malachy Browne, senior story producer on the Times’ team.
“Day of Rage” [about the January 6 insurrection] has been viewed nearly 7.3 million times on YouTube. A [Washington] Post probe into the deaths at a 2021 Travis Scott concert in Houston has been seen more than 2 million times, and its story on George Floyd’s last moments logged nearly 6.5 million views.
The Post team is an outgrowth of efforts begun in 2019 to verify the authenticity of potentially newsworthy video. There are many ways to smoke out fakes, including examining shadows to determine if the apparent time of day in the video corresponds to when the activity supposedly captured actually took place.
“The Post has seen the kind of impact that this kind of storytelling can have,” said Nadine Ajaka, leader of its visual forensics team. “It’s another tool in our reporting mechanisms. It’s really nice because it’s transparent. It allows readers to understand what we know and what we don’t know, by plainly showing it.”
Still new, the open-source storytelling isn’t bound by rules that govern story length or form. A video can last a few minutes or, in the case of “Day of Rage,” 40 minutes. Work can stand alone or be embedded in text stories. They can be investigations or experiences; The Times used security and cellphone video, along with interviews, to tell the story of one Ukraine apartment house as Russians invaded.
Leaders in the field cite the work of the website Storyful, which calls itself a social media intelligence agency, and Bellingcat as pioneers. Bellingcat, an investigative news website, and its leader, Eliot Higgins, are best known for covering the Syrian civil war and investigating alleged Russian involvement in shooting down a Malaysian Airlines flight over Ukraine in 2014. . . .
The commercial availability of satellite images was a landmark, too. The Times used satellite images to quickly disprove Russian claims that atrocities committed in Ukraine had been staged.
Other technology, including artificial intelligence, is helping journalists who seek information about how something happened when they couldn’t be on the scene. The Times, in 2018, worked with a London company to artificially reconstruct a building in Syria that helped contradict official denials about the use of chemical weapons.
Similarly, The Associated Press constructed a 3D model of a theater in Mariupol bombed by the Russians and, combining it with video and interviews with survivors, produced an investigative report that concluded more people died there than was previously believed. . . .
As efforts expand, Koenig said journalists need to make sure their stories drive the tools that are used, instead of the other way around. She hears regularly now from news organizations looking to build their own investigate units and need her advice — or students. Berkeley grad Haley Willis is on the team at The Times.
It feels, Koenig said, like a major shift has happened in the past year.
Browne said the goal of his unit’s reporting is to create stories with impact that touch upon broader truths. A probe about a Palestinian medic shot by an Israeli soldier on the Gaza strip was as much about the conflict in general than her death, for example.
“We have similar mandates,” the Post’s Ajaka said, “which is to help make sense of some of the most urgent news of the day.”
OPINION
War narrative a fable not fit for the times
Gwynne Dyer GWYNNE DYER
INDEPENDENT JOURNALIST – 9 MAY 2022
We were talking recently about how clever the Ukrainians had been to call the invading Russian troops “Orcs” even before all the atrocities in the Russian-occupied towns around Kyiv came to light. Then Tina said: “If Putin’s troops are Orcs, then he must be Sauron.”
You can guess what happened next. We started trying to link other characters in the current drama with other characters from The Lord of the Rings, which many have begun to see as a tract for our times. Frodo was easy: that’s Ukraine’s president Volodymyr Zelensky — diminutive, vulnerable but also very brave.
We couldn’t figure out who plays Aragorn, but France’s newly re-elected president, Emmanuel Marcron, is a dead ringer for Legolas. Britain’s Prime Minister Boris Johnson is one of the more boastful and self-serving dwarves, not Gimli but maybe Bombur.
Alexander Dugin, also known as “Putin’s Brain”, is the obvious candidate for the role of Saruman. He’s the Christian fascist philosopher who advises Sauron/Mr Putin on how to destroy the West, “the kingdom of the antichrist” that seeks to submerge Russia in “the abyss of chaos and corruption”. But I’m getting too technical here.
Joe Biden is Treebeard, the eldest of the Ents, and I’ll leave it to you to flesh out the rest of the characters in this low-budget remake of LOTR. But do let me know if you figure out where the hell Gandalf is when we need him. Probably late, as usual.
But here’s the thing. Comparing the war in Ukraine to The Lord of the Rings is a harmless after-dinner game, but it’s a very poor guide to policy. Yet many Western leaders are starting to sound like JRR Tolkien is their speechwriter. That’s clear evidence that they’re losing the plot.
It’s perfectly normal for war aims to expand after an early success, but it’s usually a mistake. Ukraine didn’t collapse in the face of the Russian invasion, which was what both Mr Putin and everybody in Western leadership positions expected it to do.
So Western pundits (and even Western politicians) are now predicting that Ukraine will reconquer not just the land Russia conquered since February, but also the territory that it seized in 2014. That may be possible despite Russia’s three-time-bigger population and tenfold bigger economy, though I doubt it.
But are the Ukrainians sure they want to push a nuclear-armed enemy who has shown himself to be irrational and unstable into such a humiliating corner? Are they sure their Western supporters would still back them if that leads to a nuclear showdown (as it probably would)?
Moreover, are the Ukrainians sure they really want the lost provinces of 2014 back? The people who remain in them now are not only Russian-speakers, but mostly people who actually identify as Russian. If Kyiv tried to forcibly reintegrate them into a victorious Ukrainian state, it would guarantee that state at least a generation of instability.
We will probably be spared all these awkward questions, because such a decisive Ukrainian victory is unlikely. What is of greater concern is the way that Western leaders have slipped so easily into a Tolkienesque mindset that lets them see themselves as the embattled defenders of a West that faces mortal peril from a great evil in the east.
Tolkien had an excellent excuse for writing that sort of book, because he wrote his Lord of the Rings trilogy between 1937 and 1949, when the West did indeed face a great threat, first from Nazi Germany and then from the Soviet Union. It was a fitting fable for his times. It is not relevant to ours.
How is it possible to see Russia one moment as a power so weak that it could be pushed out of all former Ukrainian territory by force, and the next moment as a mighty threat to all of Europe or “freedom” or “democracy”?
If “the test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time and still retain the ability to function”, then the current leadership class of the West are all geniuses.
Russia is not Mordor. It is a second-rate great power that must be respected because it has a lot of nuclear weapons, but it poses no serious threat to the security of the rest of Europe or to democracy. Its invasion of Ukraine was a squalid smash-and-grab raid that is being repelled with the help of Ukraine’s friends and neighbours, that’s all.
Gwynne Dyer is an independent journalist whose articles are published in 45 countries. His new book is ‘The Shortest History of War’.
NYTimes: How to Avoid Sharing Misinformation on the War in Ukraine
Here are warning signs to look for before you retweet.
Experts in misinformation say everyone has a responsibility to verify information before sharing it.
People fleeing Ukraine at a border crossing in Slovakia. Experts in misinformation say everyone has a responsibility to verify information before sharing it.
By Daniel Victor – March 21, 2022
Sorting out what is real in Ukraine and what is misinformation designed to provoke an emotional response is hard enough for professional journalists. For everyday people seeing photos and videos cascade through their social media feeds, it is even harder.
But the stakes can be high for anyone with an audience, no matter how big or small, if sharing false information — reposting a link on Facebook or retweeting a story that feels urgent — means unwittingly playing into war propaganda. Experts in misinformation say everyone has a responsibility to pause and do a bit of work to verify content before sharing it, even if it would benefit the side you support in a conflict.
“It matters because we all have the right to truth, and the more we do to pollute the information environment, the worse it’s going to get,” said Joan Donovan, the research director at Harvard’s Shorenstein Center on Media, Politics and Public Policy, which has studied the proliferation of misinformation.
Claire Wardle, a co-founder of First Draft News, a nonprofit that focuses on misinformation, said your credibility matters, even if you’re not a journalist.
“If we all keep doing that, it means we’re all going to stop believing anything anyone else is saying,” she said.
There are simple steps you can take to limit the misinformation that circulates online. If you can’t verify the authenticity of something you’re tempted to share, you can at least look for warning signs that would give you pause.
Here are some quick red flags to think about before you share:
Who’s sharing it?
Are they verified? On Twitter, Instagram or Facebook, many people, including journalists, have blue check marks next to their names to indicate their identities have been confirmed. These accounts make mistakes, too, and good information can come from people who are not verified, but the absence of a check could give you a reason to look for other red flags and pause before hitting that retweet button. Also be wary of parody or impostor accounts.
Even when you come across verified accounts, look for hints that they have some reason to know what they’re telling you: Are they reporters on the ground or researchers who have studied the area? Or are they a celebrity having the same quick-twitch reaction you’re trying to avoid?
Beware TwitterBot120362824. A user name consisting of a noun followed by a long series of numbers is often a sign that an account has been created inauthentically, Dr. Donovan said. A brand-new account with few prior or unrelated tweets or a low follower count might be a sign to move along.
#Excessive #hashtags
When an Instagram post seems a bit desperate for engagement, adding unrelated hashtags that might be popular like #catoftheday, it’s likely the post is coming from a disreputable place, Dr. Donovan said.
Google first.
If you do a quick web search and can’t find any news articles about what you’re seeing, it’s possible you could be looking at miscaptioned images from a previous war, Dr. Wardle said. If you’re feeling especially Sherlockian, you can search for the original source of a viral image yourself.
In one recent example, a 2012 video of a Palestinian girl confronting Israeli soldiers was widely recirculated by people suggesting it happened in Ukraine.
Seek out the fact-checkers.
Many news organizations have special teams to fact-check or debunk claims that spread during high-intensity news moments. Reuters, The Associated Press, the BBC and Agence France-Presse all have dedicated hubs that you can check first to see if that post you’re about to share was debunked days ago.
Do they want your shares, or your money?
Scammers prey on creating emotional responses and might say they’re raising funds for victims. Carefully look into any organization you’re tempted to donate to or post about by using a site like Charity Navigator to ensure it is legitimate.