After months of speculation about Kate Middleton’s absence from public life, a digitally altered family photo released by Kensington Palace only fanned the flames as eagle-eyed observers identified details like a strange cuff and mismatched zipper.
Though the Princess of Wales later admitted to editing the photograph, experts say most times you should leave image verification to the pros.
The original print of the engraving by A.H. Ritchie of John Calhoun, circa 1852. With the help of modern reverse image search, it’s easy to find information on past iterations of images like this one.
Print by Alexander Hay Ritchie-Courtesy Library of Congress
“It’s very easy to go way too far here and suddenly everything is suspicious,” says Hany Farid, a professor at University of California, Berkeley, who specializes in media forensics. “By eye, you’re not going to be able to figure out what’s real with any reliability whatsoever.”
Meanwhile experts have special tools, including “digital forensic techniques and resources like geolocation, satellite imagery, and sensor data to make sure we—and our audience—can trust what we’re seeing,” says Christopher Looft, coordinating producer of the visual verification unit at ABC News. “Generative artificial intelligence is advancing at such a rate that the ‘tells’ we’d look for a year ago are probably out of date by now.”
That said, if you’re worried about being misled by misinformation, these are some tips that experts say you can start using today.
What’s the source?
Your first line of defense is questioning where a photograph comes from. Something posted by an individual, a business, or a political entity might be altered to make them look good, promote a product, or otherwise benefit them.
But what about trusted news outlets, which are businesses too? Farid says you can still rely on those with a history of objective coverage. “They’re incentivized to get it right,” he says. When photo agencies had to notify customers to stop distributing the Kensington photo, it was “bad for business” as it hurts their credibility, Farid says.
“Ninety nine point nine percent of the time they get it right,” Farid says of trusted news organizations. “You may take issue with any perceived political bias, but these are serious people doing a serious job.” He adds that the standards these outlets must meet for publishing a photograph are far and above those of any an individual or commercial entity, if they have standards at all.
Beware of hot topics
Political fakes also run rampant, particularly in election years. There’s satire, Farid says, which is protected as political commentary—then there are robocalls using a presidential candidate’s voice, which could be illegal. It’s a fine line.
Conflicts are another charged topic. “In the fog of war, it’s very hard to figure out what’s going on. Emotions are running spectacularly high. And so infusing that with disinformation, fake images, makes things very messy,” Farid says. “Within days, we see fake images, fake audio, fake video, and then the real thing is being claimed as fake.”
Think about lighting
One of the ways you can tell the Kensington photo wasn’t manipulated by AI is that “the lighting on their faces is consistent,” Farid says. It’s a detail that Mark Thiessen, a National Geographic staff photographer, is constantly thinking about.
He was able to answer which cheetah photo was generated by AI—a riddle that stumped many on our staff and our readers—thanks to decades of experience in lighting and photography.
Thiessen notes that while the two real photos had realistic lighting—one ambiently lit by the sun slightly to the right of the cheetah, the other captured with head-on flash—the AI cheetah had an unnatural, strong blue highlight in one eye. Blue isn’t necessarily a strange highlight color, especially outdoors when the eyes reflect the sky, but without any other blue hue in the picture, he says this was a dead giveaway.
“The photographer who taught me lighting back in college told me to look at every image and figure out how it’s lit—every movie scene, magazine shoots, commercials,” Thiessen says. “You can figure out how things are lit generally when looking at the eye—it’s reflective and round, it’s like a mirror of the lighting setup.”
Fact check yourself
If you see a photograph that’s giving you pause, Farid suggests searching to see if any research or articles have been published about it, or if other outlets like Snopes or FactCheck.org have investigated it. If you’re only seeing the image on community forums like Reddit, 4chan, and Twitter “you should probably just ignore it.”
“Many so-called ‘fakes’ aren’t necessarily fabrications, they’re just old images misleadingly passed off as new,” Looft says. “Reverse image search is an easy way to tell whether an image already exists. There are many options but TinEye has a great reverse search, and some other tools for those looking to go a little deeper.”
The bigger picture
The distinction between “photojournalist” and “photographer” has always been an important one, Thiessen says. For photojournalists, the content of the photo cannot be changed.
There’s a difference between standard editing practices—such as adjusting contrast levels from a raw file—and manipulation. Instead of shooting an object suspended by a fishing line and editing out the fishing line later, he says, “we’d figure out how to shoot it without a fishing line.”
When it comes to marketing or commercial use, however, those rules can be bent to allow editing in post-production. And as the line between promotional and editorial journalism in imagery becomes blurrier than ever, Thiessen says the delineation of photojournalism is critical.
Rather than relying on a handout from the royal family, or a brand, or anyone else that has their own interests, it’s all the more important to have photojournalists we can trust not to manipulate photos. Even better, he says, is to have several independent journalists attending the same event so they can check each other.
“If we erode trust in the very institutions, the media and academics and the scientists and the government, we are in a lot of trouble as a society,” Farid adds. “Because when that happens, the very mechanisms that we have to talk about a global health crisis, global climate change, elections, integrity, are gone.”
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : National Geographic – https://www.nationalgeographic.com/photography/article/digitally-manipulated-ai-altered-photo-images
Unveiling 2024 Community Health Assessment: Join the Conversation and Collaborate for a Healthier Future!