13 Feb Visual Disinformation Can Be Especially Persuasive, Expert Warns
(“disinformation” by Flickr user Focal Foto via CC BY-NC 2.0 license)
By Danielle Parenteau-Decker
After the Buffalo Bills’ Damar Hamlin collapsed during a game, many people took to social media to express concern or speculate on what went wrong. Some were sure they knew why, posting footage from the field with claims blaming the COVID vaccine. That Hamlin’s vaccination status was not public knowledge did not matter.
That is just one example of the visual COVID disinformation on the internet that both manages to be nearly ubiquitous and to fly under the radar. Social media helps it spread and reach many people. The nature of visual misinformation makes it easy to get taken in by and hard for those fighting it to even find.
“Visual content is particularly powerful in its reach,” said Kathryn Heley, lead author of “Missing the Bigger Picture: The Need for More Research on Visual Health Information,” published last August by SAGE Journals.
Generally, we are more likely to pay attention to, understand and remember visuals than text alone, she added.
“Compared to written content alone, the addition of visual content enhances emotional arousal and, in some cases, persuasive impact,” Heley said.
Visual disinformation may be recontextualized, manipulated or fabricated, according to the Journalist’s Resource, a project of the Harvard Kennedy School. In the first case, the image is real but taken out of context. With the others, the visuals have been falsified in some way.
The posts about Hamlin are an example of recontextualized disinformation. The clips used were actual TV footage, and he really did suffer a medical emergency. But the claims it resulted from the COVID-19 vaccine were unfounded.
Medical experts said one plausible explanation for Hamlin’s sudden collapse could be a condition called commotio cordis, in which chest trauma with just enough force in just the right spot at just the right time can cause the heart to stop.
It is rare but has happened in professional sports. Hockey player Chris Pronger suffered commotio cordis and went into cardiac arrest during a 1998 NHL playoff game early in his Hall of Fame career.
After Hamlin dropped, “we saw an increase … in social media postings about how this condition was caused by the vaccine, and no one knows his vaccine status,” said Multiethnic Press Secretary Yurina Melara Valiulis of the California Governor’s Office of Planning and Research. “It’s clearly misinformation with maybe a tone of mal-information, the kind of information that is malicious.”
There also many other “images and videos [that] show athletes dropping down,” said Tamoa Calzadilla, managing editor of Factchequeado, a fact-checking website focused on Spanish-language media.
The Journalist’s Resource reported that those images were accompanied by text blaming the athlete’s collapse on the COVID vaccine, despite the fact that many of the players were unvaccinated, or if they were vaccinated, it was proven to be unrelated to why they fell.
>>>Read: Mistrust and Misinformation Hold Back Black Vaccination Rates
Manipulated visuals have been altered in some way — Photo-shopped, for example — to change how people are likely to interpret them.
Calzadilla gave the example of a photo of the Bill & Melinda Gates Foundation that had been doctored to add the words “Center for Global Human Population Reduction.”
A fabricated visual is just that: made up. It is not real. “Though it is likely produced with representations of people, events, or things that make it appear as authentic, legitimate information,” writes Naseem S. Miller in the Journalist’s Resource.
An example of this is a “deepfake,” in which technology has been used to create a video that seems to show someone saying something they never did. The audio may be electronically invented or spliced together from actual recordings, then seamlessly synced with footage of the person.
“Visual manipulations can be hard to detect — they are often imperceptible and easily overlooked,” Heley said.
Deepfakes can be extremely realistic-looking, but there are some signs to watch out for, according to Norton Security. Among the signs are the speaker’s eyes or body not moving naturally, teeth without discernible outlines, or facial expressions that don’t look right or match the emotion implied by the speech. Basically, if something seems wrong, there is a good chance it is wrong.
Videos can also be manipulated to trick people in multiple languages.
“Because visual content has the ability to transcend things like linguistic or literacy barriers, it may facilitate the spread of visual misinformation across cultural and linguistic contexts or among those with lower literacy levels,” Heley said.
Pfizer CEO Albert Bourla told the World Economic Forum last May that his company had set a goal in 2019 “that by 2023 we would reduce the number of people in the world who cannot afford our medicines by 50%.” Then, a video circulated that eliminated the crucial phrase “who cannot afford our medicines.”
It was disseminated with Spanish subtitles that reflected the falsified speech.
“This is a common kind of mis- and disinformation targeted at Latino communities in the U.S.,” Calzadilla said.
She said Latinx people are often deceived intentionally for political, economic and ideological reasons. She added that it could be harder for people in the U.S. whose native language is not English “to find good information and to have access to traditional media outlets with quality content.”
“Some studies show that Latinos and Latinas inform themselves through platforms such as YouTube, WhatsApp and other social media networks which represents a problematic situation because of a lot of mis- and disinformation circulates there, she said.”
Many young people also fall prey to the proliferation of disinformation on social media.
“While the young tend to be more comfortable online than older people, that hasn’t inoculated them against the spread of false claims,” Time magazine reported in 2021. “In fact, some studies have shown they seem even more prone to believe misinformation about the pandemic,” the magazine reported.
>>>Read: I Faced Down Fear and Misinformation to Get Vaccinated
Some of the misinformation on social media isn’t meant to be taken seriously — but that doesn’t mean it doesn’t have serious consequences.
Fact-checker PoltiFact NC analyzed two videos in which women showed themselves shaking significantly and blamed it on the COVID vaccines.
“Public health officials and vaccine experts told PolitiFact they are not aware of any link between either COVID-19 vaccine and uncontrollable shaking,” the report said.
Those women apparently were not joking, but parody posts exist in which someone writes in a mock-serious tone about the vaccine, expressing concern for themselves or a loved one who has supposedly lost control of their body.
Sometimes, the joke is obvious. One post is accompanied by a GIF of Jim Carrey’s Lloyd Christmas dancing goofily in “Dumb and Dumber.” Another is a bit more ambiguous — until the person starts doing the robot.
“Most of the scenarios are clearly fictional,” writes Sophia Ankel in an article about TikTokers pretending to have bad experiences with COVID vaccines. “But they embrace tropes from real-world anti-vaccine conspiracy theories, and medical experts warned Insider that they may help normalize the idea that vaccines are dangerous.”
Morgan McSweeney, a scientist known as Dr. Noc on TikTok, said in the article that many people “don’t know much about vaccines, and are not sure what to do with all the information that is fed to them on a daily basis.”
Experts like him are trying to turn things around by using social media to debunk COVID myths and spread accurate health information.
But unless brought to their attention, experts may be unaware of disinformation floating around.
“That doesn’t mean it isn’t out there,” Melara Valiulis said.
One part of the problem is technology is not sufficiently designed to pick it up.
“Existing content moderation tools are mostly designed to catch misinformation in texts and not images and videos, making it difficult to catch and stop the spread of visual misinformation,” writes Miller in the Journalist’s Resource.
Another is the way social media sites work.
“The algorithm picks up on what you’re looking for,” Melara Valiulis said. So each person’s feed looks different. “If you click on something that is misinformation, the algorithm might pick it up and give you same content and just send you in this hole, misinformation hole.”
That is why people should avoid engaging with suspected misinformation, even to disagree with it. Instead, Melara Valiulis said they should report it to the platform and can also report it to the state by emailing rumors@cdph.ca.gov.
This story was produced as part of Ethnic Media Services’ COVID Myth Busters series with funding from the California Department of Public Health.
No Comments