Google is experimenting with "pre-bunking," the preemptive debunking of "misinformation," in an attempt to "inoculate people against manipulation," according a report released Wednesday.
"[S]ocial scientists from Cambridge University and Google reported on experiments in which they showed 90-second cartoons to people in a lab setting and as advertisements on YouTube, explaining in simple, nonpartisan language some of the most common manipulation techniques," NBC News' David Ingram reported.
He said the study was "part of a broad effort by tech companies, academics and news organizations to find new ways to rebuild media literacy, as other approaches such as traditional fact-checking have failed to make a dent in online misinformation."
"In the days before the 2020 election, social media platforms began experimenting with the idea of ‘pre-bunking,’" Ingram wrote. "Interest in ‘pre-bunking’ misinformation has been percolating for a few years. Twitter used ‘pre-bunking’ on subjects including ballot security in the days leading up to the 2020 election, while Facebook and Snapchat put resources into voter education. Other efforts have focused on Covid misinformation."
YOUTUBE TO REMOVE ‘MISINFORMATION,’ ADD ‘CONTEXT AND INFORMATION' TO RELATED CONTENT
As lead study author and a postdoctoral fellow at Cambridge University’s Social Decision-Making Lab Jon Roozenbeek explained to NBC News, "Words like ‘fact-checking’ themselves are becoming politicized, and that’s a problem, so you need to find a way around that."
Ingram said, "The researchers compared the effects to vaccination, ‘inoculating’ people against the harmful effects of conspiracy theories, propaganda or other misinformation."
The researched paper itself claimed videos were made to "inoculate people against manipulation techniques commonly used in misinformation: emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks."
The study described a process where "[e]ach video instantiates the inoculation procedure by first providing a forewarning of an impending misinformation attack, then issuing a preemptive refutation of the manipulation technique used in this attack, and lastly presenting a ‘microdose’ of misinformation in the form of innocuous and humorous examples."
Ingram noted that the research has already been used by Google to respond to political world affairs such as trying to "‘pre-bunk’ anti-refugee sentiment around people fleeing Ukraine."
"The company said it doesn’t have plans to push ‘pre-bunk’ videos in the United States ahead of the midterm elections this fall but said that could be an option for future election cycles," he wrote.
Beth Goldberg, co-author of the study and head of research at Jigsaw, a Google subsidiary that does research into misinformation and other subjects, hoped to "help people gain resistance to manipulation online." According to ResearchGate, Goldberg has a record of research into topics such as "far-right extremist propaganda," "COVID-19 vaccine acceptance," and "scientific racism."
Jigsaw’s extensive page on the history of conspiracy theories claims that "Conspiracy theories have legitimized violence, impaired public health, and undermined democratic governance."
Goldberg explained that pre-bunking is supposed to aid content moderation, which "hasn’t been enough given the volume of misinformation."
Instead of reacting, pre-bunking helps get ahead of disinformation before it even starts. "We don’t have to anticipate what a politician is going to say or what the vaccine disinformation campaign is going to say next week. We just have to say, ‘We know there’s always going to be fearmongering,’" she said.
CLICK HERE TO GET THE FOX NEWS APP
Some outside academics are skeptical about the viability of the techniques used to undermine right-wing personalities, however.
Ingram paraphrased senior researcher in communication at the University of North Carolina, Shannon McGregor, claiming she thought "A ‘pre-bunking’ campaign might do little to stem the tide of disinformation from prominent sources such as far-right influencers on YouTube."