The most recent lesson on misinformation came from an unlikely teacher: the frozen meat company, Steak-umm. And Facebook, Twitter, and Google should take notes.
The Pennsylvania-based brand on Thursday posted a “beefy thread” on Twitter about “societal distrust in experts and institutions, the rise of misinformation, cultural polarization, and how to work toward some semblance of mutually agreed upon information before we splinter into irreconcilable realities.”
Steak-umm said “one universal goal everyone should prioritize is getting people from across the ideological spectrum closer to the same reality of baseline facts and evidence,” a difficult feat given in part the expansive nature of online platforms and “people’s access to infinite information.”
The company also said there can be shortcomings with experts and institutions, but that doesn’t make “fringe sources equally credible or trustworthy.”
Steak-umm didn’t name any names in its thread, but the challenges it discussed have been faced by Facebook, Google, and Twitter for years. Things have only intensified in difficulty since March 2020 as the group controls how much misinformation people see – and can therefore be influenced by – online. False information surrounding the coronavirus disease, political conspiracies, and the election specifically has taken center stage.
The companies have attempted to flag information they deemed to be misleading, including from those spreading fringe ideology, prompting backlash from some on the right who claim the platforms are censoring viewpoints they don’t agree with.
But some groups and posts, like those centered around the “Stop the Steal” campaign alleging the election was stolen from former President Donald Trump, were allowed to proliferate before Facebook cracked down on them.
A recent report found that most COVID-19 disinformation online is spread by just 12 people – and one of them is the nephew of former President John F. Kennedy.
Robert F Kennedy Jr. cemented himself as a prominent anti-vaccine advocate well before the pandemic. But his rhetoric took on a whole new meaning when the COVID-19 pandemic took over the world in March 2020.
Here’s how 67-year-old Kennedy, who is also the son of former Attorney General Robert F. Kennedy, became one of the “disinformation dozen” spreading COVID-19 conspiracy theories online.
From government law student to Fauci foe and anti-vaxxer
Kennedy graduated from Harvard, attended law school at the University of Virginia, and then earned his master’s in environmental law at Pace University, according to Vanity Fair.
He gained a reputation for defending Indigenous groups and fighting against the use of fossil fuels , all while rubbing shoulders with Hollywood elite at climate change awareness events and other social functions.
He founded the World Mercury Project in 2016, which became the Children’s Health Defense in 2018, an activist organization devoted to anti-vaccine initiatives. The group alleges, among other things, that administering some vaccines in children can cause conditions such as autism and cancer.
And Kennedy will release a new book this fall entitled “The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health.”
That’s where Facebook comes in.
The book explores what Kennedy describes as Fauci’s botched handling of the pandemic and how he staged an “assault on our First Amendment guarantee of free speech.”
“Fauci’s Silicon Valley and media allies dutifully censored criticism of his policies on mainstream social media and collaborated to muzzle any medical information about therapies and treatments that might end the pandemic and compete with vaccines,” it alleges.
But Kennedy is still very active on other sites, like Twitter for example.
He regularly shares posts from his anti-vax organization, such as one on Wednesday featuring a scientist who purported to have found evidence between COVID vaccines and neurodegenerative disorders.
Others in the Kennedy bloodline notably don’t share his stance on vaccines.
In a 2019, Kennedy’s siblings – Kathleen Kennedy Townsend and Joseph P Kennedy II – and niece – Maeve Kenned McKean – penned an essay in Politico entitled “RFK Jr. is Our Brother and Uncle. He’s Tragically Wrong About Vaccines.”
After Joshua Barbeau’s fiancé passed away, he spoke to her for months. Or, rather, he spoke to a chatbot programmed to sound exactly like her.
In a story for the San Francisco Chronicle, Barbeau detailed how Project December, a software that uses artificial intelligence technology to create hyper-realistic chatbots, recreated the experience of speaking with his late fiancé. All he had to do was plug in old messages and give some background information, and suddenly the model could emulate his partner with stunning accuracy.
It may sound like a miracle (or a Black Mirror episode), but the AI creators warn that the same technology could be used to fuel mass misinformation campaigns.
It’s some of the most sophisticated – and dangerous – language-based AI programming to date.
When OpenAI released GPT-2, the predecessor to GPT-3, the group wrote that it can potentially be used in “malicious ways.” The organization anticipated bad actors using the technology could automate “abusive or faked content on social media,” “generate misleading news articles,” or “impersonate others online.”
GPT-2 could be used to “unlock new as-yet-unanticipated capabilities for these actors,” the group wrote.
OpenAI staggered the release of GPT-2, and still restricts access to the superior GPT-3, in order to “give people time” to learn the “societal implications” of such technology.
Early in the pandemic, data scientists at Facebook asked for resources to monitor COVID-19 misinformation on the platform, but were ignored by leadership, according to a report from The New York Times.
The Times spoke to two people who were present at a meeting where data scientists asked for resources to study the spread of COVID-19 misinformation. The data scientists asked for new hires and to assign some current employees to the project, but management never approved it, and never gave an explanation, the people told The Times.
White House officials and experts have urged Facebook to share its own data about the spread and prevalence of misinformation.
It is not clear whether Facebook packages that data so it can be usefully studied.
One source told The Times that Facebook has the raw data, but hasn’t put resources towards defining and labeling misinformation.
A Facebook spokeswoman told The Times: “The suggestion we haven’t put resources toward combating Covid misinformation and supporting the vaccine rollout is just not supported by the facts.
“With no standard definition for vaccine misinformation, and with both false and even true content (often shared by mainstream media outlets) potentially discouraging vaccine acceptance, we focus on the outcomes – measuring whether people who use Facebook are accepting of COVID-19 vaccines.”
Facebook did not immediately respond when contacted by Insider for comment on The Times’ report.
US Surgeon General Vivek Murthy doubled down Sunday on his criticism of tech companies for their role in the spread of misinformation related to COVID-19 and vaccines.
“What all of us have the right to is accurate information, so we can make the right decisions for us and our families. That is not the reality for far too many people,” Murthy told Fox News’ Chris Wallace during an appearance on “Fox News Sunday.”
“They’re inundated with misinformation, and all of us – technology companies, individuals, health professionals, and government – have roles they can play in addressing and slowing the spread of misinformation,” he added.
Murthy on Thursday in a 22-page report issued his first advisory as the surgeon general. In the report, Murthy deemed misinformation “an urgent threat to public health” and called out tech companies for their role in hosting misinformation on their platforms, some of which casts doubt on the safety and effectiveness of coronavirus vaccines.
Health experts say the spread of misinformation about the vaccines and COVID-19 has played a part in millions of people avoiding getting vaccinated— even as new variants of the disease are spreading.
On Friday, a day after the report, President Joe Biden told NBC News that social-media companies were “killing people” because of misinformation hosted on their platforms.
“The only pandemic we have is among the unvaccinated, and they’re killing people,” Biden said.
“We will not be distracted by accusations which aren’t supported by the facts,” a Facebook spokesperson told Insider and other outlets. “The fact is that more than 2 billion people have viewed authoritative information about COVID-19 and vaccines on Facebook, which is more than any other place on the internet. More than 3.3 million Americans have also used our vaccine finder tool to find out where and how to get a vaccine.”
But Murthy on Sunday doubled down on his advisory.
“The reality is misinformation is still spreading like wildfire in our country, aided and abetted by technology platforms,” he said.
“There are pathways that tech companies can take to address misinformation that’s flowing on their side,” Murthy added. “I acknowledge they’re taking steps, and I appreciate that. But I’m also very clearly saying it is not enough. The intention is good but at the end of the day, it doesn’t save the life of someone who was misled by misinformation on these sites.”
He added: “I’m asking these companies to step up and take responsibility for what is happening on their sites. I’m asking them to look out for the people all across this country whose lives depend on having access to accurate information.”
The CCDH analyzed 812,000 anti-vaccine posts shared on Facebook and Twitter between February 1 and March 16, 2021. It found that 65 percent of this content could be attributed to what is being dubbed the “disinformation dozen.”
On Facebook alone, the CCDH found that those 12 people were responsible for 73 percent of the anti-vaccine content on the platform.
His account was part removed by Instagram, the CCDH said, but he remains active on Facebook and Twitter.
Fewer than half of the members of the disinformation dozen – Kennedy, Sherri Tenpenny, Rizza Islam, Sayer Ji, and Kelly Brogan – have had one of their social media accounts removed or partially removed, the study said.
The CCDH is now calling on Facebook, Instagram, Twitter, and YouTube to de-platform every member of the disinformation dozen with haste.
“The most effective and efficient way to stop the dissemination of harmful information is to de-platform the most highly visible repeat offenders, who we term the disinformation dozen,” the study said. “This should also include the organizations these individuals control or fund, as well as any backup accounts they have established to evade removal.”
In a statement provided to Insider, Facebook defended itself.
“The fact is that more than 2 billion people have viewed authoritative information about COVID-19 and vaccines on Facebook, which is more than any other place on the internet. More than 3.3 million Americans have also used our vaccine finder tool to find out where and how to get a vaccine,” a Facebook spokesperson said. “The facts show that Facebook is helping save lives. Period.”
In an additional statement provided to NBC’s Dylan Byers, a Facebook official said: “In private exchanges the Surgeon General has praised our work, including our efforts to inform people about COVID-19. The White House is looking for scapegoats for missing their vaccine goals.”
The White House fell short of its goal to have 70% of adults vaccinated by July 4, which some have blamed on vaccine hesitancy.
The White House wants Facebook to act quicker in removing posts containing vaccine misinformation.
White House press secretary Jen Psaki said Facebook takes too long to remove “violative posts” during a press briefing. Surgeon General Vivek Murthy said during the briefing that misinformation is slowing the pace of vaccinations in the US.
“Facebook needs to move more quickly to remove violative posts,” Psaki said. “Posts that will be within their policies’ removal often remain up for days. That’s too long. The information spreads too quickly.”
CNN reported meetings between the Biden administration and Facebook have been “tense” in recent weeks.
In a statement to Insider, Facebook said, “We’ve partnered with government experts, health authorities and researchers to take aggressive action against misinformation about COVID-19 and vaccines to protect public health.”
The statement pointed to the more than 18 million pieces of COVID misinformation Facebook has removed, as well as “accounts that repeatedly break these rules, and connected more than 2 billion people to reliable information about COVID-19 and COVID vaccines across our apps.”
White House Chief of Staff Ron Klain recently said, “I’ve told Mark Zuckerberg directly that when we gather groups of people who are not vaccinated, and we ask them, ‘Why aren’t you vaccinated?’ and they tell us things that are wrong, tell us things that are untrue, and we ask them where they’ve heard that. The most common answer is Facebook.”
The White House estimates misinformation could have dire consequences: Anthony Fauci said the “disparity in the willingness to be vaccinated” could lead to a surge of the Delta variant in the US.
Facebook was not immediately available for comment.
Science denial is not new, of course. But it’s more important than ever to understand why some people deny, doubt, or resist scientific explanations – and what can be done to overcome these barriers to accepting science.
Action No. 1: Each person has multiple social identities. One of us talked with a climate change denier and discovered he was also a grandparent. He opened up when thinking about his grandchildren’s future, and the conversation turned to economic concerns, the root of his denial. Or maybe someone is vaccine-hesitant because so are mothers in her child’s play group, but she’s also a caring person, concerned about immunocompromised children.
We have found it effective to listen to others’ concerns and try to find common ground. Someone you connect with is more persuasive than those with whom you share less in common. When one identity is blocking acceptance of the science, leverage a second identity to make a connection.
Challenge 2: Mental shortcuts
Everyone’s busy, and it would be exhausting to be vigilant deep thinkers all the time. You see an article online with a clickbait headline such as “Eat Chocolate and Live Longer” and you share it, because you assume it’s true, want it to be, or think it is ridiculous.
Action No. 2: Instead of sharing that article on how GMOs are unhealthy, learn to slow down and monitor the quick, intuitive responses that psychologist Daniel Kahneman calls System 1 thinking. Instead turn on the rational, analytical mind of System 2 and ask yourself, how do I know this is true? Is it plausible? Why do I think it is true? Then do some fact-checking. Learn to not immediately accept information you already believe, which is called confirmation bias.
Action No. 3: Recognize that other people (or possibly even you) may be operating with misguided beliefs about science. You can help them adopt what philosopher of science Lee McIntyre calls a scientific attitude, an openness to seeking new evidence and a willingness to change one’s mind.
Recognize that very few individuals rely on a single authority for knowledge and expertise. Vaccine hesitancy, for example, has been successfully countered by doctors who persuasively contradict erroneous beliefs, as well as by friends who explain why they changed their own minds. Clergy can step forward, for example, and some have offered places of worship as vaccination hubs.
Challenge 4: Motivated reasoning
You might not think that how you interpret a simple graph could depend on your political views. But when people were asked to look at the same charts depicting either housing costs or the rise in carbon dioxide in the atmosphere over time, interpretations differed by political affiliation. Conservatives were more likely than progressives to misinterpret the graph when it depicted a rise in CO2 than when it displayed housing costs. When people reason not just by examining facts, but with an unconscious bias to come to a preferred conclusion, their reasoning will be flawed.
Action No. 4: Maybe you think that eating food from genetically modified organisms is harmful to your health, but have you really examined the evidence? Look at articles with both pro and con information, evaluate the source of that information, and be open to the evidence leaning one way or the other. If you give yourself the time to think and reason, you can short-circuit your own motivated reasoning and open your mind to new information.
Challenge 5: Emotions and attitudes
When Pluto got demoted to a dwarf planet, many children and some adults responded with anger and opposition. Emotions and attitudes are linked. Reactions to hearing that humans influence the climate can range from anger (if you don’t believe it) to frustration (if you’re concerned you may need to change your lifestyle) to anxiety and hopelessness (if you accept it’s happening but think it’s too late to fix things). How you feel about climate mitigation or GMO labeling aligns with whether you are for or against these policies.
Action No. 5: Recognize the role of emotions in decision-making about science. If you react strongly to a story about stem cells used to develop Parkinson’s treatments, ask yourself if you are overly hopeful because you have a relative in early stages of the disease. Or are you rejecting a possibly lifesaving treatment because of your emotions?
Feelings shouldn’t (and can’t) be put in a box separate from how you think about science. Rather, it’s important to understand and recognize that emotions are fully integrated ways of thinking and learning about science. Ask yourself if your attitude toward a science topic is based on your emotions and, if so, give yourself some time to think and reason as well as feel about the issue.
Everyone can be susceptible to these five psychological challenges that can lead to science denial, doubt, and resistance. Being aware of these challenges is the first step toward taking action to meet them.
“And so we know it has become a giant source of misinformation and disinformation about the vaccines,” Klain added.
The last time the Klain and Zuckerberg spoke, he urged the CEO to “do better” at moderating COVID-19 vaccine misinformation on Facebook.
“His response was he cited the efforts Facebook was undertaking to try to put out good information, and I told him I recognize that Facebook is a source of a lot of good information about vaccines,” Klain said on the podcast. “But it also unfortunately is a source of a lot of bad information about vaccines.”
In a statement to Insider, a Facebook representative said the company has, “removed more than 18 million pieces of content on Facebook and Instagram that violate our COVID-19 and vaccine misinformation policies, and labeled more than 167 million pieces of COVID-19 content rated false by our network of fact checking partners.”
Facebook has struggled and occasionally outright refused to moderate speech on its platforms.
Klain said he urged Zuckerberg to be extra vigilant on vaccine misinformation given the seriousness of the situation: Nearly 4 million people have died worldwide from COVID so far, according to the World Health Organization.
“I’ll let Mark Zuckerberg speak for himself, he certainly can,” Klain said. “But there is just no question that a lot of misinformation about the vaccines is coming from postings on Facebook. And this is a life or death situation here.”
Got a tip? Contact Insider senior correspondent Ben Gilbert via email (firstname.lastname@example.org), or Twitter DM (@realbengilbert). We can keep sources anonymous. Use a non-work device to reach out. PR pitches by email only, please.