Russia is using the power of ‘Black PR’ to destroy political reputations and spread disinformation in the West

putin russia black pr
Vladimir Putin

    • Russia has been linked to an attempt to peddle coronavirus vaccine misinformation in France.
    • While the Russian link has not yet been proved, it follows a growing pattern of disinformation spread by Moscow.
    • Experts say Putin is now using the twin powers of social media and so-called “Black PR” to destroy reputations and undermine the West.
    • See more stories on Insider’s business page.

In May, a mysterious marketing agency contacted French influencer Léo Grasset and made a strange request.

The agency told Grasset, a popular science blogger, that it would pay him a “colossal” amount of money if he publicly cast doubt on the effectiveness of Pfizer’s coronavirus vaccine.

The agency, Fazze, asked Grasset to publish videos to his social media channels suggesting, falsely, that the western-made vaccine had caused over 1,000 deaths. The deal required that Grasset not reveal any sponsorship for the posts and would not ask who the client was making the request.

The Wall Street Journal later reported that Fazze – which contacted at least two other influencers – had ties with Russia. French counterintelligence authorities believe the campaign may have had Russian involvement, according to the report.

While claims of links to Moscow have not yet been proven, there is a distinctly Russian-style pattern in the attempt to use disinformation to sow division and doubt among people living in western democracies, which dates back decades.

Russia’s use of disinformation for such purposes dates back to the Soviet era. In the 1970s, the KGB ran a disinformation campaign to plant the idea that the United States had invented HIV/AIDS in a laboratory as a biological weapon.

Since the 1970s, it has continued to spread disinformation in the west, sowing division and doubt among its populations and undermining faith in democracy.

The major thing that has changed since the 1980s is the arrival of a new weapon in Russia’s disinformation arsenal: social media.

“The big difference is that in the last 10 to 15 years, [Russia’s disinformation efforts] have bled into mainstream life – political life, news, media, particularly social media,” said Christopher Steele – the author of the infamous Trump dossier – in a rare interview in November on the Infotagion podcast.

The sheer scale of Moscow’s disinformation efforts through social media is remarkable. A Facebook report published last week found that Russia remains the largest peddler of disinformation around the world. It was responsible not just for large-scale efforts during the 2016 election of Donald Trump and during the UK’s Brexit referendum campaign. Facebook said that Russia had run disinformation campaigns in more than 50 countries since 2017.

The report said that Russian military intelligence would create networks of increasingly sophisticated fake profiles which operated across multiple social media networks and blog platforms to try and avoid detection, peddling disinformation about topics including Russia’s proxy war with eastern Ukraine, Facebook said.

“It’s become a much more encompassing approach to trying to achieve your political and socio-economic objectives,” Steele said.

In one typical instance, Russian military intelligence created fake profiles that operated across blogs and multiple social media platforms to target Ukraine and neighboring countries. Some accounts posed as citizen journalists and tried to contact officials and other public figures, and others published blogs picked up by other journalists, Facebook said.

The objectives of these disinformation campaigns are not neatly defined. But they broadly represent attempts to undermine people’s faith in democracy and create partisanship and division, said Steele.

“What it does is undermine people’s faith in democracy and people’s faith in democracy which, as I’ve said before, should be the apogee of our democracy, not the weak point of it,” Steele said.

“The other thing I think it’s designed to do in its modern form is to create great polarity, great partisanship, and divisions.”

Disinformation is not the only decades-old Russian tactic gaining traction in the west in the social media era.

The rise of ‘Black PR’

putin disinformation black pr

So-called “Dark PR” or “Black PR” – is broadly defined as the practice of ruining reputations through dishonest public relations tactics, court battles, and other highly shady tactics. It first emerged in post-Soviet 1990s Russia as a means for political operators acting on behalf of state actors to destroy their opponents’ reputations.

However, the Kremlin, other state-owned entities, and Russian oligarchs with links to the state are now increasingly using those tactics in the west and using the power of social media to spread them further than ever before, according to a report by Dr. Andrew Foxall for the Henry Jackson Society.

“A lot of the time now, black PR campaigns tend to be on social media,” said Jade McGlynn, director of research at the Henry Jackson Society.

One example is the Bitkov family, who owned the highly successful North-West Timber Company in St Petersburg. Igor Bitkov, who built the company, made an enemy of Putin and was forced to flee the country with his wife and daughter after Russian state banks called in loans they had issued his company.

They sought refuge in Guatemala, but there was an intense and vitriolic social media campaign in Spanish against the family. “They were accused of all sorts of crimes – in that sense, it was a more typical disinformation campaign,” said McGlynn.

Whether the social media element to Russia’s disinformation efforts is actually effective is another question. In terms of Russia’s Black PR efforts, the accompanying efforts to prosecute individuals through the court systems appear to have been most effective.

The Bitkovs, for instance, were arrested and imprisoned in Guatemala in 2018, more than a decade after they fled Russia, on what they said were trumped-up charges following a decade of persecution from Russia – which had seen their daughter kidnapped. While Igor’s conviction was overturned, a Guatemalan appeals court upheld 14-year sentences against his wife and daughter only last year.

In the case of Russia’s more general disinformation campaigns, the effectiveness of its social media efforts has also been called into question, along with similar disinformation attempts backed by Iran’s government.

“Despite their relatively sophisticated nature, both of these operations reveal one of the fundamental challenges of “retail” [targeted] IO [information operations] – without a lucky break, they go nowhere,” Facebook’s report last week said. Russia’s disinformation effort in Ukraine, the company said, gained no “significant traction or attention.”

Read the original article on Business Insider

A top Facebook exec told a whistleblower her concerns about widespread state-sponsored disinformation meant she had ‘job security’

facebook ceo mark zuckerberg
In this April 11, 2018, file photo, Facebook CEO Mark Zuckerberg pauses while testifying before a House Energy and Commerce hearing on Capitol Hill in Washington.

  • Facebook let dictators generate fake support despite employees’ warnings, the Guardian reported.
  • Whistleblower Sophie Zhang repeatedly raised concerns to integrity chief Guy Rosen and other execs.
  • But Rosen said the amount of disinformation on the platform meant “job security” for Zhang.
  • See more stories on Insider’s business page.

Facebook allowed authoritarian governments to use its platform to generate fake support for their regimes for months despite warnings from employees about the disinformation campaigns, an investigation from the Guardian revealed this week.

A loophole in Facebook’s policies allowed government officials around the world to create unlimited amounts of fake “pages” which, unlike user profiles, don’t have to correspond to an actual person – but could still like, comment on, react to, and share content, the Guardian reported.

That loophole let governments spin up armies of what looked like real users who could then artificially generate support for and amplify pro-government content, what the Guardian called “the digital equivalent of bussing in a fake crowd for a speech.”

Sophie Zhang, a former Facebook data scientist on the company’s integrity team, blew the whistle dozens of times about the loophole, warning Facebook executives including vice president of integrity Guy Rosen, airing many of her concerns, according to the Guardian.

BuzzFeed News previously reported on Zhang’s “badge post” – a tradition where departing employees post an internal farewell message to coworkers.

But one of Zhang’s biggest concerns was that Facebook wasn’t paying enough attention to coordinated disinformation networks in authoritarian countries, such as Honduras and Azerbaijan, where elections are less free and more susceptible to state-sponsored disinformation campaigns, the Guardian’s investigation revealed.

Facebook waited 344 days after employees sounded the alarm to take action in the Honduras case, and 426 days in Azerbaijan, and in some cases took no action, the investigation found.

But when she raised her concerns about Facebook’s inaction in Honduras to Rosen, he dismissed her concerns.

“We have literally hundreds or thousands of types of abuse (job security on integrity eh!),” Rosen told Zhang in April 2019, according the Guardian, adding: “That’s why we should start from the end (top countries, top priority areas, things driving prevalence, etc) and try to somewhat work our way down.”

Rosen told Zhang he agreed with Facebook’s priority areas, which included the US, Western Europe, and “foreign adversaries such as Russia/Iran/etc,” according to the Guardian.

“We fundamentally disagree with Ms. Zhang’s characterization of our priorities and efforts to root out abuse on our platform. We aggressively go after abuse around the world and have specialized teams focused on this work,” Facebook spokesperson Liz Bourgeois told Insider in a statement.

“As a result, we’ve already taken down more than 100 networks of coordinated inauthentic behavior. Around half of them were domestic networks that operated in countries around the world, including those in Latin America, the Middle East and North Africa, and in the Asia Pacific region. Combatting coordinated inauthentic behavior is our priority. We’re also addressing the problems of spam and fake engagement. We investigate each issue before taking action or making public claims about them,” she said.

However, Facebook didn’t dispute any of Zhang’s factual claims in the Guardian investigation.

Facebook pledged to tackle election-related misinformation and disinformation after the Cambridge Analytica scandal and Russia’s use of its platform to sow division among American voters ahead of the 2016 US presidential elections.

“Since then, we’ve focused on improving our defenses and making it much harder for anyone to interfere in elections,” CEO Mark Zuckerberg wrote in a 2018 op-ed for The Washington Post.

“Key to our efforts has been finding and removing fake accounts – the source of much of the abuse, including misinformation. Bad actors can use computers to generate these in bulk. But with advances in artificial intelligence, we now block millions of fake accounts every day as they are being created so they can’t be used to spread spam, false news or inauthentic ads,” Zuckerberg added.

But the Guardian’s investigation showed Facebook is still delaying or refusing to take action against state-sponsored disinformation campaigns in dozens of countries, with thousands of fake accounts, creating hundreds of thousands of fake likes.

And even in supposedly high-priority areas, like the US, researchers have found Facebook has allowed key disinformation sources to expand their reach over the years.

A March report from Avaaz found “Facebook could have prevented 10.1 billion estimated views for top-performing pages that repeatedly shared misinformation” ahead of the 2020 US elections had it acted earlier to limit their reach.

“Failure to downgrade the reach of these pages and to limit their ability to advertise in the year before the election meant Facebook allowed them to almost triple their monthly interactions, from 97 million interactions in October 2019 to 277.9 million interactions in October 2020,” Avaaz found.

Facebook admits that around 5% of its accounts are fake, a number that hasn’t gone down since 2019, according to The New York Times. And MIT Technology Review’s Karen Hao reported in March that Facebook still doesn’t have a centralized team dedicated to ensuring its AI systems and algorithms reduce the spread of misinformation.

Read the original article on Business Insider

Twitter CEO Jack Dorsey was caught red-handed trolling Congress by tweeting a sarcastic poll during a Big Tech hearing

Jack Dorsey
Twitter CEO Jack Dorsey.

  • Jack Dorsey was called out for tweeting during a congressional hearing about misinformation online.
  • The Twitter CEO tweeted a poll that appeared to mock the simple “yes or no” answers lawmakers demanded.
  • Rep. Kathleen Rice told Dorsey that his “multi-tasking skills are quite impressive.”
  • See more stories on Insider’s business page.

Twitter boss Jack Dorsey on Thursday was busted tweeting a saracastic poll during a congressional hearing about misinformation on social media platforms.

Lawmakers grilled Dorsey, as well as Google CEO Sundar Pichai and Facebook CEO Mark Zuckerberg, about their sites’ handling of vaccine misinformation, election fraud claims, and online extremism.

Congress asked the three CEOs to answer “yes or no” to a range of complicated, extensive questions. Lawmakers sometimes interrupted if the CEOs tried to give longer answers.

During the hearing, Dorsey took a jab at the tactic by tweeting a poll that was simply a question mark, asking Twitter users to vote “yes” or “no.”

Read more: Here are some of the potential future CEOs in big tech, and how much they’re currently paid

Democratic Rep. Kathleen Rice picked up on Dorsey’s tweet and asked him: “Mr Dorsey, what is winning, yes or no, on your Twitter account poll?”

Dorsey said that “yes” was in the lead. Rice replied: “Hmm, your multitasking skills are quite impressive.”

At the time of publication, the poll has more than 97,000 votes.

While facing Congress, the 44-year-old was also liking tweets that pointed out lawmakers were mispronouncing Pichai’s name, as well as cutting off the CEOs mid-sentence.

Dorsey, who founded Twitter in 2006, confirmed to another Twitter user that he was barefoot in the hearing.

Dorsey also retweeted a Twitter user’s post that said: “It would be awesome if some Member engaged [Jack] in a substantive discussion on Twitter’s ‘protocols’ idea.” Dorsey tweeted about Twitter’s protocols idea before the hearing. He said the company had started working on a decentralized, open-source social media protocol called Bluesky, which could allow users to build their own media platform that is solely owned by them.

Social-media platforms have faced heavy scrutiny over the past year for the way they have policed misinformation during the pandemic, particularly during the presidential election and the Capitol riots. The five-hour long hearing on Thursday was the first time the tech CEOs had faced Congress since President Joe Biden’s inauguration.

Twitter said March 1 that it would ban users who repeatedly post misinformation about COVID-19 vaccines on the platform. It also said tweets that contain misleading information would be labeled.

One month before the election, the company said it changed some features to prevent the spread of false political claims, including prompting users to post a comment about a tweet before retweeting it.

Lawmakers in Thursday’s hearing said the changes to the platform didn’t go far enough. They could still easily find anti-vaccine content on both Twitter and Facebook, Rep. Mike Doyle, chair of the House subcommittee on Communications and Technology, said, per CNN.

Read the original article on Business Insider

US officials believe Russia launched a disinformation campaign against the Pfizer COVID-19 vaccine to boost the status of its own: Report

Pfizer
Oil markets surged in the hours after Pfizer announced positive results from its coronavirus vaccine study.

  • Russian intelligence is sowing disinformation about the Pfizer coronavirus vaccine, the WSJ reported.
  • Four foreign-owned outlets are disseminating info that questions the Pfizer vaccine’s efficacy and safety.
  • US intelligence believes this effort to undermine Pfizer is a way to bolster Russia’s vaccine.
  • Visit the Business section of Insider for more stories.

Russian intelligence officials are attempting to cast doubt on the Pfizer coronavirus vaccine, according to a new report from the Wall Street Journal. 

Four publications acting as fronts for Russian intelligence are disseminating information that questions the efficacy and safety of the Pfizer vaccine, State Department officials told the Journal. 

Russia is pedaling misleading information designed to make Americans question whether the US rushed the approval process for the Pfizer COVID-19 vaccine. 

“We can say these outlets are directly linked to Russian intelligence services,” an official at the State Department’s Global Engagement Center told the newspaper. “They’re all foreign-owned, based outside of the United States. They vary a lot in their reach, their tone, their audience, but they’re all part of the Russian propaganda and disinformation ecosystem.”

Back in November, Russian President Vladimir Putin said the country is hoping to distribute its controversial Sputnik V coronavirus vaccine to other countries. 

Russia announced a successful coronavirus vaccine in August, but Sputnik V was approved under questionable circumstances. It was released before it went through phase 3 trials. In the United States, phase 3 is a requirement before a drug or vaccine can be vetted and approved by the Food and Drug Administration.

The rushed timeline led health officials to speculate whether the Kremlin coerced vaccine makers into putting out Sputnik V quickly to gain a leg up in the global race for a cure to the novel coronavirus.

US intelligence officials now believe this effort to undermine the Pfizer vaccine coming out of the Kremlin is another way to bolster the status of Sputnik V, the Journal reported. 

Johnson & Johnson is the latest company to enter the vaccine game. The healthcare giant is offering a single-dose vaccine that the company expects to distribute to 4 million Americans shortly.

Johnson & Johnson, whose vaccine gained FDA approval toward the end of February, said it expects to vaccinate 20 million people by the end of March and 100 million by the end of June. 

Including Johnson & Johnson’s vaccine, the United States is now distributing and touting three effective vaccines to Americans. 

Pfizer and Moderna – the two companies whose coronavirus vaccines preceded Johnson & Johnson’s – have efficacy rates of 94% and 95%, respectively. 

Vaccines against the coronavirus have been rolling out in the United States since December 2020, after Pfizer became the first company to produce and receive FDA approval to distribute.

With this third vaccine on the market, the US is expected to have enough doses to immunize 300 million people. 

More than 57 million people in the United States have already received at least one dose of a coronavirus vaccine. the Pfizer and Moderna vaccines require two doses, while Johnson & Johnson’s requires one.

Last week, President Joe Biden said the US plans to have enough doses of coronavirus vaccines for “every adult in America” by the end of May. Biden’s announcement sped up the timeline to reach this threshold by about a month, Insider’s Eliza Relman and Sonam Sheth reported.

It’s been almost a year since the WHO declared the coronavirus a pandemic. Since then, more than 28 million people in the United States have contracted the virus, according to the latest data compiled by Johns Hopkins University. Of that, more than 500,000 Americans have died.  

The State Department did not immediately respond to a request for comment from Insider.

Read the original article on Business Insider

The global ‘infodemic’ will be Biden’s biggest challenge

joe biden phone tech
Democratic presidential nominee Joe Biden holds his phone as he arrives at Atlanta International Airport on October 27, 2020 in Atlanta, Georgia. Biden is campaigning in Georgia on Tuesday, with scheduled stops in Atlanta and Warm Springs.

  • Misinformation is one of the greatest national threats to American democracy. 
  • Biden has an opportunity to turn the table against misinformation by uniting the private sector and public sector in a way that disincentivizes false information.
  • It is far past time for the White House to make a real commitment to misinformation.
  • Theresa Payton is CEO and founder of Fortalice Solutions and author of “Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth.”
  • This is an opinion column. The thoughts expressed are those of the author.
  • Visit the Business section of Insider for more stories.

Whether it’s the COVID-19 vaccine rollout, the outcome of the 2020 elections, or the violence that followed in its wake, misinformation is shaking the foundations of America’s public institutions. 

An entire ecosystem of social media and “news” outlets is building and spreading an alternate reality for Americans. In his inaugural address, President Joe Biden said, “We must reject the culture in which facts themselves are manipulated and even manufactured.”  He correctly recognized that America’s “infodemic” represents the biggest threat not just to the Biden presidency, but to the future of democratic governance in the United States.

72% of Republicans believe some version of a conspiracy theory that the election was “stolen.” 27% of Americans are hesitant to get the COVID-19 vaccine, due in part to the spread of online conspiracy theories that the vaccine is harmful to public health. A poll in the UK found that 8% of people believe that 5G technology spreads the virus. 17% believe in a conspiracy by online trolls known as QAnon that the government is secretly waging a war against pedophile rings in Hollywood. All of these theories are part of an infodemic that has spread largely unchecked on social media platforms. 

It doesn’t have to be this way. President Joe Biden has an opportunity to turn the table against misinformation by uniting the private sector and public sector in a way that disincentivizes false information while protecting every American’s right to free speech, even if it has no basis in reality. 

The social network of misinformation

There’s no place misinformation spreads faster and reaches more people than social media. As private entities, companies like Twitter, Facebook and Instagram have the full power to craft their own policies about misinformation, but to date, they are ad hoc and reactive, instead of strategic and forward-looking. 

Twitter has added “disputed” labels on dubious tweets about election fraud, for example, and Facebook has removed pages for several groups dedicated to election disinformation. Misinformation campaigns spreading slightly altered or completely fictional news reports are cost-effective and pay off: Research shows that a false story about any topic, not just politics, reaches 1,500 people six times faster than legitimate news does. One rough estimate shows that misinformation on public health generated billions of views on Facebook in just one year. While social media platforms have taken steps to curb misinformation, they need to do more. 

President Biden has the opportunity to take a different approach than the previous administration by working with social media platforms to promote healthier and factual online discourse. The Biden Administration should convene a task force of tech CEOs and cybersecurity experts to examine how new policies can flag misinformation and discourage its virality. 

President Biden can demand quarterly transparency reporting on governance processes from Big Tech and social media companies and can work with lawmakers on Capitol Hill to determine once and for all whether these companies are merely platforms that have to be neutral with respect to viewpoints, publishers with editorial standards, or critical infrastructure that are critical to the health of our democracy. 

In recent months, there have been increased calls to hold Big Tech accountable for the content on their platforms — including calls to repeal or review Section 230 of the Communications Decency Act. While the Biden Administration is likely to review this legislation, repealing Section 230 alone would be unlikely to stop the spread of misinformation online. Biden will have to work in partnership with Big Tech to establish actionable policies to slow the spread of misinformation online. 

It’s also true that to curb this infodemic, the United States cannot operate alone. The Biden Administration will need to work with international leaders to create international accords and standards that punish state actors guilty of using misinformation in other countries. Just like NATO’s Article V, a misinformation attack against a person, issue, or country, is an attack on all of us and should be treated as such. International, third-party oversight can help ensure the fine line between allowing freedom of speech while recognizing the dangers of misinformation. 

Misinformation campaigns are often not about elections, picking winners or losers, or even  specific issues. They are designed to make you not believe the truth even when you are presented with. They are designed to discredit all authoritative sources, leaving a vacuum to be filled by even more misinformation.  

As we have seen in 2016, foreign adversaries like Russia play a critical role in boosting and supporting posts on social media. By sowing discord, these autocratic countries want to undermine American power abroad and show that democracy is unstable. 

America is in the midst of a digital arms race with misinformation as its chief weapon and we are losing. Defending global freedoms and democracy is now being fought in the digital domain.  Just as our nation and the world came together to fight back against the COVID-19 pandemic, we need to employ similar efforts to fight against cybercrime, hacking, the spread of misinformation, and online manipulation in the digital space.  It is far past time for the White House to make a real commitment to misinformation – and the Biden administration has an opportunity to do so now, before it’s too late.

Theresa Payton is CEO and founder of Fortalice Solutions and author of “Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth.”

Read the original article on Business Insider

Congress wants a closer look at US special operations after 2 decades of secret missions and scandals

Rep. Ruben Gallego, D-Ariz., speaks as the House reconvenes to debate the objection to confirm the Electoral College vote from Arizona, after protesters stormed into the U.S. Capitol on Wednesday, Jan. 6, 2021.
Rep. Ruben Gallego speaks during House debate of an objection to Arizona’s Electoral College vote, after protesters stormed into the US Capitol, January 6, 2021.

  • Sprawling and secretive military operations over the past two decades have been a target for criticism.
  • With a new House Armed Services subcommittee, Congress hopes to provide more scrutiny.
  • “The landscape has changed in terms of what threats are out there,” Rep. Ruben Gallego told Insider.
  • Visit the Business section of Insider for more stories.

The US military’s special-operations units have fought around the world over the past two decades, a period during which their successes have been marred by scandals and misconduct.

Now, with a new subcommittee on the House Armed Services Committee, lawmakers hope to exercise greater oversight over those shadowy operations and other emerging challenges.

“The landscape has changed in terms of what threats are out there and what the capabilities of our near-peer competitors are,” Rep. Ruben Gallego told Insider.

Gallego, the highest-ranking person of color on the Armed Services Committee, a Marine veteran, and progressive Democrat, will chair the new subcommittee.

“We’re right now having to be able to continue with the traditional roles [of the] military but then also having to figure out how to deal with hybrid warfare,” Gallego added.

Gallego and committee chairman Rep. Adam Smith announced the new subcommittee, officially called the Subcommittee on Intelligence and Special Operations, on February 3.

ISO emerges from a split of the Intelligence and Emerging Threats and Capabilities subcommittee, along with the Subcommittee on Cyber, Innovative Technologies, and Information Systems.

“A lot of the work in warfare that’s going to be coming up is going to be found in these two subcommittees,” Gallego said.

‘Very serious and sticky situations’

Ruben Gallego immigration Dreamers
Gallego and Dolores Huerta, right, outside the Supreme Court during oral arguments on then-President Barack Obama’s executive actions to help defer deportation for undocumented people, April 18, 2016.

The ISO subcommittee is responsible for military and national intelligence, countering weapons of mass destruction, counterterrorism, and special-operations forces. Special operations and military intelligence are likely to get the most attention.

“They both feed into each other” and “into the bigger portfolio in terms of preparing us for the great-power competition,” Gallego said. “We’re going to not neglect our actions in other areas, but making sure that those two areas are primed and ready to go, I think, is going to be really important.”

Demand has grown for more oversight of military operations conducted under the banner of counterterrorism. Special-operations forces, such as the Navy SEALs, are a minority among troops overseas but carry out many of those missions.

The lack of clarity about what they’re doing and the legal justification for it has been a major point of criticism.

“It definitely is a problem,” Gallego said of that opacity. “They are special operators, but they are still under the purview of civilian authority, and I also don’t appreciate that they’ve been essentially used to … go around Congress’s ability to wage war.”

“So we are going to bring that under control as much as possible. We want to see more transparency when it comes to their usage,” Gallego said. “At the same time, we also want to make sure that we guard their usage, because their consistent rotations, I think, [are] actually debilitating towards their effectiveness.”

Lawmakers have expressed concern about that high operational tempo. Like other troops, special operators face increasing mental and physical strain from frequent deployments. That strain, plaudits heaped upon those forces, and a lack of accountability have been blamed for repeated cases of misconduct – especially among SEALs.

Edward Gallagher
Navy SEAL Chief Edward Gallagher with wife Andrea after being acquitted of most of the serious charges during his court-martial at Naval Base San Diego, July 2, 2019.

Those units’ high profile may help recruiting, Gallego said, but it can also make policymakers “more likely to use them in very serious and sticky situations that they don’t necessarily want ‘normal’ forces in.”

In January, the Pentagon announced an evaluation of whether US Special Operations Command, which oversees those forces, and US Central Command, which oversees military operations in the Middle East, implemented programs to reduce potential violations of the laws of war and whether violations that did occur were reported.

Accountability is needed for “any type of abuse” uncovered by that probe, Gallego told Insider, “but mostly what we want to see come from this probe are steps and checks to make sure that we don’t find ourselves going into mission creep in terms of use of our special forces.”

Policymakers have a habit of deploying those forces without public debate, hoping that “they never get ‘caught’ or create situations where then they have to answer to the public,” Gallego said.

In that respect, the Pentagon’s review “will be extremely important,” Gallego added, pointing to Congress’ inquiries after the October 2017 ambush in Niger that killed four US Army Special Forces members. (That incident prompted a restructuring of special-operations leadership to allow more civilian oversight, which was implemented by the Trump administration and is now being reviewed by the Biden administration.)

“Members of Congress were surprised that we had military in Niger,” Gallego said. “The fact that it is that pervasive, abuse of our military, that even people in the Armed Services Committee did not know that we were actively involved there is a problem.”

‘Toe-to-toe with any military’

Rep. Ruben Gallego

While special operations will be a priority for the subcommittee, challenges related to intelligence-gathering, cyber intrusions, and disinformation loom large after the 2016 and 2020 elections.

In a joint statement announcing the new subcommittee, Gallego and Smith singled out “the disruptive impact of disinformation attacks” among the “unprecedented threats” the US faces from “adversaries and competitors.”

Disinformation is a particular challenge because it spans “the civilian-military divide” and is created by both domestic and international actors, Gallego said.

“We are going to have to address it. How we address it with the assets that we have currently on deck, I think, is going to be really important,” Gallego added. “We have the capability. We have the talent. We don’t necessarily have the authorities nor the true understanding of how deep and problematic this is.”

Gallego mentioned the Defense Intelligence Agency as a partner for the subcommittee. DIA is one of 18 organizations in the US intelligence community, the size of which has been a source of internal confusion and external criticism.

The community’s size isn’t the problem but rather its responsiveness, Gallego said.

“If you’re big and you don’t move, that’s a problem. If you’re small and you don’t move, that’s still a problem,” Gallego added. “So I’d love to be able to work with all these different elements and make sure that they are interoperable, they’re talking to each other, and they actually want to have action and operations, instead of just informing the military … and us what’s going on.”

Ruben Gallego Capitol Hill building siege attack riot
Gallego directs traffic as staffers and House members get safety hoods from under desks as protestors breach the Capitol building, January 6, 2021.

The Trump administration resisted assessments from those agencies about the role foreign influence operations had in the 2016 election. Disputes about those assessments persist, and domestic actors, including Republican lawmakers, continue to invoke baseless allegations about the integrity of the 2020 election.

Gallego said he didn’t see that as an obstacle to working with Republicans on matters before his subcommittee.

“I think that was very much a Trump administration-led problem,” Gallego told Insider. “Now that Trump has gone, I think that is no longer an issue, and I think people want to work together across party lines to make sure we take care of that serious threat.”

The new subcommittee was announced a day before the Pentagon announced a review of the US military’s “footprint, resources, strategy, and missions” around the world, which Defense Secretary Lloyd Austin said will inform his advice to President Joe Biden “about how we best allocate military forces in pursuit of national interests.”

The relevance of that review extends to information warfare and emerging technologies, Gallego told Insider.

“I’m sure we can go toe-to-toe with any military when it comes to man-to-man, hand-to-hand combat, but are we going to be able to win the hacking war of the next 20 years? Are we going to be able to win the quantum-computing competition that we may be already losing right now? What happens if China turns the corner when it comes to AI?” Gallego said. “These are the things that would have to have a full review.”

Read the original article on Business Insider

TikTok removed 400,000 videos in the 2nd half of 2020 to combat election and COVID-19 misinformation

Trump TikTok
TikTok has had to confront the spread of misinformation and disinformation, especially relating to the 2020 US election and the coronavirus pandemic.

  • TikTok announced Wednesday it removed nearly 90 million videos globally in the second half of 2020. 
  • Of those videos, more than 11 million videos were removed in the United States.
  • Almost 350,000 videos were removed for misinformation about the US election, while more than 50,000 were removed for spreading COVID-19 misinformation.
  • Visit the Business section of Insider for more stories.

TikTok removed approximately 400,000 videos for misinformation related to the US election and COVID-19 in the second half of 2020, the company announced Wednesday.

In total, from July 1 to December 31 last year, the company said it removed 89,132,938 videos globally, with 11,775,777 of those being removed in the United States. TikTok said these videos were removed for violations of its community guidelines and its terms of service.

The stats were published Wednesday in a press release authored by Michael Beckerman, TikTok’s vice president and head of US public policy, and Eric Han, the company’s head of safety in the US.

About 92% of the removed videos were deleted prior to a user reporting them, the company said. Approximately 83% of the videos were removed before anyone had seen the videos, and about 93% of the videos were removed within 24 hours of their being posted.

Of the more than 12 million videos removed, the company said it removed 347,225 videos for misinformation, disinformation, or “manipulated media” related to the 2020 election. TikTok said it deleted 51,505 videos for misinformation about the COVID-19 pandemic.

Like all major social media platforms, TikTok has had to confront the spread of misinformation and disinformation on its app, especially last year relating to the US election and the coronavirus. As Insider previously reported, TikTok has become a key tool for discussing politics among politicians, candidates, and TikTok users.

Scientists and doctors have also used TikTok to debunk falsehoods and conspiracies and educate users about COVID-19 and vaccines amid misinformation on the platform. The company last year began labeling videos about COVID-19 with a link to a pandemic information hub. The press release on Wednesday said the label was applied to more than 3 million videos in the second half of 2020.

TikTok, which is owned by the Chinese company ByteDance, in August updated its rules on disinformation and misinformation, creating new guidelines to prohibit synthetic or manipulated content and defining existing policies on “coordinated inauthentic behavior” relating to the election. The company also said at the time it expanded its relationships with fact-checking partners and launched a partnership with the US Department of Homeland Security.

The company on Wednesday said it was working to further bolster its effort to combat misinformation and disinformation on the platform to “better identify altered versions of known disinformation.” TikTok also said it was working to develop tools to prevent “repeat offenders” from evading or otherwise circumventing its moderation decisions.

Read the original article on Business Insider

Congress to hold hearing on news media’s role in promoting conspiracy theories about COVID-19 and the 2020 election

GettyImages 1294904312
Supporters of US President Donald Trump fly a US flag with a symbol from the group QAnon as they gather outside the US Capitol January 06, 2021 in Washington, DC.

  • House Democrats are holding a hearing on the media’s role in promoting “disinformation and extremism.”
  • The hearing, by the Energy and Commerce Committee, will be held remotely on February 24.
  • It will feature testimony from as-yet-unnamed “media experts.”
  • Visit the Business section of Insider for more stories.

Conspiracy theories are no longer the domain of fringe websites, but have aired on major cable news networks. Now Democrats in Congress say they want to examine the role that the mainstream media has played in promoting false and outlandish claims.

“The prolonged severity of the COVID-19 pandemic and the attack on our Capitol on January 6 have driven home a frightening reality: the spread of disinformation and extremism by traditional news media presents a tangible and destabilizing threat,” Reps. Frank Pallone and Mike Doyle said in a joint statement on Thursday.

Palone, a Democrat from New Jersey, chairs the House Energy and Commerce Committee while Doyle, a Democrat from Pennsylvania, leads the Communications and Technology subcommittee. On February 24, the two lawmakers will host a remote hearing, featuring unnamed “media experts,” examining the issue.

“Some broadcasters’ and cable networks’ increasing reliance on conspiracy theories and misleading or patently false information raises questions about their devotion to journalistic integrity,” the lawmakers said.

Though not stated, it’s possible that lawmakers will be discussing Fox News, which until this year has long been the top-rated cable news network. In January, host Steve Hilton promoted a claim that Dr. Anthony Fauci played a direct role in creating the coronavirus. Many of the network’s anchors and commentators also promoted false claims about election fraud, assertions that were challenged by some of its more fact-oriented, straight-news personalities.

Have a news tip? Email this reporter: cdavis@insider.com

Read the original article on Business Insider

Parler website appears to back online and promises to ‘resolve any challenge before us’

Parler
  • The website of controversial social media platform Parler was back online Sunday, following a nearly week-long outage after it was booted from Amazon Web Services and kicked off Apple and Google’s app stores.
  • The website popped back up on Sunday with a message from CEO John Matze, asking “Hello, world. Is this thing on?”
  • Visit Business Insider’s homepage for more stories.

The website of controversial social media platform Parler was back online Sunday, following a nearly week-long outage after it was booted from Amazon Web Services and kicked off Apple and Google’s app stores.

Amazon, Google and Apple cut ties with Parler for what they said was a failure to moderate content and threats of violence by some of its right-wing user base. Some of the Pro-Trump rioters who descended on the US Capitol on January 6, fueled by baseless allegations of voter fraud, had been planning the event and spreading misinformation about the presidential election on Parler.

The website popped back up on Sunday with a message from CEO John Matze, asking “Hello, world. Is this thing on?”

A statement on the site indicated it intends to be back soon.

“Now seems like the right time to remind you all – both lovers and haters – why we started this platform. We believe privacy is paramount and free speech essential, especially on social media,” it said  “We will resolve any challenge before us and plan to welcome all of you back soon. We will not let civil discourse perish!”

Read more: Online misinformation about the US election fell 73% after Trump’s social media ban

A WHOIS search indicates that Parler is now hosted by Epik. Parler last week registered its domain with
the Washington-based hosting provider known for hosting far-right extremist content, though Epik denied in a statement that the two companies had been in touch. 

Parler has faced massive fallout in the days following the siege on the US Capitol, with various business partners cutting ties.

Apple and Google were first to remove Parler’s app from their stores, also citing its alleged refusal to take down violent content. Not long afterward, many of Parler’s service providers, including Twilio, Okta, and Zendesk, removed Parler from their platforms as well.

Apple CEO Tim Cook said in an interview on Fox News Sunday that Parler had been suspended and could be back in the App Store if they “get their moderation together.”

Parler rose to notoriety in recent months as mainstream social media sites have faced increasing pressure to crack down on hate speech, misinformation, and calls for violence. Both Twitter and Facebook have banned President Donald Trump after the deadly Capitol riot, citing the risk of further violence. 

Parler, which has maintained that its deplatforming was intended to stamp out competition, filed an anti-trust lawsuit against Amazon last week, seeking to get its website restored. 

In a court filing, Parler disputed claims made by Amazon that it had repeatedly warned Parler that violent content on its site – and the company’s lax approach to removing it – were grounds for Amazon to suspend Parler’s AWS contract.

Parler claimed that Amazon, in effect, terminated its contract completely, rather than simply suspending it, and did not warn the social-media company about potential contract breaches until after the Capitol riots – and continuing to try to sell it additional services as late as December.

Read the original article on Business Insider

As an autism researcher, I’ve dealt with anti-vax misinformation for years. Here’s how we can combat it during the COVID vaccine rollout

covid 19 vaccine drive
Diana Carolina, a pharmacist at Memorial Healthcare System, receives a Pfizer-BioNtech Covid-19 vaccine at Memorial Healthcare System, on December 14, 2020 in Miramar, Florida.

  • Anti-vax misinformation has spread dangerously far and wide.
  • We have a social responsibility to recognize and call it out, especially amidst COVID-19 vaccinations.
  • Here’s how to do so.
  • Lior Brimberg, PhD, is an assistant professor in the Institute of Molecular Medicine at the Feinstein Institutes for Medical Research.
  • This is an opinion column. The thoughts expressed are those of the author.
  • Visit Business Insider’s homepage for more stories.

We have all seen the false headlines stating that vaccinations cause autism. This very loud and misinformed fake news has become the fuel for the ever-growing firestorm of the anti-vaccination movement and disbelief around vaccines in general. That misinformation has now dangerously spread to the discussion of the impending COVID-19 vaccine.

The anti-vax movement threatens the health and well-being of the global population. There is skepticism swirling around the COVID-19 vaccines; a recent Kaiser Health Covid-19 Monitor poll reports that 27% of Americans) say they definitely or probably would not get a coronavirus vaccine. As a scientist who has confronted the anti-vax movement head-on, this statistic is alarming.

For more than a decade, my research has focused on understanding how antibodies in moms-to-be could interfere with fetal development and may even lead to autism. Misled parents and so-called experts continue to spread misinformation and unsubstantiated claims that routine vaccines lead to autism, which all stemmed from a fraudulent and now-debunked research study. However, the damage has been done. This fake news continues to flood the internet, causing serious confusion for parents trying to navigate parenthood and puts our youth at serious risk of getting life-threatening illnesses like mumps and measles, polio, and more recently whooping cough. Now, in the middle of a global health pandemic, this is not the time for confusion.

While some hesitation is to be expected, especially in any new vaccine, everyone must understand the process and care that took place in developing the COVID-19 vaccines and have faith in the science and the federal regulations. The Food and Drug Administration recently approved Pfizer’s vaccine through an Emergency Use Authorization. It was a momentous occasion – and I take a special pride to be part of Northwell Health, who delivered the first dose in America – to see that injection of hope. It’s been weeks since that first injection, with thousands more frontline workers across the nation having received it, and it’s up to us now to trust it even more.

We have a social responsibility to the millions who were let go from work because of the pandemic, to our kids who are socially and educationally deprived, to the elderly and at risk populations who are concerned for their daily life, and for the hundreds of thousands who lost their lives.

How do we combat this ever-growing chorus of vaccine naysayers?

First, it all comes down to communication. Our local, state, and federal officials have been constantly emphasizing the need to wear masks, social distance, and the importance of proper hygiene, which are all critical in slowing the spread of the virus. While we continue to do just that, it is the time for the media to pivot to vaccine effectiveness, rollout plans, and potential side effects. The messages need to come from local, state and the national level, including influential community leaders, faith-based organizations and anyone with a megaphone to promote the safety and efficacy of these vaccines.

Second, we must explain the science. As we have seen with the confusion around autism, if people don’t understand the basics of what a vaccine is or does, then of course, this is bound to lead to distrust. It seems like scientists developed these COVID-19 vaccines in a pressure cooker, and in a sense, they did. However, it is imperative to communicate all of the safety, regulatory and strict guidelines that were adhered to, just as any other drug would have been. One of the main differences in the development of the COVID-19 vaccines, because of the federal initiative of “Operation Warp Speed,” in comparison to others, each phase of the drug testing and manufacturing were simultaneous. While a gamble for the drug manufacturers, if the vaccines’ trials showed it was unsafe, all of those produced vials would be useless; this has allowed for a new vaccine to be funded, tested, and created in less than a year.

We need to have a clear understanding of the vaccine. The RNA based platforms are not “novel” methods as the media like to present it and in fact has been studied since 1990s. People need to be comforted in knowing that this science has been tested before.

We also need to understand how it was tested and who should get it. For example, I am hearing from parents that they will not vaccinate their kids, because they do not trust the safety of the vaccine. The fact is that the vaccines have not been tested in kids, and won’t be offered to them at this moment. This is true to any population which has not been included in the clinical trials, including pregnant women. 

Third, we should encourage questions. Questions are good – they are what drive our science. Will the vaccine hurt? Why do I need two shots? How was it developed? While it is hard to sway someone’s beliefs, informed answers from medical experts and trusted community leaders will help to achieve some level of comfort and understanding. Even doctors and vaccinators need to be educated on the various vaccines and details around it – and they are learning all of the specifics now, as the drug companies continue to release their clinical trial data. While we may not have all the answers right now – medical professionals and science experts have everyone’ health and safety in mind, including their own.

And finally, it’s important for you to make an effort, and stay in the know. Doctors, health systems, and the media need to report on the numbers of those vaccinated, the effects – both the positives and negatives outcomes – as the vaccines rollout. This continuous update of data will empower more people who may have been on the fence around taking the vaccine to get it. There needs to be a vaccine movement, a groundswell of support from every community, every neighborhood in order for all of us to receive the vaccine’s benefits.

This is a difficult time.  It seems like we were all asked to grasp concepts like herd immunity, infectious disease, and vaccine development in less than a year. It’s not easy, but if we trust in the science, trust in the facts, and trust in the leaders making the decisions, we will have the opportunity to return to a time of normalcy, very, very soon. 

Lior Brimberg, PhD, is an assistant professor at the Feinstein Institutes for Medical Research, whose research focus centers around the role of the in utero environment and specifically maternal brain auto-antibodies in autism.

Read the original article on Business Insider