GOP Sen. Roger Marshall, who voted to overturn the 2020 presidential election, says he’s ‘so ready to move on’

Roger Marshall
Sen. Roger Marshall of Kansas.

  • GOP Sen. Roger Marshall said that he wanted to “move on” from discussing his challenge of the election results.
  • “I made a decision based upon the facts that I knew at that point in time,” he said.
  • Former President Trump and his campaign spent months trying to overturn President Biden’s victory.
  • See more stories on Insider’s business page.

GOP Sen. Roger Marshall of Kansas on Saturday said that he was “ready to move on” when asked about his support of former President Donald Trump’s efforts to overturn the 2020 presidential election results.

During an interview with CNN’s Pamela Brown, Marshall was questioned about whether his actions played a role in continued Republican distrust of the 2020 election.

Brown referred to a recent CNN poll conducted in late April which showed that 70 percent of Republicans believe that President Joe Biden didn’t legitimately win last year’s presidential race. The same poll revealed that only 23 percent of Republicans think Biden won the election fairly.

“Republicans continue to believe in the lie that this election, the last election was stolen,” Brown said. “You voted to toss out millions of votes in Arizona and Pennsylvania. You also joined the Texas lawsuit attempting to throw out votes cast in four states.”

She added: “I’m curious. Looking back, do you have any regrets about your actions and any concern that they contributed to misinformation about the election?”

“We’re just so ready to move on,” Marshall replied. “I made a decision based upon the facts that I knew at that point in time. I was concerned then, and I still am today that six states broke their own laws or their own constitution. But it’s time to move on. It’s time for this country to heal. It’s time for a spirit of forgiveness to be happening.”

Read more: Meet Merrick Garland’s inner circle of 18 officials. They’ve got a packed plate investigating major police departments and even Rudy Giuliani.

Days before the January 6 certification of Biden’s 306-232 Electoral College victory, Marshall joined a group of GOP senators led by Sens. Josh Hawley of Missouri and Ted Cruz of Texas who sought to challenge the results.

“I cannot vote to certify the electoral college results on January 6 without raising the fact that some states, particularly Pennsylvania, failed to follow their own state election laws,” Hawley said at the time.

The repeated maligning of the vote count by Trump and his campaign fueled the deadly insurrection at the US Capitol on January 6, which disrupted lawmakers as they sought to certify the results.

Later in the interview, Brown continued to press Marshall about how his challenge of the election results adhered to his ideological support of states’ rights.

“We want voting to easier, cheating to be harder,” he said. “By us standing up to our concerns about those elections, about the election integrity … it has forced those states with their problems to come to back to the table and have those legislatures work together to make sure we have safer elections with higher integrity.”

He added: “In my heart, I did what I thought was the right thing. I think the country is moving in a better direction.”

Read the original article on Business Insider

A top Facebook exec told a whistleblower her concerns about widespread state-sponsored disinformation meant she had ‘job security’

facebook ceo mark zuckerberg
In this April 11, 2018, file photo, Facebook CEO Mark Zuckerberg pauses while testifying before a House Energy and Commerce hearing on Capitol Hill in Washington.

  • Facebook let dictators generate fake support despite employees’ warnings, the Guardian reported.
  • Whistleblower Sophie Zhang repeatedly raised concerns to integrity chief Guy Rosen and other execs.
  • But Rosen said the amount of disinformation on the platform meant “job security” for Zhang.
  • See more stories on Insider’s business page.

Facebook allowed authoritarian governments to use its platform to generate fake support for their regimes for months despite warnings from employees about the disinformation campaigns, an investigation from the Guardian revealed this week.

A loophole in Facebook’s policies allowed government officials around the world to create unlimited amounts of fake “pages” which, unlike user profiles, don’t have to correspond to an actual person – but could still like, comment on, react to, and share content, the Guardian reported.

That loophole let governments spin up armies of what looked like real users who could then artificially generate support for and amplify pro-government content, what the Guardian called “the digital equivalent of bussing in a fake crowd for a speech.”

Sophie Zhang, a former Facebook data scientist on the company’s integrity team, blew the whistle dozens of times about the loophole, warning Facebook executives including vice president of integrity Guy Rosen, airing many of her concerns, according to the Guardian.

BuzzFeed News previously reported on Zhang’s “badge post” – a tradition where departing employees post an internal farewell message to coworkers.

But one of Zhang’s biggest concerns was that Facebook wasn’t paying enough attention to coordinated disinformation networks in authoritarian countries, such as Honduras and Azerbaijan, where elections are less free and more susceptible to state-sponsored disinformation campaigns, the Guardian’s investigation revealed.

Facebook waited 344 days after employees sounded the alarm to take action in the Honduras case, and 426 days in Azerbaijan, and in some cases took no action, the investigation found.

But when she raised her concerns about Facebook’s inaction in Honduras to Rosen, he dismissed her concerns.

“We have literally hundreds or thousands of types of abuse (job security on integrity eh!),” Rosen told Zhang in April 2019, according the Guardian, adding: “That’s why we should start from the end (top countries, top priority areas, things driving prevalence, etc) and try to somewhat work our way down.”

Rosen told Zhang he agreed with Facebook’s priority areas, which included the US, Western Europe, and “foreign adversaries such as Russia/Iran/etc,” according to the Guardian.

“We fundamentally disagree with Ms. Zhang’s characterization of our priorities and efforts to root out abuse on our platform. We aggressively go after abuse around the world and have specialized teams focused on this work,” Facebook spokesperson Liz Bourgeois told Insider in a statement.

“As a result, we’ve already taken down more than 100 networks of coordinated inauthentic behavior. Around half of them were domestic networks that operated in countries around the world, including those in Latin America, the Middle East and North Africa, and in the Asia Pacific region. Combatting coordinated inauthentic behavior is our priority. We’re also addressing the problems of spam and fake engagement. We investigate each issue before taking action or making public claims about them,” she said.

However, Facebook didn’t dispute any of Zhang’s factual claims in the Guardian investigation.

Facebook pledged to tackle election-related misinformation and disinformation after the Cambridge Analytica scandal and Russia’s use of its platform to sow division among American voters ahead of the 2016 US presidential elections.

“Since then, we’ve focused on improving our defenses and making it much harder for anyone to interfere in elections,” CEO Mark Zuckerberg wrote in a 2018 op-ed for The Washington Post.

“Key to our efforts has been finding and removing fake accounts – the source of much of the abuse, including misinformation. Bad actors can use computers to generate these in bulk. But with advances in artificial intelligence, we now block millions of fake accounts every day as they are being created so they can’t be used to spread spam, false news or inauthentic ads,” Zuckerberg added.

But the Guardian’s investigation showed Facebook is still delaying or refusing to take action against state-sponsored disinformation campaigns in dozens of countries, with thousands of fake accounts, creating hundreds of thousands of fake likes.

And even in supposedly high-priority areas, like the US, researchers have found Facebook has allowed key disinformation sources to expand their reach over the years.

A March report from Avaaz found “Facebook could have prevented 10.1 billion estimated views for top-performing pages that repeatedly shared misinformation” ahead of the 2020 US elections had it acted earlier to limit their reach.

“Failure to downgrade the reach of these pages and to limit their ability to advertise in the year before the election meant Facebook allowed them to almost triple their monthly interactions, from 97 million interactions in October 2019 to 277.9 million interactions in October 2020,” Avaaz found.

Facebook admits that around 5% of its accounts are fake, a number that hasn’t gone down since 2019, according to The New York Times. And MIT Technology Review’s Karen Hao reported in March that Facebook still doesn’t have a centralized team dedicated to ensuring its AI systems and algorithms reduce the spread of misinformation.

Read the original article on Business Insider