“Genshin Impact,” an adventure video game, is among the breakout hits of gaming during the pandemic. Released in September 2020 by Chinese developer miHoYo for mobile, PC, and PlayStation 4, the game, similar in style to “Legend of Zelda: Breath of the Wild,” quickly found a devoted fan base. In its first month, the free-to-play title which earns money through in-game purchases, made over $245 million. It has since brought in over $1 billion in revenue.
But the massively popular game is facing backlash over what many Twitter users have referred to as problematic features, including its adult characters engaging in romantic relationships with characters that appear to be minors and language that many said was racially insensitive.
On Monday evening, #BoycottGenshin trended on Twitter due to the criticism. Over 65,000 tweets used the hashtag in under 24 hours.
miHoYo, the game developer, did not respond to Insider’s request for comment.
Players are questioning the inspiration for ‘humanoid monster’ characters
Within the world of “Genshin Impact,” Hilichurls are some of the first enemies players face, with variants found all throughout the title. The game describes them as “primitive humanoid monsters,” and “that they are monsters simply incapable of communication.”
Critics are claiming, based on footage released by the studio, that the beastial characters could be based on indigenous peoples.
An October 2020 studio tour video released by miHoYo shows an animator creating the Hilichurls’ dance by watching footage of what appears to be an indigenous community dancing. The source of the footage was not immediately clear in the blurry video.
The clip went viral, reaching more than 600,000 views on Twitter in less than one day. One Twitter user whose tweet went viral said that as an “indigenous person,” the villainous characters appearing to take inspiration from a native culture “is absolutely not okay.”
The game’s depiction of characters with darker skin is facing backlash
“Genshin Impact” features 30 playable characters, most of whom are pale-skinned animé tropes.Xinyan and Kaeya are the only two currently available characters with darker skin tones.
Kaeya, an ice-wielding cavalry captain, is described as “exotic” in the game, while Xinyan, a musician with a penchant for flames, is considered “scary” in the lore of the game world, according to the game’s wikia page.
Many Twitter users using the “#BoycottGenshin” hashtag referenced these characters and their descriptions, accusing the game of colorism.
Some players criticized what they viewed as a relationship between an adult and a child
There are also claims that “Genshin Impact” included a relationship between an adult and a child. In the game, Ulfr, an adult character, says they want to build a “dandelion boat” for the childlike Flora character. The game does not specify Flora’s age, but she appears to be a minor.
“Genshin Impact” is also criticized for the money it rakes in as a gacha game, which allows it to raise capital by having users spend real-world money for a chance at unlocking a powerful character within the world of the game. This system, referred to as “gambling” by some players on Twitter, leads to people paying hundreds of dollars without unlocking the characters that they are looking for.
Though the hashtag has plenty of people upset at the title, there are also a fair amount of users who believe that this criticism is unfounded or unnecessary, and urged players to instead focus on the game’s other issues, like the game’s lack of two-factor authentication, which would help protect accounts.
On Thursday April 1, Charles – who is one of the biggest beauty influencers in the world, with more than 25 million YouTube subscribers and 36 million TikTok followers – posted a video acknowledging that he had sexually charged conversations with two 16-year-old boys who he added on Snapchat. He said that at the time he believed them to be 18 and that he blocked them once he became aware of their age.
A number of similar accusations against Charles have been made in recent weeks, primarily in the form of viral TikTok videos posted by the boys who say they had flirtatious or sexual conversations with Charles. In his apology video, Charles said he had been “reckless” and that he needed to “take accountability for my actions and most importantly apologize to the people that were affected by them.”
Now, pressure is mounting on brands that have previously aligned themselves with Charles to speak out and distance themselves from him. Followers are singling out Morphe, a makeup company that has been closely associated with Charles since it released a hugely successful eyeshadow palette in collaboration with him in November 2018. Morphe has since launched a mini version of the palette, as well as two brush sets in collaboration with Charles. One of the brush collections, which is priced at $149, is currently sold out in the US; the rest of the products are still available to purchase on Morphe’s website.
On Sunday, internet reporter Def Noodles, who has close to 100,000 Twitter followers and has amplified the accusers’ stories since they began to emerge, tagged Morphe in a tweet which laid out the accusations against Charles.
It follows similar criticism from Def Noodles over a post on Morphe’s Instagram on Saturday 3 April which promoted the James Charles palette. Fellow influencer Trisha Paytas, who has used her platform of 1.4 million YouTube subscribers and 5.6 million TikTok followers to frequently criticize Charles, also tweeted to denounce Morphe, calling its continued association with Charles “embarrassing.”
Morphe has yet to address the allegations against Charles or his statements in his apology video. On the brand’s website, a page entitled Morphe x James Charles says: “Step into the crazy-colorful world of our Morphe Babe, James Charles. It’s so, so good!”
“Morphe Babe” is the term used by the brand for loyal customers and influencers it works with. Morphe is known for being one of the first beauty brands to heavily work with online content creators, both in the form of product collaborations and also affiliate programs, which allow influencers to make a small percentage of the revenue of each sale which comes via their unique code. This form of monetization has been criticized for potentially encouraging influencers to promote brands they are affiliated with over other products.
This is not the first time Morphe has found itself facing criticism for its links to controversial influencers. Just last year the brand cut ties with YouTuber Jeffree Star, whose makeup brand was sold in Morphe’s stores, amid controversy following the release of his makeup collaboration with fellow YouTuber Shane Dawson. Dawson and Star – two of the biggest names on the platform with a combined YouTube following of more than 35 million people – became the focus of intense scrutiny last year, much of which focussed on Dawson’s past behavior in YouTube videos, which ranged from using blackface to making jokes about pedophilia.
Insider has reached out to Morphe for comment on its ties to James Charles and has yet to hear back.
As a young woman, Benscoter spent five years in the Unification Church but left in 1979 after her increasingly desperate family arranged to have her deprogrammed.
Reflecting on this time, Benscoter said she is all too familiar with the “shame and indignation” that a person experiences when they leave a cult.
This led her to set up a non-profit called Antidote, which runs support groups for people who have been engrossed in cultic ideologies and are trying to reconnect to reality.
In recent months, and especially since the Capitol riot on January 6, Benscoter has become the last hope for relatives of people lost to the QAnon delusion.
Her work has become as prevalent as ever. According to a recent Ipsos poll, more than 40% of Americans said they believe the deep state, a term used by QAnon and other conspiracy theories, is actively working to undermine Former President Donald Trump.
Around 17% of participants said they think that a “group of Satan-worshipping elites who run a child sex ring are trying to control our politics and media” – another core belief of the discredited conspiracy theory.
Recent news reports have described how the QAnon conspiracy theory has seeped into all corners of society and as a result, has ripped apart families, devastated friendships, and broken up romantic partners.
Benscoter told Insider that her inbox has been flooded with around 100 new emails a day. Most of them are from concerned family members of QAnon followers who are desperate to build bridges.
While there is no quick fix, Benscoter said she is able to give people the tools to “begin to break down barriers … so that they can be more effective in helping their radicalized loved one get to the point where they might consider the possibility that they’ve been taken advantage of or lied to,” she said.
One approach Benscoter teaches family members is to speak to their loved ones in a gentle, non-judgmental, and open-minded manner.
“You can’t argue facts, because they’ve already come to believe that any source of information outside of what they’re digesting is a lie, and is evil,” Benscoter said. “But if you can have them take a look at the possibility that maybe they were tricked or taken advantage of or lied to, then that’s already a good starting point.”
Another former Moonie, Dr. Steven Hassan, a mental health counselor and author of “The Cult of Trump”, has made educating people about mind control his life’s work after he left the Unification Church in the 1970s.
He told Insider he too has witnessed a dramatic increase of interest in his work ever since QAnon. Like Benscoter, Hassan believes it’s important to not stigmatize those who have fallen prey to disinformation.
“The public tends to blame the victim when trying to understand why they get involved with unhealthy cult groups and they believe people are weak or stupid or something’s wrong with them,” said Hassan.
“When really, it’s more a case of they were lied to, and were incrementally influenced to adopt certain beliefs and behaviors. And that, in a sense, is not their fault.”
Hassan recently launched a hashtag campaign, #IGotOut, which he hopes will make it easier for QAnon followers to seek help.
Leaving a cult is by no means easy, he said, but it is possible. For him, that moment came when he woke up as he was driving into the back of a trailer truck after he had been deprived of sleep for days.
The almost fatal accident put him away from the group for three weeks, which led him to reach out to his family who then organized an intervention.
Hassan is hopeful that other family members can also help those who have been lost to QAnon but believes it is ultimately up to the person themselves to leave.
“Family members can help, the media can help, but in the end, I think people almost always get themselves out of cults. Not because someone’s pushing them, but because they realize with time that it isn’t what they thought it was,” Hassan said.
“In the meantime, I’m encouraging family members and friends to get educated about cults, and how to talk effectively and strategically with people involved in these groups and engage them with love and respect, and curiosity and asking questions in a non-confrontational way.”
Facebook-owned Instagram is planning to build a version of its app targeted specifically toward children under 13, BuzzFeed News reported Thursday.
“We have identified youth work as a priority for Instagram and have added it to our H1 priority list, Instagram vice president of product Vishal Shah said in an internal memo, according to BuzzFeed.
“We will be building a new youth pillar within the Community Product Group to focus on two things: (a) accelerating our integrity and privacy work to ensure the safest possible experience for teens and (b) building a version of Instagram that allows people under the age of 13 to safely use Instagram for the first time,” Shah added, according to BuzzFeed.
Currently, Instagram policies prohibit children under 13 from using the app, though a parent or manager can manage an account on their behalf.
BuzzFeed News reported the kid-focused version will be overseen by Instagram head Adam Mosseri and led by Pavni Diwanji, a Facebook vice president who previously led YouTube Kids and other child-focused products at the Google subsidiary.
“Increasingly kids are asking their parents if they can join apps that help them keep up with their friends. Right now there aren’t many options for parents, so we’re working on building additional products – like we did with Messenger Kids – that are suitable for kids, managed by parents,” a Facebook spokesperson told Insider in a statement.
We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more,” they added.
But Facebook’s push to draw young children into its app ecosystem is likely to draw scrutiny given its track record on privacy, preventing abuse and harassment, and scandals involving its Messenger Kids app.
BuzzFeed’s report comes just days after Instagram published a blog post introducing new child safety features, including AI-powered tools to guess users’ ages – despite acknowledging “verifying people’s age online is complex and something many in our industry are grappling with.”
Facebook’s stepped-up efforts to protect children follow years of reports that rampant bullying, child sex abuse material, and child exploitation exists on its platform, and some research suggests the problem may be getting worse.
Facebook said in January that its AI systems “proactively” catch 99% of child exploitation content before it’s reported by users or researchers – however, that number doesn’t account for content that goes unreported.
That same year, the Federal Trade Commission hit Facebook with a $5 billion fine over privacy violations – though privacy advocates have argued that it did little to prevent Facebook from scooping up user data.
Other tech platforms have had missteps as well when it comes to protecting children’s privacy online. Google reached a $170 million settlement with the FTC to settle allegations that YouTube illegally collected kids’ data without their parents consent. In September, a British researcher filed a $3 billion lawsuit against YouTube, alleging it illegally showed “addictive” content at children under the age of 13 and harvested their data for targeted ads.
DrWitnesser, the online alias of internet pastor and streamer Joseph Hennig, announced on Twitter that he intends to pursue legal action against Twitch.
Hennig, a member of the strict Seventh-Day Adventist Church Christian denomination, started streaming on the Amazon-owned platform in April 2020 under the DrWitnesser moniker. Influenced by the popularity of the mustached alpha bro DrDisrespect, Henning joining random Fortnite groups and preached the word of god, whether his teammates wanted to hear it or not. Dressed in a black button-up and an orange tie, his image is eye-catching if not dramatic.
Over the next five months, he amassed over 140,000 followers on the platform, who would watch him tell random teenagers and strangers not to say the lord’s name in vain or that they need to repent for their sins. His TikTok grew alongside his Twitch fame, reaching over 350,000 followers in just the latter half of the year.
In July of 2020, Hennig received his first suspension for seven days on Twitch. The action came after telling a young Muslim person in his game that “if you were to die in your sins today, you would be sentenced to hell.”
In a Twitter video, Hennig says that sometime after his first ban, his Twitch account did receive another 7-day suspension for currently unknown reasons.
An indefinite ban would come Hennig’s way in January 2021, for “engaging in hateful conduct against a person or group of people” according to a message he received from Twitch. In a now-deleted tweet, Hennig wrote “Twitch has yet again showed they have no tolerance for a #Christian streamer who preaches what the #Bible teaches. They are a bias, hypocritical organization that shoves their agenda down everyone’s throats.”
After his Twitch ban, the preacher moved over to controversial platform DLIve before transitioning to YouTube where he has 14,000 subscribers. His clips would still gain the attention of critics, with YouTuber Kurtis Conner sharing a video about the DrWitnesser streams to his 2.9 million subscribers.
“My guess is he plays Fortnite to get to these kids when they are young and impressionable and scared,” Conner says in the video. “It’s really gross, it’s really f— weird man.”
On Sunday, Hennig announced on Twitter that he is planning on suing Twitch for “unlawful termination of my Twitch account on the basis of religious discrimination.” The screenshot he shared features the Superior Court of California and the county of San Francisco, which currently does not have a lawsuit on file under Hennig or the DrWitnesser name. The streamer also tweeted that he is looking to find the “right firm for legal representation.”
“Don’t be disappointed,” wrote one subscriber on a popular QAnon Telegram channel late Thursday night. “The race is not run yet and I have reason to believe March 20 is also possible.”
Another believer posted a similarly optimistic message. “We still have 16 days,” they wrote. “Lots can happen between now and then!”
With the passing of March 4, a highly-anticipated date for the conspiracy group, followers remain characteristically delusional.
With the uneventful passage of yet another supposedly momentous date, QAnon fans spent Friday morning urging followers to look forward and “keep the faith.”
QAnon’s March 4 failure
When “the Storm’ – the promise of mass arrests and executions on Joe Biden’s Inauguration Day -amounted to nothing, followers of the QAnon conspiracy theory scrambled for a new date to imagine Trump’s fictional swearing-in ceremony.
March 4, like several fruitless dates that preceded it, was born out of a convoluted political fantasy.
QAnon adherents borrowed from the obscure US-based sovereign-citizen movement to suggest that Trump would return to power on March 4, 2021. Sovereign citizens “believe that they get to decide which laws to obey and which to ignore,” according to the Southern Poverty Law Center, a nonprofit organization that tracks extremism.
The conspiracy-theory movement will continue to invent new dates to look forward to, or else their years of obsessional beliefs will all have been for naught, say far-right experts.
“Reality doesn’t really matter,” Nick Backovic, a contributing editor at fact-checking website Logically, where he researches misinformation and disinformation, told Insider. “Whether QAnon can survive another great disappointment, there’s no question – it can.”
The March 4 theory is rooted in a bizarre belief that argues all laws after the 14th Amendment, ratified in 1868, are illegitimate.
The 20th Amendment, which moved Inauguration Day from March 4 to January 20, is viewed by sovereign citizens as invalid.
Therefore, proponents of this conspiracy theory insisted that Trump would restore a republic that has been out of action for over 150 years on the day when former presidents were sworn-in.
Travis View, a conspiracy theory expert and host of the QAnon Anonymous podcast, previously told Insider that it’s based on a “blind faith” that Trump can “fix everything.”
A series of no-shows
Before March 4, the QAnon follower’s calendar was marked with a string of dates that were once hailed as moments of reckoning that didn’t happen.
In 2017, the first “Q drop” – the cryptic messages from the anonymous “Q” figure whose guidance runs the movement – claimed that former Secretary of State Hillary Clinton would be arrested because of an unfounded allegation that she was involved in child sex trafficking. This, of course, never happened, but the QAnon conspiracy theory was born.
Then, in a bid to reconcile their belief that Trump would remain president, they believed January 6, which went on to be a deadly insurrection at the US Capitol, was a precursor to “The Storm” – a violent event that would result in the execution of child-abusive elites.
The goalpost was then moved to January 20, based on the claim that Trump would seize power prior to Biden taking his oath.
But Trump was not inaugurated again on January 20 and instead left Washington to move down to his Florida home. In the hours after Biden’s inauguration, some QAnon believers were left confused and crestfallen.
Mental gymnastics ensued, with some QAnon influencers arguing that Biden’s inauguration had happened in a Hollywood studio and was therefore invalid; others claimed that Trump sent signals during his final pre-inauguration address indicating that he’d remain in office. These influencers again promoted to their followers the idea that somehow, their theory was not yet over.
“QAnon is dealing with a very difficult cognitive dissonance situation,” Michael Barkun, professor emeritus of political science at Syracuse University, told Insider.
Naturally, some believers become fed up with failures
A Wednesday post on a QAnon Telegram channel with nearly 200,000 subscribers called the plan “BS,” though the same page told their followers that the “new Republic” would begin on March 4.
Another top conspiracy theorist told their 71,000 subscribers on Wednesday morning that a “Q drop” contained a hint that the March 4 conspiracy theory was a false flag. “March 4 is a Trap,” the post said.
Whenever QAnon’s prophecies are proven wrong, the movement does lose some support, Backovic said.
In the days after President Biden’s inauguration, many QAnon believers did express a desire to leave the movement, fed up with the lies they’d been told. Even Ron Watkins, once QAnon’s top source for voter-fraud misinformation, told his 134,000 Telegram subscribers in the afternoon of January 20, “Now we need to keep our chins up and go back to our lives as best we are able.”
QAnon influencers calling the March 4 conspiracy a “false flag” also helps place blame on others in case things go awry like they did on January 6. Finding a scapegoat is a common tactic for extremists, according to Backovic.
After the Capitol insurrection, QAnon supporters and other pro-Trump protesters – and several Republicans in Congress – spread the false claim that antifa, the anti-fascist movement, staged the deadly coup attempt on the Capitol.
In addition to focusing on specific dates, QAnon has evolved and adapted to include other conspiracy theories and enter more conventional spaces.
Last spring, the movement pivoted to focus on ending human trafficking, making “Save the Children” its new battle cry. QAnon leveraged on mainstream social media, including Instagram, where lifestyle influencers spread it.
With nothing happening on March 4, believers look forward (again)
The latest disappointment has already resulted in new dates being introduced with increasingly desperate explanations.
Some QAnon influencers have suggested that March 20 is when Trump will seize control, misinterpreting the Presidential Transition Enhancement Act of 2019, which streamlines the presidential transition by providing certain services to the previous administration 60 days after the inauguration.
The claim, first made on a popular QAnon Telegram channel, appeared to be making ground with supporters offline, too. A QAnon supporter interviewed by The Washington Post’s Dave Weigel said he believes Trump remains in command of the military and will be inaugurated on the 20th.
But core followers of the conspiracy theory are reluctant to throw all their weight behind a particular date.
In another Telegram message board for QAnon believers, one post encouraged people to remain open-minded about Q’s plan. “Dates for late March, April, May, and more dates in the fall have been tossed out there,” the post said. “While we can speculate and hope, no specific dates have been landed on… don’t get caught up in the dates, watch what’s happening.”
For those tempered by repeated disappointment, some are simply set on a resounding victory for Trump in 2024.
“Whether it’s some date in March or whether ultimately it will be a second Trump term after an election in 2024,” Barkun told Insider. “There will be some further set of explanations and a further set of dates.”
Almost a year ago, Chelsea Brickham posted on TikTok for the first time.
The video got more than 500,000 views. Brickham, a 38-year-old trans woman living in Florida, posted photos of her transition after seeing other trans creators do the same. About 2,000 positive and encouraging comments appeared underneath the video.
“My initial reaction to TikTok was that it was such a positive and nurturing environment,” she told Insider. “And that’s why that actually saved me. It pulled me out of that dark place at that moment. It really did wonders for my mental health.”
Brickham was days away from getting her long-awaited gender-affirming surgeries when the coronavirus pandemic caused the hospital to cancel them. Facing the cancellation, the costs of private health insurance, and a shift to telework, Brickham turned to TikTok for “some kind of distraction, and maybe brief levity,” she said.
“It kind of takes my breath away – even now, thinking back in retrospect – because every single one of those 2,500 comments was supportive and positive and just telling me things I needed to hear,” Brickham said.
Months later, the positivity came to a screeching halt.
One of her recent videos, which got more than 1 million views and wasn’t unlike the rest of her content, led to a flood of transphobic and other attacks on her appearance.
Brickham said the experience shattered her perception of the app. She wasn’t sure why this video, in which she responded to a commentator who had misgendered her, had elicited such a different response.
Most of the comments appeared to come from young, straight, cisgender men who misgendered her, she said. For these types of comments, Brickham said, she often visited the commenter’s profile to educate them.
“I just kind of deal with it with a factual, straightforward approach,” she said, adding that she often tells transphobic commentators they don’t have the “credentials” to make claims about her gender.
In one more egregious comment that Brickham reported, a TikTok user said it was a “shame” that cancer, which she’d recently had, didn’t kill her.
Trans TikTokers find community, but also abuse and harassment
In January 2020, The Washington Post dubbed TikTok “the soul of the LGBTQ internet,” adding that young LGBTQ people used TikTok “to share their raw feelings with each other” in a way not seen on legacy social-media platforms. As of this February, videos using the hashtag “#lgbtq” had more than 665 million views.
TikTok, owned by the Chinese company ByteDance, has publicly aligned itself with LGBTQ communities, and last year it donated $3 million to LGBTQ-focused organizations such as GLAAD and the Trevor Project.
But transgender creators say TikTok is an unwitting accelerator for transphobia and harassment.
Half a dozen trans TikTok creators, with a combined follower count of more than 3.1 million, told Insider that while the app had allowed them to build impressive followings and find a sense of community, its design appeared to perpetuate a culture of transphobia and harassment.
Creators detailed the harassment and abuse they’d experienced on the app; they all said they had experienced it to a greater degree on TikTok than on other social media platforms, in part because of the app’s central algorithm-driven feed. TikTok features – like duets, which allow users to respond to another user’s videos – have also been a tool for harassers.
Their concerns and experiences raise questions about TikTok’s ability to moderate content on the app.
Trans creators said their experience soured weeks after they began posting on TikTok
Last spring, before COVID-19 travel restrictions were imposed, Madelyn Whitley, a 20-year-old transgender woman and model living in New York, joined TikTok. She and her twin sister had traveled to France for fashion week and were living there.
Whitley, who had about 10 followers, posted a video of her and her sister, also a trans woman and model, for Trans Day of Visibility, a holiday that honors and recognizes trans people. The video “barely took off,” gaining about 20,000 likes, she said. But the attention skyrocketed from there.
“Everyone was so nice on that first video,” said Whitley, who as of February had 300,000 followers on the app. “And then I think down the line, maybe in September, I posted another one that had the complete opposite reaction – most of it was negative.
“I remember my first hate comment,” she added. The commenter had misgendered her and told her she was going to hell, Whitley said.
“I don’t know why,” Whitley added, “but some of the comments can get really, really transphobic, and we haven’t really experienced this anywhere else.”
Hateful behavior on TikTok can take many forms, including comments, collaborations, and direct messages.
A TikTok representative told Insider in an emailed statement that the platform “is a community with millions of diverse creators, and the platform wouldn’t be what it is today without the range of voices and experiences our users bring.”
“There is no place for hate and harassment on TikTok, and we’re committed to creating a safe space for our users, continually improving our protections for the LGBTQ community and other underrepresented groups, and being an active ally,” the statement said.
In December, the company said it was updating its community guidelines to make them more “inclusive and thoughtful,” adding rules and updating policies to prohibit doxxing, cyberstalking, and sexual harassment.
But negative messages are “as small as ignorant comments of people just commenting one word, ‘woman,’ or, like, saying I can never be a man,” Aiden Mann, a 26-year-old transgender man from Tennessee who has 2.2 million TikTok followers, told Insider.
“I’ve had people who messaged me on an anonymous account and, in detail, explain to me if they had the opportunity to kill me how they would do it,” he added. Mann said others had suggested he end his life by suicide.
The ‘wrong side’ of TikTok
In essence, TikTok functions as a custom cable network. The app’s algorithm acts as a network executive, deciding which videos get spread to certain users, based largely on what it suspects to be the user’s taste because of their past behavior.
Central to TikTok is the “For You” page. While users can watch videos from a list of accounts they follow, the primary means of consumption is the seemingly infinite stream of videos found on the page.
Platforms like Facebook, Twitter, Snapchat, and especially YouTube similarly offer content, curated by an algorithm, from sources beyond the users a person has followed. But none uses such a system as its primary content driver in the way that TikTok does.
“I like to think of TikTok as a broadcast platform, like a channel that you’re watching rather than a social network,” said Daniel Sinclair, an independent researcher who studies TikTok and other social-media platforms, “because although you do have access to your following, TikTok is still controlling what you see.”
It is “entirely possible” that the “For You” page, led by TikTok’s algorithm and human moderators, could inadvertently lead to harassment, Sinclair said.
“I think the broadcast-first distinction is big because it’s TikTok that’s directing content and directing what you see more than many other platforms,” he added.
TikTok first shows a video to a small batch of people it thinks will be interested in it, based on a list of factors outlined in the blog post. Some of these users already follow the creator, while others don’t. Videos from accounts with larger followings may have an advantage, but “neither follower count nor whether the account has had previous high-performing videos are direct factors in the recommendation system,” the company said.
If the video performs well (users like or share the clip, or watch the entire video), the algorithm recommends it to more people. The process is repeated; if a video continues to be popular, it can quickly go viral.
The page has been credited with driving TikTok’s meteoric rise since the app emerged from Musical.ly in 2018, about a year after ByteDance purchased it. It has paved the way for TikTok’s culture of uber-fast virality and blink-and-you’ll-miss-it trends. Users with small followings have the opportunity for viral fame – their videos can be distributed to thousands and sometimes millions of strangers within hours.
Carolina Are, who researches online moderation and algorithmic bias at City, University of London, told Insider that TikTok differed from other platforms like Instagram because the app’s algorithm and focus on short videos made it easier for content to go viral.
“Because of that, because there’s no meaningful interaction, it feels like creators do not look human to people who comment, and therefore it feels very easy to just hate,” she said.
Trans creators told Insider that the “For You” page allowed them to quickly find a community and support on the app.
But there’s a hefty con to the page, they said, in that they have little control over their audience, and the audience has little control over what shows up on their screen.
“I’m starting to understand that there are different facets of going viral on TikTok,” Brickham said, alluding to the concept of “straight TikTok” versus “alt TikTok,” in which users experience vastly different types of content, memes, trends, and creators.
“There’s obviously the GLBT-positive sort of feed. And then there’s obviously, like, the conservative side and the Trump feed. And you’ve got the heterosexual sort of feed as well,” she said.
Mann also described an “LGBTQ side” of the platform where his videos often remained. But he said his experience would swiftly sour if his content ended up elsewhere.
Fletcher Furst, an 18-year-old from Alberta, Canada, argued that the algorithm behind the “For You” page was just part of the story. Furst speculated that transphobic users search for content from trans creators via hashtags like #lgbtq or #trans, leading TikTok to recommend similar content to them in the future.
Creators told Insider that the transphobia they faced on TikTok was more intense than on other social-media platforms. Unlike other apps that rely largely on a connection between creators and their followers, TikTok creators broadcast to communities that can include not only their followers but legions of people with similar interests who’ve never seen their content before.
Suddenly, the creators said, TikTok videos can end up in an entirely different community.
“Sometimes for some reason – I have no idea why – my transgender videos end up on straight TikTok, or the conservative side of TikTok, or religious TikTok,” Mann said. “And then I get the really bad bashing and hateful comments and death threats and stuff like that.”
In contrast, he said, “on Instagram, the only people who are going to see your posts, more than likely, are the people that are following you, and then same with Twitter and Facebook.”
On other platforms, “people can share stuff, and they can get to the wrong side, but it’s a lot more difficult,” he said. “With TikTok, your video can end up on the wrong side of TikTok any day, at any time. Then when it blows up, it goes on and on.”
Trans creators say their videos were removed while abusive content remained
Samuel Monger, a 17-year-old trans man from Oregon, estimated that about 10 of his videos had been removed from TikTok, for reasons that weren’t exactly clear to him. He said the deleted videos weren’t sexual or violent but involved him speaking about his experiences as a trans person.
He said TikTok had told him that these videos violated its community guidelines. He appealed, but the videos weren’t reinstated, leaving him frustrated. He tried to re-upload videos, and they were deleted again, Monger said. He was confused about why the videos were removed in the first place, but he moved on to new content.
In one video, which the company reinstated after Insider inquired about its removal, Monger showed off different facets of his style, modeling dressed-down and dressed-up outfits.
He said other trans creators had faced similar punishments when trying to, for example, educate trans youth on how to safely bind their chest to create a more masculine or nonbinary appearance.
Monger said that while he’d never shown his chest on TikTok, it was frustrating to see cisgender men – often some of TikTok’s biggest stars – appearing shirtless in videos, “advertising their bodies.”
TikTok has previously been criticized over its moderation policies. Last March, The Intercept reported that a company memo had in some markets directed moderators to keep users that they judged to be disabled, poor, or ugly from the “For You” page. At the time, the company said that the policies were an early attempt at preventing bullying, that they were no longer in use, and that they had never been implemented in the US.
A study by the Australian Strategic Policy Institute’s International Cyber Policy Centre published in September found that hashtags related to LGBTQ issues had been suppressed on the platform in at least eight languages.
Sinclair speculated that the policies were designed not to mitigate harassment but to limit content the company viewed as “unsightly.” He said it pointed toward a larger issue as TikTok’s Chinese parent company expanded to new markets and navigated content moderation.
Furst told Insider he’d also had several videos removed and was told that they’d violated TikTok’s community guidelines.
“I had a video where I tried to speak up on my experience being bullied in high school for being transgender,” he said. “And that video got taken down right away. I don’t know why. They just said it went against their guidelines.
“Maybe ’cause I just mentioned being trans, but then there are videos that are still up of people encouraging harm towards trans people, and it’s just insane,” he added. “It’s like, how come that stays up and my content gets taken down?”
The video, first uploaded in November, was reinstated by TikTok in February after Insider asked the company about its removal.
In August, Eric Han, TikTok’s US head of safety, said that since January it had removed more than 380,000 videos, 64,000 comments, and 1,300 accounts for violating its policies on hate speech.
“To be clear, these numbers don’t reflect a 100% success rate in catching every piece of hateful content or behavior, but they do indicate our commitment to action,” Han said.
Han said TikTok was updating its hate-speech policy, removing hateful content from the app, “increasing cultural awareness” in content moderation, improving transparency, and working with its teams and partners “to invest in our ability to detect and triage hateful or abusive behavior to our enforcement teams as quickly as possible.”
Han also said the company was training its content moderators on the difference between a marginalized group using a slur “as a term of empowerment” and a person using the same word hatefully.
“Educating our content moderation teams on these important distinctions is ongoing work, and we strive to get this right for our users,” Han said.
Mann said he’d been frustrated by TikTok’s inaction after he reported multiple videos he found transphobic.
“A lot of the videos that I reported come back saying that it’s not against community guidelines. I’m kind of in shock,” he said. “This person is literally making transphobic comments or making transphobic jokes. How is that not discrimination?”
Furst said TikTok would be more inclusive if it allowed creators to designate their videos as “educational,” to “be able to educate people about trans stuff without it being taken as sexual and then be taken down.”
All the creators who spoke with Insider said TikTok could change its community guidelines to better protect trans users.
Otherwise, Furst said, “it definitely feels like that app just wasn’t created for you.”
Hateful comments on TikTok can have real-life effects on trans communities
“Trans youth are continually being retraumatized through harassment that they experience both in the world that they live in and also when they show up online,” Dr. Ric Matthews, a psychotherapist in New York who works with LGBTQ communities, told Insider.
But when it comes to apps like TikTok, “not using these platforms really isn’t an option at this point,” Matthews said. “It’s an inescapable way of connecting and communicating and a necessity for social survival.”
Matthews added that “when harassment, bullying, and different types of violence that they experience in these spaces happens, it’s exacerbating isolation and alienation to people who are already battling to have safety in spaces that they occupy physically.”
Harassment on social media can also set back trans youth who are developing their identities, said Dr. Melissa Robinson-Brown, a psychologist in New York who works with young people.
“I think one of the reasons it’s pretty harmful is because especially with our generation today, so much of their time is spent on social media and on platforms like TikTok,” she told Insider.
“They’re building their communities, finding their tribe and their friends,” she added. “And so to see the transphobia, to see the negativity and the discrimination, can really be harmful to self-esteem-building and that sense of self-worth that is really just so critical for youth in general.”
Monger told Insider that while he could typically brush off hate-filled comments, he worried that the transphobia could affect other young and impressionable trans people on TikTok.
“I’m confident in myself, but there are kids who are not confident in their identity,” Monger said. “And seeing people say that they want to kill people really does not help them.”
Trans youth are at a higher risk than their peers of attempting suicide. A study published by the American Academy of Pediatrics in 2018 detailed a survey of about 120,000 young people, conducted from 2012 to 2015, in which 51% of trans boys and 30% of trans girls said they had attempted suicide, compared with 18% of cisgender girls and 10% of cisgender boys.
Trans TikTokers told Insider that hateful messages targeted their appearance or included problematic phrases like “What’s your real name?”
Mann, who said he had struggled with his appearance after multiple top surgeries – procedures to reshape the chest and remove breast tissue that left him with scars – said users left nasty comments about his body.
“People attack that all the time,” Mann said. “I’m still hoping to get them fixed.”
Mann shared with Insider four TikTok videos posted from April to June, each with more than 6,000 likes, that made fun of his chest. He said that he’d reported the videos to TikTok but that they weren’t removed.
TikTok removed all four of the videos after Insider flagged them.
“It makes me mad that they only removed them to seem to cover their a–,” Mann said.
Jade Marie Eichelberger, a 19-year-old trans woman from South Carolina, told Insider that her experience with transphobia on TikTok involved users’ desire to hear details of her transition and the trauma of being a Black trans woman.
“They really want you to talk about everything trans-related, down from the surgeries to how it makes you feel and how people treat you,” she said. “And sometimes you don’t really want to think about that or create about that, because cisgender people are not pressured to make videos about their trauma.
“Especially trans women of color, we’re always pressured to tell stories of things that have happened to us, because people want to use us as an example as to why people should be nicer to trans folks,” she added. “They always go for the people who were the most marginalized within the community to hear those sad and traumatic stories.”
Eichelberger said her videos that homed in on her transition or her experience as a trans woman performed well, but her videos about other topics seemed to fall out of favor.
She and other trans creators often field inappropriate and transphobic requests from TikTok users asking them to show their “real voice” or to upload pictures from their childhood, she said.
She added that while some creators might not have a problem with that, the requests and pressure to make that kind of content were rude and transphobic, implying that her identity is part of a performance.
“I don’t really feel comfortable with doing that,” Eichelberger said. “Am I ashamed of my childhood pictures? Hell to the hell no. I was a cute kid. But because I know why people want to see them, it makes me uncomfortable.”
Whitley said she’d had to alter how she operates on TikTok after receiving a series of transphobic comments that were fueled by a popular TikTok creator.
Chris, known on the app as @Donelij, would use the split-screen duet feature to react to videos of gay and trans creators. Chris’ smile would turn into a frown as videos of people skirting gender norms or photos displaying a person’s transition appeared. His videos would often get more than a million views. When his account was banned, he had 2.5 million followers.
“He just kept dueting them over and over and sending thousands of transphobes to me,” Whitley said.
While TikTok banned his first account last year, Chris continued to post videos to millions of followers using other accounts. TikTok banned an account he was using in February after Insider inquired about it.
Chris told Insider he was “not transphobic” and declined to comment further. Last year, he told The New York Times that he had been the target of racist harassment on TikTok. His facial expressions’ were meant as jokes, he said.
For Whitley, the videos had consequences that were far from funny.
Whitley said she’d had to limit comments on her content to prevent users from sharing her deadname (the name she went by before her transition), her address, and her phone number, all of which she said people had threatened to post.
Whitley said that since the duets stopped, some of the negative attention had subsided – but she estimated that about half of the comments she receives are negative or outwardly transphobic. She said she’d become “desensitized” to them.
“I don’t take them to heart,” Whitley said. “I’m stronger than that, and it kind of just boosts my engagement. I’m just going to take it as a positive and move on instead of focusing on their negativity.”
Top anti-vaccine advocacy groups received PPP funding from the Trump administration, The Washington Post reported.
American distrust in the safety of COVID-19 vaccinations continues to pose a threat to public health.
K. “Vish” Viswanath, a professor of health communication at the Harvard T.H. Chan School of Public Health, told Insider that anti-vaccine groups are “likely to perpetuate the adverse impacts of the pandemic.”
Five top anti-vaccine advocacy organizations that have spread medical misinformation throughout the COVID-19 pandemic received funding from the Trump administration’s Paycheck Protection Program (PPP), The Washington Post reported Monday.
The loans from the Small Business Administration totaled more than $850,000, according to the report.
K. “Vish” Viswanath, a professor of health communication at the Harvard T.H. Chan School of Public Health, told Insider that to call the loans ironic “doesn’t do justice to my feelings.” He said anti-vaccine groups are “likely to perpetuate the adverse impacts of the pandemic.”
The groups that reportedly received PPP funding were the National Vaccine Information Center (NVIC), Mercola Com Health Resources LLC, Informed Consent Action Network, Children’s Health Defense Co., and the Tenpenny Integrative Medical Center, The Post reported, citing an exclusive report from the Center for Countering Digital Hate, a UK-based advocacy group that fights hate and misinformation online.
“Lending money to these organizations so they can prosper is a sickening use of taxpayer money,” Countering Digital Hate CEO Imran Ahmed told The Washington Post. “These groups are actively working to undermine the national COVID vaccination drive, which will create long-term health problems that are felt most acutely in minority communities and low-income neighborhoods.”
The largest loan – $335,000 – was given to Mercola, a website published by the anti-vaccine activist Joseph Mercola. NewsGuard, a nonprofit that tracks misinformation, reported that the site has “published false claims about standard medical practices such as vaccinations.”
Mercola, a businessman and doctor of osteopathic medicine, is himself a major donor of the NVIC. The Washington Post reported in 2019 that Mercola gave the NVIC $2.9 million, making up roughly 40% of the group’s funding. Mercola has millions of followers on Facebook.
The Pew Research Center said in a December report that about 39% of Americans said they would definitely not, or probably not, get the vaccine. 21% of American adults surveyed said they were “pretty certain” that new information about COVID-19 vaccination would not change their minds.
Anti-vaccine advocacy groups have played a major role in propagating that distrust, Viswanath said.
Even if these groups qualified for the loans legally – as the Small Business Administration told The Washington Post – it’s a question of whether the loans are “morally” correct, Viswanath said, as they are providing the country with “additional ammunition” to question medical professionals “by exploiting the tremendous scientific achievement of developing the vaccines.”
As supporters of President Donald Trump stormed the US Capitol Building in a riot on Wednesday, one figure stood out among the mob: the “Q Shaman,” aka Jake Angeli.
Angeli, known for wearing red, white, and blue face paint and a horned helmet, has become a notable figure in the QAnon conspiracy-theory movement, popping up at far-right rallies in Arizona in the past year, The Arizona Republic reported.
On Wednesday, Angeli took photos on the Senate dais and marched through the Capitol with a megaphone.
The Arizona man is charged out of the United State’s Attorney’s Office of DC with with knowingly entering or remaining in any restricted building or grounds without lawful authority, and with violent entry and disorderly conduct on Capitol grounds.
The case is being investigated by the FBI’s Washington field office and the United States Capitol Police.
Angeli’s presence at the riot, along with others wearing QAnon paraphernalia, comes as the conspiracy-theory movement has been responsible for the popularization of Trump’s voter-fraud conspiracy theories.
Angeli told Globe and Mail correspondent Adrian Morrow that police “politely asked him to leave” after they reportedly let him into the building.
QAnon is a far-right conspiracy theory baselessly alleging that Trump is fighting a “deep-state cabal” of pedophiles and human traffickers. The movement behind it has played a massive role in organizing nationwide “Stop the Steal” protests in the two months since President-elect Joe Biden won the 2020 election.
The “Q Shaman” is one of many figures in the world of QAnon whose actions inspire and influence the movement. QAnon originated with an anonymous figure called “Q” who writes cryptic messages on the fringe message board 8kun (previously known as 8chan). As Q has become increasingly hands-off, giving fewer and fewer messages to his devotees, QAnon leaders like Angeli have gained fame and power in the movement.
Later on Wednesday afternoon, Angeli grabbed a microphone outside the Capitol and told people to go home, according to a tweet from Kevin Roose, a technology columnist at The New York Times.
As Trump seeks to undermine the election results, he has been getting much of his information on baseless voter-fraud allegations directly from the QAnon movement. The Dominion voter-fraud conspiracy theory, which baselessly alleges that Dominion Voting Systems interfered with the election, was popularized by Ron Watkins, a previous administrator of 8kun. Watkins’ father, Jim Watkins, has been suspected by some of being “Q,” or at least being associated with the figure (or group).
Wednesday’s riot included Trump supporters espousing QAnon, as well as members of other far-right groups like the Proud Boys. Many popular QAnon accounts were celebrating the Capitol siege on Wednesday, saying it was the first step in some kind of civil war.
Others at the riots were seen wearing QAnon paraphernalia. In one video shared on Twitter, a man in a QAnon shirt appears to be one of the first in a massive group of rioters entering the Capitol.
Amid DC rioting on Wednesday, a pipe bomb was found at the Republican National Committee (RNC) headquarters and the Democratic National Committee (DNC) headquarters were evacuated over a suspicious package, The New York Times reported.
The RNC and DNC are blocks away from the US Capitol, where pro-Trump rioters attempted a coup.
Another pipe bomb was found and safely detonated in the Capitol complex, CNN reported.
A pipe bomb was found at the headquarters of the Republican National Committee (RNC) in Washington, DC, as pro-Trump rioters stormed the US Capitol on Wednesday, The New York Times reported. Also on Wednesday afternoon, a suspicious package was found at the nearby headquarters of the Democratic National Committee (DNC), The Times said.
The explosive device at the RNC was destroyed by a bomb squad, according to The Times, citing an RNC official. A Democrat told The Times anonymously that the contents of the package had not yet been identified.
Authorities also found a pickup truck parked outside the RNC headquarters that reportedly held rifles, ammunition, and shotguns, The Washington Post reported, citing two sources familiar with the investigation who spoke on the condition of anonymity. Agents were investigating whether the truck was linked to the pipe bombs, the sources said, according to The Post.
The RNC and DNC are both only blocks away from the Capitol.
CNN anchor Jim Sciutto reported that another suspected pipe bomb was found within the Capitol complex, citing an unnamed federal law enforcement official.
Representatives for the RNC and DNC’s press offices did not immediately respond to requests for comment.
At the same time, rioters were attempting a coup, as Trump supporters entered congressional chambers in the Capitol. Vice President Mike Pence, whom Trump had angrily tweeted about earlier on Wednesday, and members of Congress were evacuated from the building as they had been meeting to certify President-elect Joe Biden’s election win.
As members of Congress urged the mobs to leave the Capitol, many pleaded with Trump to tell his supporters to leave. The FBI and National Guard have been deployed to Washington, and Mayor Muriel Bowser ordered a citywide curfew beginning at 6 pm on Wednesday.
Update: This story has been updated to note The Washington Post’s reporting regarding a truck parked outside the RNC’s headquarters.