Robinhood has settled a wrongful death lawsuit over the suicide of a college student who killed himself thinking he lost $730,000 on the app.
The company said in its public filing for an IPO on Thursday that the lawsuit was “dismissed with prejudice following a settlement between the parties.” The terms of the settlement were not disclosed in the S-1 filing.
Twenty-year-old Alex Kearns killed himself last June after seeing a negative balance of $730,000 in his account on the stock-trading app. However, the balance did not reflect the portfolio’s value or debt owed. It likely came from complex options trades, which can leave temporary balances while they settle over multiple days.
Before his death, Kearns emailed Robinhood asking for help after seeing his balance, according to CBS News.
“I was incorrectly assigned more money than I should have, my bought puts should have covered the puts I sold. Could someone please look into this?” he wrote.
Robinhood replied with an automated message saying the company would get back to him but their response could be delayed, CBS reports.
The company issued a statement addressing Kearns’ death last June.
“All of us at Robinhood are deeply saddened to hear this terrible news and we reached out to share our condolences with the family over the weekend,” Robinhood said at the time. “We are committed to continuously improving our platform and are reviewing our options offering to determine if any changes may be appropriate.”
Kearns’ parents filed the wrongful death suit in February. Robinhood’s S-1 says the lawsuit “asserted claims for wrongful death, negligent infliction of emotional distress and unfair business practices under a California statute, and sought damages and other relief.” Kearns’ parents reportedly said the app targets young people and encourages them to take part in risky trading.
The widow of John McAfee, the British-American tycoon who died in a Spanish prison this week while awaiting extradition to the US, said Friday that her husband was not suicidal when she last spoke to him hours before he was found dead.
“He would never quit this way, he would never take his life in this way,” Janice McAfee told reporters outside the Brians 2 penitentiary where she recovered her late husband’s belongings. “I don’t believe he did this. I want answers, I will get answers, of how this was able to happen.”
“His last words to me were ‘I love you and I will call you in the evening,'” she said in her first public remarks since the software entrepreneur’s death on Wednesday. “Those words are not words of somebody who is suicidal,”
Authorities in Spain are conducting an autopsy on McAfee’s body but have indicated that everything at the scene indicated that the 75-year-old killed himself.
“We were prepared for that decision and had a plan of action already in place to appeal that decision,” Janice McAfee, 38, told reporters. “I blame the US authorities for this tragedy. Because of these politically motivated charges against him, my husband is now dead.”
“All John wanted to do was spend his remaining years fishing and drinking,” she added. “He did not deserve to die in a filthy prison like a caged animal.”
Results of McAfee’s autopsy could take “days or weeks,” authorities have said.
According to his wife Janice, McAfee was well-liked by his fellow inmates, who called him “Papa America” and particularly appreciated his full set of teeth – useful for opening sauce packets and taking the filters out of cigarettes.
He also shared his thoughts about love and power. Six days before his death, he posted a video. It is unclear when the video was made, as it appears to be recorded in a soundproofed room rather than in prison.
“Humans are compassionate, loving, gracious, kind, generous people,” he said to camera We are simultaneously greedy, jealous, envious, angry – we’re a mixed bag. And if you give one of the human species power, which part of ourselves uses it? Love?” He laughed. “Love does not need power, people.”
The post said: “Getting subtle messages from U.S. officials saying, in effect: ‘We’re coming for you McAfee! We’re going to kill yourself,'” the tweet reads. “I got a tattoo today just in case. If I suicide myself, I didn’t. I was whackd. Check my right arm.”
At the time of the tweet, McAfee was actively evading US tax authorities. He did not explain what “subtle messages” he meant, or provide any evidence of his extraordinary claim that he was a marked man.
On Thursday, the day after his death, Villalba told Reuters that the death was a surprise.
“At no point had he shown any special worry or clue that could let us think this could have happened,” he said.
Earlier in the legal process, McAfee argued against being sent back to the US on the grounds that it would likely mean spending the rest of his life in prison.
The 2019 tweet had been shared more than 33,000 times as of early Thursday, including by Wikileaks and Michelle Malkin, an anchor of hard-right channel Newsmax, who retweeted the post with the hashtag “#IBELIEVEJOHNMCAFEE.”
McAfee’s Instagram account was deleted after his death, which had recently posted an image of the letter “Q.” It is unclear who runs the account or why it was deleted.
McAfee founded the McAfee antivirus software empire in 1987, which was bought by Intel in 2010. The mogul left the company in 1994 and later moved to Belize, where he became a subject of interest in the case of his neighbor’s murder, for which he was later found legally liable.
Before his death, he was due to be extradited as part of a criminal investigation into his tax affairs. McAfee testified against the move on June 15, saying the charges were politically motivated.
Several top QAnon influencers on Telegram, some of whom have hundreds of thousands of followers on the far-right-favorite messaging platform, shared posts on Wednesday afternoon including the word “suicide” in quotes.
“Word on the street, only time will tell if this report was true or not,” an account with 61,000 subscribers shared on the app. Conspiracies alleging that McAfee had a “dead man’s switch,” or a device that activates when its owner dies, were also being shared online.
McAfee was imprisoned in Spain pending extradition on tax evasion charges before his death was reported on Wednesday.
A 2019 tweet from the antivirus software mogul’s verified Twitter account appeared to be emboldening some of these claims: “If I suicide myself, I didn’t,” the tweet said.
QAnon, a wide-ranging, far-right conspiracy theory based on the false notion that former President Donald Trump had attempted to take down a “deep state” cabal of human traffickers and pedophiles, notably spread a similar theory when disgraced financier and convicted sex offender Jeffrey Epstein died by suicide in 2019: “Epstein didn’t kill himself” eventually became a major online meme.
Hours after McAfee’s death, a similar phrase became a popular hashtag on Twitter.
Online discourse over the McAfee Associates software company founder also focused on an Instagram post from McAfee’s verified account on Wednesday afternoon – hours after his death was reported by Reuters – that showed the letter “Q.”
A post shared by John McAfee (@officialjohnmcafee)
It was not immediately clear who had control of McAfee’s Instagram account. Instagram told Insider it was “looking into” the situation.
McAfee was an American software engineer who founded the anti-virus software company McAfee Associates and ran it until he resigned in 1994.
In 2012, Belize police considered him a “person of interest” in the murder of Gregory Viant Faull, a neighbor of his while he lived on the island. McAfee denied he was involved in the death and fled Belize.
McAfee also ran for president in the US as a libertarian in 2016 and 2020.
In March 2021, the U.S. Attorney’s Office for the Southern District of New York indicted McAfee on charges of fraud and money laundering for running what they said was a fraudulent cryptocurrency scheme. Federal prosecutors Tennessee later charged him with tax evasion.
QAnon leaders weighing in on McAfee’s death comes as no surprise, as the conspiracy theorists frequently repackage news stories to promote their own beliefs. When a rare winter storm hit Texas this year, QAnon influencers claimed that Bill Gates was behind the inclement weather; QAnon hotshots were infuriated Lil Nas X made headlines for grinding with the devil in a music video and selling shoes that reportedly contained a drop of human blood.
The Catalan justice department confirmed McAfee’s death to Reuters.
Earlier in the day, Spain’s National Court approved the extradition of McAfee to face the US-based charges.
McAfee, who made his fortune selling antivirus software, had been arrested at Barcelona’s airport last October and held in jail while awaiting the outcome of the extradition hearings.
McAfee was accused of evading his taxes in Tennessee by failing to disclose money he made from cryptocurrency and speaking fees, as well as selling the rights to his life story for a documentary. The charges carried a sentence of up to 30 years in prison, according to the Associated Press.
McAfee was fighting the extradition, and claimed to the court earlier this month that he would be forced to spend his life in prison if he was sent to the US. Despite being jailed, the outspoken mogul’s Twitter account remained active, praising cryptocurrency and railing against his prosecution.
After launching the antivirus software, McAfee turned his attention to cryptocurrency and political activism.
McAfee, born in the UK and raised in Roanoke, Virginia, made a name for himself after founding an antivirus software company in 1987. Large US corporations began using McAfee’s software by the late 1980s, and Intel acquired the company for $7.6 billion in 2010.
Earlier, in 1994, the founder had left the company and soon after sold all his shares for $100 million.
After reportedly losing much of his fortune in the 2008 financial crisis, McAfee moved to Belize and founded the biotech firm Quorumex. In Belize, the mogul admitted to bribing members of the coast guard to stop them from hassling his ferry business. and largely withdrew from society.
“My fragile connection with the world of polite society has, without a doubt, been severed,” McAfee wrote in an email reviewed by Wired. “My attire would rank me among the worst-dressed Tijuana panhandlers. My hygiene is no better.”
In 2015, shortly after McAfee returned to the US, Tennessee police officers arrested him for driving under the influence and possession of a handgun while intoxicated, according to The Jackson Sun. McAfee had previously told news outlets he struggled with drug and alcohol addiction during his life.
That year McAfee also filed paperwork to run in the 2016 presidential race as a Libertarian candidate. The tech mogul lost the party nomination to former New Mexico Gov. Gary Johnson.
Recently, McAfee had actively wrote his musings on politics and tech on Twitter. He frequently touted cryptocurrency and said, “crypto is the key to unlocking our prisons.” McAfee later said he lost his entire crypto fortune.
“The US believes I have hidden crypto. I wish I did but it has dissolved through the many hands of Team McAfee,” he tweeted. “My friends evaporated through fear of association. I have nothing. Yet, I regret nothing.”
Data from the Centers for Disease Control and Prevention says teen girl suicide attempts increased drastically during the COVID-19 pandemic.
The study, published Friday, found that suspected suicide attempts among girls aged 12 to 17 went up by 50.6% between February 21 and March 20 of this year, compared to the same time period in 2019 before the pandemic. Suicide attempts among boys of the same age range also went up but by 3.7%.
“Self-reported suicide attempts are consistently higher among adolescent females than among males, and research before the COVID-19 pandemic indicated that young females had both higher and increasing rates of ED visits for suicide attempts compared with males,” researchers wrote in the study, suggesting this new data falls in line with previous research.
“However, the findings from this study suggest more severe distress among young females than has been identified in previous reports during the pandemic, reinforcing the need for increased attention to, and prevention for, this population,” the study continued.
To conduct the study, researchers examined emergency room visits between January 1, 2019, and May 15, 2021. Visits to the emergency room by adolescents, especially girls, across 49 states and Washington, DC, began to increase around May 2020, the researchers noted. After May 2020, the rates at which adolescent girls visited the ER continued to stay elevated.
“Young persons might represent a group at high risk because they might have been particularly affected by mitigation measures, such as physical distancing (including a lack of connectedness to schools, teachers, and peers); barriers to mental health treatment; increases in substance use; and anxiety about family health and economic problems, which are all risk factors for suicide,” researchers who conducted the CDC study wrote.
Tunisian forces tracked an extremist group in the Mount Salloum area of Kasserine, the AP reported officials as saying, when it encountered a suspected jihadi and his family.
According to the reported statement, Tunisian forces killed the man, whereupon his wife activated her suicide belt, causing an explosion that killed herself and the baby in her arms. An older daughter also at the scene survived, officials reportedly said.
The mother was the first woman that Tunisian authorities say they have seen among jihadists in the mountainous area, according to the AP.
The other operation, in which two other extremists were killed, took place in the Mount Mghila area, according to the reported statement. It said the chief of Tunisia’s Jund Al Khilafa brigade was killed, a group that has pledged allegiance to the Islamic State.
The US State Department designated Jund Al Khalifa’s Tunisian branch as a terrorist entity in 2018. According to the AP, the group is believed to be responsible for several attacks in Tunisia.
Almost a year ago, Chelsea Brickham posted on TikTok for the first time.
The video got more than 500,000 views. Brickham, a 38-year-old trans woman living in Florida, posted photos of her transition after seeing other trans creators do the same. About 2,000 positive and encouraging comments appeared underneath the video.
“My initial reaction to TikTok was that it was such a positive and nurturing environment,” she told Insider. “And that’s why that actually saved me. It pulled me out of that dark place at that moment. It really did wonders for my mental health.”
Brickham was days away from getting her long-awaited gender-affirming surgeries when the coronavirus pandemic caused the hospital to cancel them. Facing the cancellation, the costs of private health insurance, and a shift to telework, Brickham turned to TikTok for “some kind of distraction, and maybe brief levity,” she said.
“It kind of takes my breath away – even now, thinking back in retrospect – because every single one of those 2,500 comments was supportive and positive and just telling me things I needed to hear,” Brickham said.
Months later, the positivity came to a screeching halt.
One of her recent videos, which got more than 1 million views and wasn’t unlike the rest of her content, led to a flood of transphobic and other attacks on her appearance.
Brickham said the experience shattered her perception of the app. She wasn’t sure why this video, in which she responded to a commentator who had misgendered her, had elicited such a different response.
Most of the comments appeared to come from young, straight, cisgender men who misgendered her, she said. For these types of comments, Brickham said, she often visited the commenter’s profile to educate them.
“I just kind of deal with it with a factual, straightforward approach,” she said, adding that she often tells transphobic commentators they don’t have the “credentials” to make claims about her gender.
In one more egregious comment that Brickham reported, a TikTok user said it was a “shame” that cancer, which she’d recently had, didn’t kill her.
Trans TikTokers find community, but also abuse and harassment
In January 2020, The Washington Post dubbed TikTok “the soul of the LGBTQ internet,” adding that young LGBTQ people used TikTok “to share their raw feelings with each other” in a way not seen on legacy social-media platforms. As of this February, videos using the hashtag “#lgbtq” had more than 665 million views.
TikTok, owned by the Chinese company ByteDance, has publicly aligned itself with LGBTQ communities, and last year it donated $3 million to LGBTQ-focused organizations such as GLAAD and the Trevor Project.
But transgender creators say TikTok is an unwitting accelerator for transphobia and harassment.
Half a dozen trans TikTok creators, with a combined follower count of more than 3.1 million, told Insider that while the app had allowed them to build impressive followings and find a sense of community, its design appeared to perpetuate a culture of transphobia and harassment.
Creators detailed the harassment and abuse they’d experienced on the app; they all said they had experienced it to a greater degree on TikTok than on other social media platforms, in part because of the app’s central algorithm-driven feed. TikTok features – like duets, which allow users to respond to another user’s videos – have also been a tool for harassers.
Their concerns and experiences raise questions about TikTok’s ability to moderate content on the app.
Trans creators said their experience soured weeks after they began posting on TikTok
Last spring, before COVID-19 travel restrictions were imposed, Madelyn Whitley, a 20-year-old transgender woman and model living in New York, joined TikTok. She and her twin sister had traveled to France for fashion week and were living there.
Whitley, who had about 10 followers, posted a video of her and her sister, also a trans woman and model, for Trans Day of Visibility, a holiday that honors and recognizes trans people. The video “barely took off,” gaining about 20,000 likes, she said. But the attention skyrocketed from there.
“Everyone was so nice on that first video,” said Whitley, who as of February had 300,000 followers on the app. “And then I think down the line, maybe in September, I posted another one that had the complete opposite reaction – most of it was negative.
“I remember my first hate comment,” she added. The commenter had misgendered her and told her she was going to hell, Whitley said.
“I don’t know why,” Whitley added, “but some of the comments can get really, really transphobic, and we haven’t really experienced this anywhere else.”
Hateful behavior on TikTok can take many forms, including comments, collaborations, and direct messages.
A TikTok representative told Insider in an emailed statement that the platform “is a community with millions of diverse creators, and the platform wouldn’t be what it is today without the range of voices and experiences our users bring.”
“There is no place for hate and harassment on TikTok, and we’re committed to creating a safe space for our users, continually improving our protections for the LGBTQ community and other underrepresented groups, and being an active ally,” the statement said.
In December, the company said it was updating its community guidelines to make them more “inclusive and thoughtful,” adding rules and updating policies to prohibit doxxing, cyberstalking, and sexual harassment.
But negative messages are “as small as ignorant comments of people just commenting one word, ‘woman,’ or, like, saying I can never be a man,” Aiden Mann, a 26-year-old transgender man from Tennessee who has 2.2 million TikTok followers, told Insider.
“I’ve had people who messaged me on an anonymous account and, in detail, explain to me if they had the opportunity to kill me how they would do it,” he added. Mann said others had suggested he end his life by suicide.
The ‘wrong side’ of TikTok
In essence, TikTok functions as a custom cable network. The app’s algorithm acts as a network executive, deciding which videos get spread to certain users, based largely on what it suspects to be the user’s taste because of their past behavior.
Central to TikTok is the “For You” page. While users can watch videos from a list of accounts they follow, the primary means of consumption is the seemingly infinite stream of videos found on the page.
Platforms like Facebook, Twitter, Snapchat, and especially YouTube similarly offer content, curated by an algorithm, from sources beyond the users a person has followed. But none uses such a system as its primary content driver in the way that TikTok does.
“I like to think of TikTok as a broadcast platform, like a channel that you’re watching rather than a social network,” said Daniel Sinclair, an independent researcher who studies TikTok and other social-media platforms, “because although you do have access to your following, TikTok is still controlling what you see.”
It is “entirely possible” that the “For You” page, led by TikTok’s algorithm and human moderators, could inadvertently lead to harassment, Sinclair said.
“I think the broadcast-first distinction is big because it’s TikTok that’s directing content and directing what you see more than many other platforms,” he added.
TikTok first shows a video to a small batch of people it thinks will be interested in it, based on a list of factors outlined in the blog post. Some of these users already follow the creator, while others don’t. Videos from accounts with larger followings may have an advantage, but “neither follower count nor whether the account has had previous high-performing videos are direct factors in the recommendation system,” the company said.
If the video performs well (users like or share the clip, or watch the entire video), the algorithm recommends it to more people. The process is repeated; if a video continues to be popular, it can quickly go viral.
The page has been credited with driving TikTok’s meteoric rise since the app emerged from Musical.ly in 2018, about a year after ByteDance purchased it. It has paved the way for TikTok’s culture of uber-fast virality and blink-and-you’ll-miss-it trends. Users with small followings have the opportunity for viral fame – their videos can be distributed to thousands and sometimes millions of strangers within hours.
Carolina Are, who researches online moderation and algorithmic bias at City, University of London, told Insider that TikTok differed from other platforms like Instagram because the app’s algorithm and focus on short videos made it easier for content to go viral.
“Because of that, because there’s no meaningful interaction, it feels like creators do not look human to people who comment, and therefore it feels very easy to just hate,” she said.
Trans creators told Insider that the “For You” page allowed them to quickly find a community and support on the app.
But there’s a hefty con to the page, they said, in that they have little control over their audience, and the audience has little control over what shows up on their screen.
“I’m starting to understand that there are different facets of going viral on TikTok,” Brickham said, alluding to the concept of “straight TikTok” versus “alt TikTok,” in which users experience vastly different types of content, memes, trends, and creators.
“There’s obviously the GLBT-positive sort of feed. And then there’s obviously, like, the conservative side and the Trump feed. And you’ve got the heterosexual sort of feed as well,” she said.
Mann also described an “LGBTQ side” of the platform where his videos often remained. But he said his experience would swiftly sour if his content ended up elsewhere.
Fletcher Furst, an 18-year-old from Alberta, Canada, argued that the algorithm behind the “For You” page was just part of the story. Furst speculated that transphobic users search for content from trans creators via hashtags like #lgbtq or #trans, leading TikTok to recommend similar content to them in the future.
Creators told Insider that the transphobia they faced on TikTok was more intense than on other social-media platforms. Unlike other apps that rely largely on a connection between creators and their followers, TikTok creators broadcast to communities that can include not only their followers but legions of people with similar interests who’ve never seen their content before.
Suddenly, the creators said, TikTok videos can end up in an entirely different community.
“Sometimes for some reason – I have no idea why – my transgender videos end up on straight TikTok, or the conservative side of TikTok, or religious TikTok,” Mann said. “And then I get the really bad bashing and hateful comments and death threats and stuff like that.”
In contrast, he said, “on Instagram, the only people who are going to see your posts, more than likely, are the people that are following you, and then same with Twitter and Facebook.”
On other platforms, “people can share stuff, and they can get to the wrong side, but it’s a lot more difficult,” he said. “With TikTok, your video can end up on the wrong side of TikTok any day, at any time. Then when it blows up, it goes on and on.”
Trans creators say their videos were removed while abusive content remained
Samuel Monger, a 17-year-old trans man from Oregon, estimated that about 10 of his videos had been removed from TikTok, for reasons that weren’t exactly clear to him. He said the deleted videos weren’t sexual or violent but involved him speaking about his experiences as a trans person.
He said TikTok had told him that these videos violated its community guidelines. He appealed, but the videos weren’t reinstated, leaving him frustrated. He tried to re-upload videos, and they were deleted again, Monger said. He was confused about why the videos were removed in the first place, but he moved on to new content.
In one video, which the company reinstated after Insider inquired about its removal, Monger showed off different facets of his style, modeling dressed-down and dressed-up outfits.
He said other trans creators had faced similar punishments when trying to, for example, educate trans youth on how to safely bind their chest to create a more masculine or nonbinary appearance.
Monger said that while he’d never shown his chest on TikTok, it was frustrating to see cisgender men – often some of TikTok’s biggest stars – appearing shirtless in videos, “advertising their bodies.”
TikTok has previously been criticized over its moderation policies. Last March, The Intercept reported that a company memo had in some markets directed moderators to keep users that they judged to be disabled, poor, or ugly from the “For You” page. At the time, the company said that the policies were an early attempt at preventing bullying, that they were no longer in use, and that they had never been implemented in the US.
A study by the Australian Strategic Policy Institute’s International Cyber Policy Centre published in September found that hashtags related to LGBTQ issues had been suppressed on the platform in at least eight languages.
Sinclair speculated that the policies were designed not to mitigate harassment but to limit content the company viewed as “unsightly.” He said it pointed toward a larger issue as TikTok’s Chinese parent company expanded to new markets and navigated content moderation.
Furst told Insider he’d also had several videos removed and was told that they’d violated TikTok’s community guidelines.
“I had a video where I tried to speak up on my experience being bullied in high school for being transgender,” he said. “And that video got taken down right away. I don’t know why. They just said it went against their guidelines.
“Maybe ’cause I just mentioned being trans, but then there are videos that are still up of people encouraging harm towards trans people, and it’s just insane,” he added. “It’s like, how come that stays up and my content gets taken down?”
The video, first uploaded in November, was reinstated by TikTok in February after Insider asked the company about its removal.
In August, Eric Han, TikTok’s US head of safety, said that since January it had removed more than 380,000 videos, 64,000 comments, and 1,300 accounts for violating its policies on hate speech.
“To be clear, these numbers don’t reflect a 100% success rate in catching every piece of hateful content or behavior, but they do indicate our commitment to action,” Han said.
Han said TikTok was updating its hate-speech policy, removing hateful content from the app, “increasing cultural awareness” in content moderation, improving transparency, and working with its teams and partners “to invest in our ability to detect and triage hateful or abusive behavior to our enforcement teams as quickly as possible.”
Han also said the company was training its content moderators on the difference between a marginalized group using a slur “as a term of empowerment” and a person using the same word hatefully.
“Educating our content moderation teams on these important distinctions is ongoing work, and we strive to get this right for our users,” Han said.
Mann said he’d been frustrated by TikTok’s inaction after he reported multiple videos he found transphobic.
“A lot of the videos that I reported come back saying that it’s not against community guidelines. I’m kind of in shock,” he said. “This person is literally making transphobic comments or making transphobic jokes. How is that not discrimination?”
Furst said TikTok would be more inclusive if it allowed creators to designate their videos as “educational,” to “be able to educate people about trans stuff without it being taken as sexual and then be taken down.”
All the creators who spoke with Insider said TikTok could change its community guidelines to better protect trans users.
Otherwise, Furst said, “it definitely feels like that app just wasn’t created for you.”
Hateful comments on TikTok can have real-life effects on trans communities
“Trans youth are continually being retraumatized through harassment that they experience both in the world that they live in and also when they show up online,” Dr. Ric Matthews, a psychotherapist in New York who works with LGBTQ communities, told Insider.
But when it comes to apps like TikTok, “not using these platforms really isn’t an option at this point,” Matthews said. “It’s an inescapable way of connecting and communicating and a necessity for social survival.”
Matthews added that “when harassment, bullying, and different types of violence that they experience in these spaces happens, it’s exacerbating isolation and alienation to people who are already battling to have safety in spaces that they occupy physically.”
Harassment on social media can also set back trans youth who are developing their identities, said Dr. Melissa Robinson-Brown, a psychologist in New York who works with young people.
“I think one of the reasons it’s pretty harmful is because especially with our generation today, so much of their time is spent on social media and on platforms like TikTok,” she told Insider.
“They’re building their communities, finding their tribe and their friends,” she added. “And so to see the transphobia, to see the negativity and the discrimination, can really be harmful to self-esteem-building and that sense of self-worth that is really just so critical for youth in general.”
Monger told Insider that while he could typically brush off hate-filled comments, he worried that the transphobia could affect other young and impressionable trans people on TikTok.
“I’m confident in myself, but there are kids who are not confident in their identity,” Monger said. “And seeing people say that they want to kill people really does not help them.”
Trans youth are at a higher risk than their peers of attempting suicide. A study published by the American Academy of Pediatrics in 2018 detailed a survey of about 120,000 young people, conducted from 2012 to 2015, in which 51% of trans boys and 30% of trans girls said they had attempted suicide, compared with 18% of cisgender girls and 10% of cisgender boys.
Trans TikTokers told Insider that hateful messages targeted their appearance or included problematic phrases like “What’s your real name?”
Mann, who said he had struggled with his appearance after multiple top surgeries – procedures to reshape the chest and remove breast tissue that left him with scars – said users left nasty comments about his body.
“People attack that all the time,” Mann said. “I’m still hoping to get them fixed.”
Mann shared with Insider four TikTok videos posted from April to June, each with more than 6,000 likes, that made fun of his chest. He said that he’d reported the videos to TikTok but that they weren’t removed.
TikTok removed all four of the videos after Insider flagged them.
“It makes me mad that they only removed them to seem to cover their a–,” Mann said.
Jade Marie Eichelberger, a 19-year-old trans woman from South Carolina, told Insider that her experience with transphobia on TikTok involved users’ desire to hear details of her transition and the trauma of being a Black trans woman.
“They really want you to talk about everything trans-related, down from the surgeries to how it makes you feel and how people treat you,” she said. “And sometimes you don’t really want to think about that or create about that, because cisgender people are not pressured to make videos about their trauma.
“Especially trans women of color, we’re always pressured to tell stories of things that have happened to us, because people want to use us as an example as to why people should be nicer to trans folks,” she added. “They always go for the people who were the most marginalized within the community to hear those sad and traumatic stories.”
Eichelberger said her videos that homed in on her transition or her experience as a trans woman performed well, but her videos about other topics seemed to fall out of favor.
She and other trans creators often field inappropriate and transphobic requests from TikTok users asking them to show their “real voice” or to upload pictures from their childhood, she said.
She added that while some creators might not have a problem with that, the requests and pressure to make that kind of content were rude and transphobic, implying that her identity is part of a performance.
“I don’t really feel comfortable with doing that,” Eichelberger said. “Am I ashamed of my childhood pictures? Hell to the hell no. I was a cute kid. But because I know why people want to see them, it makes me uncomfortable.”
Whitley said she’d had to alter how she operates on TikTok after receiving a series of transphobic comments that were fueled by a popular TikTok creator.
Chris, known on the app as @Donelij, would use the split-screen duet feature to react to videos of gay and trans creators. Chris’ smile would turn into a frown as videos of people skirting gender norms or photos displaying a person’s transition appeared. His videos would often get more than a million views. When his account was banned, he had 2.5 million followers.
“He just kept dueting them over and over and sending thousands of transphobes to me,” Whitley said.
While TikTok banned his first account last year, Chris continued to post videos to millions of followers using other accounts. TikTok banned an account he was using in February after Insider inquired about it.
Chris told Insider he was “not transphobic” and declined to comment further. Last year, he told The New York Times that he had been the target of racist harassment on TikTok. His facial expressions’ were meant as jokes, he said.
For Whitley, the videos had consequences that were far from funny.
Whitley said she’d had to limit comments on her content to prevent users from sharing her deadname (the name she went by before her transition), her address, and her phone number, all of which she said people had threatened to post.
Whitley said that since the duets stopped, some of the negative attention had subsided – but she estimated that about half of the comments she receives are negative or outwardly transphobic. She said she’d become “desensitized” to them.
“I don’t take them to heart,” Whitley said. “I’m stronger than that, and it kind of just boosts my engagement. I’m just going to take it as a positive and move on instead of focusing on their negativity.”