YouTube removes Trump content and bans him from uploading new videos for a minimum of 7 days, citing ‘ongoing potential for violence’

Sundar Pichai Donald Trump 2x1
Sundar Pichai Donald Trump

  • YouTube said Tuesday that it has “removed new content” from President Donald Trump’s official channel and banned him from posting new videos for a “minimum” of one week for violating its policies.
  • YouTube also gave Trump’s channel its first “strike,” and is “indefinitely disabling” comments over “safety concerns.”
  • YouTube’s actions come days after Facebook and Twitter banned Trump from their platforms entirely, and amid pushback from Google’s newly formed union, which slammed the company’s response to recent violence as “lackluster.”
  • Visit Business Insider’s homepage for more stories.

YouTube has suspended President Donald Trump’s account for at least one week after removing a video that the company said incited violence.

The offending video was uploaded Tuesday and violated YouTube’s policies on inciting violence, a spokesperson said, although the company did not share details of the video’s contents.

YouTube said it had issued the account a single strike, preventing it from uploading new videos for seven days, but said that timeframe could be extended.

The company said it has also disabled comments under videos on the channel indefinitely.

“After careful review, and in light of concerns about the ongoing potential for violence, we removed new content uploaded to the Donald J. Trump channel and issued a strike for violating our policies for inciting violence,” a spokesperson told Business Insider.

“As a result, in accordance with our long-standing strikes system, the channel is now prevented from uploading new videos or livestreams for a minimum of seven days-which may be extended. We are also indefinitely disabling comments under videos on the channel, we’ve taken similar actions in the past for other cases involving safety concerns.”

Although Trump’s account is suspended, the channel is still active along with previously uploaded videos, some of which falsely claim that President-elect Joe Biden did not win the election.

A spokesperson said that a second strike on the channel will lead to a two-week ban, and three strikes means permanent suspension.

YouTube is the last major internet platform to suspend Trump’s account after pro-Trump insurrectionists attempted a coup at Capitol Hill last week. Facebook suspended Trump’s account for at least two weeks, while Twitter banned him indefinitely.

While YouTube removed a video message posted by Trump last week, in which he spoke to the rioters, it stopped short of suspending his account entirely. Instead, the Google-owned service introduced a new strike policy

The decision has drawn criticism from within and outside of Google. The recently-formed Alphabet Workers Union slammed YouTube for its “lackluster” response to the siege on the Capitol, while civil rights groups and celebrity figures including Sacha Baron Cohen publicly called for the YouTube account to be suspended.

Google was swifter to pull Parler, the social media app that’s popular with Trump supporters, from its Play Store. Google said the app did not have sufficient moderation policies in place to curb content that could also incite violence.

Are you a Google employee with more to share? You can contact the reporter Hugh Langley securely using the encrypted messaging app Signal (+1-628-228-1836) or encrypted email (hslangley@protonmail.com). Reach out using a nonwork device.

Read the original article on Business Insider

How YouTube is Supporting Black Creators and Artists

Last summer admist the Black Lives Matter Movement and protests in support of George Floyd, YouTube announced the launch of a multi-year $100 million fund dedicated to amplifying and developing the voices of Black creators and artists and their stories. More specifically, the fund has supported programs such as 2 Chainz’ “Money Maker Fund” series highlighting HBCU entrepreneurs and Masego’s “Studying Abroad” livestreamed concert series.

Today, the platform is using capital for that effort to create a global grant program for Black creators.

“The painful events of this year have reminded us of the importance of human connection and the need to continue to strengthen human rights around the world. In the midst of uncertainty, creators continue to share stories that might not otherwise be heard while also building online communities,” YouTube CEO Susan Wojcicki wrote in a blog post detailing the decision and reflecting on 2020.

The #YouTubeBlack Voices Class of 2021

Per Billboard, the program is kicking off with an inaugural class of 132 individuals spanning musicians and lifestyle vloggers including Kelly Stamps and Jabril Ashe, also known as Jabrils, who share educational videos centered around the emerging gaming, technology, and AI spaces.

The musicians named to the group include Brent Faiyaz, BRS Kash, Fireboy DML, Jean Dawson, Jensen McRae, Jerome Farah, Joy Oladokun, KennyHoopla, Mariah the Scientist, MC Carol, Miiesha, Myke Towers, Péricles, Rael, Rexx Life Raj, Sauti Sol, serpentwithfeet, Sho Madjozi, Tkay Maidza, Urias and Yung Baby Tate.

Each grant recipient will be provided an undisclosed funding amount to be used in support of their channels, and can encompass needs such as editing, lighting or other equipment to amplify and enhance the quality of their content. YouTube will also offer additional resources such as workshops, training and networking opportunities to boost skills and fuel meaning collaborations. “We are not only supporting them in the moment, but this is seed funding that will help them to thrive on the platform long-term,” he added.

Hailing from across the United States, Kenya, Brazil, Australia, South Africa and Nigeria, the cohort was selected in part based on their past participation in #YouTubeBlack, a campaign and event series promoting Black creators launched in 2016.

Paving a future for change

“These creators and artists have been doing this work already and are known by their communities, but we’re really excited to invest in them, and we believe that they can and will become household names with this support, shared Malik Ducard, YouTube Vice President of Partners on the #YouTubeBlack community.

In today’s landscape, influencers are themselves a media channel. The budgets put against them shouldn’t just be production-driven but rather emphasize a broader commitment to diverse and authentic stories driven by co-communication and co-creation. For YouTube, this effort is not only beneficial in ensuring these creators have their voices heard, but in allowing the platform to stay true to its goals and values and its commitment to its community.

“This is not a flash-in-the pan Instagram moment. This is about keeping the drum beat of change alive, and in the DNA of our organization,” added Lyor Cohen, YouTube’s Global Head of Music, reiterating the confidence in the ability of this group to lead and find long-term success through raw passion, creativity, and an entrepreneurial spirit. “Our expectation is that these artists are going to be significant and important voices and make music even more enjoyable.”

The future of brand-artist collaborations

For brands partnering with music artists – the takeaway here is that social listening requires responsiveness, flexibility, and mindfulness when it comes to integrating culture. People want to be heard, not sold to, and efforts should extend offline. This is only achieved through a full understanding of a new age of partnerships – one where brands have a bigger role to play in artist’s lives and artists are crossing the threshold to become true digital marketers monetizing the whole self.

Join 100,000+ fellow marketers who advance their skills and knowledge by subscribing to our weekly newsletter.

The post How YouTube is Supporting Black Creators and Artists appeared first on Social Media Week.

YouTube rival Rumble is suing Google for at least $2 billion, saying the search giant abuses its monopoly power

Rumble screenshot
Rumble has become popular with conservatives, who say they are suppressed by Big Tech.

  • Canadian video-sharing platform Rumble is suing Google over claims the tech giant is “unfairly rigging its search algorithm” to preference YouTube videos in its search results. 
  • Rumble is a direct rival to YouTube, and has become popular with conservative US figures who say they are being censored by established tech platforms.
  • The lawsuit indicates that Google and other major tech players may face antitrust headaches from smaller, conservative-friendly rivals.
  • Rumble’s lawsuit accuses the tech giant of “willfully and unlawfully created and maintained a monopoly in the online video-sharing platform market.” 
  • Visit Business Insider’s homepage for more stories.

Video-sharing site Rumble has accused Google of “unfairly rigging its search algorithm” to favor YouTube’s videos in search results, marking the tech giant’s latest in a series of antitrust headaches. 

Rumble, based in Toronto, filed a lawsuit in California on Monday claiming that Google’s actions, including unfair search algorithms and the pre-installation of the YouTube app on Android devices, had cost it viewers and advertising revenue. 

The complaint reads: “Google, through its search engine, was able to wrongfully divert massive traffic to YouTube, depriving Rumble of the additional traffic, users, uploads, brand awareness and revenue it would have otherwise received.”

Google has faced a string of antitrust actions over its search dominance the past few years, drawing attention from US authorities, EU legislators, and market competitors alike

Rumble has become popular with conservatives in the past year or so, encouraged by Republican congressman Devin Nunes and other conservative figures. The company says it has more than 2 million creators using the site.

Influential right-wingers in the US have taken an aggressive stance against established US tech firms such as Facebook, Google, Apple, and Amazon, a position lately intensified by the platforms teaming up to essentially block right-wing social messaging platform Parler.

Currently Rumble’s list of most-watched videos include conservative political commentator Dan Bongino, Fox presenter Sean Hannity, and conservative YouTubers Diamond and Silk. Its CEO, Chris Pavlovski, regularly posts updates on Twitter about right-wing figures joining the platform.

In its complaint, Rumble accuses Google “willfully and unlawfully created and maintained a monopoly in the online video-sharing platform market in at least two ways.”

It adds: “First, by manipulating the algorithms by which searched-for-video results are listed, Google insures that the videos on YouTube are listed first, and that those of its competitors…are listed way down the list…

“Second, by pre-installation of the YouTube app as the default online video-sharing app on Google smart phones, and by entering into anti-competitive, illegal tying agreements with other smartphone manufacturers to do the same.” 

The firm indicated it was seeking damages of at least $2 billion.

Rumble’s complaint comes shortly after Parler sued Amazon, and marks a potentially troubling new antitrust for the major platforms. Amazon had hosted Parler’s service on its cloud service AWS, but booted the firm off after the US Capitol riots last week. Parler claimed in its subsequent suit that Amazon was behaving anti-competitively. Parler’s lawsuit indicates that sites and apps banned or penalized by the US tech giants for hate or violent speech are willing to use the emerging antitrust sentiment in court.

A Google spokesperson told the Wall Street Journal: “We will defend ourselves against these baseless claims.” 

Business Insider approached Rumble and Google for further comment. 

Read the original article on Business Insider

Here are the most prominent people who got banned from social media platforms after the Capitol riots

US Capitol riot
Riots at the US Capitol Building.

  • Donald Trump, Sidney Powell, and Michael Flynn were among the people whose accounts were banned following the attack on the Capitol.
  • These accounts, social media platforms said, violate their rules of engagement and pose a risk to the public. 
  • Visit Business Insider’s homepage for more stories.

Almost immediately after the attack on the Capitol building last Wednesday, social media platforms began suspending and permanently disabling accounts they say disseminate violent rhetoric.  

The most prominent ban was Twitter’s permanent suspension of President Donald Trump’s account Friday night. 

After his account got disabled, top conservatives began sharing their Parler accounts on the platform, encouraging their followers to gravitate there. Parler has become a mainstay in alt-right communication, advertising itself as a platform for unregulated language and “free speech.”

Days after the presidential election, Parler download counts surged, signaling that the platform was at the time seeing an influx of new users. 

After Twitter banned Trump, Gab another far-right website that bills itself as a “free speech” platform, reported massive growth. About 10,000 new users signed up every hour on Saturday, according to Gab, signaling the gravitation from mainstream social media accounts to less-popular ones like Gab known for the circulation of alt-right speech.

Alt-right content is still available on mainstream social media accounts like Twitter. But after the Capitol riots, social media platforms have begun removing accounts they suspect will incite violence. Some users whose accounts have been removed have previously spread misinformation related to the 2020 election results and QAnon content. 

These accounts, social media platforms said, violate their rules of engagement and pose a risk to the public. 

Here are the people who’ve been banned since the Capitol riot attacks: 

Donald Trump

donald trump debate
President Donald Trump.

Trump has been suspended from accessing multiple social media platforms almost immediately after the Capitol riots.

He was permanently suspended from Twitter on Friday “due to the risk of further incitement of violence,” the company said in a tweet. 

Facebook blocked Trump “indefinitely” a day earlier, saying the ban will last at least until President-elect Joe Biden gets sworn into office on January 20. 

Snapchat also banned Trump’s account for concerns about his rhetoric.

Reddit banned r/DonaldTrump, a popular subreddit that violated the platforms “rules against inciting violence,” a spokesperson said to Insider.

Sidney Powell

Sidney Powell
Sidney Powell.

Twitter on Friday said it suspended the account of Sidney Powell, the lawyer Trump tasked with proving his baseless claims of election fraud. 

Powell, in her attempt to alter the results of the 2020 presidential election, has been accused of spreading misinformation about Dominion Voting Systems, an electronic voting supplier.

She was sued for $1.3 billion on claims that she facilitated the spread of misinformation. 

 

Steve Bannon

steve bannon banned twitter
Steve Bannon.

YouTube removed Steve Bannon’s “War Room” podcast Friday night for “violation of YouTube’s Terms of Service.”

Trump’s personal lawyer Rudy Giuliani had appeared on the podcast hours before the ban. During his appearance, he blamed Democrats for the Capitol riots. 

Twitter banned Bannon, a former White House strategist, in November after he posted a tweet calling for the decapitation of Dr. Anthony Fauci.

Michael Flynn

Michael Flynn
Michael Flynn.

Former National Security Advisor Michael Flynn was booted off Twitter earlier this week.

Flynn partially used Twitter to urge Trump to use martial law to overturn the results of the presidential election.

He’s also been one of most visible backers of QAnon. In 2019, Flynn was scheduled to speak at a QAnon-organized conference.

Ron Watkins

ron watkins oan
Ron Watkins was interviewed by OAN’s Chanel Rion as a “cyber analyst.”

That same day, Twitter banned the account of Ron Watkins, a crucial QAnon figure who ran the alt-right platform 8kun.

Watkins’ misinformation posts have frequently often been amplified by Trump himself. When his account was active, Trump retweeted posts from Watkins. 

Other QAnon accounts were also suspended on Friday, and Twitter has been taking steps to reduce the influence and misinformation that comes out of the group. The same day, for example, Twitter removed thousands of QAnon-affiliated accounts

Still, there are several other QAnon accounts that continue to thrive on the platform. 

Read the original article on Business Insider

Twitter and Facebook both banned Trump from their platforms. Here’s why that doesn’t violate the First Amendment – or any other laws

twitter trump
Activist Mike Merrigan holds a piñata shaped like the Twitter logo with hair to look like U.S. President Donald Trump during a protest outside of Twitter headquarters on May 28, 2020 in San Francisco, California.

After months of escalating tensions between President Donald Trump and social media companies, Twitter and Facebook finally decided this week that the president had crossed a line too far.

On Wednesday, after Trump incited a mob of his supporters, thousands of them violently stormed the US Capitol, where Congress was voting to certify the results of the election, in an attempted insurrection that left five dead.

Though Trump posted a video briefly denouncing the violence, he then continued to use social media platforms to praise his supporters and once again repeat debunked conspiracy theories about the election.

Twitter and Facebook, both of which have policies against inciting violence, undermining democratic processes, and spreading election misinformation, decided that – given the impact that the president’s comments were having and continue to have – they would no longer let him use their platforms.

Twitter suspended Trump’s personal account, @realDonaldTrump, permanently, citing “the risk of further incitement of violence.” Facebook and Instagram suspended Trump “indefinitely and for at least the next two weeks until the peaceful transition of power is complete.”

Read more: Google and Apple are banning Parler from their app stores for allowing violent content in the wake of attempted insurrection that left 5 dead

Within hours of Twitter’s ban on Friday, Trump tried to bypass it by tweeting from the official presidential account, @POTUS. He posted a series of tweets railing against the social media company for “banning free speech” and taking aim at one of his favorite targets, Section 230. (Twitter quickly removed the tweets.)

But Trump’s implication – that Twitter somehow violated his First Amendment right to free speech – is a complete misunderstanding of what the First Amendment says.

Here’s why Twitter and Facebook, like other social media companies, have the right to ban Trump, and why Trump and other far-right politicians often take it out on Section 230.

What is the First Amendment?

The First Amendment to the US Constitution says: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances” [emphasis added].

In other words, it bans the government from infringing on free speech (with some limited exceptions).

What does that mean for social media companies?

Not much.

“The First Amendment is a constraint on the power of government. It doesn’t apply to Twitter,” said Daphne Keller, an attorney and internet law expert who leads the program on platform regulation at Stanford University’s Cyber Policy Center, adding: “Twitter is not a state actor.”

Why are Trump and his allies so mad then?

Trump, his allies, and others who have been hit with account suspensions, had warning labels applied to their posts, or had their advertising revenue shut off by companies like Facebook, Twitter, and YouTube may disagree with those companies’ rules or approach to enforcing them – or they may just be mad that they can’t get their message out or make money from their audience or advertisers.

But legally, there’s very little they can do.

Section 230 of the Communications Decency Act of 1996 gives legal protections to “interactive computer services” – like social media companies – that: 1) prevents them from being held liable for content posted by their users (with some limited exceptions), and 2) allows them to moderate content on their sites as they see fit.

“Section 230 makes it relatively easy for platforms to go to court and win saying ‘we have the right to enforce whatever policies we want,'” Keller said. But even without Section 230, she said, Twitter would win if Trump sued “based on their own First Amendment right to set editorial policy on the platform.”

So, why do Trump and his allies still want to get rid of Section 230?

Trump and many far-right politicians have repeatedly claimed (without evidence) that social media companies are systemically biased against them, and they believe repealing or curbing Section 230 would allow them to use the government to deny Section 230’s legal protections to platforms that aren’t “politically neutral.”

Ironically, that’s exactly what the First Amendment prohibits, which legal experts quickly pointed out when Trump tried to use executive orders to accomplish that last summer. (Still, Trump loyalists in the Federal Communications Commission tried to implement it anyway.)

What would happen if they did repeal Section 230? 

Ignoring for a second that it’s legal for social media companies to be “biased” when enforcing content rules, right-wing politicians’ criticisms of Section 230 tend to ignore several key facts about who social media currently benefits – and who it would benefit if they repealed the law.

First, the evidence has consistently shown that conservatives tend to enjoy some of the widest reach and engagement on sites like Facebook, Twitter, and YouTube – or at the least, conservatives have failed to produce evidence that their views are being silenced or their reach is being throttled.

Second, if social media companies lost the legal protections offered by Section 230, they would be more, not less likely to remove questionable content from their sites, because they’d (rightfully) be fearful of getting sued.

That purge could very likely hurt far-right accounts – something Facebook itself has implicitly acknowledged, according reports from The Wall Street Journal and The Washington Post.

And increased legal liability could also make it harder for new competitors, like “alternative” social media sites Parler, Gab, and MeWe – where Trump supporters have flocked due to their lax approaches to regulating content – to get off the ground in the first place.

Read the original article on Business Insider

All the actions big tech companies have taken against Trump’s social media accounts following the US Capitol siege

US Capitol siege
The rioters during the Capitol siege.

  • The US Capitol siege by pro-President Donald Trump rioters on Wednesday has set off a wave of actions from big tech companies.
  • Platforms like YouTube, Facebook, and Twitter have removed a video of Trump telling rioters “we love you, you’re very special” but “go home in peace.”  
  • Twitter and Facebook have both locked Trump’s respective social media accounts. 
  • Here’s a list of all the actions big tech companies have taken against Trump in response to the Capitol seige.
  • Visit Business Insider’s homepage for more stories.

The US Capitol siege by President Donald Trump supporters on Wednesday has set off a range of responses and actions from big tech companies, including deleting the infamous Trump response video and temporarily freezing Trump’s social media accounts.

Prior to the historic Capitol siege that left four people dead, according to CBS News, and the Capitol building ransacked, the biggest action tech companies like Facebook and Twitter took to moderate Trump was to add fact-checking lines on some of his baseless claims regarding topics like mail-in ballot fraud.

Read more: The siege of the US Capitol was a disaster for congressional cybersecurity – and experts say Congress will likely have to wipe all its computers and rebuild from scratch

However, following the riots, more big tech platforms have taken serious and actionable steps towards temporarily quieting Trump, although people calling to ban the president from social media platforms say these short-term freezes may not be enough.

See all of the actions various companies have taken in response to the Capitol siege:

YouTube

YouTube
YouTube.

YouTube has removed a video of Trump disputing the 2020 presidential election results while telling rioters “we love you, you’re very special” but “go home in peace.” 

Farshad Shadloo, YouTube spokesperson, told Insider in an email on Wednesday that the video violated YouTube’s policies surrounding “content that alleges widespread fraud or errors changed the outcome of the 2020 US.”

“We do allow copies of this video if uploaded with additional context and sufficient educational, documentary, scientific, or artistic (EDSA) value,” Shadloo continued.

Read more: A pro-Trump super PAC made illegal contributions to the president’s reelection campaign, a watchdog group says

Following this removal, on Thursday, YouTube announced it would give channels a “strike” if its videos violated the social media platform’s policies. Following the first strike, a channel will be banned from posting for a week. A second strike within 90 days will result in a two-week ban. The third and final strike, if done within 90 days, will result in a permanent ban. 

The strike policy announcement came out of the “disturbing events that transpired yesterday,” a YouTube spokesperson told Insider.

Facebook

Facebook headquarters
Facebook.

The same Trump video that YouTube removed was also removed by Facebook on Wednesday. According to a tweet by Guy Rosen, Facebook’s vice president of integrity, the video removal decision was made “because on balance we believe it contributes to rather than diminishes the risk of ongoing violence.”

On Thursday, Facebook went one step further and decided to freeze Trump’s Facebook and Instagram accounts “indefinitely and for at least the next two weeks until the peaceful transition of power is complete,” Mark Zuckerberg, Facebook’s CEO, wrote in a post.

“His decision to use his platform to condone rather than condemn the actions of his supporters at the Capitol building has rightly disturbed people in the US and around the world,” Zuckerberg wrote. “We believe the risks of allowing the President to continue to use our service during this period are simply too great.”

Snapchat

snapchat
Snap.

Snapchat has also “locked” Trump’s account following the Capitol siege, a Snap spokesperson told Insider on Thursday

This isn’t the first action Snap has taken against Trump. In June, the social media platform stopped promoting Trump’s account in its Discover section after he called for violence against protestors amid demonstrations following George Floyd’s death.

“We will not amplify voices who incite racial violence and injustice by giving them free promotion on Discover,” a Snap spokesperson told Insider in June. “Racial violence and injustice have no place in our society and we stand together with all who seek peace, love, equality, and justice in America.”

Shopify

Shopify app phone
Shopify.

On Thursday, Shopify removed stores with ties to Trump, including shop.donaldjtrump.com and trumpstore.com.

Read more: Biden has been certified as president. 5 experts predict how his administration could crackdown on the advertising and tech industries.

Shopify does not tolerate actions that incite violence,” a Shopify spokesperson said in a statement to Insider on Thursday. “Based on recent events, we have determined that the actions by President Donald J. Trump violate our Acceptable Use Policy, which prohibits promotion or support of organizations, platforms or people that threaten or condone violence to further a cause. As a result, we have terminated stores affiliated with President Trump.”

Twitch 

twitch logo
Twitch.

Twitch has also frozen Trump’s account, and will make further decisions about his account after Biden is inaugurated, The Verge reported.

Previously, Twitch placed a temporary two-week ban on Trump’s account due to “hateful conduct” policy violations, a Twitch spokesperson told Insider in June.

Twitter 

trump twitter
Twitter.

On Wednesday, Twitter removed the same one-minute video that YouTube and Facebook dismissed. Shortly after, the social media platform locked Trump’s Twitter account and removed three tweets – including one with the aforementioned video – and replaced the posts with “this Tweet is no longer available” messages. As a result, Trump could either delete the tweets to gain access to his account after 12 hours or remain frozen out of his Twitter account.

On Thursday, Trump deleted the three tweets in question, and the tweets now read: “This Tweet is no longer available because it violated the Twitter Rules.” Twitter did not confirm with Insider the time the tweets were deleted, but if the original statement still holds, the 12-hour countdown until Trump has access to his Twitter account has already begun.

However, the social media platform isn’t ruling out more serious actions in the future.

“Future violations of the Twitter Rules, including our Civic Integrity or Violent Threats policies, will result in permanent suspension of the @realDonaldTrump account,” Twitter said in a statement. 

Read the original article on Business Insider

Nearly a dozen major tech firms can trace their roots to PayPal. From Palantir to Tesla, here are the companies launched by members of the ‘PayPal Mafia.’

Peter Thiel
Peter Thiel, left, and Elon Musk.

  • Early employees of payments company PayPal went on to create nearly a dozen major tech startups after leaving the company.
  • The PayPal Mafia, as its early employees came to be known, were directly responsible for Tesla, SpaceX, LinkedIn, Yelp, and more. 
  • The latest company with PayPal roots to make a major splash is Palantir, the big data company that went public on the New York Stock Exchange in October. 
  • Visit Business Insider’s homepage for more stories.

Without PayPal, there may not have been Palantir. Or YouTube. Or SpaceX, LinkedIn, and Yelp. 

The payments company – launched as Confinity in 1998 by Peter Thiel, Max Levchin, and Luke Nosek – grew to become a Silicon Valley giant. It was acquired by eBay in 2002 for $1.5 billion in a deal that altered Silicon Valley history and helped spawn the careers of some of tech’s most famous names.

The PayPal Mafia, as its early employees came to be known, have gone on to become venture capitalists, tech founders, and even a US ambassador

Here are the tech companies that may not have gotten their start without the success of PayPal. 

Secretive data company Palantir was founded in part by Peter Thiel, PayPal’s cofounder.

NEW YORK, NY - NOVEMBER 01: Peter Thiel, Partner, Founders Fund, speaks at the New York Times DealBook conference on November 1, 2018 in New York City.
Peter Thiel.

When it was founded: 2003

What it does: Palantir creates software that manages and analyzes data. Its software helps other companies and agencies like law enforcement find patterns in large swaths of data.

How it’s related to PayPal: Thiel founded Palantir after PayPal’s sale to eBay, and the idea for the company was born out of Thiel’s experience dealing with credit card fraud at PayPal. 

Joe Lonsdale, who worked as a finance intern at PayPal while still in college at Stanford University, is also a Palantir cofounder. 

Affirm was launched by Max Levchin, a PayPal cofounder.

Max Levchin

When it was founded: 2013

What it does: Affirm offers instant lines of credit to customers shopping online, allowing them to buy a product and pay for it over time. The company raised a $500 million Series G round last month.

How it’s related to PayPal: Affirm is the brainchild of Max Levchin, one of the original PayPal founders. The company launched out of Levchin’s startup incubator, HVF — Levchin took over as CEO in 2014.

Levchin founded the company along with a team that includes Nathan Gettings, who also cofounded Palantir. 

Fertility tracking company Glow was also born out of Levchin’s startup incubator.

Max Levchin

When it was founded: 2013

What it does: Glow makes a family of apps that use data science to help track periods, ovulation, fertility, pregnancy, and children’s’ growth. 

How it’s related to PayPal: Glow was also founded in Levchin’s HVF startup incubator, and Levchin now serves as executive chairman. 

YouTube’s founders worked together at PayPal during the early days.

YouTube founders
YouTube founders Steve Chen, left, and Chad Hurley.

When it was founded: 2005

What it does: YouTube is a platform for hosting and sharing videos. It was sold to Google in November 2006.

How it’s related to PayPal: Founders Steve Chen, Chad Hurley, and Jawed Karim were all early employees at PayPal.

When PayPal sold to eBay for $1.5 billion, it sparked a “healthy competition” among the company’s alumni, early YouTube investor Roelof Botha told Business Insider earlier this year. When it came time for YouTube to sell, the team intentionally chose a price of $1.65 billion — 10% more than what eBay sold for. 

Elon Musk founded SpaceX after working at PayPal.

Elon Musk SpaceX Space X
Elon Musk.

When it was founded: 2002

What it does: The goal of SpaceX, short for Space Exploration Technologies, is to make space flight cheaper and eventually colonize Mars. 

How it’s related to PayPal: In 1999, Musk launched an online banking company called X.com. That company merged with Thiel’s Confinity in 2000, then became PayPal in 2001. Musk was briefly PayPal CEO before being replaced by Thiel. But when PayPal sold, Musk netted $165 million from the deal, which he used to start SpaceX. 

Musk was an early investor in and cofounder of Tesla.

Tesla Motors CEO Elon Musk introduces the falcon wing door on the Model X electric sports utility vehicles during a presentation in Fremont..JPG
Elon Musk.

When it was founded: 2003

What it does: Tesla manufactures electric vehicles, batteries, and solar panels. 

How it’s related to PayPal: Musk was an early Tesla investor and cofounder. He became CEO in 2008

Musk launched The Boring Company after becoming irritated by Los Angeles traffic.

Boring Company Hawthorne tunnel
The Boring Company’s Hawthorne Tunnel.

When it was founded: 2016

What it does: The Boring Company builds underground tunnels with the intention of housing high-speed transit systems to reduce traffic in cities. 

How it’s related to PayPal: Musk initially proposed The Boring Company in a white paper in 2013 and launched the company three years later. 

Musk also created OpenAI and Neuralink.

Elon Musk

When they were founded: 2015 and 2016, respectively

What it does: OpenAI is an artificial intelligence research lab, while Neuralink’s goal is to make computers that can be implanted in people’s brains.

How it’s related to PayPal: Musk founded both companies to fight against what he sees as the dangers of AI.

LinkedIn was founded by early PayPal exec Reid Hoffman.

reid hoffman
Reid Hoffman.

When it was founded: 2002

What it does: LinkedIn is a social network for professionals. 

How it’s related to PayPal: Hoffman was an executive vice president at PayPal in its early days. He founded LinkedIn and initially served as its CEO before later becoming executive chairman. 

Yelp was founded by two early PayPal employees, Jeremy Stoppelman and Russel Simmons.

Jeremy Stoppelman
Jeremy Stoppelman.

When it was founded: 2004

What it does: Yelp is a platform for hosting reviews and recommendations about local businesses. 

How it’s related to PayPal: Stoppelman and Simmons met while working at PayPal in the early 2000s — Stoppelman came from X.com and served as vice president of technology while Simmons worked as an engineer. Levchin provided the initial investment in the company. 

Read the original article on Business Insider

TikTok’s app is launching on TVs for the first time

A girl holds her smartphone in her hands on which she has opened @madelainepetsch's profile in the short video app TikTok.
A girl holds her smartphone in her hands on which she has opened @madelainepetsch’s profile in the short video app TikTok.

  • TikTok’s will be available on TV for the first time.
  • A partnership with Samsung means the app is available to download on certain smart TV models in Europe, and will be pre-installed on new TVs going forward.
  • TikTok’s TV experience will highlight the most popular content on the app, and the company says it’ll keep to family-friendly material.
  • TikTok’s move to TV echoes that of YouTube, which has seen a significant uptick in TV viewership.
  • Visit Business Insider’s homepage for more stories.

TikTok is launching an app on Samsung smart TVs – the first time its short-form video content has ended up on televisions, the company announced on December 14.

The app will be available for owners of Samsung TVs in Europe, beginning with a UK rollout from Monday. The app can be accessed from models dating from 2018 onwards via Samsung’s app store, and will be pre-installed in all new Samsung TVs going forward. 
 
Users will see TikTok videos on their TV, including scrolling through the For You feed. A horizontal banner will allow viewers to look at videos posted by creators they have chosen to follow, as well as videos under different categories. Viewers can also interact with videos by liking them.

TikTok on TV
Users won’t need to log in to scroll through content on TikTok’s TV app.

 
“All the best entertainment experiences are shared experiences,” said Rich Waterworth, TikTok’s European managing director, announcing the product reveal to journalists on Monday. “TikTok on Samsung TV is the next extension of this experience.”
 
Waterworth previously worked at UK broadcaster ITV. “I’ve seen the dramatic changes in the way people consume content. Gone are those days when the only way to find entertainment at home was through a single, shared TV in the living room,” he said. “And never have we needed entertainment more than in 2020.”
 
TikTok’s TV experience will feature the most liked and viewed content on the platform, suitable for families, and users aged in their teens and up.

“It’s all about giving whole households together the opportunity to enjoy big screen bursts of joy in their day,” said Waterworth.
 
The move shows TikTok’s attempts to move into the longer-form video space.

Earlier this month, the app announced it was considering extending the maximum length of videos from 60 seconds to three minutes. And the shift to embrace viewing on TVs echoes that of YouTube, which in the last five years has seen significant increases in viewing on TV, rather than smartphones.

Fateha Begum, an analyst at Omdia, told Business Insider that the launch indicated the continued importance of television in the household, and the growing popularity of free internet content.

“In recent years, the ability to cast content from the likes of Facebook has been growing in popularity,” she said. “We’ve seen the arrival of curated short-form online video platforms and even YouTube now sees more than 30% of its consumption occur on TV screens. Omdia’s research reveals that nearly half of Samsung smart TV app users access YouTube on the device.”
 
Educational content, branded in the app as #LearnonTikTok, will be a key part of the TV-based programming, Waterworth said, adding TikTok on TV will operate in restricted mode, filtering out inappropriate content.
 
“TikTok on TV has been specifically created for that home viewing experience,” Waterworth added. “This will be a totally new kind of TikTok experience.”
 
Deep Halder, head of retail and content services at Samsung, said: “By offering TiKTok TV, we are giving customers yet another fantastic choice.”

Read the original article on Business Insider

YouTube Music Ads: What Your Brand Should Know

Music has always played a special role in culture but this year particularly people are tuning into more audio content via YouTube and YouTube Music. This is largely in an effort to combat Zoom fatigue and make it easier to absorb content whether tutorials, lectures, classes, meetings while juggling the obstacles of a remote work environment.

In response to this trend and in an effort to help brands efficiently expand reach and grow brand awareness with audio-based creative, the platform dropped several updates to help brands reach these users, with ads specifically designed for non-video consumption.

Elevating your brand’s message with audio

According to YouTube, more than 50 percent of logged-in viewers who consume music content in a day consume more than 10 minutes of music content.

The company also shared that in the early testing phase of the update, more than 75 percent of audio ads yield a significant lift in brand awareness. An ad from Shutterfly, for example, garnered a 14 percent lift in ad recall and a two percent increase in favorability in its target audience.

“Regardless of when and how people are tuning in, we have ways to help advertisers connect, even when they’re consuming music in the background. Now you can complement the moments your consumers are watching, by engaging them in moments when they’re listening, with newly announced audio ads,” YouTube’s Head of Music Lyor Cohen explained in a separate blog post.

Enhanced targeting via dynamic music lineups

Also part of its audio push, YouTube is announcing dynamic music lineups, allowing marketers to target their campaigns at collections of music channels on YouTube.

This will allow advertisers to more easily reach audiences based on specific music genres spanning ‘Latin music‘, ‘K-pop‘, ‘hip-hop‘ and ‘Top 100.’ In addition, brands can leverage these music lineups to focus on particular moods or interests, like ‘fitness.’

Audio ads best practices

To be clear, these new Youtube ads are designed for the viewer who is looking to “squeeze in a living room workout before dinner, catch up on a podcast or listen to a virtual concert on a Friday night.” These are not audio-only ads, rather they are relying on audio to do the majority of the communicating understanding that people may only be glancing at the visual image sporadically or not at all. The visual side of these new ads, therefore, will be limited to “a still image of animation.” Put differently, if the person was to close their eyes, they would still clearly understand the ad’s message.

The future of music marketing and audio conversations

More than 2 billion logged-in viewers are watching at least one music video each month. Over half (60%) of YouTube’s music viewing happens on mobile, where background viewing or listening is disabled.

Stats aside, innovations in social media and shifts in consumer behavior are fundamentally reshaping how music is made, consumed and shared. Brands will need a music strategy to ensure they keep pace with culture and have a powerful opportunity to lead in this intersection and create meaningful partnerships with consumers. With podcasts on the rise over the past few years, it makes sense audio content would be of interest on YouTube, despite being primarily a video service, as well as other platforms.

Over on Twitter, a test of an audio-only virtual meeting room option, which will be built on top of its new Fleets, Stories-like tool, is underway and set to launch by year’s end. Audio Spaces will enable users to start rooms where certain people can lead a discussion and others can then join, either to just listen in or to actively participate. The user who creates the space will have full moderation controls — an attempt by the platform to prioritize safety and prevent misuse and harassment.

Join 100,000+ fellow marketers who advance their skills and knowledge by subscribing to our weekly newsletter.

The post YouTube Music Ads: What Your Brand Should Know appeared first on Social Media Week.