The YouTube Effect (2022) Movie Script
1
[soft music playing]
[dramatic music]
Here we go!
[explosion]
[upbeat music]
[laughs]
-Oh, my God.
-Oh, my God!
Whoa!
YouTube has video.
[man] I'm gonna spend
the next 50 hours buried alive
in this coffin.
[boys laughing]
[woman] So I'm really excited
to be launching
this YouTube channel.
I run nine different
YouTube channels.
[man] So hit that like button!
Smash that like button!
[upbeat music continues]
Woo!
[upbeat music continues]
Yo, it's just a prank!
[screams]
-Whose streets?
-Our streets!
[upbeat music continues]
-[repeating Donald Trump's name]
-Holy shit!
[upbeat music continues]
Please don't shoot me, man.
[upbeat music continues]
[crowd chanting] Stop the steal!
[upbeat music continues]
[music fades]
[static crackling]
[horns honking]
[soft music]
[soft music continues]
[Steve Chen] It's kind
of an interesting past.
Born in Taiwan
and traveled to the U.S.
when I was eight years old.
[modem dialing]
In 1992, 1993,
we had exposure to the Internet.
So you had a bunch
of 14, 15 year-olds
without parental supervision,
having access to the Internet.
And the Internet was new.
It was, I mean, even
more experimental
than what you know
the Internet today.
People were still
just trying to figure out,
"What do you do with
these interconnected machines?"
And so that began sort
of already my long-term career
of going to sleep at 3 a.m.,
4 a.m. every night.
It's just getting online,
talking with people
and kind of just figuring
out things all together.
[jet engine roaring]
I left the University
of Illinois in 1999.
Do you actually drop out
of school and do
the startup thing?
And, you know, that means
it was a major life decision,
but it was made
within minutes of saying,
"Okay, forget what
I'm going to be doing.
I can always
finish off my senior year
if this startup doesn't work
out, but I need to buy a ticket
to get out to Silicon Valley
as quickly as possible."
[dial-up modem connecting]]
[female host] By now, you've
heard the buzz.
[male host] It spans the globe
like a superhighway.
Telephones, TV, and
computers are merging.
Whoa!
[upbeat music]
You know, people who
had access to the Internet
were able
to communicate with each other
across borders.
It was exciting, it was
an enabler of really kind of
a unique type
of communication that hadn't
been able to exist
in that way before
without travel.
Wave your hands, Brian,
so we can get a sense
of how this works.
Okay. Hi there.
I was trying to describe
to one of my undergraduates
what the world looked
like in '97, just 20 some
odd years ago.
And I was explaining to them
that the way we took pictures,
the dominant medium was film.
They're like, "What's that?"
I was like, "Oh man,
I have to define film."
So I explained, you know,
there's this canister
with this photosensitive
material, and you put it
in the camera,
and you take 20 pictures,
and then you take it out,
and you take it
down to the drugstore,
and then, you know,
somebody developed it,
and four days later,
you get your pictures.
I swear to God, I think
they thought I was pulling
their leg.
Like, they couldn't fathom
what the world looks like today
compared to that in '97.
[Jeff Bezos] There are gonna be
lots of successful companies
born of the Internet.
[boys cheering]
[keys typing]
Well, I was getting pretty
depressed like towards the end
of last week.
And I was like, "Dude,
we have, like maybe
40, 50, 60 videos
on the site, but..."
[Steve] I started in PayPal
in 1999
and I left PayPal in 2005.
So during the course
of that period was when I met
the two other co-founders
of YouTube, Jawed and
Chad Hurley.
[Steve] Of the search results,
or of the set that the image
came from?
Yeah, where it came
from in the results, so...
On of the guys that we met
early on was the co-founder
of another service.
And it was called hotornot.com.
This service can be explained
pretty easily.
You look at a photo
and there were two options.
You voted "Hot"
or "Not" on the photo.
We just thought,
"You know, what a great
opportunity it would be
if you created a video version
of this instead of just seeing
a photo.
[laughs]
So that's the true story behind
the YouTube service
as we know it today.
[whimsical music]
We started building the service.
And I would say that much
of the work was
on the engineering side.
What became of YouTube
is not something
that's fundamentally
a new and novel idea
in that people have considered
and thought about,
are there ways to be able
to share video online?
Because plenty of people
at the time had videos
that they wanted to share.
It was just sitting
on their hard drives,
but there was no way
to be able to share them.
[keys clicking]
All right, so here we are
in front of the elephants.
Cool thing about these guys
is that they have...
[Steve] A large part of it
early on
was actually looking
through and kind of seeing
what kind of videos would
people want to be able
to share that they didn't know
were shareable before.
Look, here's finally a service
where if you wanna get
a political message out there,
if you want to get some
kind of message out there,
you can use this platform.
You're not going to be able
to turn on the TV and be able
to see this.
I remember it showing up
as like a place
with goofy videos...
[boy laughs] Charlie.
...that didn't strike me as
something that would
revolutionize the world...
[crowd yelling]
[boy laughing]
[Talia] ...be livelihoods
for so many people...
[yells]
100 million subscribers.
Thank you, YouTube.
Ow!
[Talia] ...spawn everything
from social movements...
I can't breathe!
[Talia]
...to really Baroque dramas.
I don't like 'em putting
chemicals in the water
that turned
the friggin' frogs gay.
Charlie, that really hurt.
[Talia]
That all came a bit later.
[baby laughing]
[crowd screaming]
[Anthony Padilla]
I was like, "Video.
Video is the next big thing
that's gonna be on the Internet.
This is so cool.
"Like, you could record
something on
God, you couldn't even record
on a cell phone then.
It was like,
steal someone's webcam.
[Power Rangers theme music]
Well, we actually started
making videos
before YouTube even existed.
[lip syncing to music]
[Anthony] We don't even have
broadband Internet yet.
[lip syncing to music]
Videos are not a thing
that you do on the Internet.
[Mortal Kombat music playing]
[lip syncing to music]
[slapping sound effect]
[Ian] You know, we had just
graduated from high school.
We were 17 years old,
all of our friends were going
off to colleges,
and we decided to stay
in our hometown of Sacramento,
go to community college.
[Mortal Kombat music]
Then somebody took one
of our videos and put it
on YouTube.
I did a Google search,
and I found
that it was uploaded
to this website called YouTube.
Someone just ripped it
and posted it there.
[Steve] What YouTube really
brought to the game was,
I think, the ability to be able
to utilize the latest technology
that was available, and then
building that
into a cohesive system
that made it much easier
for the user experience.
Millions of viewers now
from all over the world are
logging on
to view their short films
and video projects,
[female reporter] Their Pokmon
video, the second most watched
clip on YouTube.com,
more than 15 million views
so far.
[Anthony] I'm like
this introverted kid who
is terrified of speaking
in front of a class
of, like, 30 kids, but yet...
I'm like, I can do
this on my own,
in my bedroom,
upload it, people can enjoy it.
[skates rolling]
When YouTube first arrived,
we didn't really have
high-speed Internet connections
the way we do now.
I didn't have a smartphone.
[documentary narrator]
They're off.
[Jillian] And so back then,
it was really exciting
to be able
to go online and watch
a video that maybe
you never would have
been able to access before.
[documentary narrator]
This competitor is
determined to finish.
[Jillian]
And yet at the same time,
it was slow.
I think the first video I ever
uploaded to YouTube
was of my cat drinking out
of my water glass.
And I'd taken it with maybe
my first digital camera,
and it took hours to get
it online, and I don't even
know why I did it.
But it was just this exciting
moment to be able
to share something
with the world like that.
[cat licking]
[bell ringing]
These are from a gentleman
at another table.
-What?
-Enjoy.
[Steve] " Dear sirs,
we would like to buy
your video sharing site.
Milkshakes on us!"
[bell ringing]
Awesome. Let's do it.
Hi, YouTube.
This is Chad and Steve.
We're the co-founders
of the site.
And we just want
to say thank you.
Today, we have some exciting
news for you.
We've been acquired by Google.
Yeah, thanks. Thanks to
every one of you guys that,
have been contributing
to YouTube, the community.
We wouldn't be anywhere
close to where we are
without the help
of this community.
Thanks a lot.
[soft tense music]
[Brian Williams] There has been
a big merger in
the business world,
One that may say a lot
about our world these days.
[male host] Web giant
Google will pay 1.6 billion
to gobble up YouTube.
[soft tense music]
[soft music]
[Susan Wojcicki] The light bulb
for me when I realized
that YouTube
was going to be really big,
was with this
very specific video,
which was the first hit
that I saw, actually,
on Google Video.
[lip syncing
to Backstreet Boys music]
Of these two young students,
singing the Backstreet Boys
in their dorm room,
and their roommate is doing
homework in the back.
I still laugh
when I see it today.
[lip syncing]
And I realized, "Wow,
this is going to be a big thing.
People wanna see other people
like them on YouTube.
This is going to be important."
[soft tense music]
One of the things that
I did was I did this model
where I forecasted
that it was actually gonna be
a really big business,
and we could justify the price
that we paid for it,
which was 1.65 billion dollars,
which was a huge price.
I had a hundred
percent conviction.
There was no doubt in my mind
that this was gonna be
a huge trend of the future.
But there were many around
me that questioned it, for sure.
I mean, we saw it in the press,
we saw other people who were
supposedly, like,
tech experts say,
"This was, like,
one of the stupidest
acquisitions ever."
Like, someone told me
how that day
they were with a bunch
of different Nobel Prize winners
in economics,
and they were all talking about
what a bad acquisition this was,
and how could have Google
made such a huge mistake?
So I think people across
the board did not, like,
recognize that this was
gonna be a really,
really big opportunity.
[Chad] This is the dinner
at YouTube. This is what it's
like to work here.
We're gonna feature this video
tomorrow so...
[soft tense music]
[indistinct chattering]
-[man] Cheers.
-[All] Cheers. Woo.
I find it fascinating
the way I think
younger Millennials,
and certainly Gen Z use YouTube
almost the way the rest
of us use Google search.
You know, that's where they go
to find information.
For those of you that
are new here, Monday, Tuesday,
Wednesday, Thursday,
I like to talk about
the news, world events,
and then Friday, it is all
about the conversation.
[Nitasha] I think that it can be
really wonderful, but it's just
always
been baffling to me that
here is the most massive
video platform
influencing billions
around the world,
[horns honking]
and, you know, for years,
Google wouldn't even break out,
like, basic statistics about it
in its quarterly earnings calls.
You know I don't have any
specific metrics to give but...
We definitely...
are seeing impact and
we think we are in early days of
the impact
impact we can see.
[Nitasha] And even just from
a financial standpoint,
how did they just tuck this
under the Google umbrella
as though this is just a,
like a side bet,
when it is, like,
kind of the default portal
to information?
[Steve] Yeah, the making
of YouTube...
Here's your new keys
to your new office.
-[Steve] All right, thanks.
-[Chad] Present you your key.
[Steve] When Chad and I were
pointing some things,
they still gave us
the decision-making power
to utilize the resources
of Google after the acquisition,
but we were
allowed to make the decisions
of where we wanted
to take the product.
[soft music]
I think when it got
acquired by Google,
the general narrowing
of the Internet, where like,
there used to be web surfing
where you would just, like,
bounce around
all these different sites
from all these different
creators,
and really be looking
at different things.
And now we're much
more centralized on
our social media feeds,
and there's a sense
of much more controlled chaos.
Like, everyone's talking
and talking over each other
and contributing,
but within these silos, there's
the central platform
of the whole Internet.
[calm music]
[male narrator] The wheels
of the tech industry never
stop turning.
And in 2007, we saw glimpse
after glimpse of new technology.
[soft music]
That was, like, the pinnacle
of the moment when everyone
was so excited about
the potential of technology
and social media to change
the world for the better.
This is one device...
[crowd cheering]
and we are calling it iPhone.
[Becca] There was this idea,
of, like, organizations,
these old bureaucratic
institutions aren't going
to matter anymore
because people can
kind of come together
just using the Internet,
and can change the world
that way.
[crowd chanting]
[police screaming]
[woman] I like to think
of this as the beginning
of the revolution.
So this is a bunch of people
coming together to create
a new cultural impulse.
[engine buzzing]
[Jillian] States would control
what foreign media can do,
and I think we really saw
this at the beginning
of the Egyptian uprising,
when there were only
a few networks that already had
people in the country.
And those networks,
they didn't have the same kind
of access that people
posting to YouTube did.
I think because
of a lack of trust
that the people had
in foreign media
and reasonably so, because
foreign media had,
whether intentionally or not,
had ignored a lot
of these issues for a long time.
[crowd screaming]
[interpreter] President
Mohamed Hosni Mubarak has
decided
to wave the office
of the president
of the Republic.
[crowd cheering]
[male reporter] A voice which
ushered in a social revolution
in Egypt
where the most powerful
weapon was social networking.
[crowd cheering]
[calm music]
We were able to connect
with people all over the world.
We were able to have
these conversations that would
expose things in our country,
that helped us to understand
the connections between
all of our countries
and the things that
we had in common.
[calm music]
This on?
I'm Fred and...
Mom, I'm not using your camera!
[Steve] I think one major change
that happened
after the acquisition,
what eventually led
to this concept of the YouTuber.
-Hey bitches!
-What's happening, forum?
Hi, we're the Fine brothers.
For those of you who don't
know me, I'm Brookers,
and I'm an Internet celebrity.
[woman] He's an international
superstar, the most downloaded
man on the planet.
You pretty much can stop
anyone here and ask them
how they found out about K-pop,
they found out on YouTube.
[K-pop music]
[Steve] Korean K-Pop, that's
all possible because
all of a sudden
you have content that
was being created
in one part of the world
that just happened to have
viewers from the rest
of the world.
[upbeat music]
Pow!
You became able to see
actual people
living in other parts
of the country,
other parts of the world.
Hi, everyone. [speaking Korean]
...who were experiencing
the same things you were,
and maybe, you know,
we're boldly speaking about them
in a different set of terms
than you'd been able to access
through mainstream media.
I'm back, Black, and ready
to lay the facts.
[Jillian] If you're a person
of color who's not living
in a major city,
who doesn't have connections
to other people
who look like you,
then YouTube enables you
to see people who look like you
who might share your perspective
or who might have a totally
different perspective.
And so it allows you to see
a spectrum of people
who maybe had some
of the same features as you,
or the same proclivities as you,
but who were engaging
in different activities
than the people in your own
community were.
Y'all are gonna be
my best friends.
-Oh, my God!
-Oh, my God.
-Oh, my God!
-Oh, my God.
[Anthony] It was a couple
of years into us
doing what we were doing
that we ever met anyone else
that created content on YouTube.
-I'm Toby.
-Hey everybody.
It's Michael Buck
from the What the Buck Show.
What's up? It's Brittani Taylor.
I'm here at VidCon
with Josh Rhymer.
[Anthony] And it was such
a surreal experience, like,
seeing someone face to face
who also had a following,
who understood this world.
'Cause before that,
it was really, it was a lot
of talk about, like,
how can you use this to get
into mainstream media?
Like, New Media wasn't
a term, it was like,
YouTube was just a place
for trash, is how they saw it.
So get yourself
onto Nickelodeon or MTV.
And like, we literally had
a meeting with MTV
and we pitched
all these ideas and they didn't
know what to do with us.
Hey, everyone.
Welcome back to my channel.
It's your girl Jackie Aina.
If you are new here, welcome.
[Susan] We've really had no
gatekeepers that have said,
you know, "Send me your script
and I'll decide whether or not
we fund you."
We just say, "Hey, post it
on YouTube."
Tonight, my guests are
two transgender YouTubers
from opposite ends
of the political spectrum.
But I've brought them here
together to engage in
a rational, free-thinking debate
about a timeless...
[Natalie Wynn] YouTube is so
individual, right?
Generally, video is something
that is produced with a big team
of people.
[piano music]
It's a very solitary project
for me.
Like, I write the script,
I'm doing the wardrobe,
I'm doing the makeup,
I'm setting the camera up,
I'm positioning the lights,
I designed the set.
I turned the camera on,
I'm sitting in a room alone
talking to the camera,
and then I edited it myself.
I mean,
it's a very solitary project.
[piano music]
[piano music continues]
I don't know, maybe this is
a little bit of vanity,
but it's like, it feels like
a very personal like,
more like writing
a book would feel
than directing
a movie, you know?
[piano music]
I think I first uploaded
to YouTube in 2007.
And it was a video
of me playing the piano.
So YouTube as a website
has been in my life
a long time,
since I was in high school.
What do I look like to you,
some kind of philosopher?
[harp notes]
I majored in philosophy
and psychology,
then I started getting
a PhD in philosophy,
got two years into that
before realizing
"I don't want to be an academic.
This does not suit me at all."
Gentlemen of the Academy,
it is my duty to submit to you
the findings of my latest
mimetic research.
What are traps, and be they gay?
I was using a philosophy
background and making arguments,
but I'm also making
entertainment.
[crowd cheering
and cameras clicking]
[male reporter] YouTube
celebrities are laughing along
with their fans
all the way to the bank.
The last one of you take
your hand off this million
dollar stack of cash, keeps it.
[female interviewer] How do you
go from working in a supermarket
five years ago
to earning more than 12 million
pounds this year?
Part of me is not sure.
I just do something that
I absolutely love and put it
out there for anyone to watch.
I don't need money
I don't need cars
Girl you're my heart
[Steve] You know,
some of the most recognized
musician names out there,
they created their content
when they were 13, 14 years old,
uploading it as sort
of an amateurish piece
of content onto YouTube.
What's up guys, my name
is Shawn Mendes.
[Steve] And then you fast
forward a few years,
and they're performing
in front of hundreds
of thousands of people.
[crowd screaming]
[female newscaster] It's amazing
making money off of YouTube,
and two Sacramento
20-somethings are on that list.
[Anthony] Our channel was one
of the first ten channels
to ever be monetized.
They said we will give you
$10,000 dollars each month
if you have
these ads as part
of this partnership program.
Obviously, that's
a shit ton of money for,
God, we were 19 years old.
[crowd screaming]
[Natalie] In the early days,
no one was doing
this professionally,
or very few people were.
And if they were doing it,
it's because they were making
ad money.
-[man] What the heck?
-I've never seen
a dead person.
- You haven't?
-No, bro.
[Natalie] That really affects
the kind of content that
you can make
Because it incentivizes you
to make as much content
that gets as many clicks
as possible by any means
necessary.
[reporter] For more than a
month, Mona Lisa Perez said
her boyfriend
had been begging her to launch
his YouTube channel
with a bang.
I may fail, but if I fail,
I wanna die trying.
[Boy] He has a microwave stuck
to his head.
I know this sounds like
a prank call, but he was trying
to film a YouTube video.
[Natalie] It doesn't have
to be good.
People don't have
to wanna pay for it.
You just have to get eyes
on the video.
Dear fat people. Ah! Some people
are already really mad
at this video.
What are you going
to do, fat people?
What are you gonna do?
What are you, gonna chase me?
All right,
pull up with the gang.
You know, and look at who
they offer partnerships to.
No channel grows to hundreds
of thousands or millions
of followers incidentally.
Be sure to drop a like,
hit that subscribe button.
15,000 likes on this video.
[Talia] So much of it is about
generating clicks.
Like, it's even rougher than
the news clickbait economy
because you sink or swim
based on your impressions.
And for a lot of people,
it's their sole livelihood,
And so everything is
this capital letter,
everything's sort
of in bold, and italics,
and exclamation points.
We've been getting 50,000 likes
on all of my videos,
so go hit that like button.
[soft music]
[music continues]
Everything on YouTube
changed when
the recommender algorithm
was introduced in 2011.
YouTube started rewarding
the content that had
the highest click-through rates
and the highest watch time.
And like, some kind
of combination of that.
[upbeat music]
[Anthony] The algorithm doesn't
differentiate between
a positive piece of content
that makes you feel good,
walk away with a smile
on your face,
versus something
that pisses you off
and makes you super angry.
In fact, it tends to sway
toward the stuff that makes
you angry,
because you're more likely
to click on other videos
to learn more about this subject
that's upsetting you,
you're more likely to respond
to a negative thing
that gets under your skin
if it's in the comments.
Unless you are sort
of seeking out specific content,
what YouTube serves you
is sort of a stew of the stuff
that's meant to excite
and engage,
you know, whether
positively or negatively.
[Anthony] YouTube also didn't
want clickbait to be the number
one thing
that was dominating
the platform.
I think that they cracked
down by saying,
"Well, we'll reward whoever
sticks around the longest."
So you can't just have that
enticing thumbnail and
trick people into getting there.
You have to also captivate them
and keep them around.
Part of me is scared
about what impacts
these algorithms
will have on the future
because the algorithm
is a beast that really can't
be tamed
once it's been unleashed
and it's already been unleashed.
We have some really big
projects, we got Food Battle,
of course, coming up,
we've got some big music
videos we're working on.
-Yeah.
-We're upping our game.
[Anthony] But it's like,
it's an endless loop.
You step away from YouTube
for a little while as
a content creator
and you feel a little...
pain in your heart.
Like, what if
people stop...
What, are people gonna
forget about me
if I take two or three
weeks off?
or even a month off,
are people...?
Is the algorithm not going
to show my content
to anyone again?
So the algorithm is controlling
the creator in that sense.
It makes you feel obligated
to continue your momentum.
You can't break your momentum
is kind of the feeling
that most people have.
[eerie music]
These platforms are in
the engagement business.
At the end of the day, they're
giving away their products
for free.
And the way they make money is
by keeping you on the platform
for as long as possible,
extracting data from you,
and delivering ads.
Now Facebook, it's very easy
to see the recommendation
algorithms.
Your newsfeed is
100% curated for you
to provide you with articles
that will keep you
on the platform for
as long as possible.
YouTube, it's a little bit less
obvious,
But if you look at the viewing
patterns on YouTube,
a full 70%, 7-0,
of all YouTube videos that
are watched today
are recommended by YouTube
through Watch Next,
so the autoplay that is
by default on,
and then the Recommended
For You panel
and the recommendations
on the right hand side.
How's it choosing those?
So of course,
part of that algorithm is,
"This is something
that is relevant to this."
But some of it is that,
"Look, we know people will click
on this, and if you click on it,
I get more data
and I get more ads."
And that has led to lots
of problems.
The rabbit hole effect,
the echo chamber.
So for example, if you go
on to YouTube today
you click on moon landing,
within a couple of clicks,
you will be in La-la Land
talking about conspiracy
theories
around that the moon
landing is faked.
A full 10% of recommendations
were conspiratorial.
That's insane.
Well, I just got kicked out
of Starbucks for asking
NASA employee questions
because he's lying.
[Hany] So it's not just that
these platforms are neutral.
You know,
it'd be one thing to say,
"Hey, look, we are sitting back,
people can upload videos,
you get to look
at what you want."
That's not how
these platforms work.
They are choosing
to amplify things that engage.
And what we know is that
the more conspiratorial,
the more hateful,
the more divisive,
the more it engages.
Dipshit.
And get that camera out of my...
[engine roaring]
[calm music]
[Caleb Cain] I realized that
the Internet was a place
where a lot of people
describe it as escapism,
and I kind of resent that.
Because it's not just that,
I had video games for escapism.
I had, you know, going out
with my friends for escapism.
But what the Internet had was,
especially in the early days
of YouTube,
you could find
dissident information,
You could find information
on there that was outside
the culture.
And the reason that's
so important is because
you can't see
contradictions in your society
unless you get
outside the culture.
[calm music]
Suddenly, I could start to see
how things in my life
weren't working
because I could see what
was going on in other places.
It didn't always lead you
to the right place.
And sometimes the truth comes
with a lie and a price tag.
But you had freedom
that you never had before,
and that I never experienced
in my day-to-day life.
Hi, everybody. My name is Caleb.
You know, welcome to my channel.
So you can probably tell
by the description
on the video, in the title,
what this is gonna be about.
So I'm just gonna get
right into it.
I fell down
the alt-right rabbit hole.
So it begins.
You're in a death battle,
New World Order. We know.
If I lived in Saudi Arabia,
they'd kill me.
If I lived in China,
they'd kill me.
You still gotta sell your soul.
You gotta write
your name in blood.
[man] Whites are getting fed up,
Whites are getting tired
of being disenfranchised
and neglected.
Hail our people. Hail victory.
[laughs]
[screams]
[Caleb]
Well, at first I was kinda just
depressed skimming the web.
But I think at some point,
I had found this video called
"God is in the Neurons,"
and it was from this YouTuber
that I used to watch in
high school,
And it gave me this concept that
I could rewire my brain
and really self-improve.
[man] Our beliefs have
a profound impact on
our body chemistry.
[Caleb] And that took me
to self-help content.
And then the algorithm just
started feeding up
more self-help content.
There is no such thing
as mental illness.
[Caleb] You start with someone
like Stefan
because you're looking
for self-help.
And so we call that
an on-ramp, right?
In the study of extremism,
we would call that an on-ramp.
And you go from that self-help,
which has now validated
your identity,
it's given you
a direction to go,
it's given you an algorithmic
list, to use the word
a bit ironically,
of things that you have to do.
And I'll tell you what,
people are looking for
that in a complicated world
and especially young
men are looking for that.
It's now time
for the return of men.
[Caleb] It seems that
the algorithm would always
pull you towards
the more hyperbolic content,
towards the content that
was a bit more extreme.
Muslims by the million
have been pouring into Europe.
And you can see
the terrible problems
that have arisen
as a consequence.
[Caleb] And there was
an uncomfortability when
you went a bit deeper
into this specific ideology,
because not only is
it depressing, but it's hateful,
That specific rabbit hole,
I feel like what it was truly
doing was, it wasn't
radicalizing me,
I was already radicalizing
because of how
my life was going.
But it was killing my empathy.
It was turning me
into a sociopath.
[insects buzzing]
YouTube is now like a Walmart in
a town where there's a Walmart
and a Dollar General and a CVS.
Like it's one of the sole
organizing destinations.
And in the case
of the far right,
and like, right-wing
political content,
radicalizing political content,
it's like Walmart having
a big old gun aisle.
[Natalie]
Well, I think a lot of people
come to YouTube to do
commentary, to do politics
because they feel like
their perspective is not
represented elsewhere.
Words like racist, misogynistic,
and transphobic are not insults,
nor are they stereotypes
or generalizations.
Rather, they are facts about
the way we are socialized
in a Western society.
I'll be honest with you,
if I hate anything about
Black culture,
it's that it's such
a victim culture,
almost a victim cult.
[Natalie] I do think in general
that people who feel
marginalized
or who feel like they're
on the fringe, tend to gather
on YouTube.
You don't deserve to get laid.
[Caleb]
Then come along the alt right,
and they really took
that whole dynamic
and just revolutionized it
and innovated it.
And then they took many
of those Internet nihilists
and they took
that radical energy,
and they directed it
well,
towards dominance, and fear,
and hate, to be honest with you.
[sigh]
Well, this is my last video.
It all has to come to this.
You forced me
to suffer all my life,
and now
I'll make you all suffer.
[female TV host] An angry and
psychologically twisted young
man whose public rantings
exploded into sheer terror.
[gunshots]
[policeman]
Shots fired. Shots fired.
[female TV host]
Taking six young lives
and injuring 13.
A virgin vowing revenge
in a twisted video he posted
Friday, addressed at his
perceived enemies,
young women he says
rejected him.
You will finally see that I am,
in truth, the superior one,
the true alpha male.
[woman] Aggression, violence
and unbridled ambition can't be
eliminated
from the male psyche.
They can only be harnessed.
Rape, murder, war.
They all have two things
in common.
Bad men who do the raping,
murdering and warring,
and weak men
who won't stop them.
We need good men who will.
PragerU is an interesting
example because
they pitch themselves
as an educational channel.
They very much mimic
online course stuff,
but what they're feeding you
is very, very, quite radical
right wing ideology.
My message has been simple.
Islam is not
a religion of peace.
[Talia] I think one
of the things to debunk
is this idea that
it's solely an algorithm issue.
One big mechanism
of radicalization,
it's not just algorithmic,
it's very deliberate.
These channels, they build
parasocial relationships
with their viewers
so they feel a deep
connection with you.
I think it's more visceral
with video content.
I think it's more intense
with video content.
My name is Candace Owens,
and you are watching
my vlog series.
[Talia] You know, the sort
of confessional vlog-y, casual,
"This is my living room,
you're looking at me,"
kind of style, really, really
facilitates that kind
of parasocial relationship.
I just wanted to say thank you.
I've seen some of the comments.
Some of you are a marvel.
You send actual letters.
[Talia] And it's just, like,
a Suggested Follows,
And that's its own issue.
Because these are very
well-funded channels,
these are big moneymakers.
[Steven Crowder]
There are two videos completed
here, Louder with Crowder.
This team never would've
been able to do,
if not for this man, Kevin,
lending us his personal plane.
[woman]
My skin's been pretty good,
to be honest.
I think YouTube is
a very intimate format.
If you're watching
a YouTube video,
people tend to watch
it by themselves,
you watch it while you're
eating, you watch it while
you're going to bed.
You're watching
one person talk to you.
There's something very
intimate about that.
You feel like you know
this person.
It's a little bit uncanny,
you know, because they feel
that they know me,
but I don't know them.
We've talked about some heavy
stuff on this channel.
But I was gonna say,
"Oh, it's one of th..."
You know, if someone's
broadcasting from their bedroom,
you feel this real sense
of intimacy with them,
you feel like you know
the intimate details
of their lives.
I remember once I had
a girlfriend who was not
jealous to begin with,
but became kind of jealous,
particularly as I became
more successful
as an entrepreneur.
She became more jealous
and more insecure.
[Becca] And then when they start
kind of telling you
about their beliefs
and views,
that packs a real punch
that other delivery
mechanisms wouldn't.
The right are the producers,
the makers,
and the left are the takers
or those who manage that
taking and thereby gain
political power
by being able to hand out gifts
that they have not earned
but are taking from the makers.
[Becca] And so there's become
this kind of cottage industry of
far-right creators on YouTube
who take advantage of that
and broadcast to their audiences
and are quite effective
at radicalizing them.
I'm a weapon.
I'm made to be thrown at you.
[laughs]
[uplifting music]
Just between you and I,
I know the cameras aren't
on right now, right?
-Say, "What?"
-What?
I know the cameras are not
on right now.
-Keep saying, "What"?
-What?
[laughs]
[Anthony]
Yeah, I mean, it was
really hard for me to,
like, even think about where
YouTube was going because
we're so caught up in it.
But I definitely thought that
it was going to replace
mainstream media.
All right. Love you guys.
Thanks again for supporting.
-Thank you.
-Love you. Bye.
I knew that this was going
to be the way
that the Internet
was going to go.
I've had offers to do a show
for Netflix, or for Amazon
or cable TV, or whatever.
I always declined them.
Because I don't really see why
I would give up
my own creative control,
and this, like, child that
I've raised, basically.
you know, I want to keep it.
[saw buzzing]
[Susan] We're a platform
that enables creators
who are really next
generation media companies.
Welcome back to my channel,
everyone.
Today I wanna show you guys
some life hacks. Now, before...
[Susan] We've seen so many
people take their passions,
whether it's about cooking,
woodworking, music,
and turn that into a business
and become a creator
and become a global media
company on YouTube.
Hello. Hi. How's it going?
I finally got it.
I got the 10 million
subscriber plaque
But this is special because
it represents, like,
how far we have come
as a community,
and it represents, you know,
how much we have accomplished.
[birds chirping]
[Brianna Wu] What I thought was
so amazing about YouTube
when it first came out
is it really allowed
this hyper-segmentation.
So if you were a gamer
that wanted to understand tricks
from an obscure Japanese game
from 20 years ago,
[upbeat music]
boom, that exists out there.
[upbeat music continues]
And it was just
this massive explosion.
But I think for me,
obviously Gamergate was
when I really realized
something was going off course.
[dramatic music]
I had just finished shipping
my very first game.
And, you know, getting into
a political fight with
the Internet
was the last thing
I wanted to do.
But people on YouTube had
started to come together
in a way
to silence women
in the game industry
that were starting to ask
for more representation,
asking to be higher,
asking to be given
more opportunity.
[gentle music]
It started out with
this kind of Tumblr-style
call-out post about Zoe Quinn.
[Natalie] I guess
her ex-boyfriend accused her
of sleeping with journalists
and cheating on him.
And I started to speak
out against it.
[calm music]
I was sitting
at my home one day,
and I get this really credible
list of death threats that
went mega viral on the Internet,
and they are burned
into my brain.
[gasps]
"Guess what, bitch,
I know where you and
your husband live."
They give my address.
"You're going to die tonight."
"I'm going to rape you
with your husband's
tiny Asian penis
until you bleed."
"If you have any children,
they're going to die, too."
That was the moment that
I decided I wasn't safe
at my home.
And I left, and I got the police
and the FBI involved.
Brianna Wu is a games developer
who says she was forced
to leave her home over
the weekend after receiving
targeted threats.
She joins me now from Boston.
[Zoe] Instantly,
they dived in to,
"Find where she lives.
Find where all these people
live.
What are we going
to do about her?
Can we hack her e-mail?"
Like, instantly?
[mouse clicks]
And then all of these accounts
started being made
to talk about really
disgusting personal details
and talk about like, make really
disgusting sexual comments.
And it's the first thing
you lose is all perspective.
[soft tense music]
[Brianna]
There was an organized effort
to basically do SEO warfare,
search engine warfare.
You know this as well as I do.
If I type your name
into the Internet,
a bunch of YouTube videos are
going to be the first thing
listed.
So what Gamergate figured out
was that they could gain the SEO
and start putting out
these videos, you know,
Brianna Wu is a terrible person
for A, B, C, D, and E,
and basically malign my name.
YouTube was a major part
of the harassment
that I received.
[seagulls squawking]
[horns honking]
[engine revving]
[horns honking]
[Carrie Goldberg]
Yeah, I hear that a lot
where people are like,
"Listen, the Internet,
it's just a mirror on society.
And so why should
we be focusing on fixing
the Internet instead
of on society?
And the thing is,
that, in my opinion,
and my experience
as a lawyer for people who've
been fucked over by tech,
is that it is itself the weapon.
The Internet has
created this, like,
really convenient
mechanism to harm.
and the platforms are making,
just like they're minting money
off of it.
And so I don't agree that
it is just a reflection
of society.
It's changed society.
[dramatic music]
The thing about the algorithm is
what they're recommending
is a harm.
Their lust for hijacking
people's attention
is an additional harm.
The other issue of them
publishing content
that is itself harmful,
that they're not removing.
[Norah O'Donnell] We begin with
breaking news in Virginia.
A gunman opened fire during
a live TV news interview.
A reporter and photographer
with our CBS affiliate WDBJ
in Roanoke were killed.
[male reporter] Andy Parker
couldn't get video
of his daughter's murder
off of Google's YouTube.
[soft tense music]
[birds chirping]
[Andy] I went to YouTube,
typed in Alison Parker,
and literally there were pages
and pages and pages.
There must have been
25 to 50 pages of, you know,
thankfully it didn't autoplay,
but you knew what it was.
And in the titles of it,
a lot of it was,
"See, the whole thing was fake."
You know, it was, like,
one thing after another.
It's not like you can
just call up Google customer
service and go,
"Hey, you know,
I got a problem here."
[soft tense music]
[Susan] Well, we've invested
a huge amount in the area
of responsibility.
We also have thousands
of people who help us
with the enforcement.
We have machines
that remove, you know,
almost 90% of the videos
before anyone has to see them.
So we have a huge initiative
across the board
to make sure that we're
staying current in terms
of the creation of the policies
as well as the enforcement
of those policies
to find those videos
as quickly as possible,
and make sure they're removed.
If you need me to work
on the op papers,
I'm free and available.
It just makes me feel
like individuals don't stand
a chance.
You've got this like,
multi-billion dollar company
that's saying,
you know, that's just like
lording over, like,
ownership of your dead
daughter's murder video
over you,
that are saying, "We're not
going to take it down because,
you know, like, you aren't
the copyright holder of it.
And even though our terms
of service says that we ban
this kind of violent imagery,
we don't have
the legal requirement
to even enforce
our terms of service.
Like, you can't make us
do anything."
We have kind of a mantra
of don't be evil,
which is to do the best things
that we know how for our users,
for our customers
and for everyone.
And so, I think if
we were known for that,
it would be a wonderful thing.
[Larry] When Google,
the two founders said,
"Our motto is don't be evil."
And for a while they weren't.
But now they are
the personification of evil,
and they don't care.
You can flag this stuff,
and then
you can make that go away,
but then something
else will pop up.
That's why I said we have
to go after the great
white whale, Google.
[soft piano music]
So YouTube likes to say
it's a completely different
company from Google.
What's your opinion about that?
My experience is
the two have always been
treated as synonymous
in every interaction
I've ever had at Google.
I can tell you the impression
I've gotten is that
these products
are kind of all under
one umbrella.
I realize it's more complicated
than that when you get to,
you know, product managers
and who owns what,
but, I personally don't find
that excuse credible.
[tense music]
I think that Google has really
been far ahead of the game
from its competitors
in terms of anticipating
these legislative fights.
[Alan Davidson] We've realized
that we need to have
a larger presence here,
that the Internet is affecting
a lot of how people live today.
And we need, our industry
needs to be here
to help explain that
to members of Congress.
[Nitasha] You know, the company
has spent more than a decade
kind of cultivating politicians
on the right, on the center.
If you look at pretty much
any policy decision where
Google could potentially
be impacted, you know,
sometimes I think about
it as like a big circle,
and you know,
you have the left and the right,
and Google has given money
to just everyone
involved in the debates.
[laughs]
[Eric Schmidt] Google's mission
is to connect the world,
right, to get all that
information out.
We want a free open Internet
for every citizen of the world.
[somber music]
[male]
Leaders in government
and tech want to rewrite
a law that shapes the Internet.
Why we're sitting here today,
it's Section 230.
Because there's only
so much you can do.
We filed a complaint
with the FTC
claiming that Google
violates their own terms
of service,
which they do,
and they don't care.
[somber music]
There's the Communications
Decency Act,
Section 230,
which completely protects
the websites, the search engines
from any responsibility
to users.
When they talk about kind
of maximizing speech
on their platform,
that also means maximizing
monetizable content for them.
With Section 230,
what it means is that
they can do that without
any real legal responsibility
for what happens on
the platform.
Section 230 has
two key components.
So, one of them
is that it gives us,
protection from liability
from content that's posted
on our platform.
But the second is
that it enables us also
to remove content
that could be harmful
to our community.
But I think now,
20 plus years on,
we have to look at 230
and say, "Look, guys,
230 made sense when
you thought about
platforms as neutral platforms."
[man] Listen, Operator,
it's a very private call.
Now, you're not going
to be listening, are you?
[operator] Well, I won't even be
on the line. Just...
[man] You're going to get off
the line as soon as I get
the party, right?
- Right. There will be
no one on the line.
- Okay.
[Hany] It's like the wire on
the telephone, right?
I'm just creating a mechanism
for people to communicate.
If they're planning
to commit a crime,
you can't hold AT&T responsible.
Well, sure, that makes sense.
But that's not what
social media is anymore,
and it hasn't been
for a really long time.
They are not a neutral arbiter
of the material that is
uploaded to their sites.
They pick and choose
the winners and the losers
through the recommendation
algorithms.
Yeah, it raises a lot
of questions like
is kicking someone
off Twitter censorship?
Are we okay with that?
Like, I think that, you know,
a kind of obvious
response is to say that,
"No, the First Amendment
protects speech, protects
from the government,
doesn't give corporations
an obligation to host
your content."
If there is no Section 230,
there is no free speech, period.
I think we need to be
tremendously skeptical
and careful in tearing this up,
because it is the foundation
of the Internet.
That said, I do think that
we can look at amending it
and changing it and
updating it in different ways.
[Carrie] They don't see us
as important, we're not
their actual customers.
We're just their money makers.
They advertise at us.
They collect our data.
If there's content that
we want down
because we own
the copyright as an individual,
that is not meaningful to them.
This week, I got myself
a death threat.
This is messed up stuff
that people are saying,
[sobs]
Like, people telling
me to hang myself,
people just, like, blatantly
disrespecting the fact that
I'm still a human being
is not okay at all. [sobs]
I'm kind of simplistic in this,
it all goes back to us
having the right
to sue if we're injured.
Because most people,
like, most of the time
on these websites,
are not injured.
When I told him to leave me
alone and asked if he had
a knife,
he didn't respond. And when I...
[Carrie] But the rare occasion
when somebody is,
which on scale is
a lot of people...
You are an ugly, untalented,
fat [bleep] [laughs]
[Carrie] ...we need to be able
to hold them responsible.
The algorithm, it really
fostered this community
of videos that
people would kind of love
to watch because they loved
to hate the thing
that was in there.
They'd be like, "Yeah,
that person fucking sucks."
And I think it's time we take
a look at what canceling
really is.
I am so used to reading mean
things about myself online.
Most of you probably can't even
begin to imagine how used
to it I am.
[gong clangs]
It was just kind of driving
this community of, like,
pent up rage.
[music building]
[somber music]
[music continues]
[man] So YouTube has
set up spaces
around the world where
YouTube creators can go
and create their own videos.
[Susan]
We're a video-first platform
and everything we do
is about video
enabling new content
creators to come onto
the platform
and to be successful.
And so I expect us to continue
to really focus on the video.
I think technology will,
of course,
will continue to change
how we communicate
and the content that we see.
But YouTube will stay
focused on video.
[mouse clicks]
[Anthony] They want growth.
Growth. Growth.
Everything's about growth.
There's so much content
on YouTube right now
that it's completely
overwhelming to ever seek
stuff out.
You're not just going into
a search bar looking for things,
you're clicking on what's
being presented to you.
So I think that YouTube is going
to have to get to a point
where they make people feel
safe to be on the platform.
But then also,
they're having to tread
really murky territory where
they can't be kind of, like,
the arbiters of truth in
the sense that they choose
what is to be believed.
[crowd cheering]
[crowd cheering]
I'm going to fight to bring us
all together as Americans.
We're living in
a divided country,
it's not going to be divided.
We're going to love everybody
like we love the people
in this room.
Specifically, YouTube, the
algorithm was really rewarding
the stuff that would get
people up in arms and feeling
like they had to fight
against something for a cause.
America was built by and for
the white Christian people
of this nation.
[crowd screaming]
[Anthony] And politically,
that was a huge drive.
I know that having what felt
like good versus evil,
really gave people a reason
to keep clicking on more
and more of those videos.
So leading up
to the 2016 election.
[male reporter]
Donald Trump will be the 45th
President of the United States.
[man] We have to get better
at listening to each other,
and challenging each other
constructively and generously.
But I worry that the very
architecture
of the social Internet might
make that impossible.
[male reporter] In the week or
so since the election,
there has been mounting
criticism of whether web giants
like Facebook and Google
used enough discretion
and editorial responsibility in
screening out fake news sites.
Comet pizza. Here we go.
[male reporter]
According to police,
Welch said that he had read
online that the Comet Ping
Pong restaurant
was harboring child sex slaves.
Tell me why this pizza place
isn't even open at noon.
[gentle music]
[male reporter] DC police say
Welch fired at least one round
into the restaurant floor
with an AR-15 rifle
like this one.
[emotional music]
[music continues]
[Dave Lauer] There must not be
a tipping point
because there is no doubt
that YouTube bore
a huge responsibility
for the spread of misinformation
in 2016.
[Paul Joseph Watson] One expert
told me that Hillary has
high functioning autism
with attendant sociopathy.
[crowd screaming]
[Dave] Now we're in the sort
of misinformation apocalypse.
[chanting angrily]
I think there's blame
on both sides.
[crowd yelling]
What started off
as this niche field
in the burgeoning digital
revolution, turned into the,
"Crap, everything I read, see,
hear online is now suspect."
Everything we're seeing
in the news right now
is just insane,
fake news garbage.
If the Proud Boys are left alone
to do their thing,
they wave little American flags,
and then go to the bar to have
a drink, and nothing happens.
-You are fake news. Go ahead.
-[Reporter] Sir, can you...
[Brianna]
There were people at YouTube
that were aware that
the tools that they built
were being misused.
So they started to put together
basically task forces,
looking at the problem,
talking to women like me,
getting our experiences
and doing what Google does,
which is talking to experts
and trying to come up
with concrete,
realistic things they can
implement to solve the problem.
Unfortunately,
despite participating
in three of these task forces,
none of them were
successful in getting
Google to change their policy.
My name's Anthony Padilla,
and today I'll be spending
a day with Susan Wojcicki,
the CEO of YouTube,
who began as one of
Google's first employees.
If there's anyone in
the comments right now angrily
claiming that you, Susan,
are the key reason
that YouTube isn't the way
they wish it were,
what would you say to them?
It's much more complicated.
[laughs]
It's not just me.
You're not going
to blame someone else?
[both laugh]
I'll take ultimately
responsibility for everything.
But it has to do with the fact
that these issues are
much more complex
than people understand,
that maybe the changes
that we implemented,
for example, are due
to regulation or
they're due because
these changes we've done
will enable more advertisers
to come and spend more revenue
with YouTube creators.
I haven't been there for
close to a decade now.
But I still think that
the decisions that
they make are often prioritized
by what the end user wants,
not necessarily by,
you know, quarterly earnings,
or quarterly,
financial side of things.
If they wanted to,
there's a lot of opportunity
to be able to monetize,
but it would be at the cost
of the end user experience.
They are obligated to maximize
shareholder value.
If they don't do that,
they are exposed
to liability and lawsuits.
When that's your incentive
and you don't have liability
for the content on your system,
it leads to some very
perverse incentives,
such as building machine
learning systems
and algorithms that attempt
to capture people's attention,
addict them to content
and keep them on your site,
so that you can show them
more advertising,
because that's how
you make your money.
Well, I would just disagree
with that point of view.
And the reason
I would is because
when there is any kind
of harmful misinformation,
that is bad for us
and bad for our business
and bad for us financially
and doesn't work
with our business model.
We are an advertising
supported platform,
so no advertiser
is going to want
to be on that type of content.
And we have seen advertisers
pull back their spend
when they see that we're not
managing our platform
to keep our users
and community safe.
That reason, along with
wanting to be on the right
side of history,
wanting to do the right
thing from a brand,
from a PR,
from our employee standpoint,
from just thinking about
what's the right thing to do
for our users
in our communities.
[upbeat music]
[applause]
[Robert Kyncl]
Digital video is exploding.
Already the youngest
millennials are watching
more digital video than TV.
And in fact,
it has overtaken social media
as the top online activity.
If you were to ask
college students,
"What do you wanna bring
when you have a small dormitory?
Do you want to bring
a sort of a laptop
that you can watch YouTube,
or do you want
to bring a plasma TV?"
It's always going
to be the laptop
with how many things
that you can do with it.
[calm music]
[waves crashing]
[Ryan Kaji] It started
when I was three.
I was seeing a lot
of other people on YouTube
and I wanted to be
on YouTube too.
[kids yelling]
So I asked my mom
and she said yes.
[Loan Kaji] Okay, Ryan,
are you ready to find
the egg surprise?
Ready? Go, go, go.
[Jared Reed]
This kid, Ryan, who's on the top
of the Forbes list, he earned
$22 million this year.
-[Loan] Whoa!
-Whoa.
[reporter] He has
more than 12 billion views
on his YouTube page.
If you have children,
you know him.
-[Loan] Ryan?
-What?
-I have a surprise for you.
-What?
-Look, look over here.
-Whoa!
[Loan] And you don't need much
to get started on YouTube.
I started filming
with just my phone.
-Hi Ryan!
-Hi Mommy.
And I still use it to this day.
And, you know, when I started
to do more research,
I didn't even know, like,
what is a green screen
when I started doing research.
We couldn't afford
a green screen.
We just happened to have
a tablecloth that was green.
And we're like, "Hey,
we can make this work."
And that's what we use.
[Shion Kaji]
Definitely after four months,
we started the channel,
we started seeing the tipping
point in the viewership.
We started seeing views from,
India, East Asia,
all around the world.
Okay, guys, let's open up
the Ryan's World
World Tour Globe.
[Shion] Around the same time,
we started our own channel
and I think that timing
really matched so well.
Which one? Which one?
The purple one?
Are you sure you can do this?
Oh, good job!
[Loan] The most important
thing we tried
is just to make sure that he
still has time being a kid.
And make sure
that even though yes,
he's well known
around the world,
to me personally, I don't think
he notices it that much
because, again, we really try
to keep him grounded
and really try
to make sure that YouTube
is not the essential part
of who he is.
[all yelling]
[child giggling]
[whimsical music]
There was this offering
that YouTube
had called YouTube Kids,
and in 2017 it had 11
million weekly viewers,
which is just a huge number.
Wow!
[Dave] Now, it was really maybe
a couple years later
that it came out
that there was this effort
by content producers to use
YouTube and YouTube Kids
to basically
generate content that would
leverage what they knew
about the algorithm
so that it would be
recommended to children.
There were videos
that were frightening
and traumatizing children,
that depicted abuse.
They were hijacking
Disney characters
orPaw Patrol and showing
them being stabbed
or hit by cars or killed
or committing suicide.
And all of these videos
were being promoted to children.
So it was really a disturbing
phenomenon and really just goes
to show how dangerous this
kind of technology can be.
It really gets into this idea
that when you codify
something like that
into a recommendation algorithm,
results end up being dominated
by particular behaviors and
especially those behaviors
that make YouTube
the most money.
[Shion]
I guess the biggest challenge
of the parents
and of the producers too,
is the kids influencer business
is still untouched territory.
It's still very new.
You know, the regulation
is still not 100% perfect.
There are many things
that we have to figure out
as we go and then make sure
it's a safe place for kids.
[Dave] There was this
weird thing where, like,
this scary character was just
popping up in kids' videos.
A terrifying video that
targets your kids.
The online challenge encouraging
children to kill themselves.
[Dave] You know, that was
something that happened
after YouTube
had supposedly gone in
and fixed the problem.
When we talk about the YouTube
recommendation algorithm,
that's just a great example
of something where
the focus on fixing a system
for recommending content,
if we can just figure out
that right piece of technology,
everything will get fixed.
And that's a very Silicon Valley
way of approaching things.
And I don't think
it's the right one
because these
are not technology problems.
These are far broader problems.
Even putting up a stoplight
at an intersection,
It only gets made
when there's like
a certain amount
of deaths or something
or enough people complain
and people are only complaining
when there's
a reason to complain.
And it's kind of like that here.
I don't,
It's kind of dangerous
how we are just waiting
for there to be enough
of these digital car crashes
in this digital intersection
with no stoplight.
[slow music playing]
[sirens blaring]
[reporter] Injured people
are rushed to hospital.
They were gunned down during
Friday prayers at Al Noor Mosque
in the center of Christchurch.
[reporter 2]
Police confirm the gunman
was livestreaming the killings
at one mosque on social media.
[woman]
It was the most devastating...
[singing "Imagine"]
In December of 2020,
the New Zealand government
released the results
of this research
that they had done
into the shooter
and his radicalization process,
and they found that he
had donated money
to a series of YouTubers.
He basically spoke about
being radicalized on YouTube.
It wasn't in these,
we like to talk about
the deep,
dark corners of the Internet.
You know, it conjures
up these scary images
of the dark web
and all of these things.
But this was actually
right out in the open,
you know, right on YouTube
is where he got radicalized.
We bring our ideas to you.
Our ideas have power.
Our time has come.
There is nothing that
can stop an idea
whose time has come.
And that time is now.
[applause]
[slow music]
[Caleb] That shooting,
it really disturbed me.
And I, you know,
I realized in that moment that
this ideology would literally
kill my friends, you know?
And so, yeah,
I had to make a change.
I had two individuals message me
and they said,
"Hi, I'm not going
to tell you who I am,
but I'm inside
all the far right chats
and they're pissed off at you
and they've got your address."
And then he posted my
address and I was like,
"Why are you
telling me all this?"
And he said, "Because, man,
you're a fucking cuck.
But what they're saying,
what they're talking about
doing to you is fucked up."
I think what YouTube
needs to do is they need
to have very clear terms
of service on their websites
of what's acceptable
and what isn't.
You know,
I am a free speech advocate,
but what I saw on the
platform was people
taking advantage
of this algorithm.
And the algorithm does not care
about what your politics are.
It cares about watch time
and keeping you on platform.
And I think I left it
the same way I went in it,
is that I'm curious and I was
always looking for,
the world's
really, really screwed up
and how can I help
make it a little bit better?
And so that was always
the motivation going in
and that
was the motivation coming out.
[Destiny]
Basically what you want to do
is you want to find
somebody that feels like
they're fucked in some way.
"Make America Great Again,"
right?
You want to find somebody
that feels like they're fucked
in some way
and you want to speak to that.
And when you hone in on that,
you can bring a person onto
your side to believe anything
that you want them to believe.
[Caleb] And once I found,
you know, content debating
with these alt-right figures,
found Destiny,
later on I would
find ContraPoints.
Hi. In this video
I'm going to talk about
how to recognize
a fascist.
[Caleb] Yeah, I got obsessed
with learning about that,
because how are these people
disproving all my gurus,
you know, disproving all these
people who are supposed to be
my warriors
that are out there fighting
to save my civilization?
How are they getting slayed
on the rhetorical battlefield
by a transgender woman?
The strategic fascist knows it's
better to start with realistic,
achievable goals,
and that means focusing first
on stopping non-white
immigration,
something they'll try
to get you, the centrist,
conservative or liberal,
on board with
by emphasizing the danger
and criminality of non-white
immigrants and refugees.
I always viewed
the channel as a kind of
attempted intervention
that didn't assume that,
I never assumed that YouTube
was going to step in
and save us.
I never assumed
that these people
were going to be banned.
The question was,
is there some part of this
audience that can be kind of
pulled away from the edge
of this extremism, basically?
I started to meet people,
trans people, Muslim people,
gay people, all these people.
And it's not just I met them
and now I'm like,
"Oh, I feel comfortable
around you,
because I was never,
I never had a
strict phobia of any individual.
But what I, when I
would speak to them,
I would get to hear their
experience and I would actually
listen to their perspective
and I would ask them
challenging questions.
If there's anything
you guys can teach me,
if there's anything
I can teach you,
let's correspond,
let's get together,
and let's figure this thing out,
because it's a problem I believe
that we can figure out.
They do it like Tyler Durden
does inFight Club.
They take you through your
existential moment
and destroy your old identity,
which is your false identity
that your masters gave to you.
Just let go!
[Caleb] And then they give you
a new false identity,
which you are now
beholden to them.
Without pain, without sacrifice,
you would have nothing.
[Caleb]
And so now you just
run down the pike.
That's what's kind
of going on here.
And of course,
the trick is they lie to you
about a bunch of stuff, too.
They reframe the truth.
[man] There's been an awakening.
Have you felt it?
[male 2] Have you ever wondered
why we go to war?
Or why you never seem to be able
to get out of debt?
Why there is poverty,
division and crime.
What if I told you there
was a reason for it all?
What if I told you it
was done on purpose?
What if I told you that those
who were corrupting the world,
poisoning our food and igniting
conflict were themselves
about to be permanently
eradicated from the Earth?
[birds chirping]
[Tyler] My mom started prepping
for World War 3, apocalypse,
you name it.
If you can imagine
like a person just prepping
for a natural disaster as such,
she would go
ten times beyond that.
She started arming herself
with weapons, guns.
But she just started wearing
the weapon around the house.
[door thuds]
-[Alex Winter] Hey, Tyler.
-Hello.
You can just tell us a bit about
when you first started
experiencing the changes
that were happening at home.
It was, conspiracy theories
to me have always been
just part of the social
media platforms.
Since, you could say
the dawn of time
from the digital age and such.
Once it started approaching
closer and closer
to the elections itself,
the conspiracy theories
started becoming more
and more wild from things like
microchips and the vaccination.
China having
troops at the borders.
Trump is basically fighting
against the cabal or such.
[ominous music]
She said that QAnon were two
separate entities
named Q and Anon,
run by two quantum computers.
That they weren't even people,
they were AIs that
helped recruit Trump
to fight the good fight.
From there, it just started
getting even worse because
now she started preaching
these conspiracy theories
like they're the Bible
and started getting in my face
about it all the time.
I couldn't say no.
I couldn't say yes.
Either one would
trigger a confrontation
between herself and me.
And especially
since I didn't hold
the same beliefs as she did.
She said things like, "Oh,
I'm just trying to protect you."
And I'm like, "Yes, from what?
The neighbors are not
going to harm us.
I'm afraid sometimes
that you're going to go
out there and start killing
people or something."
She took offense to that
and eventually
it devolved to, the argument
devolved into me saying,
"I'm done.
I don't need you anymore."
And from there I just
called up some friends
and left on
just whatever I can grab.
[reporter]
The World Health Organization
has now confirmed
the coronavirus is a pandemic.
[reporter 2] A quarter
of the world's population
is now living under
some form of lockdown
due to coronavirus.
[Donald Trump]
Now the Democrats are
politicizing the coronavirus.
Let's raise the, raise the bed.
[doctors]
And this is their new hoax.
The coronavirus pandemic
was the perfect storm
for radicalizing people.
There is no evidence that I can
see that a pandemic exists.
[Robert Lewis]
I don't personally know anybody
that's had it.
I don't know anybody that knows
anybody that's had it.
So you either wear the mask...
And I'm not doing it because
I woke up in a free country.
[Talia]
It created a lot of isolation.
It gave people a lot more time
to be on their computers.
You cannot
make people wear a mask.
It's not our laws at all.
This is just made up
by Bill Gates and them.
Go online and look it up.
[Talia] You know,
there were these
massive social disruptions,
like over the nature
of the fabric of reality.
-[woman] Are you laying 5G?
-Yeah.
[woman] You know,
when they turn this on,
it's going to kill everyone.
And that's why they're building
the hospitals.
[crowd chanting]
[Talia] In person, the reopen
rallies like were attended by
militias and white nationalists.
Texans know this is a Chi-Com
globalist bio weapon
meant to shut down our economy.
[Talia] The same was true
on the Internet,
that the people pushing
reopen content
or talking about COVID
restrictions as state tyranny.
We're talking about
the COVID lockdown.
What we're really talking about
is the great reset.
Everything that you're
seeing right now
is a giant political conspiracy.
[machine beeping]
We're going to do
everything we can.
That's what
I can promise you, okay?
[Anthony]
When you meet someone that says,
"Oh, I'm somewhere
in the middle,"
which I feel like what
many people were at some point,
now it's like,
"Oh, you're in the middle,
then you're against me,
fuck you."
And the algorithm
really plays into that.
[yelling]
Before we get to the topic
at hand, which is COVID-19,
we have to address the protests
currently going on
and the murder of George Floyd
at the hands of police.
[crowds chanting]
[reporter]
The outrage over the death
of George Floyd is global.
[woman]
We're here to enact change.
We're here to say
that our lives matter.
We need the community
to understand that.
[crowd]
We want change! We want change!
Certainly, all of those factors
were pumped up to 11.
[yelling and screaming]
[reporter] Dozens of American
cities up in flames
after some protests
turned into riots.
[crowd yelling]
I think that the way
that the platforms
and YouTube in particular looked
at these algorithms was
much more
from the perspective of,
"Oh wow, let's find a
way to get people
to see content coming
from like-minded people."
And I don't think it
had occurred to them
that like-minded people might
be people who were angry...
Start a riot?
Black lives matter? [bleep] you!
[Jillian]
...and looking for the kind of
content that would help them
connect with other people
who were angry in that same way
and not oriented
towards justice.
A lot of people turned
to conspiracy
for the first time.
This is experimental
bio warfare on the people.
[Talia]
Some of them became out and out
white nationalists
and some of them
became sympathizers
and others became
QAnon supporters.
This is like a global takeover.
They're trying to create
a new world order, right?
They thought they could easily
get their great reset.
Little did they know!
Little did they know!
They thought they
could easily have it.
Pandemic's a hoax!
[Talia] It grew and grew
and grew and grew.
[Caleb] On January 6th,
I went down to the Capitol.
[screaming]
One thing that I've
tried to understand
is Internet subcultures.
And I was at the Capitol
to take pictures
of the different
extremist groups.
[crowd]
Stop the steal! Stop the steal!
They chased me around,
called me antifa.
Tried to get
the normal MAGA people,
because I could tell these were
some Proud Boys types.
We're going to storm
the fucking Capitol.
Fuck you fuckers.
Bellingcat did a great
study of looking
at sort of neo-Nazis on
Telegram who discussed
how they'd been radicalized.
And just like so many of them,
a majority of them,
cited YouTube specifically
as their
means of radicalization.
This is probably
our last opportunity
to actually organize
against the New World Order.
[reporter]
Yesterday, conspiracy theorists,
the alt right, the far right,
QAnon followers,
and others stormed Capitol Hill
and livestreamed it.
[man]
A lot of the blame for inciting
these riots is being placed
on social media sites.
[reporter] YouTube announced
Tuesday it suspended
U.S. President
Donald Trump's channel
as it violated policies
against inciting violence.
[woman]
He was suspended from both
Twitter and banned from Facebook
and Instagram as well.
It's a sad day when
Big Tech has more power
than big government,
that they can censor
the President of
the United States.
[Harris Faulkner]
YouTube extending its suspension
of former President Trump's
account now indefinitely.
[screaming]
[Jillian] They saw that these
democratic institutions
that they were trying to protect
through freedom of expression
were actually being destroyed
right in front of their eyes
on their platforms
for the whole world to see.
Patriots are inside
Nancy Pelosi's office!
Hey, everybody, Stefan Molyneux
from Freedomain radio.
[woman] YouTube is continuing
to ban the accounts
of white supremacists
in an effort to combat
hate speech on its platform.
We are going under
digital martial law.
[Caleb] Oversaturation of social
communication technology
is going
to cause a lot of conflict.
You know, radicalization getting
played out on YouTube
and everybody
throws their hands up and says,
"Oh, my God,
it's the new satanic panic."
It's the same old thing
that's been happening,
except now it's hyperlinked.
If we don't figure out
this problem,
we're going to lose
what it means to be human.
Your platforms have changed
how people across the planet
communicate, connect,
learn and stay informed.
The power of this technology
is awesome and terrifying,
and each of you has failed to
protect your users and the world
from the worst consequences
of your creations.
This is the first
time the three of you
have appeared before Congress
since the deadly attack
on the Capitol on January 6th.
That event was
not just an attack
on our democracy
and our electoral process,
but an attack on every member
of this committee
and in the Congress.
I want to start
by asking all three of you
if your platform bears
some responsibility
for disseminating disinformation
related to the election
and the Stop the Steal movement
that lead
to the attack on the Capitol.
Just a yes or no answer.
We always feel a deep
sense of responsibility.
But I think we worked hard.
This election effort was one
of our most substantive efforts.
[Mike Doyle]
Is that a yes or a no?
Congressman,
it's a complex question.
-We...
-Okay, we'll move on.
[Natalie]
It's an interesting situation
where you have
what has basically
become the public forum
is run by about three
corporations.
That is a level of power,
a level of political power
and social power
and economic power that is a
little bit terrifying, I think.
[slow music]
[Brianna] Google is one
of the largest companies
in the entire world.
So, you know, one of the reasons
I ran for Congress was I thought
that I could be a force of good.
Hi, I'm Brianna Wu.
And I'm running to be your
congresswoman right here
in Massachusetts District Eight.
I thought that having
someone that had been affected
by these issues,
I hoped that I could make
a difference directly.
Clearly just tweeting
about these issues,
it's not working.
We need legislators that really
care about this issue
and will put
skin on the line for it.
I want to be very clear that the
prosecution's decision
to abandon my client's claims
does not invalidate
the truth of her claims.
Well, because I am a litigator,
my whole like purpose
is this idea that,
you know, one lawyer
and one client can pay $210
for an index number
to start a lawsuit.
And that team of people can
create law that rules the land.
And that process comes about
by starting a case, appealing
and appealing when you lose,
and then petitioning
to the Supreme Court.
And so that's
why I think litigation
to reform the Internet
is really powerful
and is like
where I put my effort.
So, yeah, I do think it needs
to go to the Supreme Court.
Can you bring it here?
I'd like to think I've
been making a dent in it.
The NRA when I started was,
you know, all powerful.
Now they're,
they haven't gone away,
but they're somewhat in retreat.
With Google,
it's the same thing.
I have started
a Change.org petition.
People are calling in to say,
help Andy Parker,
give him the co-copyright.
What good is it for you
when he can at least use this
in his fight against Google?
And so it's a worthwhile fight
and I'm not giving up.
And whether it comes from
the Supreme Court or Congress,
I think it's more likely
to come from legislation,
congressional legislation.
Google profits massively
off of lack of regulation.
If it cannot properly
protect citizens
from online harassment,
hate speech
and moment of death videos,
I call on Congress to step in
and make sure
that proper protections
are in place
for private citizens like me
who are continually
harassed and exploited.
Well it's a great saying,
with great power
comes great responsibility.
We did tell him that you have,
you have the influence, right,
to influence kids.
So we, hopefully
we teach him right
and use that influence for good.
So what we did,
partnered with the YouTube team,
is we interviewed
health experts.
If somebody sneezes or coughs
and they have the coronavirus,
how far can it spread?
That's such a cool question.
These particles are so tiny
that they even happen
when we talk.
They come out
of our mouth when we talk.
I know that sounds really gross.
[laughter]
[Shion] Ryan asks all those
questions for his fans.
And now, until when do you want
to do the YouTube thing?
Until when you're 20-something?
Yeah, 20, I think 6 or 5.
So he wants to continue this
until at least he's 26.
Just for now. [laughs]
And why? He had a reason why.
I don't remember.
I know you've probably already
read the title by now,
but I feel like we should
just come out and say it.
I'm leaving Smosh.
I know a lot of you guys
are probably going to assume
he's leaving because
we got in some sort of big fight
or because we hate each other.
But I can guarantee you guys
it has nothing to do with that.
-That did not happen.
-No.
[Anthony]
I started to really look at
what I put out there
into the ether as...
really having some kind of
influence on the way that people
go about their days after
watching a piece of content.
My name is Anthony Padilla,
and today I'm going to be
sitting down with survivors
of school shootings
to learn what it's really like
to live through such a traumatic
and Earth-shattering event.
-Thank you so much, Shelby.
-Thank you!
I feel like I fully understand
the wondrous
world of asexuality.
-I'm so glad.
-And congratulations on coming
out to the entire world.
-Oh my God, I forgot already.
-[laughter]
[Anthony]
It made me realize
that the world
would be so much
of a better place if people came
into any interaction first with,
from a place of curiosity
and trying to understand
rather than judgment.
And it kind of made me
feel like there was
a point to what I was doing,
which really helped.
[cork popping]
You know what, America? Things,
things have not gone well.
Well, as long as I can remember
being on YouTube,
there have always been people
who are prognosticating the end.
"Right, oh, this is
the end of the free Internet.
Like the corporations
are going to come in,
they're going to take over."
Like it used to be
the Wild West,
it used to be fun,
it used to be free.
But it's all about to end.
You know, I've been hearing
people say that for 14 years.
[chuckles] As for me,
Yeah, I'm not just some person
filming on a webcam
in my bedroom anymore,
there's a budget.
There's, you know,
a million subscribers,
but it's still me.
And because I do commentary
on multiple-year trends,
is kind of
the time frame I try to work on,
I will respond to those trends
as they happen.
Like, am I going
to be agile enough
that I can stay interesting
and continue
to know what I'm talking about?
I hope I can. I intend to try.
[chuckles]
Ideally, where do you see
YouTube in the future?
And also just personally,
where do you see
the role of big tech
in terms of the fact that
it has had so much power?
What do you think
the ideal scenario is
moving forward as both
someone who is overseeing
YouTube and as a mom,
as someone who's just
a member of this crazy society
we find ourselves in
at this moment?
[chuckles]
So I see, I mean,
I see a lot of the benefits.
I see the benefits
of the educational communities,
of connecting
diverse communities
that otherwise could have never
been connected
and at the same time
be thinking about
what are the risks,
what are the downsides,
how can we manage that?
I know that right now there's
a lot of discussion about it,
but I see that for many reasons
that will get worked out.
I think there's
a demand from society,
from the press,
certainly from governments.
And I do believe that technology
has a tremendous opportunity
for good in the long term.
[upbeat music playing]
[crowd counting down]
[all cheering]
[opening bell ringing]
What I would like to see us do
is to fundamentally
rethink the business model.
I would like to see us to return
to something more sane,
which is say, look,
if there is value to Facebook
and to Twitter and to YouTube,
I should just pay
a subscription for it.
And if I'm paying
a subscription,
my business model
looks really different.
My incentives look different.
At Google, the past year
has given renewed purpose
to our mission
to organize
the world's information
and make it universally
accessible and useful.
Building a more helpful
Google for everyone.
[slow music]
At other times in our history,
we have seen dramatic changes,
breaking up oil
and railroad trusts.
Addressing big tobacco,
you know, instituting
automotive safety reforms.
We have stepped up and taken on
corporate behemoths in the past.
But it was
a very different time.
Until there's really
popular momentum
for some of these changes,
they're just
not going to happen.
[dramatic music]
If YouTube,
the company went down tomorrow,
I think there would be
a lot of harmful content
that was no longer
on the Internet.
But there also
would be a ton of creators
just trying to make a living
who'd no longer have
the venue to do that, right?
Days like this is
why I started vlogging,
because otherwise
no one would see this.
It really just kind of depends
on what your outlook in life is.
If you're the kind of person
who's looking for differences,
who's looking
for ways to condemn
other people
and to divide us out,
then you're going to see
YouTube as an enabler
of these
terrible movements, right,
as a way of justifying
the things that you're doing.
Whereas if you're someone
who's seeking those connections,
who's seeking to find
what we have in common
or to expose injustice,
then you're going to use it
for those purposes.
And so really, it's
the great enabler of all
of these different things.
[man] Lift off of the 25th
space shuttle mission,
and it has cleared the tower.
I really feel like
this is the commons.
YouTube is our public library.
[excited basketball commentary]
[Brianna] It's a critical part
of human civilization.
[Martin Luther King Jr]
We've learned to fly
the air like birds.
We've learned to swim
the seas like fish.
And yet we haven't
learned to walk the Earth
as brothers and sisters.
[Brianna]
But I think that corporations
don't do the right thing
until they're forced
to do the right thing.
My greatest fear is that we all
become citizens of big tech.
And that there comes a point
where they're far more powerful
than our courts or lawmakers
and that they have
more information
than our law enforcers
and that we're all just
kind of at their mercy.
[dramatic rock music]
We have built
a technological grid system
that we essentially
can't destroy
because we're relying on it now.
And they've got
these things buried
at the bottom of the ocean.
Is anybody stopping for a second
and thinking about what it means
to build something
and rely on something like that
and to never question it?
Are we speeding
things up too fast?
And what are we
speeding towards?
[slow music playing]
[slow music playing]
[slow music playing]
[slow music playing]
[dramatic music playing]
[dramatic music playing]
[slow music playing]
[slow music playing]
[soft music playing]
[dramatic music]
Here we go!
[explosion]
[upbeat music]
[laughs]
-Oh, my God.
-Oh, my God!
Whoa!
YouTube has video.
[man] I'm gonna spend
the next 50 hours buried alive
in this coffin.
[boys laughing]
[woman] So I'm really excited
to be launching
this YouTube channel.
I run nine different
YouTube channels.
[man] So hit that like button!
Smash that like button!
[upbeat music continues]
Woo!
[upbeat music continues]
Yo, it's just a prank!
[screams]
-Whose streets?
-Our streets!
[upbeat music continues]
-[repeating Donald Trump's name]
-Holy shit!
[upbeat music continues]
Please don't shoot me, man.
[upbeat music continues]
[crowd chanting] Stop the steal!
[upbeat music continues]
[music fades]
[static crackling]
[horns honking]
[soft music]
[soft music continues]
[Steve Chen] It's kind
of an interesting past.
Born in Taiwan
and traveled to the U.S.
when I was eight years old.
[modem dialing]
In 1992, 1993,
we had exposure to the Internet.
So you had a bunch
of 14, 15 year-olds
without parental supervision,
having access to the Internet.
And the Internet was new.
It was, I mean, even
more experimental
than what you know
the Internet today.
People were still
just trying to figure out,
"What do you do with
these interconnected machines?"
And so that began sort
of already my long-term career
of going to sleep at 3 a.m.,
4 a.m. every night.
It's just getting online,
talking with people
and kind of just figuring
out things all together.
[jet engine roaring]
I left the University
of Illinois in 1999.
Do you actually drop out
of school and do
the startup thing?
And, you know, that means
it was a major life decision,
but it was made
within minutes of saying,
"Okay, forget what
I'm going to be doing.
I can always
finish off my senior year
if this startup doesn't work
out, but I need to buy a ticket
to get out to Silicon Valley
as quickly as possible."
[dial-up modem connecting]]
[female host] By now, you've
heard the buzz.
[male host] It spans the globe
like a superhighway.
Telephones, TV, and
computers are merging.
Whoa!
[upbeat music]
You know, people who
had access to the Internet
were able
to communicate with each other
across borders.
It was exciting, it was
an enabler of really kind of
a unique type
of communication that hadn't
been able to exist
in that way before
without travel.
Wave your hands, Brian,
so we can get a sense
of how this works.
Okay. Hi there.
I was trying to describe
to one of my undergraduates
what the world looked
like in '97, just 20 some
odd years ago.
And I was explaining to them
that the way we took pictures,
the dominant medium was film.
They're like, "What's that?"
I was like, "Oh man,
I have to define film."
So I explained, you know,
there's this canister
with this photosensitive
material, and you put it
in the camera,
and you take 20 pictures,
and then you take it out,
and you take it
down to the drugstore,
and then, you know,
somebody developed it,
and four days later,
you get your pictures.
I swear to God, I think
they thought I was pulling
their leg.
Like, they couldn't fathom
what the world looks like today
compared to that in '97.
[Jeff Bezos] There are gonna be
lots of successful companies
born of the Internet.
[boys cheering]
[keys typing]
Well, I was getting pretty
depressed like towards the end
of last week.
And I was like, "Dude,
we have, like maybe
40, 50, 60 videos
on the site, but..."
[Steve] I started in PayPal
in 1999
and I left PayPal in 2005.
So during the course
of that period was when I met
the two other co-founders
of YouTube, Jawed and
Chad Hurley.
[Steve] Of the search results,
or of the set that the image
came from?
Yeah, where it came
from in the results, so...
On of the guys that we met
early on was the co-founder
of another service.
And it was called hotornot.com.
This service can be explained
pretty easily.
You look at a photo
and there were two options.
You voted "Hot"
or "Not" on the photo.
We just thought,
"You know, what a great
opportunity it would be
if you created a video version
of this instead of just seeing
a photo.
[laughs]
So that's the true story behind
the YouTube service
as we know it today.
[whimsical music]
We started building the service.
And I would say that much
of the work was
on the engineering side.
What became of YouTube
is not something
that's fundamentally
a new and novel idea
in that people have considered
and thought about,
are there ways to be able
to share video online?
Because plenty of people
at the time had videos
that they wanted to share.
It was just sitting
on their hard drives,
but there was no way
to be able to share them.
[keys clicking]
All right, so here we are
in front of the elephants.
Cool thing about these guys
is that they have...
[Steve] A large part of it
early on
was actually looking
through and kind of seeing
what kind of videos would
people want to be able
to share that they didn't know
were shareable before.
Look, here's finally a service
where if you wanna get
a political message out there,
if you want to get some
kind of message out there,
you can use this platform.
You're not going to be able
to turn on the TV and be able
to see this.
I remember it showing up
as like a place
with goofy videos...
[boy laughs] Charlie.
...that didn't strike me as
something that would
revolutionize the world...
[crowd yelling]
[boy laughing]
[Talia] ...be livelihoods
for so many people...
[yells]
100 million subscribers.
Thank you, YouTube.
Ow!
[Talia] ...spawn everything
from social movements...
I can't breathe!
[Talia]
...to really Baroque dramas.
I don't like 'em putting
chemicals in the water
that turned
the friggin' frogs gay.
Charlie, that really hurt.
[Talia]
That all came a bit later.
[baby laughing]
[crowd screaming]
[Anthony Padilla]
I was like, "Video.
Video is the next big thing
that's gonna be on the Internet.
This is so cool.
"Like, you could record
something on
God, you couldn't even record
on a cell phone then.
It was like,
steal someone's webcam.
[Power Rangers theme music]
Well, we actually started
making videos
before YouTube even existed.
[lip syncing to music]
[Anthony] We don't even have
broadband Internet yet.
[lip syncing to music]
Videos are not a thing
that you do on the Internet.
[Mortal Kombat music playing]
[lip syncing to music]
[slapping sound effect]
[Ian] You know, we had just
graduated from high school.
We were 17 years old,
all of our friends were going
off to colleges,
and we decided to stay
in our hometown of Sacramento,
go to community college.
[Mortal Kombat music]
Then somebody took one
of our videos and put it
on YouTube.
I did a Google search,
and I found
that it was uploaded
to this website called YouTube.
Someone just ripped it
and posted it there.
[Steve] What YouTube really
brought to the game was,
I think, the ability to be able
to utilize the latest technology
that was available, and then
building that
into a cohesive system
that made it much easier
for the user experience.
Millions of viewers now
from all over the world are
logging on
to view their short films
and video projects,
[female reporter] Their Pokmon
video, the second most watched
clip on YouTube.com,
more than 15 million views
so far.
[Anthony] I'm like
this introverted kid who
is terrified of speaking
in front of a class
of, like, 30 kids, but yet...
I'm like, I can do
this on my own,
in my bedroom,
upload it, people can enjoy it.
[skates rolling]
When YouTube first arrived,
we didn't really have
high-speed Internet connections
the way we do now.
I didn't have a smartphone.
[documentary narrator]
They're off.
[Jillian] And so back then,
it was really exciting
to be able
to go online and watch
a video that maybe
you never would have
been able to access before.
[documentary narrator]
This competitor is
determined to finish.
[Jillian]
And yet at the same time,
it was slow.
I think the first video I ever
uploaded to YouTube
was of my cat drinking out
of my water glass.
And I'd taken it with maybe
my first digital camera,
and it took hours to get
it online, and I don't even
know why I did it.
But it was just this exciting
moment to be able
to share something
with the world like that.
[cat licking]
[bell ringing]
These are from a gentleman
at another table.
-What?
-Enjoy.
[Steve] " Dear sirs,
we would like to buy
your video sharing site.
Milkshakes on us!"
[bell ringing]
Awesome. Let's do it.
Hi, YouTube.
This is Chad and Steve.
We're the co-founders
of the site.
And we just want
to say thank you.
Today, we have some exciting
news for you.
We've been acquired by Google.
Yeah, thanks. Thanks to
every one of you guys that,
have been contributing
to YouTube, the community.
We wouldn't be anywhere
close to where we are
without the help
of this community.
Thanks a lot.
[soft tense music]
[Brian Williams] There has been
a big merger in
the business world,
One that may say a lot
about our world these days.
[male host] Web giant
Google will pay 1.6 billion
to gobble up YouTube.
[soft tense music]
[soft music]
[Susan Wojcicki] The light bulb
for me when I realized
that YouTube
was going to be really big,
was with this
very specific video,
which was the first hit
that I saw, actually,
on Google Video.
[lip syncing
to Backstreet Boys music]
Of these two young students,
singing the Backstreet Boys
in their dorm room,
and their roommate is doing
homework in the back.
I still laugh
when I see it today.
[lip syncing]
And I realized, "Wow,
this is going to be a big thing.
People wanna see other people
like them on YouTube.
This is going to be important."
[soft tense music]
One of the things that
I did was I did this model
where I forecasted
that it was actually gonna be
a really big business,
and we could justify the price
that we paid for it,
which was 1.65 billion dollars,
which was a huge price.
I had a hundred
percent conviction.
There was no doubt in my mind
that this was gonna be
a huge trend of the future.
But there were many around
me that questioned it, for sure.
I mean, we saw it in the press,
we saw other people who were
supposedly, like,
tech experts say,
"This was, like,
one of the stupidest
acquisitions ever."
Like, someone told me
how that day
they were with a bunch
of different Nobel Prize winners
in economics,
and they were all talking about
what a bad acquisition this was,
and how could have Google
made such a huge mistake?
So I think people across
the board did not, like,
recognize that this was
gonna be a really,
really big opportunity.
[Chad] This is the dinner
at YouTube. This is what it's
like to work here.
We're gonna feature this video
tomorrow so...
[soft tense music]
[indistinct chattering]
-[man] Cheers.
-[All] Cheers. Woo.
I find it fascinating
the way I think
younger Millennials,
and certainly Gen Z use YouTube
almost the way the rest
of us use Google search.
You know, that's where they go
to find information.
For those of you that
are new here, Monday, Tuesday,
Wednesday, Thursday,
I like to talk about
the news, world events,
and then Friday, it is all
about the conversation.
[Nitasha] I think that it can be
really wonderful, but it's just
always
been baffling to me that
here is the most massive
video platform
influencing billions
around the world,
[horns honking]
and, you know, for years,
Google wouldn't even break out,
like, basic statistics about it
in its quarterly earnings calls.
You know I don't have any
specific metrics to give but...
We definitely...
are seeing impact and
we think we are in early days of
the impact
impact we can see.
[Nitasha] And even just from
a financial standpoint,
how did they just tuck this
under the Google umbrella
as though this is just a,
like a side bet,
when it is, like,
kind of the default portal
to information?
[Steve] Yeah, the making
of YouTube...
Here's your new keys
to your new office.
-[Steve] All right, thanks.
-[Chad] Present you your key.
[Steve] When Chad and I were
pointing some things,
they still gave us
the decision-making power
to utilize the resources
of Google after the acquisition,
but we were
allowed to make the decisions
of where we wanted
to take the product.
[soft music]
I think when it got
acquired by Google,
the general narrowing
of the Internet, where like,
there used to be web surfing
where you would just, like,
bounce around
all these different sites
from all these different
creators,
and really be looking
at different things.
And now we're much
more centralized on
our social media feeds,
and there's a sense
of much more controlled chaos.
Like, everyone's talking
and talking over each other
and contributing,
but within these silos, there's
the central platform
of the whole Internet.
[calm music]
[male narrator] The wheels
of the tech industry never
stop turning.
And in 2007, we saw glimpse
after glimpse of new technology.
[soft music]
That was, like, the pinnacle
of the moment when everyone
was so excited about
the potential of technology
and social media to change
the world for the better.
This is one device...
[crowd cheering]
and we are calling it iPhone.
[Becca] There was this idea,
of, like, organizations,
these old bureaucratic
institutions aren't going
to matter anymore
because people can
kind of come together
just using the Internet,
and can change the world
that way.
[crowd chanting]
[police screaming]
[woman] I like to think
of this as the beginning
of the revolution.
So this is a bunch of people
coming together to create
a new cultural impulse.
[engine buzzing]
[Jillian] States would control
what foreign media can do,
and I think we really saw
this at the beginning
of the Egyptian uprising,
when there were only
a few networks that already had
people in the country.
And those networks,
they didn't have the same kind
of access that people
posting to YouTube did.
I think because
of a lack of trust
that the people had
in foreign media
and reasonably so, because
foreign media had,
whether intentionally or not,
had ignored a lot
of these issues for a long time.
[crowd screaming]
[interpreter] President
Mohamed Hosni Mubarak has
decided
to wave the office
of the president
of the Republic.
[crowd cheering]
[male reporter] A voice which
ushered in a social revolution
in Egypt
where the most powerful
weapon was social networking.
[crowd cheering]
[calm music]
We were able to connect
with people all over the world.
We were able to have
these conversations that would
expose things in our country,
that helped us to understand
the connections between
all of our countries
and the things that
we had in common.
[calm music]
This on?
I'm Fred and...
Mom, I'm not using your camera!
[Steve] I think one major change
that happened
after the acquisition,
what eventually led
to this concept of the YouTuber.
-Hey bitches!
-What's happening, forum?
Hi, we're the Fine brothers.
For those of you who don't
know me, I'm Brookers,
and I'm an Internet celebrity.
[woman] He's an international
superstar, the most downloaded
man on the planet.
You pretty much can stop
anyone here and ask them
how they found out about K-pop,
they found out on YouTube.
[K-pop music]
[Steve] Korean K-Pop, that's
all possible because
all of a sudden
you have content that
was being created
in one part of the world
that just happened to have
viewers from the rest
of the world.
[upbeat music]
Pow!
You became able to see
actual people
living in other parts
of the country,
other parts of the world.
Hi, everyone. [speaking Korean]
...who were experiencing
the same things you were,
and maybe, you know,
we're boldly speaking about them
in a different set of terms
than you'd been able to access
through mainstream media.
I'm back, Black, and ready
to lay the facts.
[Jillian] If you're a person
of color who's not living
in a major city,
who doesn't have connections
to other people
who look like you,
then YouTube enables you
to see people who look like you
who might share your perspective
or who might have a totally
different perspective.
And so it allows you to see
a spectrum of people
who maybe had some
of the same features as you,
or the same proclivities as you,
but who were engaging
in different activities
than the people in your own
community were.
Y'all are gonna be
my best friends.
-Oh, my God!
-Oh, my God.
-Oh, my God!
-Oh, my God.
[Anthony] It was a couple
of years into us
doing what we were doing
that we ever met anyone else
that created content on YouTube.
-I'm Toby.
-Hey everybody.
It's Michael Buck
from the What the Buck Show.
What's up? It's Brittani Taylor.
I'm here at VidCon
with Josh Rhymer.
[Anthony] And it was such
a surreal experience, like,
seeing someone face to face
who also had a following,
who understood this world.
'Cause before that,
it was really, it was a lot
of talk about, like,
how can you use this to get
into mainstream media?
Like, New Media wasn't
a term, it was like,
YouTube was just a place
for trash, is how they saw it.
So get yourself
onto Nickelodeon or MTV.
And like, we literally had
a meeting with MTV
and we pitched
all these ideas and they didn't
know what to do with us.
Hey, everyone.
Welcome back to my channel.
It's your girl Jackie Aina.
If you are new here, welcome.
[Susan] We've really had no
gatekeepers that have said,
you know, "Send me your script
and I'll decide whether or not
we fund you."
We just say, "Hey, post it
on YouTube."
Tonight, my guests are
two transgender YouTubers
from opposite ends
of the political spectrum.
But I've brought them here
together to engage in
a rational, free-thinking debate
about a timeless...
[Natalie Wynn] YouTube is so
individual, right?
Generally, video is something
that is produced with a big team
of people.
[piano music]
It's a very solitary project
for me.
Like, I write the script,
I'm doing the wardrobe,
I'm doing the makeup,
I'm setting the camera up,
I'm positioning the lights,
I designed the set.
I turned the camera on,
I'm sitting in a room alone
talking to the camera,
and then I edited it myself.
I mean,
it's a very solitary project.
[piano music]
[piano music continues]
I don't know, maybe this is
a little bit of vanity,
but it's like, it feels like
a very personal like,
more like writing
a book would feel
than directing
a movie, you know?
[piano music]
I think I first uploaded
to YouTube in 2007.
And it was a video
of me playing the piano.
So YouTube as a website
has been in my life
a long time,
since I was in high school.
What do I look like to you,
some kind of philosopher?
[harp notes]
I majored in philosophy
and psychology,
then I started getting
a PhD in philosophy,
got two years into that
before realizing
"I don't want to be an academic.
This does not suit me at all."
Gentlemen of the Academy,
it is my duty to submit to you
the findings of my latest
mimetic research.
What are traps, and be they gay?
I was using a philosophy
background and making arguments,
but I'm also making
entertainment.
[crowd cheering
and cameras clicking]
[male reporter] YouTube
celebrities are laughing along
with their fans
all the way to the bank.
The last one of you take
your hand off this million
dollar stack of cash, keeps it.
[female interviewer] How do you
go from working in a supermarket
five years ago
to earning more than 12 million
pounds this year?
Part of me is not sure.
I just do something that
I absolutely love and put it
out there for anyone to watch.
I don't need money
I don't need cars
Girl you're my heart
[Steve] You know,
some of the most recognized
musician names out there,
they created their content
when they were 13, 14 years old,
uploading it as sort
of an amateurish piece
of content onto YouTube.
What's up guys, my name
is Shawn Mendes.
[Steve] And then you fast
forward a few years,
and they're performing
in front of hundreds
of thousands of people.
[crowd screaming]
[female newscaster] It's amazing
making money off of YouTube,
and two Sacramento
20-somethings are on that list.
[Anthony] Our channel was one
of the first ten channels
to ever be monetized.
They said we will give you
$10,000 dollars each month
if you have
these ads as part
of this partnership program.
Obviously, that's
a shit ton of money for,
God, we were 19 years old.
[crowd screaming]
[Natalie] In the early days,
no one was doing
this professionally,
or very few people were.
And if they were doing it,
it's because they were making
ad money.
-[man] What the heck?
-I've never seen
a dead person.
- You haven't?
-No, bro.
[Natalie] That really affects
the kind of content that
you can make
Because it incentivizes you
to make as much content
that gets as many clicks
as possible by any means
necessary.
[reporter] For more than a
month, Mona Lisa Perez said
her boyfriend
had been begging her to launch
his YouTube channel
with a bang.
I may fail, but if I fail,
I wanna die trying.
[Boy] He has a microwave stuck
to his head.
I know this sounds like
a prank call, but he was trying
to film a YouTube video.
[Natalie] It doesn't have
to be good.
People don't have
to wanna pay for it.
You just have to get eyes
on the video.
Dear fat people. Ah! Some people
are already really mad
at this video.
What are you going
to do, fat people?
What are you gonna do?
What are you, gonna chase me?
All right,
pull up with the gang.
You know, and look at who
they offer partnerships to.
No channel grows to hundreds
of thousands or millions
of followers incidentally.
Be sure to drop a like,
hit that subscribe button.
15,000 likes on this video.
[Talia] So much of it is about
generating clicks.
Like, it's even rougher than
the news clickbait economy
because you sink or swim
based on your impressions.
And for a lot of people,
it's their sole livelihood,
And so everything is
this capital letter,
everything's sort
of in bold, and italics,
and exclamation points.
We've been getting 50,000 likes
on all of my videos,
so go hit that like button.
[soft music]
[music continues]
Everything on YouTube
changed when
the recommender algorithm
was introduced in 2011.
YouTube started rewarding
the content that had
the highest click-through rates
and the highest watch time.
And like, some kind
of combination of that.
[upbeat music]
[Anthony] The algorithm doesn't
differentiate between
a positive piece of content
that makes you feel good,
walk away with a smile
on your face,
versus something
that pisses you off
and makes you super angry.
In fact, it tends to sway
toward the stuff that makes
you angry,
because you're more likely
to click on other videos
to learn more about this subject
that's upsetting you,
you're more likely to respond
to a negative thing
that gets under your skin
if it's in the comments.
Unless you are sort
of seeking out specific content,
what YouTube serves you
is sort of a stew of the stuff
that's meant to excite
and engage,
you know, whether
positively or negatively.
[Anthony] YouTube also didn't
want clickbait to be the number
one thing
that was dominating
the platform.
I think that they cracked
down by saying,
"Well, we'll reward whoever
sticks around the longest."
So you can't just have that
enticing thumbnail and
trick people into getting there.
You have to also captivate them
and keep them around.
Part of me is scared
about what impacts
these algorithms
will have on the future
because the algorithm
is a beast that really can't
be tamed
once it's been unleashed
and it's already been unleashed.
We have some really big
projects, we got Food Battle,
of course, coming up,
we've got some big music
videos we're working on.
-Yeah.
-We're upping our game.
[Anthony] But it's like,
it's an endless loop.
You step away from YouTube
for a little while as
a content creator
and you feel a little...
pain in your heart.
Like, what if
people stop...
What, are people gonna
forget about me
if I take two or three
weeks off?
or even a month off,
are people...?
Is the algorithm not going
to show my content
to anyone again?
So the algorithm is controlling
the creator in that sense.
It makes you feel obligated
to continue your momentum.
You can't break your momentum
is kind of the feeling
that most people have.
[eerie music]
These platforms are in
the engagement business.
At the end of the day, they're
giving away their products
for free.
And the way they make money is
by keeping you on the platform
for as long as possible,
extracting data from you,
and delivering ads.
Now Facebook, it's very easy
to see the recommendation
algorithms.
Your newsfeed is
100% curated for you
to provide you with articles
that will keep you
on the platform for
as long as possible.
YouTube, it's a little bit less
obvious,
But if you look at the viewing
patterns on YouTube,
a full 70%, 7-0,
of all YouTube videos that
are watched today
are recommended by YouTube
through Watch Next,
so the autoplay that is
by default on,
and then the Recommended
For You panel
and the recommendations
on the right hand side.
How's it choosing those?
So of course,
part of that algorithm is,
"This is something
that is relevant to this."
But some of it is that,
"Look, we know people will click
on this, and if you click on it,
I get more data
and I get more ads."
And that has led to lots
of problems.
The rabbit hole effect,
the echo chamber.
So for example, if you go
on to YouTube today
you click on moon landing,
within a couple of clicks,
you will be in La-la Land
talking about conspiracy
theories
around that the moon
landing is faked.
A full 10% of recommendations
were conspiratorial.
That's insane.
Well, I just got kicked out
of Starbucks for asking
NASA employee questions
because he's lying.
[Hany] So it's not just that
these platforms are neutral.
You know,
it'd be one thing to say,
"Hey, look, we are sitting back,
people can upload videos,
you get to look
at what you want."
That's not how
these platforms work.
They are choosing
to amplify things that engage.
And what we know is that
the more conspiratorial,
the more hateful,
the more divisive,
the more it engages.
Dipshit.
And get that camera out of my...
[engine roaring]
[calm music]
[Caleb Cain] I realized that
the Internet was a place
where a lot of people
describe it as escapism,
and I kind of resent that.
Because it's not just that,
I had video games for escapism.
I had, you know, going out
with my friends for escapism.
But what the Internet had was,
especially in the early days
of YouTube,
you could find
dissident information,
You could find information
on there that was outside
the culture.
And the reason that's
so important is because
you can't see
contradictions in your society
unless you get
outside the culture.
[calm music]
Suddenly, I could start to see
how things in my life
weren't working
because I could see what
was going on in other places.
It didn't always lead you
to the right place.
And sometimes the truth comes
with a lie and a price tag.
But you had freedom
that you never had before,
and that I never experienced
in my day-to-day life.
Hi, everybody. My name is Caleb.
You know, welcome to my channel.
So you can probably tell
by the description
on the video, in the title,
what this is gonna be about.
So I'm just gonna get
right into it.
I fell down
the alt-right rabbit hole.
So it begins.
You're in a death battle,
New World Order. We know.
If I lived in Saudi Arabia,
they'd kill me.
If I lived in China,
they'd kill me.
You still gotta sell your soul.
You gotta write
your name in blood.
[man] Whites are getting fed up,
Whites are getting tired
of being disenfranchised
and neglected.
Hail our people. Hail victory.
[laughs]
[screams]
[Caleb]
Well, at first I was kinda just
depressed skimming the web.
But I think at some point,
I had found this video called
"God is in the Neurons,"
and it was from this YouTuber
that I used to watch in
high school,
And it gave me this concept that
I could rewire my brain
and really self-improve.
[man] Our beliefs have
a profound impact on
our body chemistry.
[Caleb] And that took me
to self-help content.
And then the algorithm just
started feeding up
more self-help content.
There is no such thing
as mental illness.
[Caleb] You start with someone
like Stefan
because you're looking
for self-help.
And so we call that
an on-ramp, right?
In the study of extremism,
we would call that an on-ramp.
And you go from that self-help,
which has now validated
your identity,
it's given you
a direction to go,
it's given you an algorithmic
list, to use the word
a bit ironically,
of things that you have to do.
And I'll tell you what,
people are looking for
that in a complicated world
and especially young
men are looking for that.
It's now time
for the return of men.
[Caleb] It seems that
the algorithm would always
pull you towards
the more hyperbolic content,
towards the content that
was a bit more extreme.
Muslims by the million
have been pouring into Europe.
And you can see
the terrible problems
that have arisen
as a consequence.
[Caleb] And there was
an uncomfortability when
you went a bit deeper
into this specific ideology,
because not only is
it depressing, but it's hateful,
That specific rabbit hole,
I feel like what it was truly
doing was, it wasn't
radicalizing me,
I was already radicalizing
because of how
my life was going.
But it was killing my empathy.
It was turning me
into a sociopath.
[insects buzzing]
YouTube is now like a Walmart in
a town where there's a Walmart
and a Dollar General and a CVS.
Like it's one of the sole
organizing destinations.
And in the case
of the far right,
and like, right-wing
political content,
radicalizing political content,
it's like Walmart having
a big old gun aisle.
[Natalie]
Well, I think a lot of people
come to YouTube to do
commentary, to do politics
because they feel like
their perspective is not
represented elsewhere.
Words like racist, misogynistic,
and transphobic are not insults,
nor are they stereotypes
or generalizations.
Rather, they are facts about
the way we are socialized
in a Western society.
I'll be honest with you,
if I hate anything about
Black culture,
it's that it's such
a victim culture,
almost a victim cult.
[Natalie] I do think in general
that people who feel
marginalized
or who feel like they're
on the fringe, tend to gather
on YouTube.
You don't deserve to get laid.
[Caleb]
Then come along the alt right,
and they really took
that whole dynamic
and just revolutionized it
and innovated it.
And then they took many
of those Internet nihilists
and they took
that radical energy,
and they directed it
well,
towards dominance, and fear,
and hate, to be honest with you.
[sigh]
Well, this is my last video.
It all has to come to this.
You forced me
to suffer all my life,
and now
I'll make you all suffer.
[female TV host] An angry and
psychologically twisted young
man whose public rantings
exploded into sheer terror.
[gunshots]
[policeman]
Shots fired. Shots fired.
[female TV host]
Taking six young lives
and injuring 13.
A virgin vowing revenge
in a twisted video he posted
Friday, addressed at his
perceived enemies,
young women he says
rejected him.
You will finally see that I am,
in truth, the superior one,
the true alpha male.
[woman] Aggression, violence
and unbridled ambition can't be
eliminated
from the male psyche.
They can only be harnessed.
Rape, murder, war.
They all have two things
in common.
Bad men who do the raping,
murdering and warring,
and weak men
who won't stop them.
We need good men who will.
PragerU is an interesting
example because
they pitch themselves
as an educational channel.
They very much mimic
online course stuff,
but what they're feeding you
is very, very, quite radical
right wing ideology.
My message has been simple.
Islam is not
a religion of peace.
[Talia] I think one
of the things to debunk
is this idea that
it's solely an algorithm issue.
One big mechanism
of radicalization,
it's not just algorithmic,
it's very deliberate.
These channels, they build
parasocial relationships
with their viewers
so they feel a deep
connection with you.
I think it's more visceral
with video content.
I think it's more intense
with video content.
My name is Candace Owens,
and you are watching
my vlog series.
[Talia] You know, the sort
of confessional vlog-y, casual,
"This is my living room,
you're looking at me,"
kind of style, really, really
facilitates that kind
of parasocial relationship.
I just wanted to say thank you.
I've seen some of the comments.
Some of you are a marvel.
You send actual letters.
[Talia] And it's just, like,
a Suggested Follows,
And that's its own issue.
Because these are very
well-funded channels,
these are big moneymakers.
[Steven Crowder]
There are two videos completed
here, Louder with Crowder.
This team never would've
been able to do,
if not for this man, Kevin,
lending us his personal plane.
[woman]
My skin's been pretty good,
to be honest.
I think YouTube is
a very intimate format.
If you're watching
a YouTube video,
people tend to watch
it by themselves,
you watch it while you're
eating, you watch it while
you're going to bed.
You're watching
one person talk to you.
There's something very
intimate about that.
You feel like you know
this person.
It's a little bit uncanny,
you know, because they feel
that they know me,
but I don't know them.
We've talked about some heavy
stuff on this channel.
But I was gonna say,
"Oh, it's one of th..."
You know, if someone's
broadcasting from their bedroom,
you feel this real sense
of intimacy with them,
you feel like you know
the intimate details
of their lives.
I remember once I had
a girlfriend who was not
jealous to begin with,
but became kind of jealous,
particularly as I became
more successful
as an entrepreneur.
She became more jealous
and more insecure.
[Becca] And then when they start
kind of telling you
about their beliefs
and views,
that packs a real punch
that other delivery
mechanisms wouldn't.
The right are the producers,
the makers,
and the left are the takers
or those who manage that
taking and thereby gain
political power
by being able to hand out gifts
that they have not earned
but are taking from the makers.
[Becca] And so there's become
this kind of cottage industry of
far-right creators on YouTube
who take advantage of that
and broadcast to their audiences
and are quite effective
at radicalizing them.
I'm a weapon.
I'm made to be thrown at you.
[laughs]
[uplifting music]
Just between you and I,
I know the cameras aren't
on right now, right?
-Say, "What?"
-What?
I know the cameras are not
on right now.
-Keep saying, "What"?
-What?
[laughs]
[Anthony]
Yeah, I mean, it was
really hard for me to,
like, even think about where
YouTube was going because
we're so caught up in it.
But I definitely thought that
it was going to replace
mainstream media.
All right. Love you guys.
Thanks again for supporting.
-Thank you.
-Love you. Bye.
I knew that this was going
to be the way
that the Internet
was going to go.
I've had offers to do a show
for Netflix, or for Amazon
or cable TV, or whatever.
I always declined them.
Because I don't really see why
I would give up
my own creative control,
and this, like, child that
I've raised, basically.
you know, I want to keep it.
[saw buzzing]
[Susan] We're a platform
that enables creators
who are really next
generation media companies.
Welcome back to my channel,
everyone.
Today I wanna show you guys
some life hacks. Now, before...
[Susan] We've seen so many
people take their passions,
whether it's about cooking,
woodworking, music,
and turn that into a business
and become a creator
and become a global media
company on YouTube.
Hello. Hi. How's it going?
I finally got it.
I got the 10 million
subscriber plaque
But this is special because
it represents, like,
how far we have come
as a community,
and it represents, you know,
how much we have accomplished.
[birds chirping]
[Brianna Wu] What I thought was
so amazing about YouTube
when it first came out
is it really allowed
this hyper-segmentation.
So if you were a gamer
that wanted to understand tricks
from an obscure Japanese game
from 20 years ago,
[upbeat music]
boom, that exists out there.
[upbeat music continues]
And it was just
this massive explosion.
But I think for me,
obviously Gamergate was
when I really realized
something was going off course.
[dramatic music]
I had just finished shipping
my very first game.
And, you know, getting into
a political fight with
the Internet
was the last thing
I wanted to do.
But people on YouTube had
started to come together
in a way
to silence women
in the game industry
that were starting to ask
for more representation,
asking to be higher,
asking to be given
more opportunity.
[gentle music]
It started out with
this kind of Tumblr-style
call-out post about Zoe Quinn.
[Natalie] I guess
her ex-boyfriend accused her
of sleeping with journalists
and cheating on him.
And I started to speak
out against it.
[calm music]
I was sitting
at my home one day,
and I get this really credible
list of death threats that
went mega viral on the Internet,
and they are burned
into my brain.
[gasps]
"Guess what, bitch,
I know where you and
your husband live."
They give my address.
"You're going to die tonight."
"I'm going to rape you
with your husband's
tiny Asian penis
until you bleed."
"If you have any children,
they're going to die, too."
That was the moment that
I decided I wasn't safe
at my home.
And I left, and I got the police
and the FBI involved.
Brianna Wu is a games developer
who says she was forced
to leave her home over
the weekend after receiving
targeted threats.
She joins me now from Boston.
[Zoe] Instantly,
they dived in to,
"Find where she lives.
Find where all these people
live.
What are we going
to do about her?
Can we hack her e-mail?"
Like, instantly?
[mouse clicks]
And then all of these accounts
started being made
to talk about really
disgusting personal details
and talk about like, make really
disgusting sexual comments.
And it's the first thing
you lose is all perspective.
[soft tense music]
[Brianna]
There was an organized effort
to basically do SEO warfare,
search engine warfare.
You know this as well as I do.
If I type your name
into the Internet,
a bunch of YouTube videos are
going to be the first thing
listed.
So what Gamergate figured out
was that they could gain the SEO
and start putting out
these videos, you know,
Brianna Wu is a terrible person
for A, B, C, D, and E,
and basically malign my name.
YouTube was a major part
of the harassment
that I received.
[seagulls squawking]
[horns honking]
[engine revving]
[horns honking]
[Carrie Goldberg]
Yeah, I hear that a lot
where people are like,
"Listen, the Internet,
it's just a mirror on society.
And so why should
we be focusing on fixing
the Internet instead
of on society?
And the thing is,
that, in my opinion,
and my experience
as a lawyer for people who've
been fucked over by tech,
is that it is itself the weapon.
The Internet has
created this, like,
really convenient
mechanism to harm.
and the platforms are making,
just like they're minting money
off of it.
And so I don't agree that
it is just a reflection
of society.
It's changed society.
[dramatic music]
The thing about the algorithm is
what they're recommending
is a harm.
Their lust for hijacking
people's attention
is an additional harm.
The other issue of them
publishing content
that is itself harmful,
that they're not removing.
[Norah O'Donnell] We begin with
breaking news in Virginia.
A gunman opened fire during
a live TV news interview.
A reporter and photographer
with our CBS affiliate WDBJ
in Roanoke were killed.
[male reporter] Andy Parker
couldn't get video
of his daughter's murder
off of Google's YouTube.
[soft tense music]
[birds chirping]
[Andy] I went to YouTube,
typed in Alison Parker,
and literally there were pages
and pages and pages.
There must have been
25 to 50 pages of, you know,
thankfully it didn't autoplay,
but you knew what it was.
And in the titles of it,
a lot of it was,
"See, the whole thing was fake."
You know, it was, like,
one thing after another.
It's not like you can
just call up Google customer
service and go,
"Hey, you know,
I got a problem here."
[soft tense music]
[Susan] Well, we've invested
a huge amount in the area
of responsibility.
We also have thousands
of people who help us
with the enforcement.
We have machines
that remove, you know,
almost 90% of the videos
before anyone has to see them.
So we have a huge initiative
across the board
to make sure that we're
staying current in terms
of the creation of the policies
as well as the enforcement
of those policies
to find those videos
as quickly as possible,
and make sure they're removed.
If you need me to work
on the op papers,
I'm free and available.
It just makes me feel
like individuals don't stand
a chance.
You've got this like,
multi-billion dollar company
that's saying,
you know, that's just like
lording over, like,
ownership of your dead
daughter's murder video
over you,
that are saying, "We're not
going to take it down because,
you know, like, you aren't
the copyright holder of it.
And even though our terms
of service says that we ban
this kind of violent imagery,
we don't have
the legal requirement
to even enforce
our terms of service.
Like, you can't make us
do anything."
We have kind of a mantra
of don't be evil,
which is to do the best things
that we know how for our users,
for our customers
and for everyone.
And so, I think if
we were known for that,
it would be a wonderful thing.
[Larry] When Google,
the two founders said,
"Our motto is don't be evil."
And for a while they weren't.
But now they are
the personification of evil,
and they don't care.
You can flag this stuff,
and then
you can make that go away,
but then something
else will pop up.
That's why I said we have
to go after the great
white whale, Google.
[soft piano music]
So YouTube likes to say
it's a completely different
company from Google.
What's your opinion about that?
My experience is
the two have always been
treated as synonymous
in every interaction
I've ever had at Google.
I can tell you the impression
I've gotten is that
these products
are kind of all under
one umbrella.
I realize it's more complicated
than that when you get to,
you know, product managers
and who owns what,
but, I personally don't find
that excuse credible.
[tense music]
I think that Google has really
been far ahead of the game
from its competitors
in terms of anticipating
these legislative fights.
[Alan Davidson] We've realized
that we need to have
a larger presence here,
that the Internet is affecting
a lot of how people live today.
And we need, our industry
needs to be here
to help explain that
to members of Congress.
[Nitasha] You know, the company
has spent more than a decade
kind of cultivating politicians
on the right, on the center.
If you look at pretty much
any policy decision where
Google could potentially
be impacted, you know,
sometimes I think about
it as like a big circle,
and you know,
you have the left and the right,
and Google has given money
to just everyone
involved in the debates.
[laughs]
[Eric Schmidt] Google's mission
is to connect the world,
right, to get all that
information out.
We want a free open Internet
for every citizen of the world.
[somber music]
[male]
Leaders in government
and tech want to rewrite
a law that shapes the Internet.
Why we're sitting here today,
it's Section 230.
Because there's only
so much you can do.
We filed a complaint
with the FTC
claiming that Google
violates their own terms
of service,
which they do,
and they don't care.
[somber music]
There's the Communications
Decency Act,
Section 230,
which completely protects
the websites, the search engines
from any responsibility
to users.
When they talk about kind
of maximizing speech
on their platform,
that also means maximizing
monetizable content for them.
With Section 230,
what it means is that
they can do that without
any real legal responsibility
for what happens on
the platform.
Section 230 has
two key components.
So, one of them
is that it gives us,
protection from liability
from content that's posted
on our platform.
But the second is
that it enables us also
to remove content
that could be harmful
to our community.
But I think now,
20 plus years on,
we have to look at 230
and say, "Look, guys,
230 made sense when
you thought about
platforms as neutral platforms."
[man] Listen, Operator,
it's a very private call.
Now, you're not going
to be listening, are you?
[operator] Well, I won't even be
on the line. Just...
[man] You're going to get off
the line as soon as I get
the party, right?
- Right. There will be
no one on the line.
- Okay.
[Hany] It's like the wire on
the telephone, right?
I'm just creating a mechanism
for people to communicate.
If they're planning
to commit a crime,
you can't hold AT&T responsible.
Well, sure, that makes sense.
But that's not what
social media is anymore,
and it hasn't been
for a really long time.
They are not a neutral arbiter
of the material that is
uploaded to their sites.
They pick and choose
the winners and the losers
through the recommendation
algorithms.
Yeah, it raises a lot
of questions like
is kicking someone
off Twitter censorship?
Are we okay with that?
Like, I think that, you know,
a kind of obvious
response is to say that,
"No, the First Amendment
protects speech, protects
from the government,
doesn't give corporations
an obligation to host
your content."
If there is no Section 230,
there is no free speech, period.
I think we need to be
tremendously skeptical
and careful in tearing this up,
because it is the foundation
of the Internet.
That said, I do think that
we can look at amending it
and changing it and
updating it in different ways.
[Carrie] They don't see us
as important, we're not
their actual customers.
We're just their money makers.
They advertise at us.
They collect our data.
If there's content that
we want down
because we own
the copyright as an individual,
that is not meaningful to them.
This week, I got myself
a death threat.
This is messed up stuff
that people are saying,
[sobs]
Like, people telling
me to hang myself,
people just, like, blatantly
disrespecting the fact that
I'm still a human being
is not okay at all. [sobs]
I'm kind of simplistic in this,
it all goes back to us
having the right
to sue if we're injured.
Because most people,
like, most of the time
on these websites,
are not injured.
When I told him to leave me
alone and asked if he had
a knife,
he didn't respond. And when I...
[Carrie] But the rare occasion
when somebody is,
which on scale is
a lot of people...
You are an ugly, untalented,
fat [bleep] [laughs]
[Carrie] ...we need to be able
to hold them responsible.
The algorithm, it really
fostered this community
of videos that
people would kind of love
to watch because they loved
to hate the thing
that was in there.
They'd be like, "Yeah,
that person fucking sucks."
And I think it's time we take
a look at what canceling
really is.
I am so used to reading mean
things about myself online.
Most of you probably can't even
begin to imagine how used
to it I am.
[gong clangs]
It was just kind of driving
this community of, like,
pent up rage.
[music building]
[somber music]
[music continues]
[man] So YouTube has
set up spaces
around the world where
YouTube creators can go
and create their own videos.
[Susan]
We're a video-first platform
and everything we do
is about video
enabling new content
creators to come onto
the platform
and to be successful.
And so I expect us to continue
to really focus on the video.
I think technology will,
of course,
will continue to change
how we communicate
and the content that we see.
But YouTube will stay
focused on video.
[mouse clicks]
[Anthony] They want growth.
Growth. Growth.
Everything's about growth.
There's so much content
on YouTube right now
that it's completely
overwhelming to ever seek
stuff out.
You're not just going into
a search bar looking for things,
you're clicking on what's
being presented to you.
So I think that YouTube is going
to have to get to a point
where they make people feel
safe to be on the platform.
But then also,
they're having to tread
really murky territory where
they can't be kind of, like,
the arbiters of truth in
the sense that they choose
what is to be believed.
[crowd cheering]
[crowd cheering]
I'm going to fight to bring us
all together as Americans.
We're living in
a divided country,
it's not going to be divided.
We're going to love everybody
like we love the people
in this room.
Specifically, YouTube, the
algorithm was really rewarding
the stuff that would get
people up in arms and feeling
like they had to fight
against something for a cause.
America was built by and for
the white Christian people
of this nation.
[crowd screaming]
[Anthony] And politically,
that was a huge drive.
I know that having what felt
like good versus evil,
really gave people a reason
to keep clicking on more
and more of those videos.
So leading up
to the 2016 election.
[male reporter]
Donald Trump will be the 45th
President of the United States.
[man] We have to get better
at listening to each other,
and challenging each other
constructively and generously.
But I worry that the very
architecture
of the social Internet might
make that impossible.
[male reporter] In the week or
so since the election,
there has been mounting
criticism of whether web giants
like Facebook and Google
used enough discretion
and editorial responsibility in
screening out fake news sites.
Comet pizza. Here we go.
[male reporter]
According to police,
Welch said that he had read
online that the Comet Ping
Pong restaurant
was harboring child sex slaves.
Tell me why this pizza place
isn't even open at noon.
[gentle music]
[male reporter] DC police say
Welch fired at least one round
into the restaurant floor
with an AR-15 rifle
like this one.
[emotional music]
[music continues]
[Dave Lauer] There must not be
a tipping point
because there is no doubt
that YouTube bore
a huge responsibility
for the spread of misinformation
in 2016.
[Paul Joseph Watson] One expert
told me that Hillary has
high functioning autism
with attendant sociopathy.
[crowd screaming]
[Dave] Now we're in the sort
of misinformation apocalypse.
[chanting angrily]
I think there's blame
on both sides.
[crowd yelling]
What started off
as this niche field
in the burgeoning digital
revolution, turned into the,
"Crap, everything I read, see,
hear online is now suspect."
Everything we're seeing
in the news right now
is just insane,
fake news garbage.
If the Proud Boys are left alone
to do their thing,
they wave little American flags,
and then go to the bar to have
a drink, and nothing happens.
-You are fake news. Go ahead.
-[Reporter] Sir, can you...
[Brianna]
There were people at YouTube
that were aware that
the tools that they built
were being misused.
So they started to put together
basically task forces,
looking at the problem,
talking to women like me,
getting our experiences
and doing what Google does,
which is talking to experts
and trying to come up
with concrete,
realistic things they can
implement to solve the problem.
Unfortunately,
despite participating
in three of these task forces,
none of them were
successful in getting
Google to change their policy.
My name's Anthony Padilla,
and today I'll be spending
a day with Susan Wojcicki,
the CEO of YouTube,
who began as one of
Google's first employees.
If there's anyone in
the comments right now angrily
claiming that you, Susan,
are the key reason
that YouTube isn't the way
they wish it were,
what would you say to them?
It's much more complicated.
[laughs]
It's not just me.
You're not going
to blame someone else?
[both laugh]
I'll take ultimately
responsibility for everything.
But it has to do with the fact
that these issues are
much more complex
than people understand,
that maybe the changes
that we implemented,
for example, are due
to regulation or
they're due because
these changes we've done
will enable more advertisers
to come and spend more revenue
with YouTube creators.
I haven't been there for
close to a decade now.
But I still think that
the decisions that
they make are often prioritized
by what the end user wants,
not necessarily by,
you know, quarterly earnings,
or quarterly,
financial side of things.
If they wanted to,
there's a lot of opportunity
to be able to monetize,
but it would be at the cost
of the end user experience.
They are obligated to maximize
shareholder value.
If they don't do that,
they are exposed
to liability and lawsuits.
When that's your incentive
and you don't have liability
for the content on your system,
it leads to some very
perverse incentives,
such as building machine
learning systems
and algorithms that attempt
to capture people's attention,
addict them to content
and keep them on your site,
so that you can show them
more advertising,
because that's how
you make your money.
Well, I would just disagree
with that point of view.
And the reason
I would is because
when there is any kind
of harmful misinformation,
that is bad for us
and bad for our business
and bad for us financially
and doesn't work
with our business model.
We are an advertising
supported platform,
so no advertiser
is going to want
to be on that type of content.
And we have seen advertisers
pull back their spend
when they see that we're not
managing our platform
to keep our users
and community safe.
That reason, along with
wanting to be on the right
side of history,
wanting to do the right
thing from a brand,
from a PR,
from our employee standpoint,
from just thinking about
what's the right thing to do
for our users
in our communities.
[upbeat music]
[applause]
[Robert Kyncl]
Digital video is exploding.
Already the youngest
millennials are watching
more digital video than TV.
And in fact,
it has overtaken social media
as the top online activity.
If you were to ask
college students,
"What do you wanna bring
when you have a small dormitory?
Do you want to bring
a sort of a laptop
that you can watch YouTube,
or do you want
to bring a plasma TV?"
It's always going
to be the laptop
with how many things
that you can do with it.
[calm music]
[waves crashing]
[Ryan Kaji] It started
when I was three.
I was seeing a lot
of other people on YouTube
and I wanted to be
on YouTube too.
[kids yelling]
So I asked my mom
and she said yes.
[Loan Kaji] Okay, Ryan,
are you ready to find
the egg surprise?
Ready? Go, go, go.
[Jared Reed]
This kid, Ryan, who's on the top
of the Forbes list, he earned
$22 million this year.
-[Loan] Whoa!
-Whoa.
[reporter] He has
more than 12 billion views
on his YouTube page.
If you have children,
you know him.
-[Loan] Ryan?
-What?
-I have a surprise for you.
-What?
-Look, look over here.
-Whoa!
[Loan] And you don't need much
to get started on YouTube.
I started filming
with just my phone.
-Hi Ryan!
-Hi Mommy.
And I still use it to this day.
And, you know, when I started
to do more research,
I didn't even know, like,
what is a green screen
when I started doing research.
We couldn't afford
a green screen.
We just happened to have
a tablecloth that was green.
And we're like, "Hey,
we can make this work."
And that's what we use.
[Shion Kaji]
Definitely after four months,
we started the channel,
we started seeing the tipping
point in the viewership.
We started seeing views from,
India, East Asia,
all around the world.
Okay, guys, let's open up
the Ryan's World
World Tour Globe.
[Shion] Around the same time,
we started our own channel
and I think that timing
really matched so well.
Which one? Which one?
The purple one?
Are you sure you can do this?
Oh, good job!
[Loan] The most important
thing we tried
is just to make sure that he
still has time being a kid.
And make sure
that even though yes,
he's well known
around the world,
to me personally, I don't think
he notices it that much
because, again, we really try
to keep him grounded
and really try
to make sure that YouTube
is not the essential part
of who he is.
[all yelling]
[child giggling]
[whimsical music]
There was this offering
that YouTube
had called YouTube Kids,
and in 2017 it had 11
million weekly viewers,
which is just a huge number.
Wow!
[Dave] Now, it was really maybe
a couple years later
that it came out
that there was this effort
by content producers to use
YouTube and YouTube Kids
to basically
generate content that would
leverage what they knew
about the algorithm
so that it would be
recommended to children.
There were videos
that were frightening
and traumatizing children,
that depicted abuse.
They were hijacking
Disney characters
orPaw Patrol and showing
them being stabbed
or hit by cars or killed
or committing suicide.
And all of these videos
were being promoted to children.
So it was really a disturbing
phenomenon and really just goes
to show how dangerous this
kind of technology can be.
It really gets into this idea
that when you codify
something like that
into a recommendation algorithm,
results end up being dominated
by particular behaviors and
especially those behaviors
that make YouTube
the most money.
[Shion]
I guess the biggest challenge
of the parents
and of the producers too,
is the kids influencer business
is still untouched territory.
It's still very new.
You know, the regulation
is still not 100% perfect.
There are many things
that we have to figure out
as we go and then make sure
it's a safe place for kids.
[Dave] There was this
weird thing where, like,
this scary character was just
popping up in kids' videos.
A terrifying video that
targets your kids.
The online challenge encouraging
children to kill themselves.
[Dave] You know, that was
something that happened
after YouTube
had supposedly gone in
and fixed the problem.
When we talk about the YouTube
recommendation algorithm,
that's just a great example
of something where
the focus on fixing a system
for recommending content,
if we can just figure out
that right piece of technology,
everything will get fixed.
And that's a very Silicon Valley
way of approaching things.
And I don't think
it's the right one
because these
are not technology problems.
These are far broader problems.
Even putting up a stoplight
at an intersection,
It only gets made
when there's like
a certain amount
of deaths or something
or enough people complain
and people are only complaining
when there's
a reason to complain.
And it's kind of like that here.
I don't,
It's kind of dangerous
how we are just waiting
for there to be enough
of these digital car crashes
in this digital intersection
with no stoplight.
[slow music playing]
[sirens blaring]
[reporter] Injured people
are rushed to hospital.
They were gunned down during
Friday prayers at Al Noor Mosque
in the center of Christchurch.
[reporter 2]
Police confirm the gunman
was livestreaming the killings
at one mosque on social media.
[woman]
It was the most devastating...
[singing "Imagine"]
In December of 2020,
the New Zealand government
released the results
of this research
that they had done
into the shooter
and his radicalization process,
and they found that he
had donated money
to a series of YouTubers.
He basically spoke about
being radicalized on YouTube.
It wasn't in these,
we like to talk about
the deep,
dark corners of the Internet.
You know, it conjures
up these scary images
of the dark web
and all of these things.
But this was actually
right out in the open,
you know, right on YouTube
is where he got radicalized.
We bring our ideas to you.
Our ideas have power.
Our time has come.
There is nothing that
can stop an idea
whose time has come.
And that time is now.
[applause]
[slow music]
[Caleb] That shooting,
it really disturbed me.
And I, you know,
I realized in that moment that
this ideology would literally
kill my friends, you know?
And so, yeah,
I had to make a change.
I had two individuals message me
and they said,
"Hi, I'm not going
to tell you who I am,
but I'm inside
all the far right chats
and they're pissed off at you
and they've got your address."
And then he posted my
address and I was like,
"Why are you
telling me all this?"
And he said, "Because, man,
you're a fucking cuck.
But what they're saying,
what they're talking about
doing to you is fucked up."
I think what YouTube
needs to do is they need
to have very clear terms
of service on their websites
of what's acceptable
and what isn't.
You know,
I am a free speech advocate,
but what I saw on the
platform was people
taking advantage
of this algorithm.
And the algorithm does not care
about what your politics are.
It cares about watch time
and keeping you on platform.
And I think I left it
the same way I went in it,
is that I'm curious and I was
always looking for,
the world's
really, really screwed up
and how can I help
make it a little bit better?
And so that was always
the motivation going in
and that
was the motivation coming out.
[Destiny]
Basically what you want to do
is you want to find
somebody that feels like
they're fucked in some way.
"Make America Great Again,"
right?
You want to find somebody
that feels like they're fucked
in some way
and you want to speak to that.
And when you hone in on that,
you can bring a person onto
your side to believe anything
that you want them to believe.
[Caleb] And once I found,
you know, content debating
with these alt-right figures,
found Destiny,
later on I would
find ContraPoints.
Hi. In this video
I'm going to talk about
how to recognize
a fascist.
[Caleb] Yeah, I got obsessed
with learning about that,
because how are these people
disproving all my gurus,
you know, disproving all these
people who are supposed to be
my warriors
that are out there fighting
to save my civilization?
How are they getting slayed
on the rhetorical battlefield
by a transgender woman?
The strategic fascist knows it's
better to start with realistic,
achievable goals,
and that means focusing first
on stopping non-white
immigration,
something they'll try
to get you, the centrist,
conservative or liberal,
on board with
by emphasizing the danger
and criminality of non-white
immigrants and refugees.
I always viewed
the channel as a kind of
attempted intervention
that didn't assume that,
I never assumed that YouTube
was going to step in
and save us.
I never assumed
that these people
were going to be banned.
The question was,
is there some part of this
audience that can be kind of
pulled away from the edge
of this extremism, basically?
I started to meet people,
trans people, Muslim people,
gay people, all these people.
And it's not just I met them
and now I'm like,
"Oh, I feel comfortable
around you,
because I was never,
I never had a
strict phobia of any individual.
But what I, when I
would speak to them,
I would get to hear their
experience and I would actually
listen to their perspective
and I would ask them
challenging questions.
If there's anything
you guys can teach me,
if there's anything
I can teach you,
let's correspond,
let's get together,
and let's figure this thing out,
because it's a problem I believe
that we can figure out.
They do it like Tyler Durden
does inFight Club.
They take you through your
existential moment
and destroy your old identity,
which is your false identity
that your masters gave to you.
Just let go!
[Caleb] And then they give you
a new false identity,
which you are now
beholden to them.
Without pain, without sacrifice,
you would have nothing.
[Caleb]
And so now you just
run down the pike.
That's what's kind
of going on here.
And of course,
the trick is they lie to you
about a bunch of stuff, too.
They reframe the truth.
[man] There's been an awakening.
Have you felt it?
[male 2] Have you ever wondered
why we go to war?
Or why you never seem to be able
to get out of debt?
Why there is poverty,
division and crime.
What if I told you there
was a reason for it all?
What if I told you it
was done on purpose?
What if I told you that those
who were corrupting the world,
poisoning our food and igniting
conflict were themselves
about to be permanently
eradicated from the Earth?
[birds chirping]
[Tyler] My mom started prepping
for World War 3, apocalypse,
you name it.
If you can imagine
like a person just prepping
for a natural disaster as such,
she would go
ten times beyond that.
She started arming herself
with weapons, guns.
But she just started wearing
the weapon around the house.
[door thuds]
-[Alex Winter] Hey, Tyler.
-Hello.
You can just tell us a bit about
when you first started
experiencing the changes
that were happening at home.
It was, conspiracy theories
to me have always been
just part of the social
media platforms.
Since, you could say
the dawn of time
from the digital age and such.
Once it started approaching
closer and closer
to the elections itself,
the conspiracy theories
started becoming more
and more wild from things like
microchips and the vaccination.
China having
troops at the borders.
Trump is basically fighting
against the cabal or such.
[ominous music]
She said that QAnon were two
separate entities
named Q and Anon,
run by two quantum computers.
That they weren't even people,
they were AIs that
helped recruit Trump
to fight the good fight.
From there, it just started
getting even worse because
now she started preaching
these conspiracy theories
like they're the Bible
and started getting in my face
about it all the time.
I couldn't say no.
I couldn't say yes.
Either one would
trigger a confrontation
between herself and me.
And especially
since I didn't hold
the same beliefs as she did.
She said things like, "Oh,
I'm just trying to protect you."
And I'm like, "Yes, from what?
The neighbors are not
going to harm us.
I'm afraid sometimes
that you're going to go
out there and start killing
people or something."
She took offense to that
and eventually
it devolved to, the argument
devolved into me saying,
"I'm done.
I don't need you anymore."
And from there I just
called up some friends
and left on
just whatever I can grab.
[reporter]
The World Health Organization
has now confirmed
the coronavirus is a pandemic.
[reporter 2] A quarter
of the world's population
is now living under
some form of lockdown
due to coronavirus.
[Donald Trump]
Now the Democrats are
politicizing the coronavirus.
Let's raise the, raise the bed.
[doctors]
And this is their new hoax.
The coronavirus pandemic
was the perfect storm
for radicalizing people.
There is no evidence that I can
see that a pandemic exists.
[Robert Lewis]
I don't personally know anybody
that's had it.
I don't know anybody that knows
anybody that's had it.
So you either wear the mask...
And I'm not doing it because
I woke up in a free country.
[Talia]
It created a lot of isolation.
It gave people a lot more time
to be on their computers.
You cannot
make people wear a mask.
It's not our laws at all.
This is just made up
by Bill Gates and them.
Go online and look it up.
[Talia] You know,
there were these
massive social disruptions,
like over the nature
of the fabric of reality.
-[woman] Are you laying 5G?
-Yeah.
[woman] You know,
when they turn this on,
it's going to kill everyone.
And that's why they're building
the hospitals.
[crowd chanting]
[Talia] In person, the reopen
rallies like were attended by
militias and white nationalists.
Texans know this is a Chi-Com
globalist bio weapon
meant to shut down our economy.
[Talia] The same was true
on the Internet,
that the people pushing
reopen content
or talking about COVID
restrictions as state tyranny.
We're talking about
the COVID lockdown.
What we're really talking about
is the great reset.
Everything that you're
seeing right now
is a giant political conspiracy.
[machine beeping]
We're going to do
everything we can.
That's what
I can promise you, okay?
[Anthony]
When you meet someone that says,
"Oh, I'm somewhere
in the middle,"
which I feel like what
many people were at some point,
now it's like,
"Oh, you're in the middle,
then you're against me,
fuck you."
And the algorithm
really plays into that.
[yelling]
Before we get to the topic
at hand, which is COVID-19,
we have to address the protests
currently going on
and the murder of George Floyd
at the hands of police.
[crowds chanting]
[reporter]
The outrage over the death
of George Floyd is global.
[woman]
We're here to enact change.
We're here to say
that our lives matter.
We need the community
to understand that.
[crowd]
We want change! We want change!
Certainly, all of those factors
were pumped up to 11.
[yelling and screaming]
[reporter] Dozens of American
cities up in flames
after some protests
turned into riots.
[crowd yelling]
I think that the way
that the platforms
and YouTube in particular looked
at these algorithms was
much more
from the perspective of,
"Oh wow, let's find a
way to get people
to see content coming
from like-minded people."
And I don't think it
had occurred to them
that like-minded people might
be people who were angry...
Start a riot?
Black lives matter? [bleep] you!
[Jillian]
...and looking for the kind of
content that would help them
connect with other people
who were angry in that same way
and not oriented
towards justice.
A lot of people turned
to conspiracy
for the first time.
This is experimental
bio warfare on the people.
[Talia]
Some of them became out and out
white nationalists
and some of them
became sympathizers
and others became
QAnon supporters.
This is like a global takeover.
They're trying to create
a new world order, right?
They thought they could easily
get their great reset.
Little did they know!
Little did they know!
They thought they
could easily have it.
Pandemic's a hoax!
[Talia] It grew and grew
and grew and grew.
[Caleb] On January 6th,
I went down to the Capitol.
[screaming]
One thing that I've
tried to understand
is Internet subcultures.
And I was at the Capitol
to take pictures
of the different
extremist groups.
[crowd]
Stop the steal! Stop the steal!
They chased me around,
called me antifa.
Tried to get
the normal MAGA people,
because I could tell these were
some Proud Boys types.
We're going to storm
the fucking Capitol.
Fuck you fuckers.
Bellingcat did a great
study of looking
at sort of neo-Nazis on
Telegram who discussed
how they'd been radicalized.
And just like so many of them,
a majority of them,
cited YouTube specifically
as their
means of radicalization.
This is probably
our last opportunity
to actually organize
against the New World Order.
[reporter]
Yesterday, conspiracy theorists,
the alt right, the far right,
QAnon followers,
and others stormed Capitol Hill
and livestreamed it.
[man]
A lot of the blame for inciting
these riots is being placed
on social media sites.
[reporter] YouTube announced
Tuesday it suspended
U.S. President
Donald Trump's channel
as it violated policies
against inciting violence.
[woman]
He was suspended from both
Twitter and banned from Facebook
and Instagram as well.
It's a sad day when
Big Tech has more power
than big government,
that they can censor
the President of
the United States.
[Harris Faulkner]
YouTube extending its suspension
of former President Trump's
account now indefinitely.
[screaming]
[Jillian] They saw that these
democratic institutions
that they were trying to protect
through freedom of expression
were actually being destroyed
right in front of their eyes
on their platforms
for the whole world to see.
Patriots are inside
Nancy Pelosi's office!
Hey, everybody, Stefan Molyneux
from Freedomain radio.
[woman] YouTube is continuing
to ban the accounts
of white supremacists
in an effort to combat
hate speech on its platform.
We are going under
digital martial law.
[Caleb] Oversaturation of social
communication technology
is going
to cause a lot of conflict.
You know, radicalization getting
played out on YouTube
and everybody
throws their hands up and says,
"Oh, my God,
it's the new satanic panic."
It's the same old thing
that's been happening,
except now it's hyperlinked.
If we don't figure out
this problem,
we're going to lose
what it means to be human.
Your platforms have changed
how people across the planet
communicate, connect,
learn and stay informed.
The power of this technology
is awesome and terrifying,
and each of you has failed to
protect your users and the world
from the worst consequences
of your creations.
This is the first
time the three of you
have appeared before Congress
since the deadly attack
on the Capitol on January 6th.
That event was
not just an attack
on our democracy
and our electoral process,
but an attack on every member
of this committee
and in the Congress.
I want to start
by asking all three of you
if your platform bears
some responsibility
for disseminating disinformation
related to the election
and the Stop the Steal movement
that lead
to the attack on the Capitol.
Just a yes or no answer.
We always feel a deep
sense of responsibility.
But I think we worked hard.
This election effort was one
of our most substantive efforts.
[Mike Doyle]
Is that a yes or a no?
Congressman,
it's a complex question.
-We...
-Okay, we'll move on.
[Natalie]
It's an interesting situation
where you have
what has basically
become the public forum
is run by about three
corporations.
That is a level of power,
a level of political power
and social power
and economic power that is a
little bit terrifying, I think.
[slow music]
[Brianna] Google is one
of the largest companies
in the entire world.
So, you know, one of the reasons
I ran for Congress was I thought
that I could be a force of good.
Hi, I'm Brianna Wu.
And I'm running to be your
congresswoman right here
in Massachusetts District Eight.
I thought that having
someone that had been affected
by these issues,
I hoped that I could make
a difference directly.
Clearly just tweeting
about these issues,
it's not working.
We need legislators that really
care about this issue
and will put
skin on the line for it.
I want to be very clear that the
prosecution's decision
to abandon my client's claims
does not invalidate
the truth of her claims.
Well, because I am a litigator,
my whole like purpose
is this idea that,
you know, one lawyer
and one client can pay $210
for an index number
to start a lawsuit.
And that team of people can
create law that rules the land.
And that process comes about
by starting a case, appealing
and appealing when you lose,
and then petitioning
to the Supreme Court.
And so that's
why I think litigation
to reform the Internet
is really powerful
and is like
where I put my effort.
So, yeah, I do think it needs
to go to the Supreme Court.
Can you bring it here?
I'd like to think I've
been making a dent in it.
The NRA when I started was,
you know, all powerful.
Now they're,
they haven't gone away,
but they're somewhat in retreat.
With Google,
it's the same thing.
I have started
a Change.org petition.
People are calling in to say,
help Andy Parker,
give him the co-copyright.
What good is it for you
when he can at least use this
in his fight against Google?
And so it's a worthwhile fight
and I'm not giving up.
And whether it comes from
the Supreme Court or Congress,
I think it's more likely
to come from legislation,
congressional legislation.
Google profits massively
off of lack of regulation.
If it cannot properly
protect citizens
from online harassment,
hate speech
and moment of death videos,
I call on Congress to step in
and make sure
that proper protections
are in place
for private citizens like me
who are continually
harassed and exploited.
Well it's a great saying,
with great power
comes great responsibility.
We did tell him that you have,
you have the influence, right,
to influence kids.
So we, hopefully
we teach him right
and use that influence for good.
So what we did,
partnered with the YouTube team,
is we interviewed
health experts.
If somebody sneezes or coughs
and they have the coronavirus,
how far can it spread?
That's such a cool question.
These particles are so tiny
that they even happen
when we talk.
They come out
of our mouth when we talk.
I know that sounds really gross.
[laughter]
[Shion] Ryan asks all those
questions for his fans.
And now, until when do you want
to do the YouTube thing?
Until when you're 20-something?
Yeah, 20, I think 6 or 5.
So he wants to continue this
until at least he's 26.
Just for now. [laughs]
And why? He had a reason why.
I don't remember.
I know you've probably already
read the title by now,
but I feel like we should
just come out and say it.
I'm leaving Smosh.
I know a lot of you guys
are probably going to assume
he's leaving because
we got in some sort of big fight
or because we hate each other.
But I can guarantee you guys
it has nothing to do with that.
-That did not happen.
-No.
[Anthony]
I started to really look at
what I put out there
into the ether as...
really having some kind of
influence on the way that people
go about their days after
watching a piece of content.
My name is Anthony Padilla,
and today I'm going to be
sitting down with survivors
of school shootings
to learn what it's really like
to live through such a traumatic
and Earth-shattering event.
-Thank you so much, Shelby.
-Thank you!
I feel like I fully understand
the wondrous
world of asexuality.
-I'm so glad.
-And congratulations on coming
out to the entire world.
-Oh my God, I forgot already.
-[laughter]
[Anthony]
It made me realize
that the world
would be so much
of a better place if people came
into any interaction first with,
from a place of curiosity
and trying to understand
rather than judgment.
And it kind of made me
feel like there was
a point to what I was doing,
which really helped.
[cork popping]
You know what, America? Things,
things have not gone well.
Well, as long as I can remember
being on YouTube,
there have always been people
who are prognosticating the end.
"Right, oh, this is
the end of the free Internet.
Like the corporations
are going to come in,
they're going to take over."
Like it used to be
the Wild West,
it used to be fun,
it used to be free.
But it's all about to end.
You know, I've been hearing
people say that for 14 years.
[chuckles] As for me,
Yeah, I'm not just some person
filming on a webcam
in my bedroom anymore,
there's a budget.
There's, you know,
a million subscribers,
but it's still me.
And because I do commentary
on multiple-year trends,
is kind of
the time frame I try to work on,
I will respond to those trends
as they happen.
Like, am I going
to be agile enough
that I can stay interesting
and continue
to know what I'm talking about?
I hope I can. I intend to try.
[chuckles]
Ideally, where do you see
YouTube in the future?
And also just personally,
where do you see
the role of big tech
in terms of the fact that
it has had so much power?
What do you think
the ideal scenario is
moving forward as both
someone who is overseeing
YouTube and as a mom,
as someone who's just
a member of this crazy society
we find ourselves in
at this moment?
[chuckles]
So I see, I mean,
I see a lot of the benefits.
I see the benefits
of the educational communities,
of connecting
diverse communities
that otherwise could have never
been connected
and at the same time
be thinking about
what are the risks,
what are the downsides,
how can we manage that?
I know that right now there's
a lot of discussion about it,
but I see that for many reasons
that will get worked out.
I think there's
a demand from society,
from the press,
certainly from governments.
And I do believe that technology
has a tremendous opportunity
for good in the long term.
[upbeat music playing]
[crowd counting down]
[all cheering]
[opening bell ringing]
What I would like to see us do
is to fundamentally
rethink the business model.
I would like to see us to return
to something more sane,
which is say, look,
if there is value to Facebook
and to Twitter and to YouTube,
I should just pay
a subscription for it.
And if I'm paying
a subscription,
my business model
looks really different.
My incentives look different.
At Google, the past year
has given renewed purpose
to our mission
to organize
the world's information
and make it universally
accessible and useful.
Building a more helpful
Google for everyone.
[slow music]
At other times in our history,
we have seen dramatic changes,
breaking up oil
and railroad trusts.
Addressing big tobacco,
you know, instituting
automotive safety reforms.
We have stepped up and taken on
corporate behemoths in the past.
But it was
a very different time.
Until there's really
popular momentum
for some of these changes,
they're just
not going to happen.
[dramatic music]
If YouTube,
the company went down tomorrow,
I think there would be
a lot of harmful content
that was no longer
on the Internet.
But there also
would be a ton of creators
just trying to make a living
who'd no longer have
the venue to do that, right?
Days like this is
why I started vlogging,
because otherwise
no one would see this.
It really just kind of depends
on what your outlook in life is.
If you're the kind of person
who's looking for differences,
who's looking
for ways to condemn
other people
and to divide us out,
then you're going to see
YouTube as an enabler
of these
terrible movements, right,
as a way of justifying
the things that you're doing.
Whereas if you're someone
who's seeking those connections,
who's seeking to find
what we have in common
or to expose injustice,
then you're going to use it
for those purposes.
And so really, it's
the great enabler of all
of these different things.
[man] Lift off of the 25th
space shuttle mission,
and it has cleared the tower.
I really feel like
this is the commons.
YouTube is our public library.
[excited basketball commentary]
[Brianna] It's a critical part
of human civilization.
[Martin Luther King Jr]
We've learned to fly
the air like birds.
We've learned to swim
the seas like fish.
And yet we haven't
learned to walk the Earth
as brothers and sisters.
[Brianna]
But I think that corporations
don't do the right thing
until they're forced
to do the right thing.
My greatest fear is that we all
become citizens of big tech.
And that there comes a point
where they're far more powerful
than our courts or lawmakers
and that they have
more information
than our law enforcers
and that we're all just
kind of at their mercy.
[dramatic rock music]
We have built
a technological grid system
that we essentially
can't destroy
because we're relying on it now.
And they've got
these things buried
at the bottom of the ocean.
Is anybody stopping for a second
and thinking about what it means
to build something
and rely on something like that
and to never question it?
Are we speeding
things up too fast?
And what are we
speeding towards?
[slow music playing]
[slow music playing]
[slow music playing]
[slow music playing]
[dramatic music playing]
[dramatic music playing]
[slow music playing]
[slow music playing]