Last Week Tonight With John Oliver (2014) s12e02 Episode Script

Facebook & Content Moderation

Welcome to "Last Week Tonight!"
I'm John Oliver.
Thank you so much for joining us.
It has been a busy week.
In Brazil, Jair Bolsonaro was charged
in an alleged coup scheme,
and in the U.S.,
Trump told Ukraine
they should never have started
their war with Russia,
and also made time
to speak to the press on Air Force One
to deliver an important message.
We're gonna go into Fort Knox
to make sure the gold is there.
We're gonna go into Fort Knox,
you know about that?
A bad bump!
We're gonna go to Fort Knox,
and make sure the gold is there.
Where would the gold have gone?
If the gold isn't there,
we're gonna be very upset.
Okay, set aside him saying
"That was a bad bump!"
in the voice of the Aflac duck.
He is alluding
to a bonkers right-wing theory
that the gold in Fort Knox
is missing.
Though, for what it's worth,
in Trump's first term,
his Treasury secretary, Steve Mnuchin,
actually did visit there,
and Freedom of Information requests
yielded this photo taken inside,
where you can clearly see
the gold bars.
I know they're easy to miss,
given the certified smokeshow
standing in front of them,
but I promise, they are there.
The week also saw Elon Musk's DOGE
continuing to fire government workers,
something he celebrated
by wielding a chainsaw at CPAC.
This is the chainsaw for bureaucracy!
Chainsaw!
First, is it possible that Elon Musk,
the world's richest man,
thinks the sound a chainsaw
makes is "chainsaw?"
And second, I'm not legally allowed
to say what I want to happen there,
but I can, and am,
thinking it really hard right now.
But a chainsaw might be
a pretty apt metaphor for DOGE,
given that Musk's cutting hastily
and without a lot of precision.
The government's repeatedly
had to scramble to rehire employees
they've suddenly
realized were essential.
On Monday,
we learned they were trying to rehire
"more than 300 employees tasked with
managing America's nuclear weapons".
Then on Tuesday,
the USDA said it accidentally
fired employees who are working
on the federal government's response
to the avian flu outbreak.
There were also cuts to workers at the
National Parks and Forest Services.
And at one protest,
a fired worker gave a warning
bout what these layoffs could mean.
When you go on vacation this summer,
I hope there are no forest fires
because I,
and thousands of my coworkers,
other National Park rangers
and National Forest Service members,
won't be there
to help put out the fires.
So, good luck with that.
It turns out park rangers do
a lot more than just pick up litter,
give tours, and,
judging by this ranger-serve looks.
The earrings, the braids,
the cateyed sunnies.
She looks like the Man
in the Yellow Hat if he slayed.
And targeting park service workers
wasn't just shortsighted,
you are bullying
America's best dorks.
They're indoor kids
who love the outdoors.
Just watch this video from the rangers
at Badlands National Park
about what visitors
are not allowed to touch.
Can't touch this.
Can't touch this.
Can't touch this.
Can't touch this.
That's perfect.
Protect those dweebs at all costs!
You imagine their Christmas parties?
Just a bunch of nature lovers
gifting each other
"Leaf of the Month" calendars
before getting absolutely zooted
on spiked Shirley Temples?
And they are committed to that message
of "don't touch the wildlife",
even posting tweets like
"Don't pet the fluffy cows",
with the follow-up,
"In National Parks you don't pet bison.
Bison pet you. If you get too close".
And let me just say,
don't threaten me with a good time.
I'd let that broad-backed cow
with Patrick Dempsey hair
give me the ride of my life.
There is not a person among us
who wouldn't happily get bucked
and fucked by God's biggest chonk.
Now, after massive public outcry,
they did reinstate
some of the Parks Service workers,
showing two things:
one, this administration can be
swayed by public pressure,
and two, they have no idea
what they're doing.
And nothing has driven
that point home
more than DOGE's website
posting receipts for their cuts
and claiming that their total
estimated savings are 55 billion,
a claim that immediately fell apart,
as reporters quickly found
the website only accounted
for 16.6 billion of savings.
But even that figure
turned out to be an overcount,
as they'd mislabeled a canceled
contract as being worth eight billion,
instead of eight million.
And when NPR
eliminated other errors,
like contracts that hadn't actually
been terminated or closed out,
they found the estimated savings,
based on the receipts DOGE posted,
were closer to two billion.
Which is a lot of numbers away
from 55 billion!
In fact, there hasn't been
such a stark disconnect
between marketing and reality
since it turned out the Glasgow Wonka
experience was actually this.
Right now, we are all her.
And that wasn't the only big claim
that quickly fell apart.
A week ago,
Musk tweeted out this table
showing millions of people
in Social Security's database
over the age of 100, claiming:
"Maybe 'Twilight' is real"
"and there are a lot of vampires
collecting Social Security".
Laughing emojis. Because he's got
an incredible sense of humor.
Trump then amplified those claims
at a press conference.
When I saw
the Social Security numbers, I said:
"Wow, that's really something".
Let's just go above 100 years old.
We have millions and millions
of people over 100 years old.
Everybody knows that's not so.
People from 150 years old
to 159 years old:
1.345.000.
By the way,
these are in the computer files.
This is what they do well.
"Who are these DOGE people?"
"They're super brilliant computer
people and they love the country."
Okay, he is clearly
reading that for the first time,
because no one says
"Wow, that's really something",
when they know
what they're talking about.
You say
"Wow, that's really something"
when your kid hands
you a drawing of your family
and you can't decipher for where
you end and your dog begins.
As for those "brilliant computer
people," turns out, not that brilliant.
Because it emerged that that table
came from a Social Security database
which "contains basic information on
every Social Security number issued".
A 2023 audit did find that it contained
nearly 19 million people over 100
without a recorded death.
But crucially,
the SSA itself had found
"almost none of them were receiving
payments from the agency,"
and that it'd cost between five
and a half and nearly $10 million
to correct those errors,
likely far more
than you'd get back in any savings.
And that is the thing here,
for all Musk and Trump's bragging,
many of the cuts they're making
are random, disorganized,
and just plain stupid.
As the real and substantial damages
of this become more and more clear,
I'm guessing images like Elon
wielding a chainsaw
while dressed like the Grim Reaper
on a Vegas bender
are gonna spark
a ton of public anger.
And when they find that anger
directed right at them,
if I may borrow the words
Good luck with that!
Exactly. And now, this!
And Now:
CBS 58's Frankie Jupiter
Has A Way With Small Talk.
- Where's Peaches?
- At home, cuddling with her father.
But don't worry,
I'm still the favorite.
- Okay.
- Absolutely.
- Are you sure, though?
- Yes.
You just kind of know
when you're the dog's person.
I don't know how,
but you definitely know.
And I've already bought
her two Halloween outfits.
- And she loves her new outfits.
- Are you sure, though?
- Good morning. How we feeling?
- Good.
- Yeah. How are you?
- Yeah, I'm good. Good, thank you.
- Wonderful. No complaints!
- No.
- It's Friday.
- The 13th.
- I didn't realize that.
- Yeah.
- Where's that umbrella?
- I wish we had a black cat.
- And a mirror to smash.
- Frankie has a mirror.
If you bought that Apple stock,
you wouldn't be here today.
That's true.
You'd be running
your own Fortune 500 company.
That's also true.
- That would be better?
- Yeah.
- But doesn't it feel like Friday?
- No.
- You ever wanted to be an astronaut?
- No.
What do you want today? Maybe you
want to buy yourself something nice.
I'm good.
Who would we be if we were
in the Beatles? Would I be Ringo?
- Who do you want to be?
- I don't care. I like 'em all.
- Who do you want to be? Paul?
- I'll take whoever's left.
- Who do you want to be?
- I'm here so I don't get fired.
Moving on.
Our main story tonight
concerns technology,
that's brought us stone tools,
the catapult, the Tamagotchi,
and one day, God willing,
a fourth thing worth having.
One of the biggest stories
from the last election
is just how much the tech industry
seemed to swing toward Trump.
Elon Musk, of course,
campaigned and jumped for him,
and Jeff Bezos reportedly killed
the Washington Post's endorsement
of Kamala Harris,
and got a prime seat
at the inauguration,
alongside the CEOs
of both Google and Apple.
But one of the most visible swings
came from Mark Zuckerberg.
He famously banned Trump
from Facebook after January 6th.
But last month, he was co-hosting
a party at his inauguration,
and just two weeks before,
he made this striking announcement.
Hey, everyone. I want to talk
about something important today
because it's time to get back
to our roots around free expression
on Facebook and Instagram.
Is that what you wanted talk about?
Because I'd much rather discuss
why you suddenly look
like Eddie Redmayne
was cast to play Ice Cube.
You look like white Macklemore.
You look like a high schooler
going undercover as a different
high schooler with fewer friends.
But I'm sorry, you were talking
about getting back to your roots?
First, we're going to get rid
of factcheckers
and replace them
with community notes,
similar to X, starting in the U.S.
Second, we're going to simplify
our content policies
and get rid of a bunch of restrictions
on topics like immigration and gender
that are just out of touch
with mainstream discourse.
Third, we're changing
how we enforce our policies
to reduce the mistakes
that account for the vast majority
of censorship on our platforms.
We used to have filters that scanned
for any policy violation.
We're going to focus those filters
on tackling illegal
and high-severity violations.
And for lower severity violations,
we're going to rely
on someone reporting an issue
before we take action.
That is the CEO of Meta announcing
that he's getting rid of fact-checkers
and saying he doesn't want to be "out
of touch with mainstream discourse"
while wearing a $900,000 watch.
And there is just no way
any watch is worth that much,
unless when you look at it,
it reads.
"It's time to donate
the rest of your money,
you officially
seem to have too much."
The changes Zuckerberg's
making are striking.
A leaked training document
found it's now acceptable to say,
"Immigrants are grubby,
filthy pieces of shit,"
and also specifies
that this slur for trans people
is no longer a designated slur,
and is therefore allowed.
As for replacing fact-checkers
with community notes, like on X,
it is worth noting, that hasn't been
a raging success over there,
with one study finding
nearly 3/4 of accurate community
notes on election misinformation
never got shown to users.
Remember that tweet
falsely claiming Haitians
were eating pets in Springfield, Ohio?
That claim was rated PolitiFact's
"Lie of the Year",
but there is still
no community notes on the tweet,
despite multiple attempts
to add one.
And all of this is a pretty
notable shift for Zuckerberg.
7 years ago, amid widespread
public outcry around Facebook
stoking misinformation and hatred,
he went before Congress to apologize.
For most of our existence,
we focused on all of the good
that connecting people can do.
But it's clear now
that we didn't do enough
to prevent these tools
from being used for harm, as well.
And that goes for fake news,
for foreign interference in elections,
and hate speech, as well
as developers and data privacy.
We didn't take a broad
enough view of our responsibility,
and that was a big mistake.
And it was my mistake.
And I'm sorry.
Yeah, it seems that Victorian ghost
has made a pretty big turnaround,
both in terms of what he's saying,
and how he looks.
I will admit, new Zuck
does look like he's having more fun.
He's tan, he looks like he's shopping
at stores that only take crypto,
and he's not sitting
in front of Congress
looking like a depressed version
of the guy from the stonks meme.
Zuckerberg is framing all this
as merely responding
a broader cultural shift,
something that he outlined,
naturally, on Joe Rogan.
What we do is, we try to build
a platform that gives people a voice.
But I think that there's
this wholesale generational shift
in who are the people
who are being listened to.
I think it's just, like,
a wholesale shift in saying,
"We just want different people
who we actually trust".
Who are actually
going to tell us the truth,
and not give us like the bullshit
opinions that you're supposed to say,
but the type of stuff
that I would actually,
like when I'm sitting
in my living room with my friends,
like the stuff that we know is true.
Is there anything more off-putting
than a guy worth hundreds of billions,
trying to be a relatable everyman?
"You know how it is, chilling
in the living room with the bros,
cracking a six-pack
of Ace of Spades Magnums,
kicking back
on your diamond-encrusted sofa,
and turning on the big screen TV,
which in my house, is a hollow box
where I pay the cast of 'The Office'
to reenact my favorite scenes.
Just relatable,
everyday stuff, guys."
I'm not saying
Facebook was doing a perfect job
of moderating content
until now,
we've criticized them multiple times
before on this show.
I'm also not saying they even
could've done it perfectly.
It's been said that "content moderation
at scale is impossible to do well."
But the decision
to both abandon fact-checkers
and turn off systems they'd previously
claimed made the platform safer
does feel like it's about to make
that site a whole lot worse.
And the self-depiction of Zuckerberg,
rap name Lil Broccoli,
as someone embracing his company's
"roots around free expression"
is self-serving bullshit.
Given all of that, let's take a look at
the challenges of content moderation,
how Facebook's faced them
in the past,
and what might've led
to its new approach.
Let's start with the challenges.
Because from the very beginning
of the modern internet,
there were concerns
about what was on it.
In 1995, Senator James Exon
brought a blue binder
to the floor of the Senate.
The most hardcore,
perverse types of pornography.
The images
came from the internet.
Exon wanted his fellow senators
to realize what kids could see.
Come by my desk
and take a look
at this disgusting material.
So, there's a lot to love there,
from the binder that says
"warning" in big letters,
to the invitation to come by his desk
for some "disgusting material".
But my favorite part
has to be the clip of a Playboy JPEG
loading one centimeter at a time.
That is an extremely
accurate portrayal
of what online porn was like
in the '90s.
How do I know?
No reason at all.
Back then, there were battles
in both Congress and the courts
about how the law should treat
websites' hosting of things
like pornography
and defamatory comments.
For a brief time,
there were questions
about whether a site's decision
to moderate content in any way
made it a publisher, therefore liable
for anything that it didn't remove.
That ultimately led to the passage
of what's known as Section 230,
often described as the
"26 words that created the internet".
It stated that companies
could be shielded from liability
for what its users post, because
unlike, say, a print publication,
websites are dealing
with so much material,
they couldn't possibly vet all of it.
It does have some carveouts,
it doesn't give sites a pass
for certain types of illegal content,
like child sexual abuse
and terrorism materials,
but mostly, it allows them
to moderate without fear.
Which is good.
Because, as scholars will tell you,
content moderation is absolutely key
to making the internet bearable.
You can't have a usable platform
if you don't do
some sort of content moderation.
Otherwise, every platform
will just be porn and diet pills.
Right, that would be a problem!
Especially on a website
like, say, LinkedIn.
Although, to be honest, it might
already be just porn and diet pills.
I haven't been on it in years,
I've had this job since 2014
and frankly have no interest
in learning how other people
"rise and grind".
The point is, though,
if you want people to use your site,
and, crucially, have companies
want to pay to advertise on it,
you're gonna have to make choices
about what to remove.
And how you make those choices
is always going to be contentious.
Facebook, over the years,
has learned a lot of these lessons
the hard way.
In its early days, it took an almost
touchingly naive approach,
as this former employee explains.
We had to set up
some ground rules.
Basic decency, no nudity,
and no violent or hateful speech.
And after that,
we felt some reluctance
to interpose our value system
on this worldwide community
that was growing.
Was there not a concern, then,
that it could come become
sort of a place
of just utter confusion,
that you have lies that are given
the same weight as truths,
and that it kind
of just becomes a place
where truth
becomes completely obfuscated?
No.
We relied on what we thought
were the public's common sense
and common decency
to police the site.
You did, did you?
I'd say that was adorable,
but frankly, I would be embarrassed
to go on "Frontline" and confess
that everyone working at Facebook
shared a level of wide-eyed naivete
that can only be described
as "full-blown Amelia Bedelia".
Now, obviously,
that initial optimism didn't last.
Over the years, Facebook started
implementing more and more rules
and employing more and more people
to enforce them.
And that could be a grim job.
Here is one moderator
describing what it was like
to screen thousands
of disturbing images a day,
with, I'm going to warn you,
a very distracting disguise.
I think it would be easier
to deal with the images
if you weren't having to think
about them so deeply.
I worked the evening shift,
so I would start at 6:00 PM,
finish at two-two o'clock
in the morning.
But then you would often wake up
three, four hours later.
You'd suddenly sit up in bed,
remembering a decision
that you've made
and realizing
that you've made a mistake.
Like, I've missed a nipple.
You remember
some image that you'd seen,
and you suddenly realize that
there was a naked girl on one side
or an ISIS flag
in the background,
so now it should have been deleted
under the terrorism policy.
Setting aside that it's coming
from a grown man with a baby head
talking like Darth Vader
while dressed for vacation,
that is a depressing glimpse
into what Facebook was dealing with.
And it is a long way from relying
on the inherent good of humanity
to hiring people
to mainline ISIS porn.
And while some decisions
around blocking content were easy,
others turned out
to be more difficult.
Because think about it,
say you ban nudity.
What about statues?
What about breastfeeding?
What about
a breastfeeding statue?
And before you answer, what if
I told you that I meant this one?
See? It gets tricky quick.
The company
also had to develop policies
around hate speech
and misinformation,
where the boundaries
could be even trickier to define.
For instance, their rules
prohibited attacks on people
because they belong
to "protected categories"
based on things like race,
sex, gender identity,
or religious affiliation,
but it allowed users broader latitude
when they wrote about narrower
subsets of those categories.
And watching moderators try
to apply rules like that in practice
can be bizarre,
as this hidden-camera footage
from a content
moderation center in Ireland shows.
That ticket there
"Fuck off back to your own country",
and it says "Muslim".
- Immigrants.
- Muslim immigrants.
If it just said "Muslims", then, yeah.
You'd take this action.
But it doesn't,
it's actually an ignore.
"He looks after stinking
Muslim immigrants."
- I think that's fine.
- How is that right?
Yeah, because saying that they're
stinking might be physical inferiority.
If it said "Muslim immigrant scum",
for example, that would be a delete."
Yeah, that is a weird place
to draw the line.
But at the same time, anywhere
you draw a line can be weird.
Deciding where speech becomes
harmful is like trying to figure out
which of the horrifying nightmares
on an "Animorphs" cover
you wouldn't run over
with your car.
These freaks?
No question. It's on sight.
I'll bounce them off the front bumper
of my Subaru going 80 and feel nothing.
On the other end, though, whoa!
That's a normal kid.
Slow down,
I'm not trying to do time.
But in the middle?
Yeah, that's a question, isn't it?
That's a question.
And the thing is,
the same goes with factchecking,
obvious lies are one thing,
but there are plenty of statements
that are factually true
but still technically misleading.
And at one time, Facebook
put a lot of thought into this,
even producing
this video in 2018,
featuring employees wrestling
with the nuances of moderation,
and one even diagramming
out the problem as he saw it.
Imagine on the X axis
that you have the amount of truth
in a piece of content.
Now on the Y axis
you have the intent to mislead.
You can take this chart, and you
can split it into four quadrants.
In the bottom left, you have
the set of things that are low truth,
but nobody
was intending to mislead anything.
That's just called being wrong
on the internet and it happens.
And in the bottom right, it's the set
of things that have high truth
but again nobody
was trying to mislead anyone,
that's just called
being right on the internet
and I'm sure
it will happen someday.
The top right,
this is things that are high truth,
high intent to mislead
so this is stuff like propaganda.
So, this is stuff
like cherry-picking of statistics.
Now, mind you, we have to be
really careful here, right?
Because of our commitment
to free speech,
everything we do here
has to be incredibly careful.
But we move to this quadrant,
this is the really dangerous quadrant.
Low amount of truth,
high intent to mislead.
These are things that were explicitly
designed and architected to be viral.
These are the hoaxes of the world.
These are things like Pizzagate.
This is just false news.
We have to get this right
if we're gonna regain people's trust.
It is amazing that all this started
when a young man
had a simple dream of ranking
his classmates by fuckability,
and 15 years later,
a company's struggling to stop people
from accusing random pizzerias
of human trafficking.
A butterfly masturbates
in its dorm room,
and it causes a hurricane
for the rest of us.
And to give that man credit:
he is wrestling with the issue there.
But the company wasn't doing that
out of the goodness of its heart.
Facebook had come
under heavy fire
for allowing fake news
and hate speech to proliferate,
not just in the U.S.,
but also abroad.
We've talked before
about how misinformation on Facebook
helped fuel ethnic hatred,
leading to headlines like:
"Facebook Admits It Was
Used to Incite Violence in Myanmar".
It was around this time that
Zuckerberg apologized to Congress,
and the company began deploying
options to handle misinformation,
from partnering
with outside factcheckers
to appending notes to posts.
It would also delete some posts
or limit the reach of others.
And in doing so,
it found itself constantly
making very hard decisions,
under pressure
from some very powerful people.
Here is one such case,
and a former Facebook employee
explaining the decision
they ended up making.
We want to give this president the
opportunity to do something historic.
This was the video
of then-House Speaker Pelosi
posted to Facebook in 2019,
slowed down to make it seem
that she was slurring her words.
Did it come down?
- It did not.
- Why?
Because it didn't violate
the policies that they had.
So, did she put pressure
on the company to take it down?
She was definitely not pleased.
- Is that a yes?
- Yes.
And it really damaged the relationship
that the company had with her.
Okay, set aside the fact
I don't see why you'd need to slow down
footage to embarrass Nancy Pelosi,
a person who says plenty of
embarrassing things at normal speed,
I do agree with Facebook there
that "does this piss off Nancy Pelosi"
isn't a valid metric for taking
that particular video down.
I'm not saying Facebook made
the right decision 100% of the time.
Again, no company operating
at this scale could.
But it did develop systems
that it claimed worked pretty well.
At one point,
they bragged that when people saw
fact-checkers had labeled
content false, or partially false,
they would not click on it
nearly 95% of the time.
And in a recent report,
they noted that,
when it comes to hate speech
they'd taken action against,
their systems automatically dealt
with 95.3% of it,
meaning users
only had to report the rest.
Of course, some of those tools
are being watered down,
and others
are being turned off completely.
And that brings us
to the question of why.
Why are they suddenly
doing this?
There are a few things that happened
during the past five years
that have helped
bring us to this point.
One has been conservatives
repeatedly painting
normal content moderation
as political persecution.
Big tech's out to get conservatives.
That's not a suspicion.
That's not a hunch. That's a fact.
We've seen these-these-that
big tech have been censoring us.
American people are being censored.
Conservatives are being censored.
The information that's flowing to the
American people is being censored.
It's just the bottom line.
Okay, that is obviously all bullshit,
but to be fair,
Devin Nunes knows a thing or two
about censorship,
given he once filed
a $250 million lawsuit
against twitter accounts
that made fun of him,
including one that pretended
to be Nunes' cow.
He sued a fake Twitter cow
because it said mean things about him,
prompting the ACLU to issue
this actual statement headlined
"Devin Nunes' Cow
Has a First Amendment Right to call
Rep. Nunes a 'Treasonous Cowpoke'".
We truly live
in the single stupidest timeline.
The point is, conservatives have been
crying censorship for years.
But the evidence
for that is very weak.
First, to the extent their posts
do get flagged more,
that's probably
because "conservatives tend to be
more likely to spread
political misinformation,
according to numerous
empirical studies."
But even if you think platforms
are trying to suppress conservatives,
they're doing a terrible job of that,
given many of Facebook's top
performers lean right,
and there are three times
as many explicitly conservative
news influencers
as liberal ones on the site.
Republicans conducted an assault
on the idea of content moderation,
often citing one go-to example,
outlined here by Jeanine Pirro.
Who suppressed free speech
in the 2020 election? Facebook.
When they wouldn't allow people
to communicate
and the press to communicate
on Hunter Biden's laptop.
Right. Hunter Biden's fucking laptop,
a story that big tech
successfully censored,
which is why you've never
heard about it.
And I'm afraid
it is worth taking a second
to remind you
of the details in this story.
Because while people's minds might
immediately swing to "Russian hoax"
or "damning evidence
of Biden corruption",
the truth is, it was neither.
Very briefly, back in 2020,
while Trump was president,
social media sites
got a warning from the FBI
to look out for hack-and-leak
operations before the election.
Then, in October, the New York Post
ran a story based on files
from a laptop they claimed
belonged to Hunter Biden,
and which had been given to them
by Steve Bannon and Rudy Giuliani.
Facebook and Twitter
were wary of the story.
Twitter briefly didn't allow
people to post links to it.
And Facebook allowed the story
to be seen and shared
but limited the article's reach,
only to remove that restriction
soon after.
Now, it eventually came out
that files from the laptop were legit,
but also, that nothing on it
revealed illegal
or unethical behavior by Joe Biden.
Was initially suppressing the laptop
story a fuckup by these companies?
In hindsight, yeah.
Was the story itself particularly
revelatory or important?
Not really.
Did Facebook's actions
prevent people from finding
out about it before the election?
Again, not really.
Even during the period
Facebook was limiting its spread,
the story got 54 million views
on its site.
So, if this was an attempt
at censorship,
it was successful
in limiting the audience
to around the same number of people
that watched the "Friends" finale.
But that initial decision
meant Mark Zuckerberg
got yelled at a lot by the right.
And around that same time,
he was also being yelled
at by Biden's White House.
Because as the Covid vaccine
was rolling out,
a lot of misinformation
was circulating on Facebook.
Biden said at one point, of Facebook,
that, "They're killing people".
While he quickly walked that back,
to hear Zuckerberg tell it,
the pressure from the White House
to suppress anything critical of vaccines
back then was overwhelming.
Basically, these people
from the Biden administration
would call up our team
and scream at them and curse.
Biden, when he was,
he gave some statement at some point.
I don't know if it was a press
conference or to some journalist
where he basically was like
"These guys are killing people".
I don't know.
Then, all these different agencies
and branches of government
basically just started investigating,
coming after our company.
It was brutal. It was brutal.
He is clearly pandering
to Joe Rogan and his audience there,
although many seemed too distracted
by his outfit for that to work well,
given comments
under that video include,
"Bro dressed like undercover cop",
"I thought this was Lil Dicky",
"Why is a 40 year old billionaire
dressed like my 25 year old
shrooms guy?"
But let's deal
with his implicit claim
that the government
launched investigations
to punish Facebook
for hosting anti-vaccine content.
It is true that the government's
investigated Facebook a lot,
in recent years,
but none of those investigations
fit Zuckerberg's narrative.
Some, like an FTC antitrust lawsuit,
were launched
during the first Trump administration.
Others, like the CFPB's investigation
of big tech payment systems,
involve multiple other companies.
And much of the scrutiny
the company's received in recent years
was actually the result
of a whistleblower
releasing a cache of documents
known as "The Facebook Files".
As for his complaint
the government was cursing
and screaming at Facebook,
they are allowed to do that.
When we call up government agencies
to check a fact,
they can tell us to eat shit.
Because cursing
does not violate your rights.
For more on that,
check out "The Constitution
for Total Fucking Dumbasses".
What they can't do is force you
to do something.
And in that interview,
Zuckerberg describes his response
to government demands back then
as "I was just like, well,
we're not going to do that".
And, exactly, you said no.
As was your right.
And don't take my word for this.
Allegations like these
have been adjudicated in court.
When two Republican state AGs
tried suing the government,
claiming it had pressured platforms,
including Facebook,
to censor their speech, they lost,
in a six-three Supreme Court decision
written by Amy Coney Barrett,
who noted
"the plaintiffs could not demonstrate
that their content was restricted
due to government pressure".
I can understand Zuckerberg's
feelings being hurt
by the president saying
his company's killing people.
And I can understand him
being sick of being yelled at
by Republicans for doing too much,
and by Democrats for doing too little.
On some level, I can even understand
a business wanting to cozy up
to whoever's in the White House.
But there is one other factor here that
does seem relevant to this discussion,
and feels important to mention.
It's a political evolution for Meta.
Four years after Facebook
suspended Mr. Trump's accounts
in the wake of January 6th,
and just months
after the president-elect
accused Zuckerberg
of plotting against him in 2020,
calling for "life in prison"
if Zuckerberg did it again.
But after Mr. Trump's win,
Zuckerberg travelled to Mar-a-Lago,
his company donated a million dollars
to the Trump Inaugural Fund,
and now close Trump ally
and UFC head Dana White
is joining Meta's board.
Meta, Facebook.
I think they've come a long way.
Do you think he's responding to the
threats that you have made to him?
Probably.
Yeah! Yeah, probably!
Trump threatened Mark Zuckerberg
with life in prison,
then Zuckerberg
turned around, gave him money,
hired one of his buddies, and changed
the direction his company was going.
It doesn't take a genius
to draw a conclusion there
and, in fact, it didn't take one.
And it didn't stop there.
Meta also recently paid $25 million
to settle a bullshit lawsuit
that Trump filed
over being kicked off Facebook,
despite many experts agreeing
that was well
within the company's rights.
And at this point, it does begin
to feel like Trump is doing
exactly what Zuckerberg
accused the Biden administration of:
leveraging the power of his office
to pressure social media companies
to bend to his will.
And Zuckerberg
seems to be complying.
And he'll insist these changes
are not a result
of being under political pressure,
but either way,
Facebook sure seems now set
to become an absolute sewer
of hatred and misinformation.
Which I know sounds like a pretty
good description of Facebook already,
but we're about to see what happens
when they really stop trying.
So, what can we do?
There are some bad ideas out there.
Both Democrats and Republicans
have, in recent years,
suggested ways
to amend Section 230
so companies are more liable
for what appears on websites.
But I am yet to see a proposal
that couldn't be easily weaponized
to enable political censorship.
There are definitely options
available to companies
that advertise on Facebook,
and I would argue they might want
to seriously consider
whether they want their ads next
to these actual sample sentences
Facebook says are now acceptable.
Hey Disney, you want Olaf promoting
"Frozen 3" next to that shit?
I don't know, maybe you do!
But for individuals,
the options here are more limited.
You could delete
your Meta accounts.
And you would not be alone
in doing that.
In January, Google searches for how
to cancel and delete Facebook,
Instagram, and Threads accounts
increased by over 5,000%.
And there are alternatives
that don't seem as desperate
to fall in line with Trump.
But I do get that
if Facebook and Instagram
are where your family
and friends are,
you may not be ready
to take that step.
Just remember to take whatever
you read on those platforms
with even more of a grain of salt
than you did before.
But there is one small way
you can actually fuck with Meta,
and that is by making yourselves
a bit less valuable to them.
Remember, advertising makes up
98 percent of Meta's revenue.
And a key component
is them being able to offer companies
the ability to micro-target you.
Meta can do that because
they track massive amounts of data
about not just
what you do on their sites,
but all across the internet.
Which is why they probably
would not want me to tell you
that you can change your settings
so that Facebook and Instagram
cannot profit as much
from your data anymore.
If you'd be interested in a step-by-step
guide on how to do that,
simply visit John-Oliver-
wants-your-raterotica-dot-com.
And if Facebook is gonna continue
to subject us to a steadily rising tide
of slurs, hoaxes,
and misinformation,
the least it can do is tell us
the actual truth in its messaging.
Here at Facebook,
it's time to get back to our roots
around free expression.
We've been 100% committed
to making Facebook a safe
and inclusive place for all people.
Over the years, we've tried
our best to prevent the site
from becoming a frothing toilet
of the worst humanity has to offer.
It felt like our responsibility,
because we made the toilet,
and got super rich from it.
But moderating
billions of users is really hard.
They say, without it, the internet
would just be porn and diet pills.
God, I wish those were the only
things we had to look out for.
It's porn, diet pills, hoaxes,
Gambling, crypto scams,
those pictures
of a weirdly jacked grandpa
that say "One trick to getting
turbo-shredded after 50".
It's a lot.
Which is why we're so happy
to announce our 2025 policy
on content moderation: fuck it.
Fuck it.
To be clear,
all our previous issues remain,
but by strategically
pivoting to fuck it,
we found it's now
more of a you problem.
It's so nice not to have to keep
all the nuances straight,
like why "immigrants are shitty"
is acceptable,
but "immigrants are shit" isn't.
- Other way around.
- You sure?
Is "shit" the adjective or the object?
What if they wrote
"Go back to your country, immigrant?"
- That's definitely okay.
- Okay. All right.
I mean, not okay,
but-you know.
You know
how your older relatives would say,
"I got an email from a prince in Africa
who will send me a million dollars
if I give him
my Social Security number,"
and you had to be like
"No, Grandma, that's fake"?
Now you just have to do
that for all your relatives,
for all news, forever.
Fuck it.
Besides, what is the worst
that could happen?
A genocide! We kind of sort
of contributed to a genocide.
In Myanmar. Remember?
But what are the odds
another genocide could happen?
I have no idea. We fired
the team that would know.
I'm just glad my baby's
gonna grow up in a world
where he can use the slurs
of his choice.
Or her choice. Or their-oh, right.
No, that's right, it's two.
Facebook decided it is just
the two now. Hear that?
Fuck it.
Fuck it.
And to those who say this is just us
rolling over for President Trump
in the hopes he won't
throw us all in prison,
let me forcefully say: nuh-uh.
Donald Trump doesn't set the tone here.
Unless he said he did. Did he?
Just know that
whatever's happening out there,
we here at Facebook
are recommitting to our core values.
The same ones
we've definitely always had.
- Freedom of expression.
- Avoidance of responsibility.
And ranking college girls
by hotness.
No? No, that's too far back?
Okay, never mind.
Facebook. It's like a town square,
if your town was also
full of Russian spies and bots,
some teenagers disguised as adults,
some adults disguised as teenagers,
getting together
to say whatever they want,
including conspiracy theories
plus variations on
"but the Nazis also had
some good ideas",
and also now you are the mayor
and police of your town square.
That's our show, thanks for watching.
We'll see you next week, good night!
Fuck it.
Fuck it.
Fuck it.
Previous EpisodeNext Episode