Follow This (2018) s01e07 Episode Script

The Future of Fakes

A NETFLIX ORIGINAL DOCUMENTARY SERIES
[Charlie] My fellow Americans
this is not president Trump.
This is Charlie Warzel from BuzzFeed News.
And the future, I can assure you
is extremely terrifying.
THE FUTURE OF FAKES
I think technology
is a value-neutral tool.
CHARLIE WARZEL
SENIOR TECH WRITER, BUZZFEED NEWS
It is as good as we want it to be
and as bad as we make it.
I report on very strange things
that happen on the internet usually.
But I also peer down
into the rise of online harassment,
and sort of the darker side
of the internet.
I look at sort of
a lot of this fake news stuff,
um, and see it sort of in the same way
as climate change.
But the difference between climate change
and the misinformation crisis
is that one is very quantitative.
You can track
how hot the planet is getting.
You can monitor all that stuff.
This technology is so new
that there's no way to quantify
how prevalent it is,
and how much it's growing.
People can write fake stories
and they go viral,
and that's sort of lo-fi.
What happens if you can manipulate
the president's face
and create a video where he declares
nuclear war on North Korea?
The future of this
is potentially disastrous.
We're entering an era in which
our enemies can make it look like
"FAKE NEWS PSA":
JORDAN PEELE AND BUZZFEED NEWS
anyone is saying anything
at any point in time,
even if they would never say those things.
[Charlie]
The audio and video manipulations
hitting the internet these days go viral
because they range from entertaining
to downright dystopian.
For instance, they could
have me say things like,
"President Trump is a total
and complete dipshit."
Now you see,
I would never say these things.
At lest not in a public address,
but someone else would.
Someone like Jordan Peele.
This is a dangerous time.
[Charlie] I think in the last year
we're seeing
more and more the beginnings
of what this future might look like.
I have a lot of questions about the people
that are actually building this stuff.
So, where are we
with audio and video manipulation?
And who is this technology
really impacting?
INDUSTRIAL LIGHT & MAGIC HEADQUARTERS
[John] I like living in a world
where I can tell the difference
between reality and fiction.
Um, although professionally,
I work in a field
where we are trying to make
compelling imagery
that transports you
to different times and places.
AVATAR, 2009
Damn you,
Jack Sparrow!
PIRATES OF THE CARIBBEAN: DEAD MAN'S CHEST, 2006
You are sort of a special effects legend.
A fascinating thing is
you are also behind Photoshop.
I'm surprised at just how
widespread it became,
or that it would become a verb.
Any regret that you have
with the technology?
I do wish there were fewer
unethical uses,
but I think it's more up to society
to put pressure on people, um,
in applying appropriate ethics
with those tools.
[Charlie] John had used more
than just Photoshop
after Star Wars actor
Peter Cushing died, in 1994.
Charming to the last.
I wanna talk a little bit about
the digital recreation, Grand Moff Tarkin,
and sort of this idea of bringing somebody
who is no longer with us back to life.
[John] We built a very detailed
digital model of Tarkin's face.
Then went through a process
of continually comparing our model
against that archival footage,
until we felt like we had
a really good match.
The original plans for the station
are kept there, are they not?
How fast is this stuff
moving and evolving?
[tongue clicks] Never fast enough.
- [chuckles] Never fast enough?
- For my taste.
[Charlie] Not everyone can bring people
back from the dead.
NEW YORK, NY
But what technology is available
to the average user?
I've been using
this free piece of software.
And essentially what it does is
it creates a digital copy of your voice.
You use your computer's microphone,
and you just speak a bunch
of pre-set phrases that they tell you.
"The sparrow and the songbird
danced along the fence."
And now I can type into a text box
anything that I want,
and it reads it out in my voice.
So, obviously what I want to do now is
I want to try to use this to fool my mom
into thinking it's me.
[phone line ringing]
- [Robin] Hi.
- [Charlie] Hey, how's it going?
[Robin] Good, good. How are you?
ROBIN WARZEL
CHARLIE'S MOM
[Charlie] I only have a second here.
Super busy.
Just wanted to quickly check in and make
sure we are still good for dinner tonight.
[Robin] Yeah, yeah. 8:15.
[Charlie] Okay, sounds good.
- Text me when you arrive.
- [Robin] Okay.
- [Charlie] Sound good?
- [Robin] Oh, great, Charlie. Thank you.
- [Robin] I'm glad you called.
- I'm losing service right now. Love you.
- Bye.
- [Robin] Love you, too. Bye.
[line disconnects]
Aah!
That worked!
Eww, I'm creeped out!
I really thought
it was gonna totally fail.
Uh, I thought she was just gonna be like,
"What's going on?"
I'm like I'm like very adrenaline-y
right now. Um
It proves that
it doesn't even have to be good,
it just has to be good enough, um
to fool, literally, your mother
into thinking that it's you
when it's a computer.
SANTA MONICA, CA
Training your voice is one thing,
but seeing is believing.
Where do things stand
with fake visuals?
The question is, how can we create
HAO LI
COFOUNDER, PINSCREEN
a, you know, photo-real avatar of you
that is not just 3D, but is also dynamic?
Pinscreen's technology will allow users
to create realistic digital avatars
for virtual reality and video games.
Several tech companies
license their software.
- Let me show you real quick how it works.
- Okay.
So, on the very left you can see
this is a neutral picture of someone.
[Charlie] Hao uses hundreds of thousands
of photos of real people
to train the computer to predict
what an expression would look like
on any given face.
[Hao] Whenever it makes a mistake,
we just tell it, "No,"
and then it gets better, and better,
and better.
- Now we wanna predict how it works on you.
- Yeah.
- [Hao] So you took your selfie
- Yeah.
and you sent it to us.
- This is one photo I sent you a week ago.
- Right.
The computer's never seen it move.
KOKI NAGANO
PRINCIPAL SCIENTIST, PINSCREEN
- I've never seen it move.
- You've never seen my face move.
And based off of
- a database of knowledge of faces
- Right.
- and facial contortions
- Right.
it was able to predict
This is how
- how I'm going to move.
- Yeah.
And all your key expressions,
and even anything between it.
And it generates wrinkles.
- That's how my nose wrinkles. For sure.
- Right?
- Yeah. Yeah.
- That's wild.
[Charlie] And right now this is like
a pretty heavy lift, right?
No, um, actually the training
Uh, the training
is computationally expensive,
but processing you, in theory,
it can be in real time.
Wow. So, the future is here.
Yeah, this is, uh this is coming.
Damn!
[Charlie] Which brings me back
to puppetting President Trump.
- Here is a video that you recorded.
- Yeah.
[Hao] And we just took like one picture
from Trump's whatever speech.
[Charlie] Yeah.
We created a 3D model,
generating all the teeth, all the
- Wow!
- you know, faces and
you know, it's kind of interesting
because it creates this real 3D motion.
[Charlie] Yeah.
[chuckling]
I'm just going to kind of, you know
[deep sighs]
try to move my face around a whole bunch
and talk to you guys
about nothing in particular,
just like total freakin' nonsense.
Oh, my Lord!
It's Charlie. How's it going?
Wow.
[Charlie laughing]
The system is really thinking
about like how would Trump smile.
[Charlie] Yeah, based on the way
his face looks.
[Hao] Based on how he looks like. Right?
[Charlie] You're so deep in this world.
Would you say you're
more worried than the average person
about, you know, potential future misuse?
Does this keep you up at night?
This is what computer graphics always did,
this is what the visual effects
industry does,
is to create things that look photo-real,
- to create an illusion for storytelling.
- [Charlie] Mm-hmm.
But then what happens if, you know,
- ordinary people have access to it?
- Yes.
I think that is the scary part.
Especially when you think about, like,
"Oh, you just need a picture.
I can do anything I want,
- it's 3D now, it's photo-real."
- Yeah.
[phone line ringng]
- [John] This is John.
- Hey, John, it's Charlie. [laughs]
JOHN PACZKOWSKI
CHARLIE'S BOSS
[John] Oh, hey, Charlie!
[Charlie] So, I just puppetted
Donald Trump's face
with a bunch of artificial intelligence
and it was really disturbing.
It kind of freaked me out, honestly.
It's like a Brave New World.
[John] I humbly suggest
that "brave new world"
is the wrong terminology here.
[laughs]
Okay, yeah. We're all screwed.
How about that?
[John] I mean, obviously,
we need to find someone who's,
- you know, a real human
- [Charlie] Yeah.
[John] who's potentially
been impacted by this.
Yeah, I think so.
- All right.
- [John] Great work.
[Charlie]
Thanks, man. I'll talk to you soon.
PARAMUS, NJ
[Katie] It was, um, April of 2011.
I had just turned 21,
and I started getting
a bunch of Facebook messages
KATIE KRAUSZ
PHOTOSHOP VICTIM
of people telling me that there were
inappropriate pictures of me online.
So I Googled my name and my hometown
and the first picture that came up was
a photoshopped picture
of me where it wasn't
It was my head,
it was the rest of my body,
and the breasts were not mine.
KATIE KRAUSZ
CUTE, FROM PARAMUS NJ
It wasn't just one website,
it was the whole bunch,
and it was the same picture
used on every website.
The only thing the police department
told me to do was they said,
"Just stop Googling yourself and sooner
or later, it'll be further down the list."
Mm-hmm.
There's nothing they can do. It's already
on the internet, they can't take it off.
[Charlie] There is a law
against revenge porn where Katie lives,
but it's never been used
to challenge photoshopped images.
How often did you find yourself
going back and searching for it?
I used to stay up all hours
of the night, just watching it
and going to work or school the next day,
being so exhausted.
It sounds like it kind of
like, stopped your entire life
in its tracks.
Mm-hmm.
[Katie] I've always been more of like
a conservative dresser, as a whole.
I like sleeves, I like high cut things,
but now more so than ever.
[Charlie] Tell me about the effect
that this had on your family.
Having to deal
with my mom crying about it,
it's heartbreaking every time.
There's nothing really
you can do about that,
because as a parent,
I couldn't imagine
watching your child go through this.
I remember asking my dad. I said,
"You know, everyone at the firehouse
is gonna know now, like,
[crying]
and that's embarrassing."
And he said, "I'm proud of you,
no matter what."
[Charlie] Katie thinks she knows
who the perpetrator was.
Somebody she only casually new
and never dated.
The photos they're still online today.
VENICE BEACH, CA
The crudest possible version
of this technology impacted Katie's life.
But imagine how much worse
it would've been for Katie,
if that still image moved.
I'm a technologist. I've worked in
big software companies and small
software companies for my whole life.
When, um DeepFace came on the scene
in December of 2017,
I saw this is taking a lot
of cutting-edge research,
and packaging it up
into this piece of technology
that you can download and experiment with.
[Charlie] The DeepFace he's referring to
is a recent video manipulation phenomenon,
where female celebrities' faces
are seamlessly stitched
onto adult actresses' bodies
in pornographic videos.
So, I wanted to see what its capabilities
and its limitations are.
I, uh decided to pick John Oliver
and Jimmy Fallon to swap their faces.
So with Oliver and Fallon
JUMMIY FALLON
LATE-NIGHT HOS
the results are quite convincing.
JOHN OLIVER
LATE-NIGHT HOS
I fed lots and lots
of YouTube videos of them.
The algorithm what it's doing is
it's taking the video, it's breaking it up
into individual frames,
and then it's converting
each of those frames,
and then stitching it back together
into a video, and that's what you get.
ORIGINAL
SWAPPED
The quality is really good
because I have a lot of training data,
from which the algorithm can learn.
Which is why celebrities, or politicians,
or any public figures
are sort of you know, more susceptible.
You know, the same could be said
for anybody who has
a public Instagram account,
and lots and lots of selfies.
[Charlie] As the technology
keeps getting better,
detection becomes more difficult,
and the distribution, more widespread.
Are the major social media platforms
actually prepared for what's coming?
We reached out to some
of the, uh big tech companies,
you know Google, Facebook, Reddit,
to have a conversation about this,
to sit down and talk
out all the different problems
and the gray areas,
and they didn't agree to do that.
Instead they sent some
fairly generic statements.
"WE KNOW THAT ASSESSING VERACITY OF PHOTOS AND VIDEO ARE MORE NUANCED"
- FACEBOOK
"WE HAVE AND WILL CONTINUE TO EVOLVE OUR SITE WIDE POLICIES AS NECESSARY."
- REDDI
SAN FRANCISCO, CA
So who is taking all this seriously?
A source of mine, Aviv Ovadya,
is an MIT grad
and he's been ringing alarm bells
about how this new technology
is being used to spread misinformation.
[Aviv] Whether you're in engineering
or a research lab,
people who are really out there trying
to make a better world in their own way.
It's hard to hear that, "Oh, wait.
That interesting problem you're solving,
which might be really, really beneficial
to so many people,
might also destroy the world order."
[laughs]
If you could sit down with Mark Zuckerberg
or people at YouTube or Google right now,
and you could ask one thing of them
to do in the next year,
what would your top priority for them be?
We need this sort of
responsibility infrastructure
across everything.
So you don't have, one after another,
your tech CEOs, your researchers
being like, "Oh, I had no idea!"
If they're poor academics,
there should be some third-party
that they can go to
that's gonna be doing that work for them.
If they're a Facebook,
they need to have internal teams
that are focused exclusively on this.
You need these feedback loops in place,
and I don't think we have any of that now.
[Charlie] Very soon, it's possible
we won't be able to tell the difference
between real and fake videos.
And so the question becomes,
what happens when we can't trust
what we see and what we hear?
The potential of this technology
is both extraordinary and alarming.
But we aren't helpless in the face
of the coming information apocalypse.
What happens in the future
depends on what we do right now.
Previous Episode