The Truth About Killer Robots (2018) Movie Script
Today's Sandy Speaks is going
to focus on my white people.
What I need you to understand
is that being black in America
is very, very hard.
Sandy had been arrested.
COP: I will light you up!
Get out!
SHANTE NEEDHAM:
How do you go from failure
to signal a lane change
to dead in jail
by alleged suicide?
GENEVA REED-VEAL:
I believe she let them know,
"I'll see you guys in court,"
and I believe they silenced her.
PROTESTERS: Sandra Bland!
WOMAN: Say her name!
Say her name!
ALL: Say her name!
Say her name!
(car humming)
(machine whirring)
(hydraulic hissing)
(hydraulic whirring)
(alarm blaring)
(siren wailing)
(beeping)
(in robotic voice):
This is the story
of automation,
and of the people
lost in the process.
Our story begins in
a small town in Germany.
(distant siren wailing)
(whirring)
(men speaking German)
(crackling)
(man speaking German)
(crackles)
(beeping)
Sven Khling:
It was quite a normal
day at my office,
and I heard from an informant
that an accident
had happened,
and I had to call
the spokesman of
the Volkswagen factory.
(ticking)
(beeping)
We have the old sentence,
we journalists,
"Dog bites a man
or a man bites a dog."
What's the news?
So, here is the same.
A man bites a dog,
a robot killed a man.
(ticking continues)
(beeping)
The spokesman said that
the accident happened,
but then he paused for a moment.
So, I...
think he didn't want
to say much more.
(rattling)
(beeps)
(man speaking German)
Khling:
The young worker
installed a robot cage,
and he told his colleague
to start the robot.
(whirring, clanking)
The robot took the man
and pressed him
against a metal wall,
so his chest was crushed.
(whirring, beeping)
(hissing)
(beeping)
(speaking German)
Kodomoroid:
The dead man's identity
was never made public.
The investigation
remained open for years.
Production continued.
(metronome ticking)
(speaking German)
Kodomoroid:
Automation of labor made
humans more robotic.
(ticking continuing)
(man 1 speaking German)
(man 2 speaking German)
(man 1 speaking)
(beeping)
Kodomoroid:
A robot is a machine
that operates automatically,
with human-like skill.
The term derives
from the Czech words
for worker and slave.
(whirring)
(drill whirring)
(drill whirring)
Hey, Annie.
(whirring)
(whirring)
(beeping)
Walter:
Well, we are engineers,
and we are really not
emotional guys.
But sometimes Annie
does something funny,
and that, of course,
invokes some amusement.
Especially when Annie
happens to press
one of the emergency stop
buttons by herself
and is then incapacitated.
We have some memories
of that happening
in situations...
and this was--
we have quite a laugh.
-(whirring)
-Okay.
Kodomoroid:
We became better at
learning by example.
You could simply show
us how to do something.
(Walter speaks German)
Walter:
When you are an engineer
in the field of automation,
you may face problems
with workers.
Sometimes,
they get angry at you
just by seeing you somewhere,
and shout at you,
"You are taking my job away."
"What? I'm not here
to take away your job."
But, yeah, sometimes
you get perceived that way.
But, I think,
in the field of human-robot
collaboration, where...
we actually are
working at the moment,
mostly is...
human-robot
collaboration is a thing
where we don't want
to replace a worker.
We want to support workers,
and we want to, yeah...
to, yeah...
(machinery rumbling)
(latches clinking)
(workers conversing
indistinctly)
(machine hisses)
(clicks)
(hissing)
Kodomoroid:
In order to work
alongside you,
we needed to know which lines
we could not cross.
(whirring)
(typing)
(thudding)
Stop. Stop. Stop.
(thuds)
Hi, I'm Simon Borgen.
We are with Dr. Isaac Asimov,
a biochemist, who may
be the most widely read
of all science fiction writers.
Kodomoroid:
In 1942,
Isaac Asimov created
a set of guidelines
to protect human society.
The first law was that a robot
couldn't hurt a human being,
or, through inaction,
allow a human being
to come to harm.
The second law was that
a robot had to obey
orders given it
by human beings,
provided that didn't
conflict with the first law.
Scientists say that
when robots are built,
that they may be built
according to these laws,
and also that almost all
science fiction writers
have adopted them as
well in their stories.
(whirring)
Sami Haddadin:
It's not just a statement
from a science fiction novel.
My dissertation's
name was actually,
Towards Safe Robots,
Approaching
Asimov's First Law.
How can we make robots
really fundamentally safe,
according to
Isaac Asimov's first law.
There was this accident
where a human worker
got crushed
by industrial robot.
I was immediately
thinking that
the robot is an industrial,
classical robot,
not able to sense contact,
not able to interact.
Is this a robot
that we, kind of,
want to collaborate with?
No, it's not.
It's inherently forbidden.
We put them behind cages.
We don't want to interact
with them.
We put them behind cages
because they are dangerous,
because they are
inherently unsafe.
(whirring)
(clatters)
More than 10 years ago,
I did the first experiments
in really understanding, uh,
what does it mean
if a robot hits a human.
-(grunts)
-(laughter)
I put myself
as the first guinea pig.
I didn't want to go through
all the legal authorities.
I just wanted to know it,
and that night,
I decided at 6:00 p.m.,
when everybody's gone,
I'm gonna do these experiments.
And I took one of the students
to activate the camera,
and then I just did it.
-(smacks)
-(laughing)
(whirring)
A robot needs to understand
what does it mean to be safe,
what is it that potentially
could harm a human being,
and therefore, prevent that.
So, the next generation
of robots that is
now out there,
is fundamentally
designed for interaction.
(whirring, clicking)
(laughs)
Kodomoroid:
Eventually, it was time
to leave our cages.
(whirring)
(beeping)
(beeping)
Nourbakhsh:
Techno-optimism
is when we decide
to solve our problem
with technology.
Then we turn our attention
to the technology,
and we pay so much
attention to the technology,
we stop caring about
the sociological issue
we were trying to solve
in the first place.
We'll innovate our way
out of the problems.
Like, whether it's
agriculture or climate change
or whatever, terrorism.
If you think about
where robotic automation
and employment
displacement starts,
it basically goes back
to industrial automation
that was grand,
large-scale.
Things like welding
machines for cars,
that can move far faster
than a human arm can move.
So, they're doing a job
that increases the rate
at which the assembly
line can make cars.
It displaces some people,
but it massively increases
the GDP of the country
because productivity goes up
because the machines are
so much higher in productivity
terms than the people.
Narrator:
These giant grasshopper-lookig
devices work all by themselves
on an automobile
assembly line.
They never complain
about the heat
or about the tedium
of the job.
Nourbakhsh:
Fast-forward to today,
and it's a different dynamic.
(grinding)
You can buy
milling machine robots
that can do all the things
a person can do,
but the milling machine
robot only costs $30,000.
We're talking about machines
now that are so cheap,
that they do exactly
what a human does,
with less money,
even in six months,
than the human costs.
(Wu Huifen speaking Chinese)
(whirring)
(beeping)
Kodomoroid:
After the first wave
of industrial automation,
the remaining
manufacturing jobs
required fine motor skills.
(bell ringing)
-(indistinct chatter)
-(ringing continuing)
(beeping, chimes)
(beeping, chimes)
Kodomoroid:
We helped factory owners
monitor their workers.
(beeping)
(beeping)
(beeping)
(beeping)
(beeping)
(sizzling)
(Li Zheng speaking Chinese)
(beeping)
(whirring)
(sizzling)
Kodomoroid:
Your advantage in
precision was temporary.
We took over
the complex tasks.
You moved to the end
of the production line.
(beeping)
(beeping)
(indistinct chatter)
(music playing on speaker)
(man speaking in Chinese)
(woman speaking Chinese)
(man speaking on speaker)
(man speaking Chinese)
(Luo Jun speaking Chinese)
(beeping)
(chickens clucking)
(chickens clucking)
(woman speaking Chinese)
(indistinct chatter)
(Wang Chao speaking Chinese)
(beeping)
(buzzing)
Automation of the service sector
required your trust
and cooperation.
Man:
Here we are, stop-and-go
traffic on 271, and--
Ah, geez,
the car's doing it all itself.
What am I gonna do
with my hands down here?
(beeping)
(beeps)
And now,
it's on autosteer.
So, now I've gone
completely hands-free.
In the center area here
is where the big deal is.
This icon up to
the left is my TACC,
the Traffic-Aware
Cruise Control.
It does a great job of
keeping you in the lane,
and driving
down the road,
and keeping you safe,
and all that kind of stuff,
watching all
the other cars.
Autosteer is probably going
to do very, very poorly.
I'm in a turn
that's very sharp.
-(beeping)
-And, yep,
it said take control.
(horn honking)
-(Twitter whistles)
-(indistinct video audio)
(phone ringing)
Operator (on phone):
911, what is the address
of your emergency?
Man (on phone):
There was just a wreck.
A head-on collision right
here-- Oh my God almighty.
Operator:
Okay, sir, you're on 27?
Man:
Yes, sir.
Bobby Vankaveelar:
I had just got to work,
clocked in.
They get a phone call
from my sister,
telling me there was
a horrific accident.
That there was somebody
deceased in the front yard.
(beeping)
The Tesla was coming down
the hill of highway 27.
The sensor didn't read
the object in front of them,
which was the, um,
semi-trailer.
The Tesla went right
through the fence
that borders the highway,
through to the retention pond,
then came through
this side of the fence,
that borders my home.
(police radio chatter)
So, I parked
right near here
before I was asked
not to go any further.
I don't wanna see
what's in the veh-- you know,
what's in the vehicle.
You know,
what had happened to him.
(scanner beeping)
(indistinct chatter)
Donley:
After the police officer
come, they told me,
about 15 minutes
after he was here,
that it was a Tesla.
It was one of
the autonomous cars,
um, and that
they were investigating
why it did not pick up
or register that there
was a semi in front of it,
you know, and start braking,
'cause it didn't even--
You could tell from
the frameway up on top of
the hill it didn't even...
It didn't even recognize
that there was anything
in front of it.
It thought it was open road.
Donley: You might have
an opinion on a Tesla
accident we had out here.
(man speaking)
It was bad,
that's for sure.
I think people just rely
too much on the technology
and don't pay attention
themselves, you know,
to what's going on around them.
Like, since, like him,
he would've known
that there was an issue
if he wasn't relying on the car
to drive while he was
watching a movie.
The trooper had told me
that the driver
had been watching Harry Potter,
you know, at the time
of the accident.
A news crew from
Tampa, Florida, knocked
on the door, said,
"This is where the accident
happened with the Tesla?"
I said, "Yes, sir."
And he goes, "Do you know
what the significance is
in this accident?"
And I said,
"No, I sure don't."
And he said,
"It's the very first death,
ever, in a driverless car."
I said,
"Is it anybody local?"
And he goes, "Nobody
around here drives a Tesla."
Newsman:
...a deadly crash that's
raising safety concerns
for everyone in Florida.
Newswoman:
It comes as
the state is pushing
to become the nation's
testbed for driverless cars.
Newsman:
Tesla releasing a statement
that cars in autopilot
have safely driven
more than
130 million miles.
Paluska:
ABC Action News reporter
Michael Paluska,
in Williston, Florida,
tonight, digging for answers.
(beeping)
Paluska:
Big takeaway for
me at the scene was
it just didn't stop.
It was driving
down the road,
with the entire top
nearly sheared off,
with the driver dead
after he hit
the truck at 74 mph.
Why did the vehicle not
have an automatic shutoff?
That was my big question,
one of the questions
we asked Tesla,
that didn't get answered.
All of the statements
from Tesla were that
they're advancing
the autopilot system,
but everything was couched
with the fact that if one
percent of accidents drop
because that's the way
the autopilot system works,
then that's a win.
They kind of missed the mark,
really honoring
Joshua Brown's life,
and the fact that
he died driving a car
that he thought was
going to keep him safe,
at least safer than
the car that I'm driving,
which is a dumb car.
Vankaveelar:
To be okay with letting
a machine...
take you from
point A to point B,
and then you
actually get used to
getting from point A
to point B okay,
it-- you get, your mind
gets a little bit--
it's just my opinion, okay--
you just, your mind
gets lazier each time.
Kodomoroid:
The accident
was written off
as a case of human error.
(beeping)
Former centers of
manufacturing became
the testing grounds for
the new driverless taxis.
Nourbakhsh:
If you think
about what happens
when an autonomous
car hits somebody,
it gets really complicated.
The car company's
gonna get sued.
The sensor-maker's
gonna get sued
because they made
the sensor on the robot.
The regulatory framework
is always gonna be behind,
because robot invention
happens faster
than lawmakers can think.
Newswoman:
One of Uber's
self-driving vehicles
killed a pedestrian.
The vehicle was
in autonomous mode,
with an operator
behind the wheel
when the woman was hit.
Newswoman 2:
Tonight, Tesla confirming
this car was in autopilot mode
when it crashed
in Northern California,
killing the driver,
going on to blame
that highway barrier
that's meant
to reduce impact.
Kodomoroid:
After the first
self-driving car deaths,
testing of the new
taxis was suspended.
Nourbakhsh:
It's interesting when you
look at driverless cars.
You see the same kinds
of value arguments.
30,000 people die
every year,
runoff road accidents
in the US alone.
So, don't we wanna
save all those lives?
Let's have cars
drive instead.
Now, you have
to start thinking
about the side effects
on society.
Are we getting rid of every
taxi driver in America?
Our driver partners are
the heart and soul
of this company
and the only reason we've come
this far in just five years.
Nourbakhsh:
If you look at Uber's
first five years,
they're actually
empowering people.
But when the same company
does really hardcore research
to now replace
all those people,
so they don't
need them anymore,
then what you're
seeing is
they're already a highly
profitable company,
but they simply want
to increase that profit.
(beeping)
(beeping)
Kodomoroid:
Eventually,
testing resumed.
Taxi drivers' wages became
increasingly unstable.
Newsman:
Police say a man drove up
to a gate outside city hall
and shot himself
in the head.
Newswoman:
He left a note saying
services such as Uber
had financially
ruined his life.
Newsman:
Uber and other
mobile app services
have made a once
well-paying industry
into a mass
of long hours, low pay,
and economic insecurity.
Kodomoroid:
Drivers were the biggest
part of the service economy.
(beeping)
Brandon Ackerman:
My father, he drove.
My uncle drove.
I kind of grew
up into trucking.
Some of the new
technology that came out
is taking a lot of
the freedom of the job away.
It's more stressful.
Kodomoroid:
Automation of trucking began
with monitoring the drivers
and simplifying their job.
There's a radar system.
There's a camera system.
There's automatic braking
and adaptive cruise.
Everything is controlled--
when you sleep,
how long you break,
where you drive,
where you fuel,
where you shut down.
It even knows if somebody
was in the passenger seat.
When data gets sent through
the broadband to the company,
sometimes,
you're put in a situation,
maybe because the truck
automatically slowed
you down on the hill
that's a perfectly
good straightaway,
slowed your
average speed down,
so you were one mile
shy of making it
to that safe haven,
and you have to...
get a-- take a chance
of shutting down
on the side of the road.
An inch is a mile out here.
Sometimes you just...
say to yourself, "Well,
I violate the clock one minute,
I might as well just
drive another 600 miles."
You know, but then you're...
you might lose your job.
We're concerned that it,
it's gonna reduce
the skill of
a truck driver and the pay.
Because you're not gonna
be really driving a truck.
It's gonna be the computer.
Some of us are worried
about losing our houses,
our cars...
having a place to stay.
Some people...
drive a truck just for
the medical insurance,
and a place to stay
and the ability to travel.
Kodomoroid:
Entire industries
disappeared,
leaving whole regions
in ruins.
Martin Ford:
Huge numbers of people
feel very viscerally
that they are being left
behind by the economy,
and, in fact,
they're right, they are.
People, of course,
would be more inclined
to point at globalization
or at, maybe, immigration
as being the problems,
but, actually, technology has
played a huge role already.
Kodomoroid:
The rise of
personal computers
ushered in an era
of digital automation.
(beeping)
Ford:
In the 1990s, I was running
a small software company.
Software was
a tangible product.
You had to put a CD in a box
and send it to a customer.
So, there was a lot of work
there for average people,
people that didn't necessarily
have lots of education.
But I saw in my own business
how that just evaporated
very, very rapidly.
Historically, people move
from farms to factories,
and then later on, of course,
factories automated,
and they off-shored,
and then people moved
to the service sector,
which is where most people
now work in the United States.
Julia Collins:
I lived on a water buffalo
farm in the south of Italy.
We had 1,000 water buffalo,
and every buffalo
had a different name.
And they were all these
beautiful Italian names
like Tiara, Katerina.
And so, I thought
it would be fun
to do the same thing
with our robots at Zume.
The first two
robots that we have
are named Giorgio and Pepe,
and they dispense sauce.
And then the next robot,
Marta, she's
a FlexPicker robot.
She looks like
a gigantic spider.
And what this robot does
is spread the sauce.
Then we have Bruno.
This is an incredibly
powerful robot,
but he also has
to be very delicate,
so that he can take pizza
off of the assembly line,
and put it into
the 800-degree oven.
And the robot can do this
10,000 times in a day.
Lots of people have
used automation
to create food at scale,
making 10,000 cheese pizzas.
What we're doing is
developing a line
that can respond dynamically
to every single customer
order, in real time.
That hasn't been done before.
So, as you can see right now,
Jose will use the press,
but then he still has to work
the dough with his hands.
So, this is a step that's
not quite optimized yet.
We have a fifth robot
that's getting fabricated
at a shop across the bay.
He's called Vincenzo.
He takes pizza out,
and puts it into
an individual mobile oven
for transport and delivery.
Ford:
Any kind of work that is
fundamentally routine and
repetitive is gonna disappear,
and we're simply not equipped
to deal with that politically,
because maybe
the most toxic word
in our political vocabulary
is redistribution.
There aren't gonna be
any rising new sectors
that are gonna be there
to absorb all these workers
in the way that, for example,
that manufacturing was there
to absorb all those
agricultural workers
because AI is going
to be everywhere.
Kodomoroid:
Artificial intelligence
arrived in small steps.
Profits from AI
concentrated
in the hands of
the technology owners.
Income inequality
reached extreme levels.
-(beeping)
-The touchscreen made
most service work obsolete.
Tim Hwang:
After I graduated college,
I had a friend who had
just gone to law school.
He was like, "Aw, man,
the first year of law school,
it's super depressing."
All we're doing is
really rote, rote stuff.
Reading through documents
and looking for a single word,
or I spent the whole afternoon
replacing this word
with another word.
And as someone with a kind of
computer science background,
I was like,
"There's so much here
that could be automated."
(beeping)
So, I saw law school
as very much going three
years behind enemy lines.
I took the bar exam,
became a licensed lawyer,
and went to a law firm,
doing largely
transactional law.
And there, my project was
how much can
I automate of my own job?
During the day, I would
manually do this task,
and then at night,
I would go home,
take these legal rules and sa,
could I create
a computer rule,
a software rule,
that would do what
I did during the day?
In a lawsuit,
you get to see a lot
of the evidence
that the other
side's gonna present.
That amount of
documentation is huge.
And the old way was actually,
you would send an attorney
to go and look through
every single page
that was in that room.
The legal profession works
on an hourly billing system.
So, I ended up in a kind of
interesting conundrum,
where what I was doing
was making me more
and more efficient,
I was doing more
and more work,
but I was expending
less and less time on it.
And I realized that this would
become a problem at some point,
so I decided to go
independent. I quit.
So, there's Apollo Cluster,
who has processed
more than 10 million unique
transactions for clients,
and we have another
partner, Daria,
who focuses on transactions,
and then, and then there's me.
Our systems have generated
tens of thousands
of legal documents.
-(beeping)
-It's signed off by a lawyer,
but largely, kind of, created
and mechanized by our systems.
I'm fairly confident that
compared against human work,
it would be indistinguishable.
(beeping)
(whirring)
(beeping)
(Ishiguro speaking)
(whirring)
(robot speaking in Japanese)
(speaking Japanese)
(indistinct chatter)
(giggles)
(beeping)
(Hideaki speaking in Japanese)
(Robot speaking Japanese)
(Hideaki speaking Japanese)
(woman speaking Japanese)
(whirs, beeps)
(beeping)
(Niigaki speaking Japanese)
(beeping)
(whirring)
(robot speaking Japanese)
(speaking Japanese)
(jazzy piano music playing)
-(beeps)
-(lock clicks)
(piano music continuing)
(automated voice
speaking Japanese)
When we first appeared,
we were a novelty.
(man speaking Japanese)
(automated voice
speaking Japanese)
(automated voice
speaking Japanese)
(buzzing)
Kodomoroid:
While doing
your dirty work,
we gathered data
about your habits
-and preferences.
-(humming)
We got to know you better.
(buzzing)
(beeping)
Savvides:
The core of everything
we're doing in this lab,
with our long-range
iris system
is trying to
develop technology
so that the computer
can identify who we
are in a seamless way.
And up till now,
we always have to make
an effort to be identified.
-(beeping)
-All the systems were very
close-range, Hollywood-style,
where you had to go
close to the camera,
and I always found that
challenging for a user.
If I was a user
interacting with this...
system, with this computer,
with this AI,
I don't wanna be that close.
I feel it's very invasive.
So, what I wanted to
solve with my team here
is how can we capture
and identify who you
are from the iris,
at a bigger distance?
How can we still do that,
and have a pleasant
user experience.
I think there's
a very negative stigma
when people think about
biometrics and facial
recognition,
and any kind of sort
of profiling of users
for marketing purposes to
buy a particular product.
I think the core
science is neutral.
Nourbakhsh:
Companies go to no end
to try and figure out
how to sell stuff.
And the more information
they have on us,
the better they
can sell us stuff.
(beeping)
We've reached a point where,
for the first time,
robots are able to see.
They can recognize faces.
They can recognize
the expressions you make.
They can recognize
the microexpressions you make.
You can develop individualized
models of behavior
for every person on Earth,
attach machine learning to it,
and come out with the perfect
model for how to sell to you.
(door squeaks)
(beeping)
Kodomoroid:
You gave us your
undivided attention.
(whirring)
We offered reliable,
friendly service.
Human capacities
began to deteriorate.
Spatial orientation and memory
were affected first.
The physical world
and the digital world
became one.
(neon sign buzzing)
You were alone
with your desires.
("What You Gonna Do Now?"
by Carla dal Forno playing)
What you gonna do now
That the night's come
and it's around you?
What you gonna do now
That the night's come
and it surrounds you?
What you gonna do now
That the night's come
and it surrounds you?
(buzzing)
Automation brought
the logic of efficiency
to matters of life and death.
Protesters:
Enough is enough!
Enough is enough!
Enough is enough!
-(gunfire)
-(screaming)
Police Radio:
To all SWAT officers
on channel 2, code 3...
Get back! Get back!
-(gunfire)
-Police Radio:
The suspect has a rifle.
-(police radio chatter)
-(sirens)
(gunfire)
(sirens wailing)
Police Radio:
We have got to get
(unintelligible) down here...
... right now!
(chatter continues)
Man:
There's a fucking sniper!
He shot four cops!
(gunfire)
Woman:
I'm not going near him!
He's shooting right now!
-(sirens continue)
-(gunfire)
Police Radio:
Looks like he's inside
the El Centro building.
-Inside the El Centro buildin.
-(radio beeps)
-(gunfire)
-(helicopter whirring)
(indistinct chatter)
Police Radio:
We may have
a suspect pinned down.
-Northwest corner
of the building.
-(radio beeps)
Chris Webb:
Our armored car was
moving in to El Centro
and so I jumped on the back.
(beeping)
(indistinct chatter)
Came in through the rotunda,
where I found two of our
intelligence officers.
They were calm
and cool and they said,
"Everything's upstairs."
-There's a stairwell right here.
-(door squeaks)
That's how I knew I was
going the right direction
'cause I just kept
following the blood.
Newswoman:
Investigators say
Micah Johnson was
amassing an arsenal
at his home outside Dallas.
Johnson was
an Afghan war veteran.
Every one of these
door handles,
as we worked our way down,
had blood on them,
where he'd been checking them.
Newswoman:
This was a scene of terror
just a couple of hours ago,
and it's not over yet.
(helicopter whirring)
(police radio chatter)
Webb:
He was hiding behind,
like, a server room.
Our ballistic tip rounds
were getting eaten up.
(gunfire)
He was just hanging
the gun out on the corner
and just firing at the guys.
(siren blares)
(gunfire)
And he kept enticing them.
"Hey, come on down!
Come and get me! Let's go.
Let's get this over with."
Brown:
This suspect we're negotiating
with for the last 45 minutes
has been exchanging
gunfire with us
and not being very cooperative
in the negotiations.
Before I came here,
I asked for plans
to end this standoff,
and as soon as
I'm done here,
I'll be presented
with those plans.
(police radio chatter)
Webb:
Our team came up with the pla.
Let's just blow him up.
We had recently got
a hand-me-down robot
from the Dallas ATF office,
and so we were using it a lot.
(whirring)
(beeping)
It was our bomb squad's robot,
but they didn't wanna have
anything to do with what
we were doing with it.
The plan was to
set a charge off right on
top of this guy and kill him.
And some people
just don't wanna...
don't wanna do that.
We saw no other option
but to use our
bomb r-- bomb robot
and place a device
on its... extension.
Webb:
H e wanted something
to listen to music on,
and so that was
a way for us to...
to hide the robot
coming down the hall.
"Okay, we'll bring
you some music.
Hang on, let us get
this thing together."
(ticking)
It had a trash bag
over the charge
to kinda hide the fact
that there was,
you know, pound and a quarter
of C4 at the end of it.
The minute the robot
got in position,
the charge was detonated.
(boom)
(high-pitched ringing)
(muted gunfire)
He had gone down with his
finger on the trigger,
and he was kinda hunched over.
It was a piece of the robot
hand had broken off,
and hit his skull, which
caused a small laceration,
which was what was bleeding.
So, I just squeezed
through the,
the little opening that...
that the charge had
caused in the drywall,
and separated him from the gun,
and then we called up
the bomb squad to come in,
and start their search
to make sure it was safe,
that he wasn't sitting
on any explosives.
Newsman:
The sniper hit
11 police officers,
at least five of
whom are now dead,
making it the deadliest
day in law enforcement
since September 11th.
They blew him up with a bomb
attached to a robot,
that was actually built to
protect people from bombs.
Newsman:
It's a tactic straight from
America's wars in Iraq
and Afghanistan...
Newsman 2:
The question for SWAT teams
nationwide is whether Dallas
marks a watershed moment
in police tactics.
Kodomoroid:
That night in Dallas,
a line was crossed.
A robot must obey
orders given it by
qualified personnel,
unless those orders violate
rule number one. In other words,
a robot can't be ordered
to kill a human being.
Things are moving so quickly,
that it's unsafe to go
forward blindly anymore.
One must try to foresee
where it is that one is
going as much as possible.
Savvides:
We built the system
for the DOD,
(indistinct chatter)
and it was something that
could help the soldiers
try to do iris
recognition in the field.
We have collaborations
with law enforcement
where they can test
their algorithms,
and then give us feedback.
It's always a face
behind a face, partial face.
A face will be masked.
Even if there's
occlusion,
it still finds
the face.
Nourbakhsh:
One of the ways we're
trying to make autonomous,
war-fighting machines now
is by using computer vision
and guns together.
(beeping)
You make a database of
the images of known terrorist,
and you tell the machine
to lurk and look for them,
and when it matches a face
to its database, shoot.
(gunfire)
(gunfire)
Those are robots that are
deciding to harm somebody,
and that goes directly
against Asimov's first law.
A robot may never harm a human.
Every time we make a machine
that's not really as
intelligent as a human,
it's gonna get misused.
And that's exactly where
Asimov's laws get muddy.
This is, sort of,
the best image,
but it's really out
of focus. It's blurry.
There's occlusion due
to facial hair, hat,
he's holding
a cell phone...
So, we took that and we
reconstructed this,
which is what we sent to
law enforcement at 2:42 AM.
To get the eye coordinates,
we crop out
the periocular region,
which is the region
around the eyes.
We reconstruct the whole
face based on this region.
We run the whole face
against a matcher.
And so this is what it
comes up with as a match.
This is a reasonable
face you would expect
that would make sense, right?
Our brain does
a natural hallucination
of what it doesn't see.
It's just, how do we get
computer to do the same thing.
(police radio chatter)
Man:
Five days ago, the soul
of our city was pierced
when police officers
were ambushed
in a cowardly attack.
Webb:
July 7th for me,
personally, was just,
kinda like, I think
it got all I had left.
I mean I'm like, I just,
I don't have a lot more to give.
It's just not worth it.
-(applause)
-(music playing)
Thank you.
Thank you.
I think our chief of police
did exactly what we
all wanted him to do,
and he said the right things.
These five men
gave their lives
for all of us.
Unfortunately, our chief
told our city council,
"We don't need more officers.
We need more technology."
He specifically said
that to city council.
(police radio chatter)
In this day and age,
success of a police chief
is based on response times
and crime stats.
(beeping)
And so, that becomes the focus
of the chain of command.
So, a form of automation
in law enforcement
is just driving everything
based on statistics and numbers.
What I've lost in all
that number chasing
is the interpersonal
relationship between the officer
and the community that
that officer is serving.
The best times in police work
are when you got to go out
and meet people and get
to know your community,
and go get to know
the businesses on your beat.
And, at least in Dallas,
that's gone.
We become less personal,
and more robotic.
Which is a shame
because it's supposed to
be me interacting with you.
(Zhen Jiajia speaking Chinese)
(beeping)
(typing)
(office chatter)
(beeping)
(automated voice
speaking Chinese)
(beeps)
(beep, music playing)
(sighs)
Kodomoroid:
You worked to improve
our abilities.
Some worried that one day,
we would surpass you,
but the real milestone
was elsewhere.
The dynamic between
robots and humans changed.
You could no longer
tell where you ended,
and we began.
(chatter, laughter)
John Campbell:
It seems to me that what's so
valuable about our society,
what's so valuable about
our lives together,
is something that we
do not want automated.
What most of us value,
probably more than
anything else,
is the idea of authentic
connection with another person.
(beeping)
And then, we say, "No,
but we can automate this."
"We have a robot that...
"it will listen
sympathetically to you.
"It will make eye contact.
"It remembers everything you
ever told it, cross-indexes."
-When you see a robot that
-(beep)
its ingenious creator
has carefully designed
to pull out your
empathetic responses,
it's acting as if it's in pai.
The biggest danger there
is the discrediting
of our empathetic responses.
Where empathizing with pain
reflex is discredited.
If I'm going to override it,
if I'm not going to
take it seriously
in the case
of the robot,
then I have to go back
and think again
as to why I take it
seriously in the case of
helping you when
you are badly hurt.
It undermines the only
thing that matters to us.
("Hikkoshi"
by Maki Asakawa playing)
Kodomoroid:
And so, we lived among you.
Our numbers grew.
Automation continued.
(woman singing in Japanese)
(speaking Japanese)
to focus on my white people.
What I need you to understand
is that being black in America
is very, very hard.
Sandy had been arrested.
COP: I will light you up!
Get out!
SHANTE NEEDHAM:
How do you go from failure
to signal a lane change
to dead in jail
by alleged suicide?
GENEVA REED-VEAL:
I believe she let them know,
"I'll see you guys in court,"
and I believe they silenced her.
PROTESTERS: Sandra Bland!
WOMAN: Say her name!
Say her name!
ALL: Say her name!
Say her name!
(car humming)
(machine whirring)
(hydraulic hissing)
(hydraulic whirring)
(alarm blaring)
(siren wailing)
(beeping)
(in robotic voice):
This is the story
of automation,
and of the people
lost in the process.
Our story begins in
a small town in Germany.
(distant siren wailing)
(whirring)
(men speaking German)
(crackling)
(man speaking German)
(crackles)
(beeping)
Sven Khling:
It was quite a normal
day at my office,
and I heard from an informant
that an accident
had happened,
and I had to call
the spokesman of
the Volkswagen factory.
(ticking)
(beeping)
We have the old sentence,
we journalists,
"Dog bites a man
or a man bites a dog."
What's the news?
So, here is the same.
A man bites a dog,
a robot killed a man.
(ticking continues)
(beeping)
The spokesman said that
the accident happened,
but then he paused for a moment.
So, I...
think he didn't want
to say much more.
(rattling)
(beeps)
(man speaking German)
Khling:
The young worker
installed a robot cage,
and he told his colleague
to start the robot.
(whirring, clanking)
The robot took the man
and pressed him
against a metal wall,
so his chest was crushed.
(whirring, beeping)
(hissing)
(beeping)
(speaking German)
Kodomoroid:
The dead man's identity
was never made public.
The investigation
remained open for years.
Production continued.
(metronome ticking)
(speaking German)
Kodomoroid:
Automation of labor made
humans more robotic.
(ticking continuing)
(man 1 speaking German)
(man 2 speaking German)
(man 1 speaking)
(beeping)
Kodomoroid:
A robot is a machine
that operates automatically,
with human-like skill.
The term derives
from the Czech words
for worker and slave.
(whirring)
(drill whirring)
(drill whirring)
Hey, Annie.
(whirring)
(whirring)
(beeping)
Walter:
Well, we are engineers,
and we are really not
emotional guys.
But sometimes Annie
does something funny,
and that, of course,
invokes some amusement.
Especially when Annie
happens to press
one of the emergency stop
buttons by herself
and is then incapacitated.
We have some memories
of that happening
in situations...
and this was--
we have quite a laugh.
-(whirring)
-Okay.
Kodomoroid:
We became better at
learning by example.
You could simply show
us how to do something.
(Walter speaks German)
Walter:
When you are an engineer
in the field of automation,
you may face problems
with workers.
Sometimes,
they get angry at you
just by seeing you somewhere,
and shout at you,
"You are taking my job away."
"What? I'm not here
to take away your job."
But, yeah, sometimes
you get perceived that way.
But, I think,
in the field of human-robot
collaboration, where...
we actually are
working at the moment,
mostly is...
human-robot
collaboration is a thing
where we don't want
to replace a worker.
We want to support workers,
and we want to, yeah...
to, yeah...
(machinery rumbling)
(latches clinking)
(workers conversing
indistinctly)
(machine hisses)
(clicks)
(hissing)
Kodomoroid:
In order to work
alongside you,
we needed to know which lines
we could not cross.
(whirring)
(typing)
(thudding)
Stop. Stop. Stop.
(thuds)
Hi, I'm Simon Borgen.
We are with Dr. Isaac Asimov,
a biochemist, who may
be the most widely read
of all science fiction writers.
Kodomoroid:
In 1942,
Isaac Asimov created
a set of guidelines
to protect human society.
The first law was that a robot
couldn't hurt a human being,
or, through inaction,
allow a human being
to come to harm.
The second law was that
a robot had to obey
orders given it
by human beings,
provided that didn't
conflict with the first law.
Scientists say that
when robots are built,
that they may be built
according to these laws,
and also that almost all
science fiction writers
have adopted them as
well in their stories.
(whirring)
Sami Haddadin:
It's not just a statement
from a science fiction novel.
My dissertation's
name was actually,
Towards Safe Robots,
Approaching
Asimov's First Law.
How can we make robots
really fundamentally safe,
according to
Isaac Asimov's first law.
There was this accident
where a human worker
got crushed
by industrial robot.
I was immediately
thinking that
the robot is an industrial,
classical robot,
not able to sense contact,
not able to interact.
Is this a robot
that we, kind of,
want to collaborate with?
No, it's not.
It's inherently forbidden.
We put them behind cages.
We don't want to interact
with them.
We put them behind cages
because they are dangerous,
because they are
inherently unsafe.
(whirring)
(clatters)
More than 10 years ago,
I did the first experiments
in really understanding, uh,
what does it mean
if a robot hits a human.
-(grunts)
-(laughter)
I put myself
as the first guinea pig.
I didn't want to go through
all the legal authorities.
I just wanted to know it,
and that night,
I decided at 6:00 p.m.,
when everybody's gone,
I'm gonna do these experiments.
And I took one of the students
to activate the camera,
and then I just did it.
-(smacks)
-(laughing)
(whirring)
A robot needs to understand
what does it mean to be safe,
what is it that potentially
could harm a human being,
and therefore, prevent that.
So, the next generation
of robots that is
now out there,
is fundamentally
designed for interaction.
(whirring, clicking)
(laughs)
Kodomoroid:
Eventually, it was time
to leave our cages.
(whirring)
(beeping)
(beeping)
Nourbakhsh:
Techno-optimism
is when we decide
to solve our problem
with technology.
Then we turn our attention
to the technology,
and we pay so much
attention to the technology,
we stop caring about
the sociological issue
we were trying to solve
in the first place.
We'll innovate our way
out of the problems.
Like, whether it's
agriculture or climate change
or whatever, terrorism.
If you think about
where robotic automation
and employment
displacement starts,
it basically goes back
to industrial automation
that was grand,
large-scale.
Things like welding
machines for cars,
that can move far faster
than a human arm can move.
So, they're doing a job
that increases the rate
at which the assembly
line can make cars.
It displaces some people,
but it massively increases
the GDP of the country
because productivity goes up
because the machines are
so much higher in productivity
terms than the people.
Narrator:
These giant grasshopper-lookig
devices work all by themselves
on an automobile
assembly line.
They never complain
about the heat
or about the tedium
of the job.
Nourbakhsh:
Fast-forward to today,
and it's a different dynamic.
(grinding)
You can buy
milling machine robots
that can do all the things
a person can do,
but the milling machine
robot only costs $30,000.
We're talking about machines
now that are so cheap,
that they do exactly
what a human does,
with less money,
even in six months,
than the human costs.
(Wu Huifen speaking Chinese)
(whirring)
(beeping)
Kodomoroid:
After the first wave
of industrial automation,
the remaining
manufacturing jobs
required fine motor skills.
(bell ringing)
-(indistinct chatter)
-(ringing continuing)
(beeping, chimes)
(beeping, chimes)
Kodomoroid:
We helped factory owners
monitor their workers.
(beeping)
(beeping)
(beeping)
(beeping)
(beeping)
(sizzling)
(Li Zheng speaking Chinese)
(beeping)
(whirring)
(sizzling)
Kodomoroid:
Your advantage in
precision was temporary.
We took over
the complex tasks.
You moved to the end
of the production line.
(beeping)
(beeping)
(indistinct chatter)
(music playing on speaker)
(man speaking in Chinese)
(woman speaking Chinese)
(man speaking on speaker)
(man speaking Chinese)
(Luo Jun speaking Chinese)
(beeping)
(chickens clucking)
(chickens clucking)
(woman speaking Chinese)
(indistinct chatter)
(Wang Chao speaking Chinese)
(beeping)
(buzzing)
Automation of the service sector
required your trust
and cooperation.
Man:
Here we are, stop-and-go
traffic on 271, and--
Ah, geez,
the car's doing it all itself.
What am I gonna do
with my hands down here?
(beeping)
(beeps)
And now,
it's on autosteer.
So, now I've gone
completely hands-free.
In the center area here
is where the big deal is.
This icon up to
the left is my TACC,
the Traffic-Aware
Cruise Control.
It does a great job of
keeping you in the lane,
and driving
down the road,
and keeping you safe,
and all that kind of stuff,
watching all
the other cars.
Autosteer is probably going
to do very, very poorly.
I'm in a turn
that's very sharp.
-(beeping)
-And, yep,
it said take control.
(horn honking)
-(Twitter whistles)
-(indistinct video audio)
(phone ringing)
Operator (on phone):
911, what is the address
of your emergency?
Man (on phone):
There was just a wreck.
A head-on collision right
here-- Oh my God almighty.
Operator:
Okay, sir, you're on 27?
Man:
Yes, sir.
Bobby Vankaveelar:
I had just got to work,
clocked in.
They get a phone call
from my sister,
telling me there was
a horrific accident.
That there was somebody
deceased in the front yard.
(beeping)
The Tesla was coming down
the hill of highway 27.
The sensor didn't read
the object in front of them,
which was the, um,
semi-trailer.
The Tesla went right
through the fence
that borders the highway,
through to the retention pond,
then came through
this side of the fence,
that borders my home.
(police radio chatter)
So, I parked
right near here
before I was asked
not to go any further.
I don't wanna see
what's in the veh-- you know,
what's in the vehicle.
You know,
what had happened to him.
(scanner beeping)
(indistinct chatter)
Donley:
After the police officer
come, they told me,
about 15 minutes
after he was here,
that it was a Tesla.
It was one of
the autonomous cars,
um, and that
they were investigating
why it did not pick up
or register that there
was a semi in front of it,
you know, and start braking,
'cause it didn't even--
You could tell from
the frameway up on top of
the hill it didn't even...
It didn't even recognize
that there was anything
in front of it.
It thought it was open road.
Donley: You might have
an opinion on a Tesla
accident we had out here.
(man speaking)
It was bad,
that's for sure.
I think people just rely
too much on the technology
and don't pay attention
themselves, you know,
to what's going on around them.
Like, since, like him,
he would've known
that there was an issue
if he wasn't relying on the car
to drive while he was
watching a movie.
The trooper had told me
that the driver
had been watching Harry Potter,
you know, at the time
of the accident.
A news crew from
Tampa, Florida, knocked
on the door, said,
"This is where the accident
happened with the Tesla?"
I said, "Yes, sir."
And he goes, "Do you know
what the significance is
in this accident?"
And I said,
"No, I sure don't."
And he said,
"It's the very first death,
ever, in a driverless car."
I said,
"Is it anybody local?"
And he goes, "Nobody
around here drives a Tesla."
Newsman:
...a deadly crash that's
raising safety concerns
for everyone in Florida.
Newswoman:
It comes as
the state is pushing
to become the nation's
testbed for driverless cars.
Newsman:
Tesla releasing a statement
that cars in autopilot
have safely driven
more than
130 million miles.
Paluska:
ABC Action News reporter
Michael Paluska,
in Williston, Florida,
tonight, digging for answers.
(beeping)
Paluska:
Big takeaway for
me at the scene was
it just didn't stop.
It was driving
down the road,
with the entire top
nearly sheared off,
with the driver dead
after he hit
the truck at 74 mph.
Why did the vehicle not
have an automatic shutoff?
That was my big question,
one of the questions
we asked Tesla,
that didn't get answered.
All of the statements
from Tesla were that
they're advancing
the autopilot system,
but everything was couched
with the fact that if one
percent of accidents drop
because that's the way
the autopilot system works,
then that's a win.
They kind of missed the mark,
really honoring
Joshua Brown's life,
and the fact that
he died driving a car
that he thought was
going to keep him safe,
at least safer than
the car that I'm driving,
which is a dumb car.
Vankaveelar:
To be okay with letting
a machine...
take you from
point A to point B,
and then you
actually get used to
getting from point A
to point B okay,
it-- you get, your mind
gets a little bit--
it's just my opinion, okay--
you just, your mind
gets lazier each time.
Kodomoroid:
The accident
was written off
as a case of human error.
(beeping)
Former centers of
manufacturing became
the testing grounds for
the new driverless taxis.
Nourbakhsh:
If you think
about what happens
when an autonomous
car hits somebody,
it gets really complicated.
The car company's
gonna get sued.
The sensor-maker's
gonna get sued
because they made
the sensor on the robot.
The regulatory framework
is always gonna be behind,
because robot invention
happens faster
than lawmakers can think.
Newswoman:
One of Uber's
self-driving vehicles
killed a pedestrian.
The vehicle was
in autonomous mode,
with an operator
behind the wheel
when the woman was hit.
Newswoman 2:
Tonight, Tesla confirming
this car was in autopilot mode
when it crashed
in Northern California,
killing the driver,
going on to blame
that highway barrier
that's meant
to reduce impact.
Kodomoroid:
After the first
self-driving car deaths,
testing of the new
taxis was suspended.
Nourbakhsh:
It's interesting when you
look at driverless cars.
You see the same kinds
of value arguments.
30,000 people die
every year,
runoff road accidents
in the US alone.
So, don't we wanna
save all those lives?
Let's have cars
drive instead.
Now, you have
to start thinking
about the side effects
on society.
Are we getting rid of every
taxi driver in America?
Our driver partners are
the heart and soul
of this company
and the only reason we've come
this far in just five years.
Nourbakhsh:
If you look at Uber's
first five years,
they're actually
empowering people.
But when the same company
does really hardcore research
to now replace
all those people,
so they don't
need them anymore,
then what you're
seeing is
they're already a highly
profitable company,
but they simply want
to increase that profit.
(beeping)
(beeping)
Kodomoroid:
Eventually,
testing resumed.
Taxi drivers' wages became
increasingly unstable.
Newsman:
Police say a man drove up
to a gate outside city hall
and shot himself
in the head.
Newswoman:
He left a note saying
services such as Uber
had financially
ruined his life.
Newsman:
Uber and other
mobile app services
have made a once
well-paying industry
into a mass
of long hours, low pay,
and economic insecurity.
Kodomoroid:
Drivers were the biggest
part of the service economy.
(beeping)
Brandon Ackerman:
My father, he drove.
My uncle drove.
I kind of grew
up into trucking.
Some of the new
technology that came out
is taking a lot of
the freedom of the job away.
It's more stressful.
Kodomoroid:
Automation of trucking began
with monitoring the drivers
and simplifying their job.
There's a radar system.
There's a camera system.
There's automatic braking
and adaptive cruise.
Everything is controlled--
when you sleep,
how long you break,
where you drive,
where you fuel,
where you shut down.
It even knows if somebody
was in the passenger seat.
When data gets sent through
the broadband to the company,
sometimes,
you're put in a situation,
maybe because the truck
automatically slowed
you down on the hill
that's a perfectly
good straightaway,
slowed your
average speed down,
so you were one mile
shy of making it
to that safe haven,
and you have to...
get a-- take a chance
of shutting down
on the side of the road.
An inch is a mile out here.
Sometimes you just...
say to yourself, "Well,
I violate the clock one minute,
I might as well just
drive another 600 miles."
You know, but then you're...
you might lose your job.
We're concerned that it,
it's gonna reduce
the skill of
a truck driver and the pay.
Because you're not gonna
be really driving a truck.
It's gonna be the computer.
Some of us are worried
about losing our houses,
our cars...
having a place to stay.
Some people...
drive a truck just for
the medical insurance,
and a place to stay
and the ability to travel.
Kodomoroid:
Entire industries
disappeared,
leaving whole regions
in ruins.
Martin Ford:
Huge numbers of people
feel very viscerally
that they are being left
behind by the economy,
and, in fact,
they're right, they are.
People, of course,
would be more inclined
to point at globalization
or at, maybe, immigration
as being the problems,
but, actually, technology has
played a huge role already.
Kodomoroid:
The rise of
personal computers
ushered in an era
of digital automation.
(beeping)
Ford:
In the 1990s, I was running
a small software company.
Software was
a tangible product.
You had to put a CD in a box
and send it to a customer.
So, there was a lot of work
there for average people,
people that didn't necessarily
have lots of education.
But I saw in my own business
how that just evaporated
very, very rapidly.
Historically, people move
from farms to factories,
and then later on, of course,
factories automated,
and they off-shored,
and then people moved
to the service sector,
which is where most people
now work in the United States.
Julia Collins:
I lived on a water buffalo
farm in the south of Italy.
We had 1,000 water buffalo,
and every buffalo
had a different name.
And they were all these
beautiful Italian names
like Tiara, Katerina.
And so, I thought
it would be fun
to do the same thing
with our robots at Zume.
The first two
robots that we have
are named Giorgio and Pepe,
and they dispense sauce.
And then the next robot,
Marta, she's
a FlexPicker robot.
She looks like
a gigantic spider.
And what this robot does
is spread the sauce.
Then we have Bruno.
This is an incredibly
powerful robot,
but he also has
to be very delicate,
so that he can take pizza
off of the assembly line,
and put it into
the 800-degree oven.
And the robot can do this
10,000 times in a day.
Lots of people have
used automation
to create food at scale,
making 10,000 cheese pizzas.
What we're doing is
developing a line
that can respond dynamically
to every single customer
order, in real time.
That hasn't been done before.
So, as you can see right now,
Jose will use the press,
but then he still has to work
the dough with his hands.
So, this is a step that's
not quite optimized yet.
We have a fifth robot
that's getting fabricated
at a shop across the bay.
He's called Vincenzo.
He takes pizza out,
and puts it into
an individual mobile oven
for transport and delivery.
Ford:
Any kind of work that is
fundamentally routine and
repetitive is gonna disappear,
and we're simply not equipped
to deal with that politically,
because maybe
the most toxic word
in our political vocabulary
is redistribution.
There aren't gonna be
any rising new sectors
that are gonna be there
to absorb all these workers
in the way that, for example,
that manufacturing was there
to absorb all those
agricultural workers
because AI is going
to be everywhere.
Kodomoroid:
Artificial intelligence
arrived in small steps.
Profits from AI
concentrated
in the hands of
the technology owners.
Income inequality
reached extreme levels.
-(beeping)
-The touchscreen made
most service work obsolete.
Tim Hwang:
After I graduated college,
I had a friend who had
just gone to law school.
He was like, "Aw, man,
the first year of law school,
it's super depressing."
All we're doing is
really rote, rote stuff.
Reading through documents
and looking for a single word,
or I spent the whole afternoon
replacing this word
with another word.
And as someone with a kind of
computer science background,
I was like,
"There's so much here
that could be automated."
(beeping)
So, I saw law school
as very much going three
years behind enemy lines.
I took the bar exam,
became a licensed lawyer,
and went to a law firm,
doing largely
transactional law.
And there, my project was
how much can
I automate of my own job?
During the day, I would
manually do this task,
and then at night,
I would go home,
take these legal rules and sa,
could I create
a computer rule,
a software rule,
that would do what
I did during the day?
In a lawsuit,
you get to see a lot
of the evidence
that the other
side's gonna present.
That amount of
documentation is huge.
And the old way was actually,
you would send an attorney
to go and look through
every single page
that was in that room.
The legal profession works
on an hourly billing system.
So, I ended up in a kind of
interesting conundrum,
where what I was doing
was making me more
and more efficient,
I was doing more
and more work,
but I was expending
less and less time on it.
And I realized that this would
become a problem at some point,
so I decided to go
independent. I quit.
So, there's Apollo Cluster,
who has processed
more than 10 million unique
transactions for clients,
and we have another
partner, Daria,
who focuses on transactions,
and then, and then there's me.
Our systems have generated
tens of thousands
of legal documents.
-(beeping)
-It's signed off by a lawyer,
but largely, kind of, created
and mechanized by our systems.
I'm fairly confident that
compared against human work,
it would be indistinguishable.
(beeping)
(whirring)
(beeping)
(Ishiguro speaking)
(whirring)
(robot speaking in Japanese)
(speaking Japanese)
(indistinct chatter)
(giggles)
(beeping)
(Hideaki speaking in Japanese)
(Robot speaking Japanese)
(Hideaki speaking Japanese)
(woman speaking Japanese)
(whirs, beeps)
(beeping)
(Niigaki speaking Japanese)
(beeping)
(whirring)
(robot speaking Japanese)
(speaking Japanese)
(jazzy piano music playing)
-(beeps)
-(lock clicks)
(piano music continuing)
(automated voice
speaking Japanese)
When we first appeared,
we were a novelty.
(man speaking Japanese)
(automated voice
speaking Japanese)
(automated voice
speaking Japanese)
(buzzing)
Kodomoroid:
While doing
your dirty work,
we gathered data
about your habits
-and preferences.
-(humming)
We got to know you better.
(buzzing)
(beeping)
Savvides:
The core of everything
we're doing in this lab,
with our long-range
iris system
is trying to
develop technology
so that the computer
can identify who we
are in a seamless way.
And up till now,
we always have to make
an effort to be identified.
-(beeping)
-All the systems were very
close-range, Hollywood-style,
where you had to go
close to the camera,
and I always found that
challenging for a user.
If I was a user
interacting with this...
system, with this computer,
with this AI,
I don't wanna be that close.
I feel it's very invasive.
So, what I wanted to
solve with my team here
is how can we capture
and identify who you
are from the iris,
at a bigger distance?
How can we still do that,
and have a pleasant
user experience.
I think there's
a very negative stigma
when people think about
biometrics and facial
recognition,
and any kind of sort
of profiling of users
for marketing purposes to
buy a particular product.
I think the core
science is neutral.
Nourbakhsh:
Companies go to no end
to try and figure out
how to sell stuff.
And the more information
they have on us,
the better they
can sell us stuff.
(beeping)
We've reached a point where,
for the first time,
robots are able to see.
They can recognize faces.
They can recognize
the expressions you make.
They can recognize
the microexpressions you make.
You can develop individualized
models of behavior
for every person on Earth,
attach machine learning to it,
and come out with the perfect
model for how to sell to you.
(door squeaks)
(beeping)
Kodomoroid:
You gave us your
undivided attention.
(whirring)
We offered reliable,
friendly service.
Human capacities
began to deteriorate.
Spatial orientation and memory
were affected first.
The physical world
and the digital world
became one.
(neon sign buzzing)
You were alone
with your desires.
("What You Gonna Do Now?"
by Carla dal Forno playing)
What you gonna do now
That the night's come
and it's around you?
What you gonna do now
That the night's come
and it surrounds you?
What you gonna do now
That the night's come
and it surrounds you?
(buzzing)
Automation brought
the logic of efficiency
to matters of life and death.
Protesters:
Enough is enough!
Enough is enough!
Enough is enough!
-(gunfire)
-(screaming)
Police Radio:
To all SWAT officers
on channel 2, code 3...
Get back! Get back!
-(gunfire)
-Police Radio:
The suspect has a rifle.
-(police radio chatter)
-(sirens)
(gunfire)
(sirens wailing)
Police Radio:
We have got to get
(unintelligible) down here...
... right now!
(chatter continues)
Man:
There's a fucking sniper!
He shot four cops!
(gunfire)
Woman:
I'm not going near him!
He's shooting right now!
-(sirens continue)
-(gunfire)
Police Radio:
Looks like he's inside
the El Centro building.
-Inside the El Centro buildin.
-(radio beeps)
-(gunfire)
-(helicopter whirring)
(indistinct chatter)
Police Radio:
We may have
a suspect pinned down.
-Northwest corner
of the building.
-(radio beeps)
Chris Webb:
Our armored car was
moving in to El Centro
and so I jumped on the back.
(beeping)
(indistinct chatter)
Came in through the rotunda,
where I found two of our
intelligence officers.
They were calm
and cool and they said,
"Everything's upstairs."
-There's a stairwell right here.
-(door squeaks)
That's how I knew I was
going the right direction
'cause I just kept
following the blood.
Newswoman:
Investigators say
Micah Johnson was
amassing an arsenal
at his home outside Dallas.
Johnson was
an Afghan war veteran.
Every one of these
door handles,
as we worked our way down,
had blood on them,
where he'd been checking them.
Newswoman:
This was a scene of terror
just a couple of hours ago,
and it's not over yet.
(helicopter whirring)
(police radio chatter)
Webb:
He was hiding behind,
like, a server room.
Our ballistic tip rounds
were getting eaten up.
(gunfire)
He was just hanging
the gun out on the corner
and just firing at the guys.
(siren blares)
(gunfire)
And he kept enticing them.
"Hey, come on down!
Come and get me! Let's go.
Let's get this over with."
Brown:
This suspect we're negotiating
with for the last 45 minutes
has been exchanging
gunfire with us
and not being very cooperative
in the negotiations.
Before I came here,
I asked for plans
to end this standoff,
and as soon as
I'm done here,
I'll be presented
with those plans.
(police radio chatter)
Webb:
Our team came up with the pla.
Let's just blow him up.
We had recently got
a hand-me-down robot
from the Dallas ATF office,
and so we were using it a lot.
(whirring)
(beeping)
It was our bomb squad's robot,
but they didn't wanna have
anything to do with what
we were doing with it.
The plan was to
set a charge off right on
top of this guy and kill him.
And some people
just don't wanna...
don't wanna do that.
We saw no other option
but to use our
bomb r-- bomb robot
and place a device
on its... extension.
Webb:
H e wanted something
to listen to music on,
and so that was
a way for us to...
to hide the robot
coming down the hall.
"Okay, we'll bring
you some music.
Hang on, let us get
this thing together."
(ticking)
It had a trash bag
over the charge
to kinda hide the fact
that there was,
you know, pound and a quarter
of C4 at the end of it.
The minute the robot
got in position,
the charge was detonated.
(boom)
(high-pitched ringing)
(muted gunfire)
He had gone down with his
finger on the trigger,
and he was kinda hunched over.
It was a piece of the robot
hand had broken off,
and hit his skull, which
caused a small laceration,
which was what was bleeding.
So, I just squeezed
through the,
the little opening that...
that the charge had
caused in the drywall,
and separated him from the gun,
and then we called up
the bomb squad to come in,
and start their search
to make sure it was safe,
that he wasn't sitting
on any explosives.
Newsman:
The sniper hit
11 police officers,
at least five of
whom are now dead,
making it the deadliest
day in law enforcement
since September 11th.
They blew him up with a bomb
attached to a robot,
that was actually built to
protect people from bombs.
Newsman:
It's a tactic straight from
America's wars in Iraq
and Afghanistan...
Newsman 2:
The question for SWAT teams
nationwide is whether Dallas
marks a watershed moment
in police tactics.
Kodomoroid:
That night in Dallas,
a line was crossed.
A robot must obey
orders given it by
qualified personnel,
unless those orders violate
rule number one. In other words,
a robot can't be ordered
to kill a human being.
Things are moving so quickly,
that it's unsafe to go
forward blindly anymore.
One must try to foresee
where it is that one is
going as much as possible.
Savvides:
We built the system
for the DOD,
(indistinct chatter)
and it was something that
could help the soldiers
try to do iris
recognition in the field.
We have collaborations
with law enforcement
where they can test
their algorithms,
and then give us feedback.
It's always a face
behind a face, partial face.
A face will be masked.
Even if there's
occlusion,
it still finds
the face.
Nourbakhsh:
One of the ways we're
trying to make autonomous,
war-fighting machines now
is by using computer vision
and guns together.
(beeping)
You make a database of
the images of known terrorist,
and you tell the machine
to lurk and look for them,
and when it matches a face
to its database, shoot.
(gunfire)
(gunfire)
Those are robots that are
deciding to harm somebody,
and that goes directly
against Asimov's first law.
A robot may never harm a human.
Every time we make a machine
that's not really as
intelligent as a human,
it's gonna get misused.
And that's exactly where
Asimov's laws get muddy.
This is, sort of,
the best image,
but it's really out
of focus. It's blurry.
There's occlusion due
to facial hair, hat,
he's holding
a cell phone...
So, we took that and we
reconstructed this,
which is what we sent to
law enforcement at 2:42 AM.
To get the eye coordinates,
we crop out
the periocular region,
which is the region
around the eyes.
We reconstruct the whole
face based on this region.
We run the whole face
against a matcher.
And so this is what it
comes up with as a match.
This is a reasonable
face you would expect
that would make sense, right?
Our brain does
a natural hallucination
of what it doesn't see.
It's just, how do we get
computer to do the same thing.
(police radio chatter)
Man:
Five days ago, the soul
of our city was pierced
when police officers
were ambushed
in a cowardly attack.
Webb:
July 7th for me,
personally, was just,
kinda like, I think
it got all I had left.
I mean I'm like, I just,
I don't have a lot more to give.
It's just not worth it.
-(applause)
-(music playing)
Thank you.
Thank you.
I think our chief of police
did exactly what we
all wanted him to do,
and he said the right things.
These five men
gave their lives
for all of us.
Unfortunately, our chief
told our city council,
"We don't need more officers.
We need more technology."
He specifically said
that to city council.
(police radio chatter)
In this day and age,
success of a police chief
is based on response times
and crime stats.
(beeping)
And so, that becomes the focus
of the chain of command.
So, a form of automation
in law enforcement
is just driving everything
based on statistics and numbers.
What I've lost in all
that number chasing
is the interpersonal
relationship between the officer
and the community that
that officer is serving.
The best times in police work
are when you got to go out
and meet people and get
to know your community,
and go get to know
the businesses on your beat.
And, at least in Dallas,
that's gone.
We become less personal,
and more robotic.
Which is a shame
because it's supposed to
be me interacting with you.
(Zhen Jiajia speaking Chinese)
(beeping)
(typing)
(office chatter)
(beeping)
(automated voice
speaking Chinese)
(beeps)
(beep, music playing)
(sighs)
Kodomoroid:
You worked to improve
our abilities.
Some worried that one day,
we would surpass you,
but the real milestone
was elsewhere.
The dynamic between
robots and humans changed.
You could no longer
tell where you ended,
and we began.
(chatter, laughter)
John Campbell:
It seems to me that what's so
valuable about our society,
what's so valuable about
our lives together,
is something that we
do not want automated.
What most of us value,
probably more than
anything else,
is the idea of authentic
connection with another person.
(beeping)
And then, we say, "No,
but we can automate this."
"We have a robot that...
"it will listen
sympathetically to you.
"It will make eye contact.
"It remembers everything you
ever told it, cross-indexes."
-When you see a robot that
-(beep)
its ingenious creator
has carefully designed
to pull out your
empathetic responses,
it's acting as if it's in pai.
The biggest danger there
is the discrediting
of our empathetic responses.
Where empathizing with pain
reflex is discredited.
If I'm going to override it,
if I'm not going to
take it seriously
in the case
of the robot,
then I have to go back
and think again
as to why I take it
seriously in the case of
helping you when
you are badly hurt.
It undermines the only
thing that matters to us.
("Hikkoshi"
by Maki Asakawa playing)
Kodomoroid:
And so, we lived among you.
Our numbers grew.
Automation continued.
(woman singing in Japanese)
(speaking Japanese)