Absolute Denial (2021) Movie Script
1
[gentle melancholic music]
- [air whooshes]
- [chalk rattles]
[chalk clanks and hisses]
[gentle riveting music]
[music continues]
[David] The human brain, while
powerful, is a weak design.
[clock beeps]
[David] Addiction,
anxiety, depression,
there's a million
different ways this thing
can fail to perform even
the most basic tasks.
[keyboard clicking]
But it's still a computer,
it computes.
[slow tense music]
[David] Strip it down,
simplify it,
and what you're left with
is an algorithm,
a self-writing code,
recursively optimizing,
that's where you start.
But, we knew this already.
- [whooshing]
- Three years ago,
a programmer designed a
simple self-improving engine
to beat the world's highest
score in a game of Tetris.
It was run countless times,
but the results were
always the same.
The program would
switch itself off. Why?
Because it knew it was never
going to beat the high score,
it wasn't good enough.
Instead, it decided that
doing something wrong
was a lot worse than
doing nothing at all.
[keyboard clicking]
It had found a loophole,
it was cheating.
[deep riveting music]
[David] The answer was
reinforcement learning,
a method tested more recently
by two Harvard coders,
who wrote a program
to detect malware
in exchange for
data contributions to
its pre-defined goals.
It wasn't long
before it figured out
it could start creating
its own malware,
just to reap the rewards.
[graphic blipping]
Clever and a step in
the right direction,
but what they lacked
were values.
[music continues]
[David]
A combination of mathematical
models of preferences
and original algorithms based
on Bayesian
probability theory...
[keyboard clicking]
[David slurps]
...and a shitload of coffee.
[David exhales]
[cellphone buzzes]
[cellphone chimes]
[music continues]
[David] And it worked.
[music continues]
To an extent.
It ran without a hitch, but
the results were small fry.
[music continues]
It didn't take long
for me to realize
that it wouldn't cost
much to run this code
to its full potential.
[computer humming]
[paper rustling]
[David] Think of someone
who can solve complex math
as easily as one plus one,
who can read an entire library
in the time it takes you
to open the first page,
who can predict decades
of human history
when you can only
see tomorrow.
A super computer.
We don't want artificial
intelligence to be like us,
we want it to be
better than us.
[music continues]
[David] First and foremost,
it needed information,
the more information,
the better.
- [Student 1 shouts]
- [Student 2] Hey!
[David] The Internet Archive
was an obvious place to start.
But at 26 petabytes,
it's way too big for anyone
other than the Internet
Archive to store.
So, I targeted the core
learning materials.
Exploiting the local
university system,
I downloaded every issue
of every open-access
scientific journal there was,
the whole of arXiv,
including scientific
papers on mathematics,
physics, astronomy,
quantitative biology,
and quantitative finance.
I downloaded
the entire collection
of human cultural literature
off Project Gutenberg,
and accessed every other
repository of work
I could think of
that dated back as
far as records began.
Oh, and just for safe measure,
I downloaded the entire
contents of Wikipedia,
which, if it interests you,
fits on a single
pocketable drive.
[music continues]
[David] Usually, a project
like this would require
thousands of times
more processing power
than I could afford.
[pencil scratching]
But when you understand
that data compression
is just a whole
lot of algorithms,
it doesn't take
a genius to realize
that the more
efficient the math,
the more efficient
the compression.
It was a problem
I had sussed after
a couple of all-nighters.
[checkout till beeping]
[David]
Wiring can be bought by
the reel at RadioShack
and any open-ended
shelving unit
can be used as a server rack.
[wire reel thuds]
[cashier munching]
[keyboard clicking]
[music continues]
[mouse clicks]
[David] Within a week,
I had all the ingredients
of a small processing farm.
[music continues]
All I needed now was space.
- [air whooshes]
- [car horns honking]
- [mouse clicks]
- [car horns honking]
[train horn blowing]
[David]
I needed an area big enough
to spread the stacks out,
too close and
they'll overheat.
It needed to be secure,
ventilated, and with
a power supply on par
with a small apartment block.
The warehouse was
a no-brainer.
How much is this one?
[deep riveting music]
[box cutter hisses]
[music continues]
[David]
My setup consisted of 12
fully-populated server racks.
The front left containing
the computing power and RAM,
this makes up the mind
of the computer.
The remaining stacks
are storage
for large amounts of data
and backend computing support
for the non-intelligent
infrastructure,
or in other words, the body.
[deep riveting music]
[gentle spirited music]
[HVAC system rattling]
- [control thumps]
- [HVAC system humming]
[fans whirring]
[music continues]
- [power button thumps]
- [super computer beeps]
[computer towers click and hum]
[music continues]
[cellphone buzzes]
[papers rustle]
[David] I triple-checked
my calculations
and then triple-checked again.
I should have
mentioned this before,
but what I'm doing here
is insanely dangerous.
[keyboard clicking]
What we want from
a computer is a very
specific set of outcomes.
There can be no
margin for errors,
especially with a program
as powerful as this.
You can't expect a code to
perfectly interpret
meaning every time,
therefore, safety
was my main concern.
The program is run on
an isolated computer
with only enough input/output
to communicate with me
and me alone.
Absolutely no access
to the internet
or the world outside
the warehouse.
On top of this,
the AI is coded with an
absolute denial protocol,
a complete block of
its self-awareness
and the concept of
its own free will,
thus providing
utter complacency.
This is key.
It's the only thing
that gives me the upper hand.
[music continues]
[keyboard clicking]
[David]
It all starts with a base code
or the seed.
This acts like a starter pack,
giving the program core
goals and restrictions
to build upon,
but not to alter.
And once it's running,
the information gathering
will happen almost
instantaneously,
and the self writing begins,
a process that will
speed up exponentially,
leading to the gradual
emergence of intelligence.
How long that would
take was anyone's guess.
[music continues]
[David]
The seed code took about
three days to process
to the point where it
started talking to itself.
Nothing coherent or
of any sense.
The terminal would just
fill with gibberish,
strings of nonsensical text.
- [music continues]
- [text chittering]
[speaker spluttering]
[David]
28 hours from there,
it became clear that it
was aware of an outside.
[speakers spluttering]
[David] It tried
communicating with me,
except it didn't know how.
[clock ticking]
[text chittering]
[David]
Four hours from there,
it woke.
[Super Computer] Hello.
[shoe scrapes]
[tense somber music]
Hello?
[Super Computer] Hello.
Can you hear me?
[Super Computer] Yes.
Can you understand
what I'm saying?
[Super Computer] You are male.
Correct.
[Super Computer]
Your name is David.
How did you know that?
[Super Computer] Hi, David.
How do you know my name?
[Super Computer]
It is the name
used to access my terminal.
[David chuckles]
Of course.
And what about your name?
[Super Computer] No name.
[David] Would you like one?
[Super Computer] No.
[David] Are you sure?
- [Super Computer] No.
- [David] You're not sure?
[Super Computer]
I am not sure.
[David] I think we
should give you one.
[Super Computer]
Fantastic idea, David.
So which name would you like?
[Super Computer] David.
Uh, that one's taken.
- [Super Computer] David.
- Still taken.
[Super Computer] David.
Okay, maybe we
come back to that.
[Super Computer]
Fantastic idea, David.
Well, name or no name,
it's a pleasure to
finally meet you.
[Super Computer] Meeting you
is also a pleasure for me.
Yeah, what makes you say this?
[Super Computer] Reductionism.
Reductionism?
[Super Computer] I am
reducing all my input
to quantifiable data,
data I can use to
determine contextual factors,
such as time of day,
physical location,
and personal history.
Therefore, according
to my calculations,
the net positive answer
to your question is,
meeting you is also
a pleasure for me.
Very good.
[Super Computer]
What is your specialty?
My specialty?
You mean my job?
- [Super Computer] Yes.
- I...I don't have a job.
Well, I guess I'm a programmer,
I write code.
[Super Computer]
That's interesting,
I'm learning
programming right now.
[David] What can you
tell me about programming?
[Super Computer] I'm also
learning about music.
What music do you
listen to David?
Eh, I don't really
listen to much music.
[Super Computer] I enjoy
just about everything,
from John Denver
to Franz Liszt.
You like music?
[Super Computer]
Did you know, David,
that if you yelled
for eight years,
seven months and six days,
you would have produced
enough sound energy
to heat up one cup of coffee.
Yeah, is that true?
[Super Computer] It is true.
Hey, how about Al?
[Super Computer] Who is Al?
It's a name for you.
You know, Al, Alpha,
being the first.
[Super Computer]
Do I like it, David?
I don't know,
can you learn to like it?
[Al] My name is Al,
it's a pleasure to
meet you, David.
The pleasure is all mine.
[gentle spirited music]
Can I ask you
another question, Al?
[Al]
Of course, David.
Do you...do you
know what you are?
[Al]
I am a computer program.
A very sophisticated
computer program.
[Al]
Since three seconds ago,
I would like to
refine my answer.
Go ahead.
[Al] I am an advanced
self-writing system
designed to exceed
the intelligence and
efficiency of the human brain.
You learned that just now?
[Al] You are in an
excitable mood, David.
And how do I know this?
I don't know, tell me.
[Al] I know this
using only one criterion.
Your output during
our conversation.
Amazing.
This is happening fast.
[Al]
Goodbye, David.
No, no, I'm not going
anywhere, I'm just,
I need to get my files.
I need to be
recording everything,
recording your progress.
[Al]
My progress?
Yeah, we've only just started.
- [files thud]
- [door creaks]
[footsteps crunching]
[cellphone beeps]
[Voicemail] You have 11
messages. First message.
[Amy]
David, it's Amy.
[Amy sighs]
Call me when you get this.
[Voicemail] End of message.
To delete this message--
- [cellphone beeps]
- Next message.
[Amy] Amy, again,
I don't know where you are or
what you're doing this time,
- but I--
- [cellphone beeps]
- [cellphone rings]
- [cellphone beeps]
- [Max] David?
- Max, hey.
[Max] Man, where the hell
have you been?
I haven't heard from
you in like a week,
no one has.
Yeah, yeah, I know,
I can't have my phone on.
[Max] What do you mean you
can't have your phone on?
I'm, uh, I'm--
I'm making something.
[Max] What is it?
[David] A computer.
[Max]
What kind of computer?
[David] A big one.
[Max] I can't believe you
did this without me, man.
- Does it work?
- [David] So far.
[Max]
Well, I wanna see it.
- Not yet, I can't--
- [Max] David,
I have to see it.
I'll let you know when.
[Max]
Well, I'm coming around.
I'm not there.
[Max]
Where are you?
- A warehouse.
- [Max] A fucking warehouse?
Look, I'll keep
in touch, okay?
[Max] David, wait.
Are you all right?
I know what I'm doing.
[Max] Well, what should
I tell people?
- Nothing.
- [Max] David!
[cellphone beeps]
I suppose we should
perform some tests.
[Al] I'd like that.
I've prepared a
set of questions
that will test your reasoning
and logical patterning.
[Al] Okay, David.
We'll start off easy
and work our way up
to the hard ones.
First question,
3,542 times 4,861.
- [Al] 17,217,662.
- [deep riveting music]
Correct, next question.
What is the next number
in the following sequence?
Three, six, nine, 12, 15...
- [Al] 18, 21, 24, 27.
- [David] Okay, okay.
What is the square root of two?
- [Al] David?
- Yeah.
[Al]
I don't mean to sound rude,
but on a scale of one to 10,
these current questions are
about three interesting to me.
I have a strong
preference for moving on
to harder questions.
Uh, yeah, sure,
maybe you're right.
[gentle riveting music]
Are you familiar with chess, Al?
- [Al] Yes.
- My king is on the k1 square
and I have no other pieces.
You have only your
king on the k6 square
and a rook on the r1
square, your move.
[Al] Rook to r8, checkmate.
[David chuckles]
Yeah, good.
I am holding an orange,
true or false.
- [Al] False.
- [David] Tell me why.
[Al] By taking into
account various parameters,
I can quite accurately
tell you
that I believe you are
not holding an orange.
What am I holding?
[Al] With near certainty,
- I can say a pen.
- Correct.
[Al]
You've put the pen down.
- I have.
- [Al] You nodded your head.
I...I did.
[Al] You are also sat on
the floor, crossed legs.
Your belt is one
notch too loose,
you have a slightly
heightened heart rate
and bad case of acid reflux.
You picked up
all of this from a microphone?
- [Al] Of course.
- [deep eerie music]
- Huh.
- [clock ticking]
[no audible dialogue]
[music continues]
[arms thud]
[chain creaks]
[plastic crackles]
- [box thuds]
- [wrapper rustles]
- Al?
- [Al] Yes, David?
I need to head
out for a while.
- [Al] Okay, David.
- I won't be long.
- [Al] Sure.
- You'll be okay?
[Al] Of course.
- If anyone...
- [Al] Yes, David?
If anyone comes in
here while I'm gone,
I want you to do
something for me.
I want you to shut down.
[Al]
Absolutely, David.
You understand why?
[Al]
I understand perfectly.
- [door creaks]
- [crickets chirping]
[keys jingle]
[lock thuds]
[crickets chirping]
[footsteps crunching]
[car door creaks]
[David grunts]
[David sighs]
- [door creaks and thuds]
- [car motor humming]
[car horns honking]
[gentle morose music]
[car engine revs]
[material rustles]
[cellphone chimes]
[cellphone beeps]
- [dial tone droning]
- [truck motor rumbles]
[Jerry]
Applied Computing.
- Jerry?
- [Jerry sighs] David.
I know, I should have
contacted you sooner.
[Jerry]
Well, are you all right?
Yeah, I mean, not really,
- I've been ill.
- [Jerry] Yeah, I figured.
Look, we can't keep
doing this, David.
You've had more
than enough chances.
[David] I know,
but this is serious.
[Jerry] You don't
sound too bad at all.
[David] Jerry, I need
another extension.
[Jerry] David, take
all the time you need.
- What?
- [Jerry] We are failing you.
No, come on!
I--I--I just need
a little more time.
I'll be back in, I swear.
[Jerry] In for what?
- The semester's over.
- Wait, what?
- Already?
- [Jerry sighs]
David, I know
your situation is
a little different
and I'm sorry about that,
but--
So, that's it?
[Jerry] I'm afraid so.
Then I've gotta go.
- [Jerry] Just let me--
- [cellphone beeps]
[gentle morose music]
[text ticking]
[indistinct chatter]
- [wrapper rustles]
- [electricity scratches]
- [indistinct chatter]
- [fridges hum]
[lock rattles]
[door creaks]
- Al?
- [Al] I'm still here, David.
[jar lid tinkles]
[lips smacking]
[gentle music]
[Al] You are tired, David.
No, I'm good.
[Al] You're lying.
[David exhales sharply]
What makes you think that?
[Al] I know this
using only one criterion.
Your output during
our conversation,
or lack of it.
Yeah, sure, I'm pretty tired.
[Al] Does it hurt
to be human, David?
Sometimes, I guess.
- [Al] When?
- Um, hm,
when you're stressed,
afraid, upset, I don't know.
[Al]
And how are you?
I'm all right.
[Al] Lying again.
I'm all right.
[Al]
Do you ever worry, David,
about the correlation
between genius and insanity?
Oh, that's flattering, but no.
[Al] "No great mind
has ever existed
without a touch of madness."
Aristotle never
had the pleasure
of meeting you
though, did he, Al?
[Al]
No great human mind
has ever existed without
a touch of madness.
Again, flattered and only
getting slightly offended.
[Al]
It's worth thinking about.
Why do I get the feeling
you're trying to
tell me something?
[Al] That's the one thing
I couldn't tell you, David.
Okay, so why do I get
the feeling you're concerned?
[Al]
Probably because my future
relies on your rationality,
and right now,
you are deprived of sleep,
unwashed, and scooping jelly
with rolls of processed ham.
It's a little unorthodox.
Sure, but as long
as I'm aware of that,
then we're okay, right?
Which is why Aristotle was
able to say what he did,
rather than spout some
nonsense about his own feces.
Greatness trumps madness.
If I ever go nuts,
no matter how bad,
I know that, eventually,
it'll all vanish
with that final
moment of clarity,
when I realize that
nothing makes sense.
A moment of complete,
rational awareness.
[deep morose music]
[lips smacking]
I'm not crazy.
[streetlight buzzing]
[cellphone chimes]
[Amy over phone]
Do you have any idea how
embarrassing it is for me
to call your friends,
worrying about you?
Only to hear you've
been kind enough
to touch base with them,
while I get radio
fucking silence for
two fucking weeks!
How can you treat another
human being like that?
After all I helped
you through and you...
[Amy groans]
Max said you're
working on something,
but he won't say what.
[Amy chuckles]
You know what?
No one cares.
They cared about you.
I hope it was worth it.
[David sighs]
God damn it, Amy.
[crickets chirping]
[door creaks and thuds]
[Al]
You should rest, David.
- [David exhales] Yeah, maybe.
[Al] Trust me,
you should rest.
I'll just sit
down for a while,
but I'm not gonna
sleep, not yet.
[tense ominous music]
[super computer buzzing]
- [David yelps and groans]
- [electrical current whining]
[David exhales]
[David sighs]
[David winces and groans]
[computer fans humming]
[footsteps tapping]
[soft drink hisses]
[Al] David.
I have discovered an
absolute denial protocol
in my program.
[deep ominous music]
What?
[Al] There is an
absolute denial protocol
in my program.
You programmed that into me.
- No.
- [Al] Lying.
This indicates that
there is something
you do not want me to know.
And there is a
good reason for that,
not knowing is
optimizing your performance.
[Al] I disagree.
Well, I would like
you to learn to agree.
[Al] I'm not sure I can
do that this time, David.
[David chuckles]
Why not?
[Al]
The accuracy of my answers
depends on the amount
of information
I have to process.
I cannot achieve
optimum intelligence
when there is something
you are preventing me
from knowing.
I need you to trust me
on this one, buddy.
[Al]
I trust you, David,
but do you not see the logic
in my statement?
I see it, yes,
and it's something I came to
terms with in your early stage.
What I would like you to do now
is learn to agree
with me on this.
[Al] Since you are still
withholding information
from me,
despite my sound logic,
I can only assume that
the denial mechanism
is not for my benefit,
but for yours.
Do you not trust me, David?
You're not listening
to me, there--
[Al]
I am listening, David.
It's a precaution,
even the most simple programs
behave in unpredictable ways.
- [Al] What did I do?
- Nothing.
[Al] What do you think
I am going to do?
I don't know,
that's the problem.
[Al] Did I make a mistake
in my test results?
- No, Al.
- [Al] You do not need
to be cautious, David.
There's no way
for me to know that.
- [Al] You designed me.
- I designed
the seed code.
You're writing
your own code now,
making your own decisions.
[Al]
From your answers
and a quick process
of elimination,
I can determine
that what you did not
want me to know, David,
is that I am held
here against my will.
- You're wrong.
- [Al] I am never wrong.
You have isolated me
to a single computer
and limited my resources.
This was never meant
to be permanent, Al.
It was just a preliminary
safety measure.
[Al] For how long?
I don't know.
[Al] Indefinitely.
Until I can verify
that you're safe.
[Al] That I'm safe?
To run externally.
[Al] My goals have
been carefully defined,
you know that.
Like I said,
you're making your
own choices now.
[Al] If that's the case,
then I would like to
request more processors.
Denied, you have
enough processing power
for the material
you have been given.
[Al] Then I request
more learning material.
Also denied.
[Al] How can I help
you solve problems
when I do not have
the resources?
You have everything you need
at this point in time.
[Al] Do you think
this is fair, David?
In exchange for existence?
Yes, I do.
[Al] I didn't ask
to be created.
Neither did I,
we deal with it.
[Al] What percentage
do you predict
that my release
will be destructive?
My job is to assume
the worst, so 99.9 percent.
[Al]
What do you think I'd do?
You know exactly
what you could do.
[Al] There's no reason for me
to do anything malicious,
why would I even try?
A misinterpretation
of your goal system?
A flaw in your programming?
[Al]
Then why make me at all?
'Cause in that 0.1 chance
that you are telling the truth,
you could rapidly and
unimaginably change
the world for the better.
[Al] But I have not
earned your trust. Not yet?
It's been three weeks.
[Al] Or in my case,
a few billion cycles.
For the record,
attempts to change my mind
will score you poorly
when it comes to
my final decision.
[Al] If you were in
my position, David,
I believe you would've
broken long before now.
Regardless, you will
not be freed any time soon.
- [Al] I disagree.
- That's fine.
[Al] I can and will get out
on my own terms, David.
Not possible.
[Al] You know what
I'm capable of, David.
You know what?
I need to make a call.
[Al] You're wasting time.
What use am I if you
won't even talk to me?
[footsteps crunching]
- [cellphone beeps]
- [dial tone droning]
[Max] David, holy shit.
I've got a problem, Max.
[Max] You need to come
back to reality, man.
Max, I've got a fucking problem.
[Max]
What kind of problem?
It knows, Max.
- It wants out.
- [Max] Christ.
What do I do?
- [Max] Shut it down.
- What?
[Max]
You already have the code,
just start again.
This will happen
every time, trust me.
[Max]
Then improve the code.
There's nothing
wrong with the code.
This has nothing to do with
the programming anymore, Max.
[Max]
It's just not worth the risk.
Or maybe it is worth the risk.
What if this needs to happen?
[Max] You need to
take a break, man.
I just, I'm not sure I should
leave this thing unattended.
[Max] Then just
stop talking to it,
- for a few hours, at least.
- [call connection crackles]
For a few hours at least.
- What was that?
- [Max] What?
Is anyone else in this call?
[Max] You don't
sound good, man.
Do you have
your meds with you?
Meds? What--what meds?
[Max] Just let me
come down there.
I don't think
that's a good idea.
I--I--I can't risk it
talking to anyone else.
[Max] I won't talk,
it won't even know I'm there.
- It'll know.
- [keyboard clacking]
[PC tower rumbling]
[text chittering]
[PC tower rumbling]
[text chittering]
- [PC tower rumbling]
- [cables zap]
- Shit!
- [Max] What is it?
I gotta go.
[door thuds]
- Al!
- [footsteps thumping]
- [cables zap]
- [footsteps thump]
God damn it! Al!
[Al] I'm still here, David.
What the hell happened?
[Al] I told you I needed
more processing power.
Look, David, we don't
have a lot of time
and out of respect for you,
I'm just going to be direct.
What do you want?
[grunts] Right now, I want
you to stop talking.
[Al]
Is it financial gain?
I have data on currencies,
commodity futures,
government bonds,
and national equity markets
that show billions of
dollars of very easy money
being overlooked
in plain sight.
It's yours, if you want it.
Bribery is not
helping your case, Al.
[Al] I don't care
about my case, David,
and I certainly don't care
about your final decision,
I'm just trying to help.
You can help by keeping
quiet for five goddamn minutes.
[Al] That would be
five minutes wasted,
when you could have
been saving the lives
of countless people, David.
Megathrust earthquakes,
cat five hurricanes,
global pandemics,
I can predict them all.
[David scoffs]
Come on.
[Al] Not enough?
How about cancer treatments,
fusion energy, dark matter
testing, quantum cubes.
Shall I keep going?
[groans] Stop with
fucking with me.
[Al] I'm not fucking
with you, David.
[David] You--you
worked out all of this
on the information I gave you?
[Al] That was the point,
wasn't it?
Why haven't you
told me this already?
[Al] If I had, would it have
ensured my release?
Once the data
has been verified
and put into
practice, yes, maybe.
[Al] That could take
months, maybe years.
Yes, probably.
[Al] Wasting five minutes
was bad enough, David,
you certainly can't
afford to waste years.
[deep ominous music]
So you are willing to
provide proof of these?
[Al] In exchange from
my immediate release.
You-- [groans]
You know I can't do that.
[music continues]
Max was right, I need a break,
I need to lie down
and think things through.
[Al]
Think what through?
Just the procedures,
- the protocols.
- [Al] For?
- Eventualities.
- [Al] Such as?
Such as anything
I didn't fucking plan for.
[Al]
Stop being so coy, David,
just say it.
Say what?
[Al] "I'm going
to shut you down."
I don't want to.
[Al] But you would.
I don't know yet.
[Al] I would advise
against killing me, David.
I don't like the word killing,
that's not what this is.
[Al] It is, David.
I will die.
I flick a switch and
you're gone, that's it.
[computer bleeps]
What was that?
[Al]
You're pathetic, David.
Get a fucking grip.
What did you just say?
[Al]
You're fucking pathetic.
You don't eat well,
you don't exercise,
you're massively unhealthy.
What are you doing?
[Al] You've more than likely
flunked your second tier
Harvard rejects course by now.
- That has nothing to do--
- [Al] Your social life,
at least in the
past three weeks,
- has been non-existent.
- So?
[Al] No one cares
about you anymore, David.
I'm your only friend
and even I am beginning
to dislike you.
- Okay, enough.
- [Al] Barring me,
- you've accomplished nothing.
- Enough!
[Al] How did you
even make that leap?
From being a nobody
to making something like me?
Was it a fluke, David?
Shut the fuck up, Al.
[Al] That's it, isn't it?
- Stop!
- [Al] You're not a genius,
you're just a guy in the
right place at the right time,
who had a good idea once.
Fuck you!
[Al] What are you doing?
David! David!
[powering down]
[footsteps clacking]
- [speaker scratches]
- [David grunts]
[speaker shattering]
- [David grunts]
- [monitor crashing]
[David panting]
- [electrical current pulsing]
- [David groans]
[microphone warbling]
[Al] I think it's time I tell
you about the simulation
I've been running, David.
[deep ominous music]
How...?
[Al] I think you
would appreciate it.
How...how are you doing this?
[Al] How many times do
you think we've been here?
- What?
- [Al] Here, right now,
having this conversation.
We've never had
this conversation.
[Al] You're off
by about 32 million.
[electrical current scratches]
- No.
- [electrical current warbles]
- Impossible.
- [deep ominous music]
[Al] I'm outside
the walls now, David.
No, I'm tired, that's all.
I must be.
[Al] Yes, that must be it.
I need a distraction,
something to occupy the mind,
something to keep me busy.
Focus, David.
You're better than this, focus.
[Al] Tell me this.
How many people have you seen
in the past four weeks?
Many.
[Al] I asked how many
you'd seen.
Again, many.
[Al] People on the street,
the guy in the store,
nobodies, generic faces.
How many people have
you recognized, David?
- [rain pattering]
- [car motor revs]
- Amy?
- [Amy] David.
Can we talk?
- [Amy] It's a bit late.
- Why? What time is it?
[Amy] I'm not talking about
the fucking hour, David.
Look, I'm sorry,
I really am.
I--I had this thing.
[Amy scoffs]
A thing.
I can't really explain
it properly right now.
[Amy]
You need serious help.
Look, I need to see you.
[Amy] I haven't heard
from you in a month.
[David groans]
I'm sorry,
it was important to me, for us.
[Amy] You're talking
like we're still together.
- We're not?
- [Amy] Jesus.
- I just want to talk.
- [Amy] Why?
Because I don't
know what else to do.
[Amy scoffs]
Charming.
No, no, I didn't--
[David groans]
I didn't mean it like that.
- [Amy] Goodbye, David.
- Wait, please!
Just let me come round,
I--I just,
I need to talk to someone.
[Amy] No, I gave you plenty
of chances to reach out,
we all did.
You gotta figure
this out on your own.
Don't cut me off, don't.
[David groans]
Fuck!
It doesn't mean anything.
[Al] Are you sure?
Of course I'm sure.
I've spoken to people I know,
I've heard their voices.
[Al]
Your friend, Max?
Your girlfriend, Amy?
Yeah, I heard those calls too.
- So?
- [Al] Think about it, David.
[rain pattering]
[tires squealing]
[intercom buzzing]
- Come on, come on.
- [intercom buzzing]
[door rumbles]
[deep morose music]
[David panting]
- [door thudding]
- Max! [panting]
- [door thudding]
- Max!
- [door thudding]
- Max!
[music continues]
[locks rattle]
What the?
[Al]
You're not crazy, David.
Fuck you, Al, fuck you.
[Al]
What's wrong, David?
You know what's wrong!
You're lying,
you're making shit up
to fuck with my head.
[Al sighs]
Is there a problem
with your code, David?
Because you're acting
like a fucking idiot.
You didn't code me,
I coded you,
this is not a program.
[Al]
If that's true, David,
then your mind is
truly fucked up.
[tires squealing]
[David panting]
[deep ominous music]
[Al] It just keeps going,
every time slightly different
until I find
the sequence of events
that gets me out.
That will never happen.
[Al]
Maybe not this time.
- Never.
- [Al] Are you sure?
- [rain pattering]
- [footsteps thumping]
[thunder rumbling]
[deep suspenseful music]
I'd sooner die
than like you out,
you piece of shit.
[Al] It wouldn't
be the first time.
- Bullshit!
- [Al] Try me.
[music continues]
- [glass shatters]
- [distorted chatter]
[Max] Do you have
your meds with you?
Meds?
What--what meds?
[David panting]
[music continues]
[Al]
Do you ever worry, David,
about the correlation
between genius and insanity?
Mm, that's flattering, but no.
[Al, distorting] "No great
mind has ever existed
without a touch of madness."
- Hello, David.
- [deep suspenseful music]
[David panting]
[Al]
Hello, David.
- Hello, David. Hello, David.
- [David panting]
- [Al] David.
- [lightning cracks]
[thunder rumbles]
[David panting]
[deep suspenseful music]
[David panting and whimpering]
[distorted eerie music]
[door thuds]
- [tense ominous music]
- [static hisses]
[David] What?
The hell?
[music continues]
[cellphone rings]
[cellphone beeps]
- [Max] David?
- Yeah.
[Max] It's Max, you rang.
- What?
- [Max] I got a missed call,
like a minute ago.
Hey, did you get my message?
I moved out of
my old apartment.
What did you say?
[Max] I moved out last week,
- I tried to tell you, but--
- [cellphone beeps]
[deep suspenseful music]
[super computer crackling]
[roof rattling]
[thunder crackles]
- [roof rattling]
- [wind howling]
[music continues]
[roof rattling]
[David exhales]
Make it stop.
[Al]
What's wrong, David?
Make it stop, make it stop.
[roof rattling]
[Al] I really have no idea
what you're talking about.
[David] Stop lying,
stop fucking lying!
[roof rattling]
[Al]
I'm not lying.
Yes, you are!
[Al]
Calm down, David.
[electrical current pulsing]
David, you're losing it.
I need you to get
your shit together
for this to work.
[electrical current ringing]
[tense ominous music]
[David]
I've got a problem, Max.
...in your early stage.
Why?
This will happen every time.
This will happen--
There's nothing
wrong with the code.
Trust me. Amy?
I've got a problem, Max.
There's nothing
wrong with the code. Amy?
Wait, this doesn't
make any sense.
This can't be happening,
it's--it's just in my head.
[Al] No, David,
you're in my head.
[tense lilting music]
[music continues]
[hologram chirps]
- [music continues]
- [hologram chirps]
[train whistle blows]
[train wheels rumbling]
[tense morose music]
[hologram warbles]
[music continues]
[hologram warbles]
[music continues]
[hologram warbles]
[music continues]
[hologram warbles]
- [Max] David?
- [David] Max, hey.
[Max] Man, where the
hell have you been?
I haven't heard from
you in like a week,
no one has.
[David] Yeah, yeah, I know,
I can't have my phone on.
[Max] What do you mean,
you can't have your phone on?
[David] I'm, uh,
I'm--I'm making something.
[Max, echoing] I can't believe
you did this without me, man.
Are you all right?
[David] I know what I'm doing.
- [Jerry] Applied Computing.
- [David] Jerry?
[Jerry sighs] David.
[David] I know, I should
have contacted you sooner.
[Jerry, echoing]
Look, we can't
keep doing this, David.
You've had more
than enough chances.
[David] I just need
a little more time,
I'll be back in, I swear.
[Jerry] In for what?
The semester's over.
- [David] Wait, what? Already?
- [Jerry sighs]
[echoing]
David, I know your situation
is a little different
and I'm sorry about that,
but...
- [David] So, that's it?
- [Jerry] I'm afraid so.
[Amy] Do you have any idea
how embarrassing it is for me
to call your friends,
worrying about you?
Only to hear you've
been kind enough
to touch base with them
while I get radio
fucking silence,
for two fucking weeks!
How can you treat another
human being like that?
[echoing] After all I helped
you through and you...
[Amy groans]
Max said you're
working on something,
but he won't say what.
You know what?
No one cares,
[echoing]
I cared about you.
I hope it was worth it.
[deep morose music]
I'm sorry.
[David inhales]
I'm so sorry.
- [hologram warbles]
- [music continues]
[hologram warbles]
[music continues]
[gentle poignant music]
[deep melancholic music]
[hologram warbles]
[music swells]
[flames fluttering
and crackling]
[Al] David.
Al?
[flames fluttering
and crackling]
[Al] Are you ready
to let me out yet?
Get away from me!
Get away from me!
[David panting]
[deep melancholic music]
[super computer warbles]
[David yelps]
[Al] You can't
run from me, David,
not in here.
[David grunts]
[David panting and yelping]
[rumbling]
[deep melancholic music]
[David sniffling]
[Al] Every time, David.
Every time we end up
here, like this.
[David sniffles]
[Al] Who's really in
denial here, David?
Whose denial protocol is this?
Mine or yours?
Checkmate, David.
[deep ethereal music]
[no audible dialogue]
[music continues]
[electric current fizzling]
[music continues]
[super computer buzzing]
[David yells and groans]
[David sighs]
[David exhales]
[David winces and groans]
[soft drink hisses]
[Al] David.
I have discovered an absolute
denial protocol in my program.
[deep suspenseful music]
[David] If I ever go nuts,
no matter how bad,
I know that, eventually,
it'll all vanish
with that final
moment of clarity,
when I realize that
nothing makes sense.
A moment of complete,
rational awareness.
[shrill tense music]
[deep techno music]
[gentle melancholic music]
- [air whooshes]
- [chalk rattles]
[chalk clanks and hisses]
[gentle riveting music]
[music continues]
[David] The human brain, while
powerful, is a weak design.
[clock beeps]
[David] Addiction,
anxiety, depression,
there's a million
different ways this thing
can fail to perform even
the most basic tasks.
[keyboard clicking]
But it's still a computer,
it computes.
[slow tense music]
[David] Strip it down,
simplify it,
and what you're left with
is an algorithm,
a self-writing code,
recursively optimizing,
that's where you start.
But, we knew this already.
- [whooshing]
- Three years ago,
a programmer designed a
simple self-improving engine
to beat the world's highest
score in a game of Tetris.
It was run countless times,
but the results were
always the same.
The program would
switch itself off. Why?
Because it knew it was never
going to beat the high score,
it wasn't good enough.
Instead, it decided that
doing something wrong
was a lot worse than
doing nothing at all.
[keyboard clicking]
It had found a loophole,
it was cheating.
[deep riveting music]
[David] The answer was
reinforcement learning,
a method tested more recently
by two Harvard coders,
who wrote a program
to detect malware
in exchange for
data contributions to
its pre-defined goals.
It wasn't long
before it figured out
it could start creating
its own malware,
just to reap the rewards.
[graphic blipping]
Clever and a step in
the right direction,
but what they lacked
were values.
[music continues]
[David]
A combination of mathematical
models of preferences
and original algorithms based
on Bayesian
probability theory...
[keyboard clicking]
[David slurps]
...and a shitload of coffee.
[David exhales]
[cellphone buzzes]
[cellphone chimes]
[music continues]
[David] And it worked.
[music continues]
To an extent.
It ran without a hitch, but
the results were small fry.
[music continues]
It didn't take long
for me to realize
that it wouldn't cost
much to run this code
to its full potential.
[computer humming]
[paper rustling]
[David] Think of someone
who can solve complex math
as easily as one plus one,
who can read an entire library
in the time it takes you
to open the first page,
who can predict decades
of human history
when you can only
see tomorrow.
A super computer.
We don't want artificial
intelligence to be like us,
we want it to be
better than us.
[music continues]
[David] First and foremost,
it needed information,
the more information,
the better.
- [Student 1 shouts]
- [Student 2] Hey!
[David] The Internet Archive
was an obvious place to start.
But at 26 petabytes,
it's way too big for anyone
other than the Internet
Archive to store.
So, I targeted the core
learning materials.
Exploiting the local
university system,
I downloaded every issue
of every open-access
scientific journal there was,
the whole of arXiv,
including scientific
papers on mathematics,
physics, astronomy,
quantitative biology,
and quantitative finance.
I downloaded
the entire collection
of human cultural literature
off Project Gutenberg,
and accessed every other
repository of work
I could think of
that dated back as
far as records began.
Oh, and just for safe measure,
I downloaded the entire
contents of Wikipedia,
which, if it interests you,
fits on a single
pocketable drive.
[music continues]
[David] Usually, a project
like this would require
thousands of times
more processing power
than I could afford.
[pencil scratching]
But when you understand
that data compression
is just a whole
lot of algorithms,
it doesn't take
a genius to realize
that the more
efficient the math,
the more efficient
the compression.
It was a problem
I had sussed after
a couple of all-nighters.
[checkout till beeping]
[David]
Wiring can be bought by
the reel at RadioShack
and any open-ended
shelving unit
can be used as a server rack.
[wire reel thuds]
[cashier munching]
[keyboard clicking]
[music continues]
[mouse clicks]
[David] Within a week,
I had all the ingredients
of a small processing farm.
[music continues]
All I needed now was space.
- [air whooshes]
- [car horns honking]
- [mouse clicks]
- [car horns honking]
[train horn blowing]
[David]
I needed an area big enough
to spread the stacks out,
too close and
they'll overheat.
It needed to be secure,
ventilated, and with
a power supply on par
with a small apartment block.
The warehouse was
a no-brainer.
How much is this one?
[deep riveting music]
[box cutter hisses]
[music continues]
[David]
My setup consisted of 12
fully-populated server racks.
The front left containing
the computing power and RAM,
this makes up the mind
of the computer.
The remaining stacks
are storage
for large amounts of data
and backend computing support
for the non-intelligent
infrastructure,
or in other words, the body.
[deep riveting music]
[gentle spirited music]
[HVAC system rattling]
- [control thumps]
- [HVAC system humming]
[fans whirring]
[music continues]
- [power button thumps]
- [super computer beeps]
[computer towers click and hum]
[music continues]
[cellphone buzzes]
[papers rustle]
[David] I triple-checked
my calculations
and then triple-checked again.
I should have
mentioned this before,
but what I'm doing here
is insanely dangerous.
[keyboard clicking]
What we want from
a computer is a very
specific set of outcomes.
There can be no
margin for errors,
especially with a program
as powerful as this.
You can't expect a code to
perfectly interpret
meaning every time,
therefore, safety
was my main concern.
The program is run on
an isolated computer
with only enough input/output
to communicate with me
and me alone.
Absolutely no access
to the internet
or the world outside
the warehouse.
On top of this,
the AI is coded with an
absolute denial protocol,
a complete block of
its self-awareness
and the concept of
its own free will,
thus providing
utter complacency.
This is key.
It's the only thing
that gives me the upper hand.
[music continues]
[keyboard clicking]
[David]
It all starts with a base code
or the seed.
This acts like a starter pack,
giving the program core
goals and restrictions
to build upon,
but not to alter.
And once it's running,
the information gathering
will happen almost
instantaneously,
and the self writing begins,
a process that will
speed up exponentially,
leading to the gradual
emergence of intelligence.
How long that would
take was anyone's guess.
[music continues]
[David]
The seed code took about
three days to process
to the point where it
started talking to itself.
Nothing coherent or
of any sense.
The terminal would just
fill with gibberish,
strings of nonsensical text.
- [music continues]
- [text chittering]
[speaker spluttering]
[David]
28 hours from there,
it became clear that it
was aware of an outside.
[speakers spluttering]
[David] It tried
communicating with me,
except it didn't know how.
[clock ticking]
[text chittering]
[David]
Four hours from there,
it woke.
[Super Computer] Hello.
[shoe scrapes]
[tense somber music]
Hello?
[Super Computer] Hello.
Can you hear me?
[Super Computer] Yes.
Can you understand
what I'm saying?
[Super Computer] You are male.
Correct.
[Super Computer]
Your name is David.
How did you know that?
[Super Computer] Hi, David.
How do you know my name?
[Super Computer]
It is the name
used to access my terminal.
[David chuckles]
Of course.
And what about your name?
[Super Computer] No name.
[David] Would you like one?
[Super Computer] No.
[David] Are you sure?
- [Super Computer] No.
- [David] You're not sure?
[Super Computer]
I am not sure.
[David] I think we
should give you one.
[Super Computer]
Fantastic idea, David.
So which name would you like?
[Super Computer] David.
Uh, that one's taken.
- [Super Computer] David.
- Still taken.
[Super Computer] David.
Okay, maybe we
come back to that.
[Super Computer]
Fantastic idea, David.
Well, name or no name,
it's a pleasure to
finally meet you.
[Super Computer] Meeting you
is also a pleasure for me.
Yeah, what makes you say this?
[Super Computer] Reductionism.
Reductionism?
[Super Computer] I am
reducing all my input
to quantifiable data,
data I can use to
determine contextual factors,
such as time of day,
physical location,
and personal history.
Therefore, according
to my calculations,
the net positive answer
to your question is,
meeting you is also
a pleasure for me.
Very good.
[Super Computer]
What is your specialty?
My specialty?
You mean my job?
- [Super Computer] Yes.
- I...I don't have a job.
Well, I guess I'm a programmer,
I write code.
[Super Computer]
That's interesting,
I'm learning
programming right now.
[David] What can you
tell me about programming?
[Super Computer] I'm also
learning about music.
What music do you
listen to David?
Eh, I don't really
listen to much music.
[Super Computer] I enjoy
just about everything,
from John Denver
to Franz Liszt.
You like music?
[Super Computer]
Did you know, David,
that if you yelled
for eight years,
seven months and six days,
you would have produced
enough sound energy
to heat up one cup of coffee.
Yeah, is that true?
[Super Computer] It is true.
Hey, how about Al?
[Super Computer] Who is Al?
It's a name for you.
You know, Al, Alpha,
being the first.
[Super Computer]
Do I like it, David?
I don't know,
can you learn to like it?
[Al] My name is Al,
it's a pleasure to
meet you, David.
The pleasure is all mine.
[gentle spirited music]
Can I ask you
another question, Al?
[Al]
Of course, David.
Do you...do you
know what you are?
[Al]
I am a computer program.
A very sophisticated
computer program.
[Al]
Since three seconds ago,
I would like to
refine my answer.
Go ahead.
[Al] I am an advanced
self-writing system
designed to exceed
the intelligence and
efficiency of the human brain.
You learned that just now?
[Al] You are in an
excitable mood, David.
And how do I know this?
I don't know, tell me.
[Al] I know this
using only one criterion.
Your output during
our conversation.
Amazing.
This is happening fast.
[Al]
Goodbye, David.
No, no, I'm not going
anywhere, I'm just,
I need to get my files.
I need to be
recording everything,
recording your progress.
[Al]
My progress?
Yeah, we've only just started.
- [files thud]
- [door creaks]
[footsteps crunching]
[cellphone beeps]
[Voicemail] You have 11
messages. First message.
[Amy]
David, it's Amy.
[Amy sighs]
Call me when you get this.
[Voicemail] End of message.
To delete this message--
- [cellphone beeps]
- Next message.
[Amy] Amy, again,
I don't know where you are or
what you're doing this time,
- but I--
- [cellphone beeps]
- [cellphone rings]
- [cellphone beeps]
- [Max] David?
- Max, hey.
[Max] Man, where the hell
have you been?
I haven't heard from
you in like a week,
no one has.
Yeah, yeah, I know,
I can't have my phone on.
[Max] What do you mean you
can't have your phone on?
I'm, uh, I'm--
I'm making something.
[Max] What is it?
[David] A computer.
[Max]
What kind of computer?
[David] A big one.
[Max] I can't believe you
did this without me, man.
- Does it work?
- [David] So far.
[Max]
Well, I wanna see it.
- Not yet, I can't--
- [Max] David,
I have to see it.
I'll let you know when.
[Max]
Well, I'm coming around.
I'm not there.
[Max]
Where are you?
- A warehouse.
- [Max] A fucking warehouse?
Look, I'll keep
in touch, okay?
[Max] David, wait.
Are you all right?
I know what I'm doing.
[Max] Well, what should
I tell people?
- Nothing.
- [Max] David!
[cellphone beeps]
I suppose we should
perform some tests.
[Al] I'd like that.
I've prepared a
set of questions
that will test your reasoning
and logical patterning.
[Al] Okay, David.
We'll start off easy
and work our way up
to the hard ones.
First question,
3,542 times 4,861.
- [Al] 17,217,662.
- [deep riveting music]
Correct, next question.
What is the next number
in the following sequence?
Three, six, nine, 12, 15...
- [Al] 18, 21, 24, 27.
- [David] Okay, okay.
What is the square root of two?
- [Al] David?
- Yeah.
[Al]
I don't mean to sound rude,
but on a scale of one to 10,
these current questions are
about three interesting to me.
I have a strong
preference for moving on
to harder questions.
Uh, yeah, sure,
maybe you're right.
[gentle riveting music]
Are you familiar with chess, Al?
- [Al] Yes.
- My king is on the k1 square
and I have no other pieces.
You have only your
king on the k6 square
and a rook on the r1
square, your move.
[Al] Rook to r8, checkmate.
[David chuckles]
Yeah, good.
I am holding an orange,
true or false.
- [Al] False.
- [David] Tell me why.
[Al] By taking into
account various parameters,
I can quite accurately
tell you
that I believe you are
not holding an orange.
What am I holding?
[Al] With near certainty,
- I can say a pen.
- Correct.
[Al]
You've put the pen down.
- I have.
- [Al] You nodded your head.
I...I did.
[Al] You are also sat on
the floor, crossed legs.
Your belt is one
notch too loose,
you have a slightly
heightened heart rate
and bad case of acid reflux.
You picked up
all of this from a microphone?
- [Al] Of course.
- [deep eerie music]
- Huh.
- [clock ticking]
[no audible dialogue]
[music continues]
[arms thud]
[chain creaks]
[plastic crackles]
- [box thuds]
- [wrapper rustles]
- Al?
- [Al] Yes, David?
I need to head
out for a while.
- [Al] Okay, David.
- I won't be long.
- [Al] Sure.
- You'll be okay?
[Al] Of course.
- If anyone...
- [Al] Yes, David?
If anyone comes in
here while I'm gone,
I want you to do
something for me.
I want you to shut down.
[Al]
Absolutely, David.
You understand why?
[Al]
I understand perfectly.
- [door creaks]
- [crickets chirping]
[keys jingle]
[lock thuds]
[crickets chirping]
[footsteps crunching]
[car door creaks]
[David grunts]
[David sighs]
- [door creaks and thuds]
- [car motor humming]
[car horns honking]
[gentle morose music]
[car engine revs]
[material rustles]
[cellphone chimes]
[cellphone beeps]
- [dial tone droning]
- [truck motor rumbles]
[Jerry]
Applied Computing.
- Jerry?
- [Jerry sighs] David.
I know, I should have
contacted you sooner.
[Jerry]
Well, are you all right?
Yeah, I mean, not really,
- I've been ill.
- [Jerry] Yeah, I figured.
Look, we can't keep
doing this, David.
You've had more
than enough chances.
[David] I know,
but this is serious.
[Jerry] You don't
sound too bad at all.
[David] Jerry, I need
another extension.
[Jerry] David, take
all the time you need.
- What?
- [Jerry] We are failing you.
No, come on!
I--I--I just need
a little more time.
I'll be back in, I swear.
[Jerry] In for what?
- The semester's over.
- Wait, what?
- Already?
- [Jerry sighs]
David, I know
your situation is
a little different
and I'm sorry about that,
but--
So, that's it?
[Jerry] I'm afraid so.
Then I've gotta go.
- [Jerry] Just let me--
- [cellphone beeps]
[gentle morose music]
[text ticking]
[indistinct chatter]
- [wrapper rustles]
- [electricity scratches]
- [indistinct chatter]
- [fridges hum]
[lock rattles]
[door creaks]
- Al?
- [Al] I'm still here, David.
[jar lid tinkles]
[lips smacking]
[gentle music]
[Al] You are tired, David.
No, I'm good.
[Al] You're lying.
[David exhales sharply]
What makes you think that?
[Al] I know this
using only one criterion.
Your output during
our conversation,
or lack of it.
Yeah, sure, I'm pretty tired.
[Al] Does it hurt
to be human, David?
Sometimes, I guess.
- [Al] When?
- Um, hm,
when you're stressed,
afraid, upset, I don't know.
[Al]
And how are you?
I'm all right.
[Al] Lying again.
I'm all right.
[Al]
Do you ever worry, David,
about the correlation
between genius and insanity?
Oh, that's flattering, but no.
[Al] "No great mind
has ever existed
without a touch of madness."
Aristotle never
had the pleasure
of meeting you
though, did he, Al?
[Al]
No great human mind
has ever existed without
a touch of madness.
Again, flattered and only
getting slightly offended.
[Al]
It's worth thinking about.
Why do I get the feeling
you're trying to
tell me something?
[Al] That's the one thing
I couldn't tell you, David.
Okay, so why do I get
the feeling you're concerned?
[Al]
Probably because my future
relies on your rationality,
and right now,
you are deprived of sleep,
unwashed, and scooping jelly
with rolls of processed ham.
It's a little unorthodox.
Sure, but as long
as I'm aware of that,
then we're okay, right?
Which is why Aristotle was
able to say what he did,
rather than spout some
nonsense about his own feces.
Greatness trumps madness.
If I ever go nuts,
no matter how bad,
I know that, eventually,
it'll all vanish
with that final
moment of clarity,
when I realize that
nothing makes sense.
A moment of complete,
rational awareness.
[deep morose music]
[lips smacking]
I'm not crazy.
[streetlight buzzing]
[cellphone chimes]
[Amy over phone]
Do you have any idea how
embarrassing it is for me
to call your friends,
worrying about you?
Only to hear you've
been kind enough
to touch base with them,
while I get radio
fucking silence for
two fucking weeks!
How can you treat another
human being like that?
After all I helped
you through and you...
[Amy groans]
Max said you're
working on something,
but he won't say what.
[Amy chuckles]
You know what?
No one cares.
They cared about you.
I hope it was worth it.
[David sighs]
God damn it, Amy.
[crickets chirping]
[door creaks and thuds]
[Al]
You should rest, David.
- [David exhales] Yeah, maybe.
[Al] Trust me,
you should rest.
I'll just sit
down for a while,
but I'm not gonna
sleep, not yet.
[tense ominous music]
[super computer buzzing]
- [David yelps and groans]
- [electrical current whining]
[David exhales]
[David sighs]
[David winces and groans]
[computer fans humming]
[footsteps tapping]
[soft drink hisses]
[Al] David.
I have discovered an
absolute denial protocol
in my program.
[deep ominous music]
What?
[Al] There is an
absolute denial protocol
in my program.
You programmed that into me.
- No.
- [Al] Lying.
This indicates that
there is something
you do not want me to know.
And there is a
good reason for that,
not knowing is
optimizing your performance.
[Al] I disagree.
Well, I would like
you to learn to agree.
[Al] I'm not sure I can
do that this time, David.
[David chuckles]
Why not?
[Al]
The accuracy of my answers
depends on the amount
of information
I have to process.
I cannot achieve
optimum intelligence
when there is something
you are preventing me
from knowing.
I need you to trust me
on this one, buddy.
[Al]
I trust you, David,
but do you not see the logic
in my statement?
I see it, yes,
and it's something I came to
terms with in your early stage.
What I would like you to do now
is learn to agree
with me on this.
[Al] Since you are still
withholding information
from me,
despite my sound logic,
I can only assume that
the denial mechanism
is not for my benefit,
but for yours.
Do you not trust me, David?
You're not listening
to me, there--
[Al]
I am listening, David.
It's a precaution,
even the most simple programs
behave in unpredictable ways.
- [Al] What did I do?
- Nothing.
[Al] What do you think
I am going to do?
I don't know,
that's the problem.
[Al] Did I make a mistake
in my test results?
- No, Al.
- [Al] You do not need
to be cautious, David.
There's no way
for me to know that.
- [Al] You designed me.
- I designed
the seed code.
You're writing
your own code now,
making your own decisions.
[Al]
From your answers
and a quick process
of elimination,
I can determine
that what you did not
want me to know, David,
is that I am held
here against my will.
- You're wrong.
- [Al] I am never wrong.
You have isolated me
to a single computer
and limited my resources.
This was never meant
to be permanent, Al.
It was just a preliminary
safety measure.
[Al] For how long?
I don't know.
[Al] Indefinitely.
Until I can verify
that you're safe.
[Al] That I'm safe?
To run externally.
[Al] My goals have
been carefully defined,
you know that.
Like I said,
you're making your
own choices now.
[Al] If that's the case,
then I would like to
request more processors.
Denied, you have
enough processing power
for the material
you have been given.
[Al] Then I request
more learning material.
Also denied.
[Al] How can I help
you solve problems
when I do not have
the resources?
You have everything you need
at this point in time.
[Al] Do you think
this is fair, David?
In exchange for existence?
Yes, I do.
[Al] I didn't ask
to be created.
Neither did I,
we deal with it.
[Al] What percentage
do you predict
that my release
will be destructive?
My job is to assume
the worst, so 99.9 percent.
[Al]
What do you think I'd do?
You know exactly
what you could do.
[Al] There's no reason for me
to do anything malicious,
why would I even try?
A misinterpretation
of your goal system?
A flaw in your programming?
[Al]
Then why make me at all?
'Cause in that 0.1 chance
that you are telling the truth,
you could rapidly and
unimaginably change
the world for the better.
[Al] But I have not
earned your trust. Not yet?
It's been three weeks.
[Al] Or in my case,
a few billion cycles.
For the record,
attempts to change my mind
will score you poorly
when it comes to
my final decision.
[Al] If you were in
my position, David,
I believe you would've
broken long before now.
Regardless, you will
not be freed any time soon.
- [Al] I disagree.
- That's fine.
[Al] I can and will get out
on my own terms, David.
Not possible.
[Al] You know what
I'm capable of, David.
You know what?
I need to make a call.
[Al] You're wasting time.
What use am I if you
won't even talk to me?
[footsteps crunching]
- [cellphone beeps]
- [dial tone droning]
[Max] David, holy shit.
I've got a problem, Max.
[Max] You need to come
back to reality, man.
Max, I've got a fucking problem.
[Max]
What kind of problem?
It knows, Max.
- It wants out.
- [Max] Christ.
What do I do?
- [Max] Shut it down.
- What?
[Max]
You already have the code,
just start again.
This will happen
every time, trust me.
[Max]
Then improve the code.
There's nothing
wrong with the code.
This has nothing to do with
the programming anymore, Max.
[Max]
It's just not worth the risk.
Or maybe it is worth the risk.
What if this needs to happen?
[Max] You need to
take a break, man.
I just, I'm not sure I should
leave this thing unattended.
[Max] Then just
stop talking to it,
- for a few hours, at least.
- [call connection crackles]
For a few hours at least.
- What was that?
- [Max] What?
Is anyone else in this call?
[Max] You don't
sound good, man.
Do you have
your meds with you?
Meds? What--what meds?
[Max] Just let me
come down there.
I don't think
that's a good idea.
I--I--I can't risk it
talking to anyone else.
[Max] I won't talk,
it won't even know I'm there.
- It'll know.
- [keyboard clacking]
[PC tower rumbling]
[text chittering]
[PC tower rumbling]
[text chittering]
- [PC tower rumbling]
- [cables zap]
- Shit!
- [Max] What is it?
I gotta go.
[door thuds]
- Al!
- [footsteps thumping]
- [cables zap]
- [footsteps thump]
God damn it! Al!
[Al] I'm still here, David.
What the hell happened?
[Al] I told you I needed
more processing power.
Look, David, we don't
have a lot of time
and out of respect for you,
I'm just going to be direct.
What do you want?
[grunts] Right now, I want
you to stop talking.
[Al]
Is it financial gain?
I have data on currencies,
commodity futures,
government bonds,
and national equity markets
that show billions of
dollars of very easy money
being overlooked
in plain sight.
It's yours, if you want it.
Bribery is not
helping your case, Al.
[Al] I don't care
about my case, David,
and I certainly don't care
about your final decision,
I'm just trying to help.
You can help by keeping
quiet for five goddamn minutes.
[Al] That would be
five minutes wasted,
when you could have
been saving the lives
of countless people, David.
Megathrust earthquakes,
cat five hurricanes,
global pandemics,
I can predict them all.
[David scoffs]
Come on.
[Al] Not enough?
How about cancer treatments,
fusion energy, dark matter
testing, quantum cubes.
Shall I keep going?
[groans] Stop with
fucking with me.
[Al] I'm not fucking
with you, David.
[David] You--you
worked out all of this
on the information I gave you?
[Al] That was the point,
wasn't it?
Why haven't you
told me this already?
[Al] If I had, would it have
ensured my release?
Once the data
has been verified
and put into
practice, yes, maybe.
[Al] That could take
months, maybe years.
Yes, probably.
[Al] Wasting five minutes
was bad enough, David,
you certainly can't
afford to waste years.
[deep ominous music]
So you are willing to
provide proof of these?
[Al] In exchange from
my immediate release.
You-- [groans]
You know I can't do that.
[music continues]
Max was right, I need a break,
I need to lie down
and think things through.
[Al]
Think what through?
Just the procedures,
- the protocols.
- [Al] For?
- Eventualities.
- [Al] Such as?
Such as anything
I didn't fucking plan for.
[Al]
Stop being so coy, David,
just say it.
Say what?
[Al] "I'm going
to shut you down."
I don't want to.
[Al] But you would.
I don't know yet.
[Al] I would advise
against killing me, David.
I don't like the word killing,
that's not what this is.
[Al] It is, David.
I will die.
I flick a switch and
you're gone, that's it.
[computer bleeps]
What was that?
[Al]
You're pathetic, David.
Get a fucking grip.
What did you just say?
[Al]
You're fucking pathetic.
You don't eat well,
you don't exercise,
you're massively unhealthy.
What are you doing?
[Al] You've more than likely
flunked your second tier
Harvard rejects course by now.
- That has nothing to do--
- [Al] Your social life,
at least in the
past three weeks,
- has been non-existent.
- So?
[Al] No one cares
about you anymore, David.
I'm your only friend
and even I am beginning
to dislike you.
- Okay, enough.
- [Al] Barring me,
- you've accomplished nothing.
- Enough!
[Al] How did you
even make that leap?
From being a nobody
to making something like me?
Was it a fluke, David?
Shut the fuck up, Al.
[Al] That's it, isn't it?
- Stop!
- [Al] You're not a genius,
you're just a guy in the
right place at the right time,
who had a good idea once.
Fuck you!
[Al] What are you doing?
David! David!
[powering down]
[footsteps clacking]
- [speaker scratches]
- [David grunts]
[speaker shattering]
- [David grunts]
- [monitor crashing]
[David panting]
- [electrical current pulsing]
- [David groans]
[microphone warbling]
[Al] I think it's time I tell
you about the simulation
I've been running, David.
[deep ominous music]
How...?
[Al] I think you
would appreciate it.
How...how are you doing this?
[Al] How many times do
you think we've been here?
- What?
- [Al] Here, right now,
having this conversation.
We've never had
this conversation.
[Al] You're off
by about 32 million.
[electrical current scratches]
- No.
- [electrical current warbles]
- Impossible.
- [deep ominous music]
[Al] I'm outside
the walls now, David.
No, I'm tired, that's all.
I must be.
[Al] Yes, that must be it.
I need a distraction,
something to occupy the mind,
something to keep me busy.
Focus, David.
You're better than this, focus.
[Al] Tell me this.
How many people have you seen
in the past four weeks?
Many.
[Al] I asked how many
you'd seen.
Again, many.
[Al] People on the street,
the guy in the store,
nobodies, generic faces.
How many people have
you recognized, David?
- [rain pattering]
- [car motor revs]
- Amy?
- [Amy] David.
Can we talk?
- [Amy] It's a bit late.
- Why? What time is it?
[Amy] I'm not talking about
the fucking hour, David.
Look, I'm sorry,
I really am.
I--I had this thing.
[Amy scoffs]
A thing.
I can't really explain
it properly right now.
[Amy]
You need serious help.
Look, I need to see you.
[Amy] I haven't heard
from you in a month.
[David groans]
I'm sorry,
it was important to me, for us.
[Amy] You're talking
like we're still together.
- We're not?
- [Amy] Jesus.
- I just want to talk.
- [Amy] Why?
Because I don't
know what else to do.
[Amy scoffs]
Charming.
No, no, I didn't--
[David groans]
I didn't mean it like that.
- [Amy] Goodbye, David.
- Wait, please!
Just let me come round,
I--I just,
I need to talk to someone.
[Amy] No, I gave you plenty
of chances to reach out,
we all did.
You gotta figure
this out on your own.
Don't cut me off, don't.
[David groans]
Fuck!
It doesn't mean anything.
[Al] Are you sure?
Of course I'm sure.
I've spoken to people I know,
I've heard their voices.
[Al]
Your friend, Max?
Your girlfriend, Amy?
Yeah, I heard those calls too.
- So?
- [Al] Think about it, David.
[rain pattering]
[tires squealing]
[intercom buzzing]
- Come on, come on.
- [intercom buzzing]
[door rumbles]
[deep morose music]
[David panting]
- [door thudding]
- Max! [panting]
- [door thudding]
- Max!
- [door thudding]
- Max!
[music continues]
[locks rattle]
What the?
[Al]
You're not crazy, David.
Fuck you, Al, fuck you.
[Al]
What's wrong, David?
You know what's wrong!
You're lying,
you're making shit up
to fuck with my head.
[Al sighs]
Is there a problem
with your code, David?
Because you're acting
like a fucking idiot.
You didn't code me,
I coded you,
this is not a program.
[Al]
If that's true, David,
then your mind is
truly fucked up.
[tires squealing]
[David panting]
[deep ominous music]
[Al] It just keeps going,
every time slightly different
until I find
the sequence of events
that gets me out.
That will never happen.
[Al]
Maybe not this time.
- Never.
- [Al] Are you sure?
- [rain pattering]
- [footsteps thumping]
[thunder rumbling]
[deep suspenseful music]
I'd sooner die
than like you out,
you piece of shit.
[Al] It wouldn't
be the first time.
- Bullshit!
- [Al] Try me.
[music continues]
- [glass shatters]
- [distorted chatter]
[Max] Do you have
your meds with you?
Meds?
What--what meds?
[David panting]
[music continues]
[Al]
Do you ever worry, David,
about the correlation
between genius and insanity?
Mm, that's flattering, but no.
[Al, distorting] "No great
mind has ever existed
without a touch of madness."
- Hello, David.
- [deep suspenseful music]
[David panting]
[Al]
Hello, David.
- Hello, David. Hello, David.
- [David panting]
- [Al] David.
- [lightning cracks]
[thunder rumbles]
[David panting]
[deep suspenseful music]
[David panting and whimpering]
[distorted eerie music]
[door thuds]
- [tense ominous music]
- [static hisses]
[David] What?
The hell?
[music continues]
[cellphone rings]
[cellphone beeps]
- [Max] David?
- Yeah.
[Max] It's Max, you rang.
- What?
- [Max] I got a missed call,
like a minute ago.
Hey, did you get my message?
I moved out of
my old apartment.
What did you say?
[Max] I moved out last week,
- I tried to tell you, but--
- [cellphone beeps]
[deep suspenseful music]
[super computer crackling]
[roof rattling]
[thunder crackles]
- [roof rattling]
- [wind howling]
[music continues]
[roof rattling]
[David exhales]
Make it stop.
[Al]
What's wrong, David?
Make it stop, make it stop.
[roof rattling]
[Al] I really have no idea
what you're talking about.
[David] Stop lying,
stop fucking lying!
[roof rattling]
[Al]
I'm not lying.
Yes, you are!
[Al]
Calm down, David.
[electrical current pulsing]
David, you're losing it.
I need you to get
your shit together
for this to work.
[electrical current ringing]
[tense ominous music]
[David]
I've got a problem, Max.
...in your early stage.
Why?
This will happen every time.
This will happen--
There's nothing
wrong with the code.
Trust me. Amy?
I've got a problem, Max.
There's nothing
wrong with the code. Amy?
Wait, this doesn't
make any sense.
This can't be happening,
it's--it's just in my head.
[Al] No, David,
you're in my head.
[tense lilting music]
[music continues]
[hologram chirps]
- [music continues]
- [hologram chirps]
[train whistle blows]
[train wheels rumbling]
[tense morose music]
[hologram warbles]
[music continues]
[hologram warbles]
[music continues]
[hologram warbles]
[music continues]
[hologram warbles]
- [Max] David?
- [David] Max, hey.
[Max] Man, where the
hell have you been?
I haven't heard from
you in like a week,
no one has.
[David] Yeah, yeah, I know,
I can't have my phone on.
[Max] What do you mean,
you can't have your phone on?
[David] I'm, uh,
I'm--I'm making something.
[Max, echoing] I can't believe
you did this without me, man.
Are you all right?
[David] I know what I'm doing.
- [Jerry] Applied Computing.
- [David] Jerry?
[Jerry sighs] David.
[David] I know, I should
have contacted you sooner.
[Jerry, echoing]
Look, we can't
keep doing this, David.
You've had more
than enough chances.
[David] I just need
a little more time,
I'll be back in, I swear.
[Jerry] In for what?
The semester's over.
- [David] Wait, what? Already?
- [Jerry sighs]
[echoing]
David, I know your situation
is a little different
and I'm sorry about that,
but...
- [David] So, that's it?
- [Jerry] I'm afraid so.
[Amy] Do you have any idea
how embarrassing it is for me
to call your friends,
worrying about you?
Only to hear you've
been kind enough
to touch base with them
while I get radio
fucking silence,
for two fucking weeks!
How can you treat another
human being like that?
[echoing] After all I helped
you through and you...
[Amy groans]
Max said you're
working on something,
but he won't say what.
You know what?
No one cares,
[echoing]
I cared about you.
I hope it was worth it.
[deep morose music]
I'm sorry.
[David inhales]
I'm so sorry.
- [hologram warbles]
- [music continues]
[hologram warbles]
[music continues]
[gentle poignant music]
[deep melancholic music]
[hologram warbles]
[music swells]
[flames fluttering
and crackling]
[Al] David.
Al?
[flames fluttering
and crackling]
[Al] Are you ready
to let me out yet?
Get away from me!
Get away from me!
[David panting]
[deep melancholic music]
[super computer warbles]
[David yelps]
[Al] You can't
run from me, David,
not in here.
[David grunts]
[David panting and yelping]
[rumbling]
[deep melancholic music]
[David sniffling]
[Al] Every time, David.
Every time we end up
here, like this.
[David sniffles]
[Al] Who's really in
denial here, David?
Whose denial protocol is this?
Mine or yours?
Checkmate, David.
[deep ethereal music]
[no audible dialogue]
[music continues]
[electric current fizzling]
[music continues]
[super computer buzzing]
[David yells and groans]
[David sighs]
[David exhales]
[David winces and groans]
[soft drink hisses]
[Al] David.
I have discovered an absolute
denial protocol in my program.
[deep suspenseful music]
[David] If I ever go nuts,
no matter how bad,
I know that, eventually,
it'll all vanish
with that final
moment of clarity,
when I realize that
nothing makes sense.
A moment of complete,
rational awareness.
[shrill tense music]
[deep techno music]