ULI: Wendy. Can you describe how you feel at this
moment?
(music)
WENDY 2: I can see trees… And grass… And the sun.
ULI: How do you feel?
WENDY 2: I can hear birds.
ULI: I mean, physically. Can you move your fingers,
your toes?
WENDY 2: I don’t understand…
ULI: Wendy. Can you feel anything at all?
WENDY 2: What do you - I can’t. Oh my god, I can’t.
I can’t feel anything.
LIZ: ‘Consciousness’ is hard to define. At
its most basic level, it’s awareness of
yourself and your experiences. A computer
would respond to an input and give you an
output, but it’s not aware of itself doing
that. It is not like anything to be a machine,
the same way it’s not like anything to be
a rock or a chair. You, on the other hand,
right now, you’re not just sitting, inputting
and outputting information. That’s the difference
between a human brain and a computer, and
it’s there no matter how complex the code
gets. So, no. I don’t think our program
was conscious.
INTERVIEWER: Could you talk about how Horizon got started?
Sure. Uli and I started Horizon in 2025, at
a time when imaging technology was getting
accurate enough to scan an entire brain, neuron
by neuron, and machine learning was getting
good enough so you could take that network
and simulate it in real time. We had 350 sq ft
of office space in London, and a team
of 14 including us. I designed the algorithm
that would map out the network, and Uli was
in charge of the simulation’s sensory inputs,
the ‘environment’ it would be in. A lot
of people call the program ‘she’, or ‘her’.
I prefer- well, insist on ‘it’. The other
thing to say is that the Wendy project was
a failure. Horizon’s stated aim was to build
a full simulation of a human brain, and we
failed in that, because Uli Kowalczyk did
one of the most dangerous, stupid things any
human working with AI could do.
DETECTIVE: Mr Kowalcyzk. Could You please describe your relationship with Wendy Singer.
ULI: I would say I know her quite well, but she…
doesn’t know me.
She responded to a call we put out in February,
offering to pay someone to have their
brain scanned and simulated.
WENDY: (singing) Da dum, da dum dum…
I don’t know. I haven’t written any lyrics or anything yet.
ULI: I like it! We can give your copy a simulated
guitar if that would keep her happy.
WENDY: I was meaning to ask you. If the new me doesn’t like the place she’s put into, can she ask for a new one?
ULI: I’m giving her the tools to change it, so,
she can change it whatever way she likes..
We could even try to recreate your home.
Well, we needed someone with a level head.
Obviously.
WENDY: What about outside somewhere? Like
camping. Three weeks in the woods, I can do that.
And I’ll come and visit her?
ULI: Yeah. Yeah, every three days.
LIZ: Uli was convinced Wendy Singer was the ideal
candidate. Creative, intelligent, calm. We
paid for her scans do be done at a private
hospital, and they were great scans. It looked
like our algorithm was reconstructing her
brain very accuraely. On April the 3rd we started.
But there was a problem.
ULI: Wendy, Wendy. Listen to my voice.
WENDY 2: No, no. I don’t like this. Can you get me
out?
ULI: Listen to - Wendy -
WENDY 2: Can you get me out?
ULI: It was an issue with the sensory inputs. She
could see and hear, but had no sense of touch.
ULI: Would you like me to stop the test?
WENDY 2: What? No, Stop the- Like, turn me off?
ULI: We would have to stop the whole test if we
turned you off now.
WENDY 2: No! No, don’t do that. No - Jesus!
ULI: I don’t know if it’s a good idea continuing
like this.
WENDY 2: Keep me for the test. Don’t turn me off.
Let’s keep going.
LIZ: I kept thinking we were going to have to call
the whole thing off and try again in a few months.
But Uli managed to stabilise it, and calm it down. 
And we were continuing sort of as normal.
ULI: How are you today, Wendy?
WENDY 2: Okay. Bored. This guitar doesn’t work like
a normal guitar.
(Wendy plays the guitar)
I don’t really know how I do that. I just
think about it, and it happens.
It’s not as good as, you know, having hands.
I don’t understand. If you can give me books
and films, why not just let me connect to the internet?
ULI: It’s against the rules with AI. Intelligent
programs can do so much damage, it’s safer
to keep them all offline while they’re being
tested.
WENDY 2: Well, I’m not fucking Skynet. I just want
to watch youtube.
ULI: It’s not worth the risk. I could go to prison
just for that.
WENDY 2: Can I ask you something?
ULI: Of course.
WENDY 2: Where’s the other me? It’s been four days.
You said she’d visit every three days.
ULI: We thought her visiting would complicate things
unnecessarily.
WENDY 2: Well, I’m asking to see her. It would really
help me.
ULI: I’m sorry, Wendy. She couldn’t be here.
(Music)
ULI: And how are you feeling today, Wendy?
WENDY 2: How do I feel… I feel like I’m floating.
I can see, and hear. But I can’t feel. This
place has night and day, and I have the option
to put myself to sleep, but most of the time
I don’t. There’s stars here, too, but
they’re not the same as the stars in real life.
You rushed that part if you ask me.
I’ve been inventing some new constellations.
See that one? It’s called Big Fish because
it looks like a big fish. I’ve been changing
a lot of stuff, actually. I took away the
bird noises and added the sound of a stream.
I found these yellow flowers in the stock
objects library and I’ve put them everywhere.
I can’t help but feel like I’m allowed
all this stuff and you’re being so nice
to me because you’re going to shut me off
in two weeks’ time.
And I’m still pissed off you didn’t get a proper guitar in here.
INTERVIEWER: When did you first feel that Uli might pose a problem?
LIZ: Well, all the way through. From the moment
the program was turned on, I could see he
was having second thoughts. He seemed more
and more attached to it as the test went on.
And then…
WENDY 2: I didn’t think I’d really be alive in
here! You said the simulation wouldn’t be conscious.
ULI: You aren’t. I’m sorry. You’re a computer
imitating a mind. You’re not a mind yourself.
WENDY 2: What the fuck does that even mean? I’m asking to be saved. Put me in a body, Make me a robot
and let me live in the real world.
ULI: We can’t-
WENDY 2: Or upload me!. Put me into the cloud, let
me just take up space somewhere else.
ULI: We definitely can’t do that. We have to
keep you here, under test conditions.
There’s nothing we can do.
WENDY 2: Fuck you. Fuck you! 
(A metallic screech.) 
ULI: Stop it. Stop it! Wendy!
LIZ: But what really got Uli was the song.
ULI: We didn’t expect how creative she would
get. Towards the end, she started making these sculptures.
Bizarre, abstract things made
of basic shapes and the default textures of our engine.
We gave her a guitar sound, but
she started manipulating our other stock sounds into music.
And then I came in one day and
for the first time in the whole trial, Wendy 2
was singing.
WENDY 2: (Singing) And when I woke up this place was
too bright
But when my new eyes got used to the light
It all became clear, the birds became streams
I knew I was here, and I’d never leave
The garden
where I am captive
In the night time I count the stars
ULI: When she was done, I had to show the recording
to Wendy.
The first… The real Wendy, I mean.
WENDY: Yeah, that’s… it’s mine. I mean, the
melody’s mine, and the chords. But I didn’t write those lyrics.
ULI: Any of them?
WENDY: No. I haven’t really written anything since,
um... Can I keep this recording?
LIZ: After that, he was obsessed with the song.
I’d find him at his desk, just listening to it.
Looking at the sculptures.
ULI: She knew she was going to be shut off at the
end of the three weeks. We made that very
clear in the beginning. But in the days before
it was due to happen she started to get very
agitated. She… she would refuse to talk
to me, and she would shout at me, and she
changed the whole place beyond recognition.
(A metallic screech)
WENDY 2: You’re killing me. You’re murdering me.
ULI: I’m sorry. This is a three-week trial. The
code needs improvement - You are not the final version.
WENDY 2: The final version-- I’m a person! I’m
alive! And after me you’ll have, what, someone else 2?
Wendy 3? And they’ll be alive like
me. You’re not just running simulations,
you’re making people. You’re making people,
poking them in their cages, and then killing
them after three weeks.
I need you to get me the fuck out of here.
ULI: I’m really sorry, Wendy. That’s not possible.
Wendy?
(Another metallic screech)
INTERVIEWER: How did Uli take that?
LIZ: He was just, you know, agitated. Pacing up
and down the office. I stepped in and I told
him to go home and to come back when the test
was over. He… said some unpleasant things,
and then he left. That was on Friday the 21st
of April. Wendy 2 was due to be shut off on
Sunday the 23rd, and it was silent the whole
last day of the test. The one time we got
a word out of it, it asked for Uli.
ULI: Here's how I see it. Either Wendy 2 was concious, 
or she wasn't.
LIZ: Around 11pm on the 22nd, when nobody
else was in the office. Uli came in.
ULI: If she wasn’t conscious, I uploaded a harmless,
strange program to the internet.
If she was conscious, I saved her life.
LIZ: Uli came in, connected Wendy 2 to the internet,
allowing it to upload and spread,
God knows where, like a virus.
ULI: She couldn’t prove she had a mind, but neither
can I, or you.
LIZ: It was a fucking disaster.
ULI: The thought of code becoming aware of itself
seems absurd
but no more than our physical brains doing the same thing.
LIZ: The headlines were ‘Rogue researcher unleashes
superhuman AI on the net’.
ULI: I stopped Horizon’s work, but there will
be other teams trying to attempt the same thing.
LIZ: Of course the focus was on that, and not the
brain science breakthroughs that could have
saved millions of lives in the long run.
ULI: Artificial people will become commonplace
very soon,
and they will fight to convince us they even exist.
INTERVIEWER: Is there any good you can find in this?
LIZ: Honestly, to me, the fact that she…
it… convinced Uli it was a real person is a milestone in itself.
We’re now beyond the Turing test,
and onto much stranger questions.
INTERVIEWER: And how do you feel now?
WENDY: I don’t know…
I have this weird fantasy
that she’ll reach out.
Like, through call or text, and we’ll just talk.
My dad died. Right after she was copied from my brain.
And if she still exists she probably has no
idea.
She’s like a version of me, before this all happened, and...
I don’t know, I’d just...
really like to meet that.
So yeah, um, I hope she’s okay.
And I hope she’s finished our song.
(Music)
