Jesslyn Tannaday: Give yourself back scratches.
Elise Hu: OK hold on.
OK, I’m trying to picture myself.
[Sound of robot legs moving.]
Oh whoa! Haha.
So how did I get here, scratching my own back,
with a robot controlled by my mind and
a magic armband?
Well, I got up early the day before and left
LA.
to head back out on our journey.
We're exploring who we will be,
what powers human bodies will have,
by the year 2050.
It brought me to a soggy New York.
It’s pouring down rain.
We're here to try out an armband that lets
me control things with my mind in digital
and physical worlds.
So, what does that feel like?
And where could this take us?
We’ll find out in this episode of Future You
with me, Elise Hu.
There are already a handful of neurotech startups
working on tools to meld mind and machine —
known as brain-machine interfaces.
Most of the devices are either implantable
or worn on the head.
CTRL-labs is a little different.
CTRL-labs.
Their brain-machine interface attaches to your arm.
We are trying out what is essentially the Force.
You know, the Force, that binds all
things?
[Star Wars music.]
Hu: OK, kind of like the Force.
The company has created an armband interface
that lets you control digital objects
not with motion, but with your intention.
This is Jesslyn.
She's one of the brains behind this technology.
Hu: What do you call this device?
Tannady: CTRL-kit armband.
It’s basically going to be able to read your neuron fires,
and you’ll be able to play with our demos.
Hu: But my neuron fires are up here, right?
Tannady: Your neuron fires come from your brain.
They go through your spine through your nervous system.
And, well if you think about it,
when you move your fingers, those are all like instructions
that are coming from your brain through your arm, right?
Hu: So that’s why in this episode,
I'm not wearing a cap. First time!
Hu: Here’s how the armband works.
[Singing: 101]
It is not the same as a typical video game controller.
It’s not gesture-based at all.
And there are no cameras tracking my movement.
This armband is a brain-machine interface
that uses uses electromyography,
or EMG technology.
The inside of my armband is lined with electrodes
that touch my skin. And like Jesslyn said,
the electrodes measure tiny electrical pulses
of my nerve cells, or neurons.
That means the band reads the motor neuron signals
in my arm, before my muscles react.
That's how I can move digital objects,
or even robots, before my body has moved,
or even if I don't move at all.
The device is reading neural signals,
not muscle movement.
Tannady: What we have here is a representation
of your hands.
Hu: This is a loop directly from my brain
to the computer, generating my hand
on-screen.
Oh whoaaaa.
I’m here, but I’m also there.
I’m making some small movements with my hand,
but they aren’t necessary.
Once I get better at this, you wouldn’t be able to see my hands move at all.
My mind would do the moving.
What does this feel like? I would say
it feels really natural. Like I am not having to think
really hard. I’m simply moving.
So it really does feel just like an extension of me.
OK, so come. And go.
Since the digital world I’m controlling
is programmed,
I have to learn the laws of the universe,
in order to control it.
Oh my goodness!
Oh, Elise tries.
Oh, Elise dropped it.
The Force is not yet with me.
Tannady: Oh there you go, there you go! You're pushing it so far away.
You're pushing it far away.
Hu: Oh there it goes!
Tannady: You're controlling that.
Hu: Now that we’ve tried out the Force. Oh my goodness.
Yoda: There is no try.
Hu: OK, now that we’ve begun to use the
Force.
You can just destroy whole worlds,
and then [snaps] rebuild them.
We can take it further, and actually control objects
in the physical world,
like this six-legged guy. Or gal.
Come to me, baby.
[Sound of robot legs walking.]
Hu: Aww. Now I don't feel so distanced anymore.
Tannady: This is our little hexapod.
Hu: It doesn’t need to have the googly eyes,
but it kind of personalizes it a bit.
Tannady: Oh, the googly eyes are so important.
Hu: It takes a little bit to get used to,
but I’m amazed at how little it takes to
like navigate and negotiate this object, right?
Because we are not connected in any way except for
digitally, right?
And these are just intentions and the force and strength of my arm,
which really means there are so many potentialities
for the future.
[Singing: In the year 2050.]
Hu: I am here with Thomas Reardon.
He goes by Reardon.
He is a co-founder of CTRL-labs,
which is the maker of a lot of this future-facing technology,
and we're going to put him through our superscenarios.
So, based on the technology that you’re making,
what is a superhuman scenario
that you can see this delivering for us by 2050?
Thomas Reardon: It's as straightforward as
the Force from Star Wars.
This ability to have a kind of digital telekinesis
but that you can then bring into the real world.
So yes, an extension of you,
where interacting with a computer and a machine
no longer feels like something you're doing mechanically,
but instead is just a fluid extension of your
thoughts and subtle movements.
Hu: You were talking about the Force.
Could I summon an object?
Reardon: In a virtual world? Sure.
In the real world? No.
The real world is regulated by physics,
and physics works.
Hu: Darn.
Reardon: But the question is could you also have a little
robotic buddy that’s going down the street with you
that could go and fetch that object for you?
Hu: What would be a supervillain use
of something like this?
Reardon: There's always the danger inherent
in it that they're gonna be exploited to increase
what I'll call “human lethality.”
It could just be anybody looking for scale
that goes beyond their ability to use their own body
to exert lethal force in the world.
Hu: Let’s get to superthorny.
So when you think about where things get really
ethically complicated or complex.
What do you think about?
Reardon: There are some substantial questions
around privacy.
The signals you generate neurologically
are the most unique identifier of you ever.
You are distinct not just from your clones
but every other human being who's ever lived.
Give us 30 seconds of recordings from
a person,
and we can identify that person for the rest of their lives.
And we have to make sure that people
who might have the opportunity
to exploit it to track you,
don't have an easy way to go do that.
Hu: What is the superlikely scenario
for this technology by 2050?
Reardon: I think you'll no longer see people
staring down at that dumb brick as they're
walking down the street
or fighting with it to text message —
all these things that we think of as the "Internet of things"
that all have their perverse little interfaces,
like the Nest thermostat on the wall.
There's no reason for me to go up
and touch it and move it.
I ought to be able to just look at it
and change the temperature.
The neural interfaces that we're working on
get rid of the whole concept of movement,
and allow you to use that same neural flow
to now control that “Internet of things,”
that sea of small little digital devices,
but also, like I said, to summon an
autonomous vehicle to the curb next to me.
Hu: Unconstrained by our physical bodies,
what could our brains accomplish?
Follow the future with us.
If you have any comments for me,
you can write futureyou@npr.org
or message me @EliseWho.
You can subscribe to future episodes of Future You
on YouTube or go to NPR.org/FutureYou
Tannady: Oh you wanna do your pinkie pinch.
Hu: Oh pinkie, sorry.
[Sound of robot leg kicking ball.]
[Sound of ball bouncing on the floor.]
Hu: Dun, da, da, dun.
And this concludes our episode.
