- Take a look at this clip.
Just as the Tesla Model 3 starts to go
through the intersection,
bam, right there.
A car runs the red
light, clips another car,
and is sliding directly towards the Tesla,
but what happens?
The Tesla stopped automatically,
without the driver giving any input,
and it did so because of the
automatic emergency braking
in conjunction with the autopilot cameras,
which are on the side of the car.
Now there's some debate
about whether or not
this was actually the Tesla autopilot
and emergency braking system
or the driver themselves.
However, if you pay attention,
most of the other cars kept driving,
even after the accident occurred,
but the Tesla stopped immediately,
almost instinctively as
the accident happened.
Fast forward a little bit.
The car slides through the intersection,
slams into the truck, and flips it.
That truck isn't having a good day,
this Tesla driver is
because the combination
of the safety systems
of autopilot and the
automatic emergency braking.
So the question is, is
autopilot really safe?
Thankfully, Tesla's given
us a lot of data here
to look at, and we out to
be able to tease that out,
looking at the actual data
combined with evidence
like you saw just now.
Let's go.
(upbeat music)
Before we get into the
data about autopilot,
I thought it made sense
just to take a step back
and think about new technology,
because it's always rife with issues
and tons of opposition.
Radio would rot your brain if
you listened to it too much,
TV will ruin your eyes,
the internet will make us all crazy
and our cell phones will spy
on us 24/7 if we let them.
Okay, those last two are
actually kind of true,
but you get the point.
In order for us to advance as a species,
it means we need to
adopt new technologies,
but we don't need to blindly trust them.
A perfect example of
this is when we switched
from mechanical switches in automobiles
for acceleration and deceleration
to computerized ones back in the mid-'80s.
Do you remember that Toyota scandal
where the cars were
accelerating out of nowhere,
sudden acceleration was
the term that people
were throwing around, and
it was causing accidents?
Well this problem actually
kinda began in the mid-'80s
when we switched from
those mechanical switches
from controlling
acceleration and deceleration
in a vehicle to the controller
area network or CAN,
which it is called, which is
a built-in computer network
in your car to control every
little thing that it does.
So when you press on the
gas in a gas car today,
a signal is sent to a computer in the CAN,
that controller area
network, that tells it
"hey, I want to go faster."
Then there's some decisions that happen,
sort of an algorithm that runs
basically checking all the other systems,
what's going on in the vehicle, and if so,
if it all matches, it will
actually then tell the system
to accelerate, but there are all kinds
of safety checks in there
to make sure you don't
do things that you're not
supposed to do and harm the vehicle.
So in general, this is
seen as a good thing,
but fast forward to 2009
and for some reason,
Toyotas started
accelerating out of control
and causing accidents.
Originally Toyota was blaming the drivers,
since it seemed to happen to
older drivers particularly.
Now this was until 2009 when
a California Highway Patrol
officer Mark Saylor and
his three family members
were killed in a high-speed
crash that was caused by this.
In total, this issue was
linked to 31 accidents
and 12 deaths, but the sudden acceleration
information group online
reports that Toyota isn't alone,
and these issues account for
thousands of deaths worldwide.
Now this was a change to
the car's internal system,
not one that people even
considered when buying their car,
and while computers definitely
help make cars safer
with all the automated safety features,
they are not without some major issues.
So a company like Tesla that
is pushing the boundaries
of technology to make driving safer
are bound to encounter some issues.
It's simply part of the process
to making things better,
hopefully with much less of an error rate
than Toyota had in that previous issue
we were just talking about.
But let's see what the data actually show.
The first report that Tesla released
was back in October of 2018
and they showed one crash-like event,
and we'll call it an accident,
for every 3.34 million miles
when the car was in autopilot,
so when autopilot was engaged.
Now without autopilot for
the same fleet of cars,
they had one accident for
every 1.92 million miles,
so that's about two times as
safe, or it's close to that.
Now the NHTSA, the National
Highway Traffic and Safety
Administration in the United
States for the same time period
reported that on average,
a driver has a crash every 492,000 miles.
So compared to the 3.34
million miles in autopilot,
that's about seven times better.
And in Q1 of 2019, they
reported one accident
for every 2.87 million miles in autopilot,
where it was engaged, and one accident
for every 1.76 miles without
autopilot being engaged.
So the ratio of crash-like
events to miles driven
is going down, meaning it's getting worse,
but it's still about 6.5 times better
than the average reported by the NHTSA.
But realistically, it's more
like two to three times safer,
as you can see in the comparison between
the driving in autopilot versus
driving without autopilot.
And before you guys jump all over me
and talk about apples and
oranges and this and that
in the comments, let me
just point out that yes,
this data is not representative.
You can't really compare the NHTSA number
to the Tesla number
because the NHTSA number
accounts for all driving scenarios.
Autopilot is really only
used on the freeways,
it's not used on city streets yet,
so that's why I took the
comparison of autopilot
to non-autopilot miles, which
I know is still not perfect,
but it's a much more realistic scenario
than the NHTSA numbers.
So yes with a grain of salt,
this is the data we have,
but it is a positive
sign and as these numbers
kinda grow in terms of miles driven
and the more scenarios
which Teslas are driving in,
these numbers will become a better
apples to apples comparison.
So if you've never seen autopilot before,
I'm gonna give you just a
quick overview of what it does.
Right now I'm in my 2018 Tesla Model 3,
long range rear wheel drive.
This has autopilot version 2.5,
which has eight cameras total,
ultrasonic sensors for nearby objects,
and a forward-facing radar.
So on the freeway is where
it's designed to be used.
I double tap the shift
and now I'm in autopilot.
You can see that the blue indicator here
and the speed is set at 72,
happened to be what I
was going at the time
that I entered it, and the visualization
of all the vehicles
coming and going as my car
navigates on its own.
So I need to keep my hand on the wheel
in case I need to take over
in the event of an emergency
or whatever, but other than that,
the car's gonna do it all on its own.
It's gonna adjust the
speed based on vehicles
in front of me slowing down,
or speed up to whatever I have it set to,
as well as keep me in the lane.
Like I'm in a turn here,
my hand is on the wheel,
but it's actually doing the turn for me.
So that's the most
basic form of autopilot.
Beyond that, Tesla's added a lot of new
kind of advanced features.
One of them is called
Navigate on Autopilot,
so if I punch in a location
that I want to navigate to,
the car will actually drive me there
from on-ramp all the way until off-ramp,
going around traffic for slow,
people that are slower than you,
as well as taking those
on-ramps entirely on its own.
Autopilot is pretty advanced.
It has a lot of different
quirks and features.
In general, I think the lane keep
and the speed adjustment are fantastic,
but some of these other features are still
kind of early days, and in the future,
what we'll see is for them
to just get better and better
with more data and more cars on the road,
until the point where it just becomes
incredibly obvious that a
human is not the safest option.
Let's hop back to the studio now
and just kind of wrap it up.
So is Tesla autopilot safe?
Yeah, yeah it definitely is,
in the aggregate, right?
When you look at this data,
it's all aggregated up,
it's all rolled up, and
yeah, it's clearly safer.
Now to what degree I think is
a difficult thing to answer.
As I mentioned before, all the challenges
with the data that we currently have,
but overall, the signs are very positive
that this is a good
thing for driver safety.
But it doesn't mean that it's perfect,
and the thing that I
really want to emphasize
is that when you aggregate data like this,
when you take these big numbers
of all these individual
events and you roll them up
and you see oh look,
it's all rosy and good,
you lose sight of some of the details.
People have died driving in autopilot,
and just like Toyota
wanted to blame the driver,
it's kind of natural
for a company to want to
blame the user here, and maybe
there is some fault there.
I'm not saying that there isn't,
but the point is is that
there have been injuries,
there have been issues,
as there are with all new technologies.
We shouldn't be forgiving, we
shouldn't just dismiss those,
but overall when you look at it,
I think we have some
really positive signs here,
and if we can work through these issues,
we should be coming out much better off
than if we hadn't pursued this at all.
So what do you think?
Do you have autopilot,
what version do you have?
What car do you have, how
long have you been driving it?
And what scenarios has it
been really good for you,
and what scenarios has it
been not so good for you?
I've talked about my
experiences kind of extensively
here on the channel and on the podcast
Our Ludicrous Future,
so you can hear about
me and my thoughts and
where I am with that,
but I would love to really hear from you.
Leave me a comment down
below and let me know
what your experience has been like.
As I said before, it's
not a perfect system,
but it clearly is showing good signs
that this is gonna help
us in the long run here.
So I think it's worth pursuing because
that's how progress is made.
So that's it for this one,
I hope you enjoyed it,
and don't forget, when you free the data,
your mind will follow.
I'll see you guys back
here in the next one.
(upbeat music)
Hey thanks for watching the video.
I hope you got something out of it.
If you want to dive a little bit deeper,
become a part of the
Teslanomics community,
consider joining us on Patreon.
So what we have set up
are different things
and ways to engage,
such as a Discord group,
which is like this chat
room, that is just the folks
that support the channel through Patreon.
I'm on there almost daily
engaging in conversation
about how Tesla and others like them
are changing the world
around us for the better.
So if you'd like to learn more,
go ahead and go to patreon.com/teslanomics
and I hope to see you there soon.
