Hi, everybody.
Welcome to CES 2019.
We are going to showcase and
take you through the demos
that Texas Instruments has put
together with our partner, D3
Engineering.
Scott is here as well.
Scott is going to take
you through all the demos.
What we have running is demos
on front camera and analytics
for ADAS that enable up to
level three class of automation
to drive vehicle safely.
We are also going
to showcase parking
assist items as well
as surround view.
And we are going to take you
through the radar and camera
fusion items that are running
on TDA, Jacinto, and AWR devices
as well.
Over to you, Scott.
Thanks a lot, Aish.
I'm Scott Reardon
with D3 Engineering.
I'm excited to be here at CES.
Our demo vehicle is
loaded with TI technology.
As Aish said, there's
a handful of demos.
And we'll walk you around the
car and show you these demos.
So we have a handful of sensors
on the front of the car here.
All are fed back to the compute
bed in the back of the car,
but this has a two
megapixel camera from Sony
that's stitched into
four cameras like it
the car for surround view.
We have a forward radar, or
a mid range radar, an 1843,
a new device from TI.
It does forward radar.
It does clustering
algorithms all on chip.
And then we have
multiple radars that
are stitched together to
create a 360 degree radar
bubble around the car.
It can be used for parking.
It can be used for
autonomous control and such.
So there's two camera
forward camera demos
inside the windshield here.
One is a 1.2 megapixel.
One is a 1.7 megapixel.
They're running a Ford
single camera algorithms,
such as lane departure,
pedestrian detect, vehicle
detect, traffic
sign recognition.
On the side here
under the mirror
is a mirror placement
camera, low latency camera,
shooting back in blind spot
area with a live display
inside the car.
OK, these two sensors here
are part of the surround radar
system.
Again, they feed in and give
a 360 degree bubble of radar
around the car.
On the back of the car is
the last of the surround view
sensors, again, a two megapixel.
And then inside the back are all
the different processor boxes.
These are rugged ECU
boxes, development kits,
available from D3, reference
designs as starting points
for either automotive,
industrial vehicle,
or robotics.
Anything in the
autonomous or ADAS space.
Some are based on TI's
TDA three processor.
Other ones such as this
are based on the TDA2
and a recent addition
is the TDA 2 plus.
They can take anywhere from
four to eight SerDes inputs
in, so cameras and radar.
They also have other
inputs, CAN bus.
They're automotive rated parts.
It's 12 volt automotive input.
So it's safe to
put in a vehicle.
They're weather tight.
They have a debug panel.
They're really good
places to get started
either on an A sample
or R&D projects
on automotive, robotics,
or industrial vehicles.
This is a 1843 demo.
This is a brand new
millimeter wave radar
part from Texas Instruments in
the 77 to 81 gigahertz range.
It has three transmitters,
four receivers.
And what you're looking at is a
ruggedized version development
kit from D3 that allows
you to go on vehicle,
plugs in 12 volts, and
can run out of the box.
It's actually, all the
algorithm is run on chips.
So this is a radar device that
has the RF section, the FFT
generator, and a DSP
processor on chip.
This gives you three
dimensional data.
So the plots on the lower right
is showing actually a z-axis,
so how high the cars are,
or maybe reflections off
lights or signs that you
would see, radar reflections.
The upper right hand
plot is Doppler.
It tells you the speed
of the objects coming.
All right, so this is
a forward camera demo.
Running on a rugged ECU from
D3 with ruggedized cameras.
And the algorithm is provided
by one of our algorithm
partners, Hella.
This is a forward camera,
single camera algorithm.
It does, as you can
see, vehicle detection.
It'll do sign recognition.
It'll do lane markings.
It'll do standard forward
camera algorithms.
All runs on a TDA 2
processor and available
from Hella Aglaia.
This is a forward camera
algorithm from one
of our partners, Stradvision.
Again, this runs
on a D3 rugged ECU
with using some of our
ruggedized cameras.
Our algorithm
partner Stradvision,
again, provides four
camera algorithms.
They're doing lane detection.
They're doing object
detection of vehicles,
people, classification.
As you can see on the
graphics, the boxes
tell you the size of the
object, distance to object.
You can see the lane
markers come up live.
And this is running
on a TDA 2 processor
in a rugged on vehicle
platform available from D3.
So this is a
multiple radar demo.
We call it occupancy grid.
What you're seeing on
the screen and outside is
we have six radars, all
connected over CAN bus
to a centralized processor.
And we're essentially
mapping out
where everything is
in the parking lot.
So you'll see as
we drive around,
that fronts and backs
of cars get filled in.
You'll see empty parking
spots become obvious.
You'll actually even see
the side of the building
that we're near fill in.
And this is from, again,
six millimeter wave
devices around the vehicle.
1843s, a new part from TI.
They actually do the
processing out on the edge.
This is a part that has a
DSP as well as an FTE engine
as well as the RF engine
all on a single chip.
Then over CAN bus, we
send the solution back
to a centralized processor.
Uses an IMU and a GPS.
And we mucks all this
data together in a system
that produces this output.
So you're seeing
now surround view.
This is actually two
megapixel surround view,
so we have four two megapixel
cameras around the car,
one in the front, two under
the mirrors, one in the rear.
And they're wide angle 190
degree field of view cameras.
It gives you a 360 degree
bubble around the car,
situational awareness
of the video.
And then what we can do is we
can output that and combine it
together in any way we want.
The main mathematics behind how
this works is photogrammetry.
It allows you to stitch all
the different cameras together
and then move the user around
within the frame of the video.
The surround view
algorithm is provided
by one of our algorithm
partners, Cogent Embedded.
They make a production
quality surround view.
It also has features on it, such
as pedestrian detect, parking
assist.
And you can see that the
steering lines and other things
are hooked up to the CAN bus.
All that overlay
information is provided
in that graphics library.
The demo you're seeing now is
a CMS or mirror placement demo.
This is new technology
that will either augment
what you see in your mirrors
or replace your mirrors,
so both side mirrors
and rear view mirror.
It has to be low latency.
It has to be safe.
You're seeing a OVT10640
camera on the side
of our car running
in a low latency mode
using a TDA processor.
Thanks a lot for looking
at the car today.
These demos are available on
our website from D3 Engineering.
They're reference kits you could
buy today and put on a vehicle.
And lots of good technology
from TI here at CES.
Thank you, Scott.
Thanks for taking us
through all the demos
that are running on the car.
The demos enable totally
scalable safety solutions
on the road on TDA ADAS.
So please have fun, leave us
some feedback, and use TDA.
Thank you.
Thanks.
