WAYNE PIEKARSKI: Hi everyone.
Welcome to this session
on Android Auto Media
and Messaging Support.
So you would have heard in
the keynote today from Daniel
[INAUDIBLE] that-- he was
talking about Android Auto
and all the features it offers.
So today what we're
going to do is
we're going to
talk about how you
can go and take
those same features
and put them into your app.
And it's really easy.
Typically, in most cases, it's
a couple extra lines of code,
and then you're good to go.
So that's the nice
thing about Auto
is that you don't need
to rewrite your app.
So it takes advantage of your
existing Android knowledge,
but you're just adding small
extensions to your code.
So let's quickly talk about
how Android Auto works.
Remember, Android Auto runs
completely on your phone.
So in this case here you
can see I've got a phone,
and then I've got a Pioneer head
unit which is in a, sort of,
fake car simulator here,
and there's a USB cable
that connects the two together.
Now it's very
important to realize
that the user plugs the
phone into their car,
but everything
runs on the phone.
So the graphics
rendering happens
on the phone, all the code
that you run, everything
is on the phone itself.
The head unit is just a
display that shows pixels on it
and plays audio.
So all the smarts
is in the phone.
Because if you think about it,
Android's been out since 2008,
and that's still less
than 10 years ago.
Now, think how much stuff
has changed in 10 years.
People typically keep
their cars around
for 10, 15, 20 years-- we want
this stuff to have a long road
map, and we can't keep
pushing updates out
and things like that.
So this way the head unit can
have a fixed, simple protocol.
Everything runs in the phone
and as the user upgrades
their phone, adds more
CPU processing and things
like that, Android Auto
can move along with that.
So that's kind of how it works.
So with that in
mind, now we're going
to talk about what it
means for your app.
So the big win for
Android Auto is
that it uses a templated
user interface.
So you can see here
we've got Maps running,
but the main thing we're
going to talk about today
is music and media.
And we're going
to talk about how
your app can fit within those.
Now, the thing with
putting displays in a car
is there's a lot of
legal issues around it.
There's a lot of safety
issues, driving distraction--
you want to make sure the user
is focused on the road and not
on their screens.
And in order to get
an app published
many years ago in
other automakers' cars,
you had to spend
a lot of money--
and we're talking tens
of thousands of dollars.
You had to spend a lot of time.
You had to get them
certified and tested.
It's really expensive.
It can take years, or maybe
never, to get your app out.
And so what we did
with Auto is we
built a set of
templates for music
and messaging that we
can test, and then you're
allowed to plug text into it and
plug images in, and so forth.
But the templates have been
pre-tested and pre-approved,
which means that you can publish
your app on the Play Store,
and literally within hours the
app is now in someone's phone,
ready to use in a car.
So we've taken all of these
long deadlines and huge costs,
and we've basically made them
nothing so that you can get
your apps out really quickly.
Now, there are some restrictions
because they're templates.
They're limited.
You can't play video
in the background,
you can't do things like that.
That would be dangerous.
And so we've given you this
platform that really gives you
a quick path to getting your
apps into people's automobiles.
So remember, the
UI of Android Auto
is based on speech
recognition, and we're
going to show that
in a sec, so that way
you can keep your hands on the
wheel and the eyes on the road.
So with that in mind, what
does this all look like?
Let's talk about
messaging first.
So there's countless
accidents every year
caused by people
being distracted.
They're texting in their cars,
they're doing things like that.
So with Android Auto, we're
taking that and we're making it
speech enabled,
and we're providing
an API that allows you to hook
in and enable the same auto
support in your messaging app.
So Android Auto right now
supports SMS and Hangouts,
and now we're going to show you
how to extend your app, too.
So you can see in
this screenshot here,
when a message comes in you can
see it pops down on the screen,
and then it gets
played back as audio.
So now what we'll do is I'll
show you a quick little demo.
So you can see here I've
got my Pioneer unit,
and on the screen here it's
the generic UI of whatever I'm
doing.
But I can flip between
Maps or whatever I want.
But now, if I use
my little testing
app here to generate a fake
message-- so on the phone
here we have a messaging
sample, and this sample's
available in Android Studio.
So what I'm going to
do is I'm going to send
one fake message to myself.
So it appears.
Click on it.
Here's the message--
"can you give me call?"
You can press the voice
button and say, "reply."
So then you can press it.
So you can see there that the
text message was generated
on the phone, and then it
appeared on the Auto head unit.
And so everything's
done for you,
you don't have to
worry about it.
It's all based on notifications.
So now I'm going to
show you how to go
about adding that support
to your messaging app that
does notifications.
So the first step is that you
have to change your Android
Manifest file so that it
has a section in there
for Android Auto.
So you can see we've
got a section there
that says we're going to
support Android.gms.car.
We add this
automotive_app_desc, and then we
create an XML file called
automotive_app_desc
that defines the
type of app we have.
And we're doing a
notification-style app
so you got to put that in.
And once you put that in
your app is now enabled,
and the Android Auto app
that runs on the phone
now can see your app.
It goes, OK, I know to deal
with your notifications.
So the next step.
If you're writing a
messaging app right now,
you're already
generating notifications
whenever a message comes in.
So a message comes in, you use
NotificationCompat.Builder,
and you credit notification with
an icon, some text, a title,
and so forth.
So this is the code
that you already have,
and every sample for messaging
and every notification sample
we have has this
basic structure.
The trick to adding Auto
support is you call .extend,
and then you create
a CarExtender object,
and you then add a few
extra method calls on that.
And the key thing here is you
call setUnreadConersation,
and you pass an
unreadConvBuilder in,
which we're going to
do on the next slide.
So that's the first step.
So we've added this line called
extend with the CarExtender,
and then finally, here is
the unreadConvBuilder object.
So you can see here, we call
newUnreadConversation.Builder
and we pass in a
series of fields.
So these are fields that are
specific to Android Auto,
and you're going
to put in things
like the names of the person
who sent the message, the time
stamp for it, and that is
used to show it on the screen
and so it can keep track of
when the message arrived.
And then also we have two
intents that we declare here.
The readPendingIntent is
called by Android Auto
when your message has been
received on the head unit.
So when the person hears
the audio and goes,
oh OK, I've heard
it, that intent
is called to tell your app
hey, the message was read.
And that's useful if
you're in a messaging app
because you can then
change an indicator to say,
yeah, the message
was read correctly.
The next thing is you
set the reply action,
and this intent is called
when we've spoken a text
reply into the head unit-- hi,
I'm running 5 minutes late--
and it then sends the text.
That intent is called,
and it actually
gives you the string of
the text that was spoken.
So your app doesn't need
to think about any speech
recognition, it doesn't need
to focus on speech playback.
All it needs to do is
provide two intents,
and they're called, and
you get some strings
that you can then
pass on to your code
or whatever you're doing.
So it's really simple.
With just these two
little snippets of code
you can add messaging support
to your app pretty quickly.
And so in Android
Studio, there is a sample
called messaging sample
that is the one I just
used on the phone.
You can use that to test out
this kind of functionality,
and it shows you how
to do it, and then
you can see how to retrofit
it into your own app.
So that's it.
So the next important part of
Android Auto is media playback.
And so here we've got the
Google Play Music app,
and you can see here
on the little demo
we can go to the
media selector and we
can pick Google Play Music.
So this is an app that's
running on my phone.
Now, what I'm going
to do is I'm going
to show you how to build
your own music sample,
and there is a sample called
Universal Music Player, which
is also available
in Android Studio.
This is a really nice
sample because it shows you
how to build a
complete media app.
So it shows you how to stream
music from a server somewhere,
and it has a presentation
display for Android Auto,
but it also works on Android TV.
It also does Chromecast,
and it does a whole bunch
of different stuff
and it shows you
how to fit all these
pieces in together.
I actually used it to make my
own media player for my music
that I keep at home.
It's a really great sample
to get started with,
and it's sort of a best
practice of how to make
a really nice music player.
So that's definitely
a great sample to do.
So now what I'll do is I'll just
show you what it looks like.
So you've selected Universal
Media Player from here.
You then go pick from genres.
We'll just pick something
at the top-- "Awakening."
Streaming if off the web,
music starts playing out.
You can pick next
track, next song plays.
So this is really easy,
and if we're in a real car,
we'd have steering
wheel controls,
it would also work on
these buttons, as well.
But the point is I'm not
touching a phone here.
I'm touching just the head
unit in my car, steering wheel
controls, as well.
We're totally not touching
the phone, as well,
which makes it really
good for driving,
and it's a really nice
user interface for that.
So that's the music app.
Now we're going to go through
the nuts and bolts of how
one goes about getting your
music app ready for talking
to Android Auto.
So here's how the music app kind
of fits into the whole thing.
So you've got Android Auto,
the display visible in the car,
but everything in
Android Auto is
done by the Android Auto app.
So this app is
provided by Google.
It runs on your phone.
This is what draws the graphics
and has the templates in it
and things like that.
Now, if you look in
the UI of the car,
there is this display
that we just showed you
where we picked the song from.
So that's drawn by
the Android Auto app,
but it needs to
get the information
to show to the user.
So it needs to somehow
talk to your app
to query it and get the
music library out of it.
So what it does is
it talks to your app.
So this is my audio app here.
It talks to a media
browser service
that you define in your app.
And you have a tree structure
of music that you have,
so you can have folders and
you can have music files in it.
And there is an API which I'll
talk about in the sec which
allows you to communicate
this tree structure back
to the Android Auto app.
And once it's queried
your database of music,
it can then present this UI.
And then once that's done,
the user can pick a song,
and then that
information is then
sent back and forth between
Android Auto app and your app
here.
So there's this media session
API which the Android Auto
app can make calls into your app
that says, hey, the user just
picked song number 52.
Start playing it.
And then you start
playing the song.
And then also, when you press
controls on the steering
wheel-- next track,
previous track--
those are all sent via
these callbacks, as well.
So we're going to go
through the details of that.
And so this display here, all
those buttons, the events,
are sent to your app.
And also, your app
has the ability
to send information
back to say things like,
hey, I know this
song's 4 minutes long,
and we're currently
50% of the way through.
You can update the
little progress bar.
You have the ability to
set the title of the song,
and also set the background
image using metadata.
So there's this back and
forth between the Android Auto
app and your app,
but the point is
is your app just has to
provide that information,
Android Auto takes care
of everything else.
So when we introduced
Android Auto,
we also introduced it during
the time of Lollipop, as well.
So in Lollipop there was a
whole new media framework
that was added that was built
for supporting Android Auto,
but also for supporting other
media interfaces, as well.
And so some of the classes were
MediaSession, MediaController,
that's what it's
built up around.
Now, before Lollipop,
everything was
based on RemoteControlClient.
Now, RemoteControlClient is
what presents the lock screen.
It appears on the lock
screen of your phone
and you have the
ability to change
back and forth between songs.
So what you need
to do is you need
to migrate your app
from RemoteControlClient
over to this new
framework, and then you
get Android Auto support,
the lock screen still works,
but you also get other
platforms, as well, for free.
So for example Android
Wear has the ability
to understand media
callbacks, as well.
If you're playing a
song on your phone,
the actual media information
will appear on your watch,
as well, and you can pick
next track, previous track,
browse through the library.
So we're not just
adding this framework
to our code to support Android
Auto, we're adding it for Wear,
and any other platforms that
Google might add in the future.
So it's a generic
media framework
that's really useful for a
whole bunch of different things,
and so this is the
future way going forward.
So how does the media API work?
So let's talk about how the
MediaSession support works.
So what happens is that when
we start to play a song,
the MediaSession communicates
metadata to the head unit.
So it says, hey, we've
got the song title,
we've got the current
playback state-- are
we play, paused, how far are we
through the song, and so forth.
And we then have controls
on the unit itself.
Whenever someone
presses those buttons,
they're sent back to the
media app on this end.
And the Android framework
is all under the hood
here, so it's responsible
for handling all the IPC.
You don't have to deal with
any networking code or anything
like that.
It's all kind of done
magically in the back end,
and it allows us to change
the implementation of things
without you having
to rewrite your code.
So how do we go back
creating a MediaSession?
So in your music
app you're going
to create a new MediaSession
object, you make it active,
and then you have to
grab a session from this
and then store it.
And we're going to
use that later on.
So the next thing we need to
do is get the metadata over
to the Android Auto app.
So you create a metadata
object with this builder.
You then put some strings
in-- title, artist, duration.
There's a whole bunch
of different tags
that are available
for different things.
So you fill out the
information that you have.
As I said before, you can
also do background images.
So it allows you to make
nice, beautiful artwork
for the background, as well, so
it's not just limited to text.
And then we take that
builder, we call build on it,
and then we pass it to set
metadata, which then puts it
into the session object.
And then pretty
much after that we
don't have to worry about it.
The next part is the
playback state-- play, pause,
things like that.
Once again, you use
a builder object.
We're currently playing.
We're setting a speed here.
And then we're also able to
specify what kind of controls
are available right now.
So we're saying OK, we want
to support play and pause,
but I don't want to
support other things.
You can control what
controls appear.
We'll show you another
screenshot in a little bit.
Now once again, build the
state, put it in here.
Now it's in the session.
So the stuff's pretty easy.
The API's very simple,
there's nothing
too complicated about it.
You don't have to write
tons of boilerplate code.
Makes it quite nice.
Now finally, the
controls that come back.
So we created a
MediaSession.Callback,
and inside we fill out methods
for all the different events
that we're interested in--
on play, pause, skip track,
and so forth.
So these methods will be called
when the user presses pause,
play, whatever.
And you can put whatever
code you want in here.
And so also if a new song is
selected, things like that.
So now let's talk about
the browser implementation.
So those are the controls.
We've worked out how to
send messages back and forth
that change tracks,
and so forth,
but now we need to actually
tell the Auto unit what
our music library looks like.
And this is where
it gets interesting,
because with Android Auto, you
can ask it with voice queries.
You can say, hey, play me a
song with jazz music in it.
And it will go through your
tree and find something
that's got jazz in
it as its genre.
And so it has the ability to
search your database of music
in your app.
So that's done with the
MediaBrowserService.
So MediaBrowserService
is we provide
a class for that which
you extend, and then
you have to take the
SessionToken that we created
earlier when we set
up the previous code,
and we store that in here.
And that then allows
everything to be connected up.
So the SessionToken is the glue
that brings it all together.
And now the next method
that we need to implement
is called onGetRoot.
So when the Android
Auto app connects
to your app for the first
time, it calls onGetRoot,
and your app needs to decide
what to do with this request.
Now it's an opportunity to
validate if whoever's calling
you is allowed to
talk to your app.
So you might have a
protected music library
that you don't want to share
with every app on the phone,
because this mechanism--
any app could request, hey,
grab the music library and
start playing it or whatever.
But you might only
want to restrict it
to your app, plus the
Android Auto and Wear apps.
And you have the
ability to do that here.
So in the samples that we
provide we provide some sample
code for a
PackageValidator class
that takes in the package name,
the UID, and the current class,
and it verifies that
the signature matches
what's expected so that
some rogue app isn't
talking to your code.
Now you don't have to put that
in if you don't want to but,
it's recommended that you do.
And it's very important
that you do this correctly,
because if you mess up
the packet verification,
it might be that you haven't
secured your app correctly.
So make sure you really
do look at the samples
to get this right.
Finally, the next
step is you return
the root of your media tree.
So your media is
stored in a hierarchy,
you're returning this browser
root node that's the top.
And later on we're going
to return this back,
and then it's going to
come back to our code,
and we're going to have to start
returning the children of it.
So with that in mind,
once onGetRoot's returned,
the unloadChildren
method is what
is called when someone's
trying to query our database.
The root is initially
passed in with this ID here,
and then we have to
return a result back.
So you can see here that
we're doing result.sendResult,
and we're parsing
in some media data
of the first node of our tree.
You have the ability when
you specify your media
items to specify what's called a
playable or a browsable object.
A browsable object
means that this object
has further children.
A playable object
is a piece of media
that can be played right now.
So it's just a way of
differentiating the nodes
in the tree.
So you do that.
Now, you might have
an app that is getting
its media from the
internet, so it
might be a slow process to go
and fetch this information.
Now, this unloadChildren
method is designed
to return back very quickly.
If you can't return back
instantly a result, what
you should do is detach, load
the children in the background,
and then return that result
later in this onComplete method
here.
And it returns asynchronously.
So if media's in local storage,
do the first method here.
If you're on the network,
do the second one
because you don't want to
have a laggy experience.
Because otherwise,
it'll slow down
the experience in the head
unit-- bad for the user.
And especially when
they're driving.
They're going to be super
frustrated if they're
trying to press a button
and it's not working
while they're driving a car.
So super important that
you get this right.
So here is the final chart that
summarizes the back and forth
that happens with Android Auto.
So we pick the music app.
It then calls onGetRoot in
our media browser service.
The SessionTokens
go back and forth.
The Android Auto app then
calls unloadChildren to get
the music.
And then once that's done,
someone selects music
from the display
on the head unit,
it goes through to
one of your callbacks.
And then once your
media starts playing,
you then send these
metaDataChanged
and playbackStateChanged
back to the head unit.
So that's the only true back
and forth that happened.
You fill out those
methods, and then you're
good to go, basically.
So, finally, a few little
quick customisations here.
So, obviously, as
an app developer
you want to be able to
differentiate your app
from other apps on the system.
So you have the ability to
change the background artwork,
so this image here is coming
from the music in this case.
You also have the
ability to change
the colors of everything.
So there's a primary color and a
secondary color you can adjust,
and you can see here that
this app is using orange which
allows you to control the
progress bars and the icons
and so forth.
You can change that to
whatever you want, as well.
And finally, you
have the ability
to customize the controls.
So in this case up here,
we showed this a little bit
earlier with setActions.
We're saying OK, our app
supports play, pause, and skip
to next, but that's it.
So you can see that
this icon is not there.
Now you might not have
a traditional music app.
So some music apps don't have
the ability to fast forward,
and it might be
that it's a radio
station so there's no concept
of skipping to the next track.
So if this is the
case, in this example
here you can parse in a list,
and you can create custom icons
like go back 10 seconds,
go forward 30 seconds,
whatever it is that
you want to do.
You can create these icons
and then you can put them in
and it'll take care of it.
So the apps are reviewed as
part of the submission process,
so don't try to do things
like playing movies
or anything crazy in the icons.
You're not allowed to
play videos to the user,
you're not allowed to play
videos in the background
or anything like that.
If you try to do it, your
app will get rejected
and then you won't be able to
get it out to the Play Store.
So don't do things like that.
So that's kind of a
quick little summary
of how to go about putting
media support into your app.
Now that's basically it.
So we've shown how to take a
media app, a messaging app,
and we have samples
for these things.
So if you look in
Android Studio,
we have the Universal
Music Player, which
if you go to Android
Studio, new, sample,
you can select that
and just hit build,
go straight to your device,
and you're good to go.
We have the messaging service
sample which shows you
how to do the messaging.
And I've got this head unit here
which is like a car simulator,
but obviously you don't
want to have to carry
this around everywhere.
So we have what's called
the desktop head unit, which
is a simulator that
runs on your laptop
that you connect
up to your phone,
and it allows you to
have an Android Auto
experience on your desktop
or your laptop screen,
and it allows you to
test your app completely.
And you can do the speech
and all that kind of stuff.
It's quite nice.
So give that a try if you
want to test your app out.
And then once that's ready, you
submit it to the Play Store,
and then you're ready to go.
So if you want more
help with this,
we have a Udacity class
that I created a while back.
It has a really
nice summary of how
to get started with Android
Auto and some of the things
that we talked
about in this class.
And it actually goes
through step-by-step
of how to take a messaging
sample and add the code to it,
so you can give that a try.
And we also have an Android
Auto Developer's Forum on G+.
So I run this forum--
g.co/androidautodev--
and that's a great place to
go if you've got questions
or if you want to
discuss problems,
and me or other people on the
team are available to answer
your questions.
And then finally, follow
me old Google+, as well.
I give updates
about Android Auto,
but I also do a lot of work on
Brillo and Weave and wearables
and other things like that,
so follow me on Google+ if you
want to know more about
what's going on in this area.
So that's the end
of the talk today.
So thank you very
much, and then we
can have some time
for questions.
Thank you.
Yeah, question.
AUDIENCE: Yeah, so especially
with SUVS and stuff
like that, if you wanted
to make an app so that it's
easy to be able to put
on movies for your kids,
they don't have a [INAUDIBLE]
for anything like that?
WAYNE PIEKARSKI: Yeah, so
I'll repeat the question.
So the question was is if
you've got an SUV with screens
around the car, how can you
play movies for the kids?
And so right now we don't
support anything like that.
That's a whole separate problem.
The current implementation
of what we showed
is for the head unit of the car.
It would be very different, so
it's not currently supported.
Question.
AUDIENCE: So under the cover,
what exactly is that thing?
Is it just a touch screen?
Is it [INAUDIBLE]
audio [INAUDIBLE]?
WAYNE PIEKARSKI: So this
here, Pioneer made this unit.
So this is designed to drop
into pretty much any car.
So most cars have what's called
a DIN, it's like a DIN port
or whatever they call it, and so
there's usually a plastic panel
or an existing CD player.
So you pop the dash off,
unscrew the old CD player,
you throw it out.
This thing slides in
there, and it's a box.
This one has a CD player
behind it and stuff like that.
It drops in, you connect
up your existing speakers
and everything, and it's
designed to replace the CD
player in your car.
So I use one of these.
This around here is a fake
car simulator, basically.
So we have a 12-volt power
supply, we have an amp,
we have some
speakers, and we have
everything necessary
to basically provide
a fake automobile.
It probably even has a
parking-- oh, here you go.
This is a parking
brake switch so you
can pretend you're pulling the
parking brake lever in the car.
So we built this is as
a way of demonstrating
how to use one of these
units without having
to go outside and sit in a car.
For most developers, you
want to use the desktop head
unit because you can
just sit at your desk
and test your app out.
You don't want to have
to go out to a car
every time you want
to try an App out.
But you can buy one
of these for your car.
OEMs are introducing
cars that are gradually
coming out which have
Android Auto support built
into the car.
But I bought a car, for example,
that had an old school CD
player in it.
I just ripped it out,
put one of these in,
and now my car is Auto-enabled.
So that's a great way of adding
Auto support to an existing
vehicle.
AUDIENCE: What's the dimension
[INAUDIBLE] desktop head unit?
WAYNE PIEKARSKI: Well,
that's limited to your desk.
It's pretty much identical.
There it does the
speech, it does
everything basically the same.
So from a debugging
perspective, if your app
works in a desktop
head unit, it's
probably going to work
fine in one of these, too.
Yeah.
AUDIENCE: So what kind
of access does the app
allow you [INAUDIBLE] on
the car, like at a speed.
Can you read and write,
change the speed or something?
WAYNE PIEKARSKI: All
right, so the question
was how do you get access to the
information about the vehicle?
So right now the
Android Auto app
does not provide lots of
information like that.
However, that doesn't matter
so much because your app is
running on the phone.
You have the ability to fire up
the GPS or any of the sensors
just like you would, because
it's just a phone app.
So if you want to know
the current speed,
you can just ask the GPS or
the location library for that.
AUDIENCE: Actually
what I mean was
can you do diagnostic
or something like that?
WAYNE PIEKARSKI: No, so if you
wanted to do a diagnostic app,
that's a whole separate thing.
You can buy these little
ODB2 scanner plug-ins
that you plug into your car.
It has Bluetooth on it.
You can then pair
your phone up to it.
But that would be an app
that runs on your phone.
It wouldn't fit within the
Android Auto experience
because you can't draw gauges
or dials on this screen.
It's a templated UI, so it's
just media and just messaging.
You're not legally allowed
to put a random movie
or graphic display on here
so we don't allow that.
And that's what allows you
to get your app to the Play
Store in a couple hours.
It's a whole separate--
so in exchange
for using the templated UI,
you get fast access to the Play
Store and to your
customers base for free.
Yeah, question.
AUDIENCE: So on
Chromecast you can now
cast to multi rooms, et cetera.
Could you imagine
this as at some point
this might be kind of like a
head unit and then a rear unit,
or [INAUDIBLE].
Is anything going to happen
because people take a rear unit
and still put it up in
the front of their car
and it would kind of bypass it?
WAYNE PIEKARSKI:
Yeah, I don't know
what-- I can't talk about the
plans for what they're planning
on doing in the future.
AUDIENCE: [INAUDIBLE] on a
templated kind of presentation
and [INAUDIBLE]
graphics [INAUDIBLE].
WAYNE PIEKARSKI: Yeah,
it's currently templated.
So the nice thing,
though, is let's
say you're making a media app.
So now, as long as you
implement the API correctly,
hypothetically in the future,
if someone else builds
a plugin that plugs in to
this API, your app just works.
And that's actually the cool
thing is the wear people,
for example, they
implemented this API.
It just works.
So if, hypothetically,
the second display
was to come out, which
I'm not promising but,
hypothetically, if it
happened, if it used this API,
you don't have to do anything.
So that's the cool part
is you write the API
and whatever functionality
is available,
it will try to use it.
AUDIENCE: OK, did
I catch you right
as you said as that, in theory,
is if your app is written right
as it says, it presents here
on the head unit that there's
a song, it also will show as a
notification on my Wear watch?
WAYNE PIEKARSKI: Yes.
AUDIENCE: At the same time?
WAYNE PIEKARSKI: Yeah.
AUDIENCE: And if I update
there then it literally will--
WAYNE PIEKARSKI: Oh yeah,
they're all kept in sync.
So your watch, the car,
and your phone are in sync.
So if you disconnect
the phone and then go
for a walk out of
a car, your watch
will show the current song and
your phone will still play it.
So you can be
listening to something
before you get into the car.
You plug it in, it's sort of all
kept in sync with each other.
And your watch will show it.
So you can use whatever is
convenient for you at the time,
so it's kind of nice.
Yeah.
AUDIENCE: Why can't
the Android Auto app
work without a head unit?
WAYNE PIEKARSKI: Why
can't the Android Auto
app work without a head unit?
OK, the thing is a lot of it is
to do with safety regulations,
as well.
I take it you want
to, like, Velcro
a Nexus tablet to your
dash or something, right?
AUDIENCE: Well, I've got a mount
that I use my phone as a GPS.
I want this interface without
paying $700 for a [INAUDIBLE].
WAYNE PIEKARSKI:
Yes, a lot of it
is to do with safety
regulations, as well.
And the thing is is
car manufacturers,
they put their DIN port
and all this stuff.
It's all tested and whatever.
And it's just dangerous
to have people sticking
a phone on their steering
wheel, or wherever they might
want to do that kind of stuff.
So yeah, these things
are limited due to a lot
of safety and legal reasons.
Question.
AUDIENCE: You were talking about
the approvals for the media
player apps.
Is there a similar approval
for the notification?
WAYNE PIEKARSKI:
Yeah, so the question
was is there approval for
messaging as well as media?
They all have to be approved.
So what happens is when
you do the automotive app
desk to the XML file, you
declare notifications or media.
Those are the two current
types that you declare.
Any of those apps that want to
go through the Play Store, both
of them have to be approved.
And your app won't work
unless it has been approved
through the Play Store.
So you can't side load them.
You can't hand
out apps to people
and expect them to
work in their cars.
It doesn't work like that.
So the app has to go
through the approval.
So it's done for both.
And in most cases
there's not too much
crazy stuff you can do.
If you stick to the specs,
your app will be approved.
It's if you're
doing crazy things
like trying to keep
the screen alive,
or trying to spam too many
messages or things like that.
That's when you'll
have a problem.
AUDIENCE: Are there
guidelines [INAUDIBLE]?
WAYNE PIEKARSKI: Yeah, so
the Android Auto website--
developers.android.com/auto
has a whole bunch of guidelines
on when you submit
to the app store,
there are the kind of things
that we look for, and make sure
you do these-- think
about these concepts.
There's a bunch
more information.
So today I covered mainly
the technical aspects,
but there's a lot
more information there
that really talks
about the motivations
and how to go about doing it.
One more question.
Yeah.
AUDIENCE: What about video ads?
WAYNE PIEKARSKI:
What about video ads?
Well, like I said
before, you can't
play any video of any kind
because video requires the user
to look at the screen.
So any video is bad.
All you've got is
the background image.
You can't bring up a
full-screen video so no,
it's definitely not allowed.
OK, we've run out of
time for questions,
and we actually have Android
Auto office hours over
in the code lab room.
So we're going to have Daniel
[INAUDIBLE] over there,
I'm going to be there.
And so please come over
there and have a chat with us
if you have any more questions.
But otherwise thanks
for coming, and I hope
you have a great conference.
Thank you.
