Hi, everyone.
Welcome to the State
of Media Playback.
We'll be talking
today about how to do
media playback the right way.
So things have
changed quite a bit
over time, from the earliest
Android days to now.
But we've come up with
a pretty good solution
on a lot of the
things you actually
do need to do to really get the
best experience out of media
playback.
And so we'll be talking
through a lot of the best
practices around all of that.
A little bit about me-- I've
been doing Android development
for about five years now.
I started doing it
in my spare time
and then moved to doing it
full time, first at Funware
and then at Facebook.
And now I work at Google
as a developer advocate.
So I'm working on both the
advanced Android Udacity
course, which is online
learning to get your learn on,
as well as a new Android
development pattern
series of kind of going into
all of these best practices.
My focus has primarily been on
the framework and the Support
Library.
And I actually
work really closely
with the Media
Framework Team as well
to write a lot of these things
I'm going to be talking about.
Now, media playback can mean a
lot to lot of different people.
So today, we're
going to be focusing
on just one part of that--
specifically, audio playback.
Believe me, there's enough
in the land of media
that we could cover
a whole presentation,
a whole conference, about it.
But for this, we'll be focusing
solely on audio playback.
For video, you might
consider looking
at the CastCompanionLibrary.
It's up on GitHub.
It's actually a
really good example
of a lot of the best practices
around video playback,
especially with interacting
with Google Cast devices.
So when we talk about
any app, we really
want to start with the user.
They are the most
important part of the app.
It's really our
job as developers
to give them the best
experience that we can.
And that really starts
with what they're
expecting from your app.
That starts with
the description,
and screenshots, and goes
into the actual functionality
of the app.
And what we're trying to
do is rather than just meet
those expectations,
we're really trying
to exceed those expectations.
The worst thing that
we can do to users
is have them expect that, oh,
this is supposed to do this.
But then all of a sudden, your
app, unlike every other app,
doesn't do it the right way.
And for media apps,
especially for audio playback,
there are a lot
of things that do
come as kind of pre-conditions
and expectations from the user.
And of course for audio apps,
the first user expectation
is really background playback.
Or if you've ever
had a media app--
like you hit the Home button
and it just stops playing, even
for video apps, like
YouTube for example,
now they also support
that background playback.
So for audio apps
particularly, this
is one of those
base requirements.
And when we think about
background playback,
we should really already
be thinking about services.
These are the top-level,
Android component specifically
for doing work in
the background that
isn't tied to an activity or a
short-lived broadcast receiver.
So it really is
the natural choice
for handling all
of our media state.
So what actually goes into
media playback at just
the 10,000-foot level?
A lot of this comes down to
this super-simplified version
of the events in media playback.
So for every service
that's running, eventually
it's going to be created.
This may be the
start of playback
or when your app
first gets created.
But we have some
one-time initialization
in the created step.
And then eventually,
when the user actually
hits that Play
button, that's where
we're going to go into
the Playing state.
And Playing state
here really means
any time when we are
actually outputting audio.
So this is slightly different
from the Pause state, which
would be the reverse of any
time that we're ready to play,
or getting ready to play, but
not actually playing any music.
So from a Play/Pause
perspective,
obviously for any
audio app, we're
going to want to be
able to transition
back and forth
between those states.
But at some point
the user is probably
going to stop playback.
This may be something like
swiping away the notification,
or closing your
app, or rebooting
their device, or what have you.
In that case, we're going
to move to be Stopped state.
And this case is really the case
where we're explicitly closed.
And we can generally remove
all of our information
and say, hey, we're
done playing back.
And where we are going to then
transition to the Destroyed
state, which is where we
do that one time Cleanup
state of releasing
system resources,
stopping our service,
that type of thing.
There may be some difference
between Stopped and Destroyed,
and we'll get into that.
So it wouldn't be
much of an audio app
if it didn't
actually play audio.
That's kind of a big part of it.
So we'll be assuming that
you're using something
like MediaPlayer, which
is built into Android.
But the same talk works
actually extremely well
if you're using something
like [? XO ?] player
or any other third
party playback system.
All this is kind of agnostic
to that, and really building
on top of that to say,
OK, well, beyond just
let's play some audio, can
we do a little bit more?
But for MediaPlayer,
we can see it
has a pretty easy,
straightforward flow
for our different events states.
You'll create it.
We'll create a new media player.
And then, of course,
we'll prepare it and play.
When we go to
playing, we'll pause.
When we pause, we stop.
When we stop-- this
should be fairly
straightforward from the
media playback perspective.
And, well, we're done.
Media plays, pause, and
we have lots of time
back for getting
ready for lunch.
But maybe we should go into
a little bit more detail.
So we're really trying
to do it the right way.
In fact, just playing
audio in the background
with no controls, no
information, and no idea
of what the user is doing
is probably actually
a net negative for the user.
And we want to get
onto the positive side.
So what can we get to really
bump this up to at least an
acceptable experience rather
than something that is just
blasting music from your
phone with no controls
in the background that
they can control and have
to find your app to kill it?
Probably not the
best place to be.
So the first thing we
want to do is audio focus.
Now it really is one
of those keys of being
a great citizen on Android.
So it's the way of getting
that heads up to other apps
and to the system that you
want to play something,
that you are ready to play.
And at the same
point, it also means
that someone can't necessarily
take your audio focus.
What we don't want is we don't
want multiple apps playing back
on top of one another and
causing issues where you can't
understand either one of them.
Now, this is slightly distinct
from actually playing audio.
It's the intent to play audio.
So we'll want to
continue to have
audio focus any time we're
ready to play, or going to play,
or really taking on that
role of the main media
playback in an app.
So we can look at some code like
this to request audio focus.
We'll use Audio
Manager for this.
And then actually going through
and requesting audio focus--
in this case, looking
for the streamed music,
the main playback stream
for audio playback--
and hoping to gain audio focus.
Now one thing to note is
that almost all the time,
you'll be granted audio focus.
But there are a few exceptions
where you'll actually not
be granted audio focus.
One example is maybe when
the user is in a phone call.
And that probably isn't the
time to start blasting music
because they're actually talking
with another human, which
is great.
But it means that our media
playback should probably just
abort and say, you know what?
We didn't get audio focus,
so we shouldn't continue.
But most of the time, you'll
be able to proceed and play
your glorious music.
And then, of course, we
need to abandon audio focus
when we're stopping
playback, when we get
to that final Stopped state.
But what is that audio focus
change listener thing I just
put in here for no reason?
Well, it's actually how you
learn from other apps what's
going on in the system.
It's your link to other apps
and to the system saying, well,
someone else is
requesting audio focus.
And how does your app
actually react to that?
It's the callback system.
So you may be getting
an audio focus loss.
Now, this is pretty serious.
This means that the
other app is taking
over permanent control
of audio focus--
at least while it's the
last one to request it.
And your audio focus
isn't coming back.
You're done.
It's not your turn yet.
So in this case, we'll want
to move to the Stopped state.
You're done playing back audio.
They've moved on
to a different app.
Now, one thing to
consider is that maybe you
don't want to immediately
remove notifications, and delete
your service, and everything
immediately upon audio loss.
You could decide to wait
around for 30 seconds just
in case they accidentally
hit the wrong app.
But this is really one of
those user experience things
where you should
test with your users
on what they expect when they
accidentally hit another app.
For most Google apps, when
they do lose audio focus,
you'll see that they immediately
stop playback and awaits
any future concerns.
Now, there's another one
called loss transient.
Now in this case, it's
not a permanent loss.
It's just a short-term loss.
So this may be a case where
you're using something
like Google Maps, and it's
announcing like, oh, you're
going to be on time.
A temporary loss, but
you can expect it back.
So in this case, loss
transient means you should just
pause your media playback.
Now, there's the other one
you may be very familiar with,
which is the
LOSS_TRANSIENT_CAN_DUCK.
Now, this is the
basic-- like when
a notification sound comes in.
And in this case,
the expectation
is that you're going
to lower the volume.
You don't necessarily
need to pause completely.
But you just lower the volume
so that the other sound
to be heard clearly.
And then you'll regain
audio focus with audio gain.
Now, one thing to keep
in mind with the CAN_DUCK
is that you don't have
to lower your volume.
You can actually pause.
For example, if you're
a podcast playing out,
and you're having
spoken words, it's
very important
that the user hears
everything that's going on.
And maybe they don't want
Google Maps talking over them.
You can actually pause
for any of these events
and then resume when
you get audio gains.
So there aren't necessarily
requirements to do anything.
But in any case, you should be
respecting these audio focus
changes.
So what is our
updated life cycle?
So we can see here
now, we're requesting
focus when we start playback.
And we're removing
focus, abandoning focus,
when we're actually stopped.
Now, note this isn't tied
to playing audio or back.
And even with just
this, we already
are a lot better
citizen and working well
with the system and
other media apps.
But are we done?
We're done?
No.
All right.
So more to go, and
that's fine because these
are important things that
users are going to expect.
Now, one of the things--
and probably my favorite
named broadcast-- is the
becoming noisy broadcast, which
is exactly how it sounds.
It's actually when you're
listening to something
on headphones or a
Bluetooth headset,
and the Bluetooth headset
runs on batteries,
or the headphone
gets yanked out.
And all of a sudden, it's
blaring to the whole crowd,
to the whole audience, of
wow, that's really what
you're listening to right now.
Probably should have
used becoming noisy
to be able to pause
your playback.
So it's a really
nice way of saying,
OK, well, the user is expecting
that they are not blaring out
their music to everyone.
And we can register a becoming
noisy broadcast very simply
with register receiver, and
then, of course, unregistering.
Now in this case,
because this is
tied to not wanting to blare
music out to the world,
these events are going to be
tied into the actual Playing
and Pause state.
So only when we're
outputting audio
are we going to want to
register for the becoming noisy,
and then unregister when
we actually pause playback.
So slightly different here.
And actually at this point,
we're at a fairly decent state.
We're never going to be playing
audio when the user is not
expecting it.
And we're going to play well
with other apps that are doing
notifications or other things.
But we're not quite to the
best media playback experience.
In fact, we haven't talked
about controls at all.
And it's certainly one of
those very frustrating things
when you're trying to find
the Pause button, or the Play
button, or the Next Track
button because that track isn't
appropriate for everyone
who's in the car,
and you want that control
available as many places as
possible.
You don't want to
dive into an app
and find it in your
recents, or what have you.
So thankfully, Android
offers a lot of ways
to actually get
controls everywhere.
And one of the ones that's
most frustrating in our work
are headphones and
Bluetooth controls--
the times when you're
really listening to music,
and you don't necessarily
want to pull off your phone.
And they have
buttons on them now.
So we should probably
get those working.
And collectively, these
are called media buttons.
In fact, they're just
like any other button
that is hit on the system.
They're key events that
are sent to the system
and onto your apps.
And by default, the
system is actually
going to capture all of
these and then send them out
as a media button
broadcast, which, of course,
your app can then
receive and handle.
And you can build just a
simple broadcast receiver.
The button extracts a key
event and then does work on it.
And we found that basically
all of these receivers
were doing the exact same thing.
They need to look
at a key event,
and then extract from the
intent what the key event was,
and then somehow transfer
that to our service.
So we built it all for you.
We built a media button receiver
in the Support Library, version
23.1 and higher, so
that it kind of handles
a lot of the
boilerplate for you.
How does it work?
You have no code whatsoever.
It's just a few
manifest entries.
You'll add it to your manifest.
You'll note our media
button intent filter here.
And then, instead of writing
all the code yourself,
what it's going to
do is it's going
to look for a service that's
also in your manifest that has
that same media button intent.
And what it'll do is every time
the broadcast receiver receives
a media button intent,
it will forward
that on to your service,
which makes it really nice
because in your service
that has your media player
and that type of things, you can
actually act on those events.
The big problem we found
was that in so many times
you want the broadcast
receiver to say, oh,
you hit the Play/Pause button,
or you hit the Next Track
button.
Now all of a sudden I need to
somehow get that to my service.
And this really helps
in that common case.
So one problem though,
if you have this code,
it won't actually work.
Thanks, Ian.
But it turns out that this
is an important thing for all
of Android in that there's
what's called a preferred media
button receiver.
In fact, it's probably
not a very good idea
if every media app
on your phone all
received every media button.
First, we'd have many,
many processes starting
at the same time, as
well as, of course,
there's usually
only one app that
has audio focus that
wants to play back
apps that is actually the one
that is the preferred app.
So it's very similar to
audio focus in that it's
a "last wins" kind of model.
If you're the last one to say,
I want to be the preferred media
button receiver, then you'll
become the preferred button
receiver.
So if you've ever had
like Play Music take over
when you hit the
Play/Pause button,
or when you get into
your car, that's
usually the app you're
expecting not actually handling
being the preferred
media button receiver.
So if all apps do
this correctly,
then when you do get in your
car and your Bluetooth auto
connects, and it
starts playback,
it's actually going to start
the app that you actually last
used rather than the one that
was just last registered.
So that's really where
MediaSessionCompat,
another class, comes in.
It's really that
consolidated connection
between your app and the system.
And it's actually doing a lot
more than you might imagine.
So we built the MediaSession
APIs in LOLLIPOP.
And MediaSession brings
that back to every app,
even down to V4, if
you're that person who's
still supporting V4.
Please, no V4s.
The 8 maybe?
Everything cool in media
starts at about V8,
so we were covering everyone
with MediaSessionCompat.
And it's actually doing just
about everything for you
with just a simple few methods.
But of course, we do
want to create it.
And there's one thing
you want to do especially
is make sure you're
setting flags.
You want to set both media
buttons and the transport
controls flag.
This is what actually allows
some of those connections
to your app.
Now, if you're doing a very
temporary thing, like showing
an advertisement that probably
shouldn't have these actions,
those flags are actually
a really good way
of turning those off temporarily
and then turning them back on.
But for the most [? part, ?]
[? this ?] [? is ?] for media
playback.
You'll want to just always
supposed set those two flags.
And then we have this
concept of callbacks.
And these callbacks are
really the onPlay, onPause,
basically all of the events that
your app wants to respond to.
And [? what ?] [? we'll ?] use
our callbacks for is the main
way of interacting
with Media Player.
So everything that's coming into
your service, everything that's
coming into your app, then goes
through one of these callbacks
to actually then trigger
the media player.
The nice part
about this solution
is that it works
really well if you
do want to enable
Cast in your media app
because you can just
switch out your callbacks,
and switch from local
callbacks to remote callbacks,
and not have to actually touch
any of the rest of your code.
Just have onPlay and
onPause do a different thing
based on whatever the
current callback is.
So the one thing we actually
need to really do then
to become the preferred
media button playback
is called setActive.
And we'll setActive to true
basically at the same time
we're requesting
audio focus, and we've
been granted audio focus,
and then setActive to false
when we actually stop.
So this is the
important part that's
actually going to get us to
the preferred media button
receiver.
And you'll note the once
we call these lines,
then magically, all of
our broadcast receivers
start working.
This is also the
exact time where
we can use media button
receiver's other handy method,
which is handle Intent.
So this handle Intent takes
in your MediaSessionCompat,
extracts the key events,
and then hands it
off to your callbacks,
all without having
to write any more code.
It's just one line in
your onStart command.
And all of a sudden
your callbacks
are then receiving
media button events
without you having to
write or decode key events
to get through all that.
Now, there's one wrinkle
though because how does it
know that when you
hit the Play/Pause
button on your Bluetooth
remote that you want to play
or you want to pause?
We haven't actually told
you what's going on.
We haven't told the
system anything.
So that's what
PlaybackStateCompat is for.
It's actually how you
tell the system what's
currently going on.
And there's actually
two parts to it.
One is setState.
So this is what's
currently going on.
So this is like StatePlaying,
StatePaused, Buffering, as well
as kind of your position.
So if you're 30
seconds into a track,
you'll set the position to 30.
The other part is setActions.
Now, setActions are what
controls we support.
So you'll definitely want
to support, say, Play/Pause
and Stop.
But of course, if you
support SkipToNext or not,
those actions are
actually going to be
really important to set here.
If you don't set
those actions here,
you won't get media
buttons for them,
and you won't get
controls on Android Wear
and on Android Auto as well.
They all rely on those
actions to be set.
So if you want to
support Rewind,
again, one of the
actions you need to add.
So these are actually grouped
together because many times
they could change
at the same time.
If you are buffering,
you probably
don't have a FastForward button.
If you're paused, you probably
don't have a Pause button.
You'll want to switch the
actions at the same time you're
switching your state.
What about those cool
lock screen controls?
We added them in
Ice Cream Sandwich.
And you could Play/Pause
without unlocking your phone.
It's probably the coolest thing,
except for the whole combining
phones and tablets
together, but that's not
important for this talk.
And it's actually pretty easy.
It requires a little
bit of metadata.
In fact, a picture would be
required for this information.
Now, this is actually used
for a lot more than just
lock screen controls.
Like Android Wear, we'll
take the background image
from your metadata.
So what kinds of metadata?
Well there's actually like 27
different kinds of metadata
you can add.
These are the most
important ones,
the ones you'd expect--
Title, Album, Artist,
AlbumArtist, if that's a
different thing-- as well
as the duration.
It goes really well
with that position.
We added it in playback
state, as well as
the actual images themselves.
You can store them
as bitmaps or provide
URIs to content URIs, which
then the app can read.
Now, really, really
don't store like 4,000
by 4,000 pixel bitmaps in this.
These are sent to other apps.
So if you're doing
it, you probably
want to set a smaller
image in a bitmap
and then provide a URI for the
full-size image in case apps
really do one level
of information.
So how does this actually
work with our playback,
with our life cycle?
Well, we'll want to create
setFlag, setCallback
in onCreate, and then
setActive to make
sure our metadata and state are
updated when we start playing.
And then, of course,
when we pause,
we'll want to update our state.
Stopped will setActive
to false similarly,
again, along with
our audio focus.
And then we'll release
when we're all done.
So lock screen controls,
like how old are those?
We actually removed all of them.
So in fact, notifications
are the new hotness
for the lock screen.
And in fact, you probably should
have been using notifications
all along.
They're kind of a big deal.
It doesn't really make sense
to lock your phone just
to get to media controls.
So having notifications actually
turns out to be really useful.
But writing a
custom notification
that does media
controls and have it
work well on every device--
it's actually really hard.
So we built it for you.
We backported the
MediaStyle notification,
which was added in LOLLIPOP,
so that now you can use it
on all platform versions.
But that did come
with a few caveats.
First of all, prior
to API 14, you
couldn't actually have
buttons in your notification.
It was just one click target.
Can you imagine?
It must have been horrible.
But we've moved on.
But thankfully, MediaStyle
will just continue to work.
It will give you the best
effort it can at that level.
An API 14, we actually
can add actions.
So you'll note there it
could be up to three actions
in the collapsed view,
that single line view,
of the notification.
API 16, Jelly Bean, added
expanded notifications
where now we can have
up to five actions.
And on 5.0 and
higher, we'll just
use the framework MediaStyle.
So as we change
things, and things
get even better, or
the styling changes,
you'll always know that you
are in sync with the framework
at all API 21 and higher.
So I don't like
writing boilerplate.
I'm sure many of you
love writing boilerplate.
Boilerplate?
No?
Wow.
You all hate boilerplate, too.
So it's OK.
I wrote it for you because
I want you guys to save
as much time as possible.
And so I built this
helper that actually takes
a MediaSessionCompat and
builds a notification for you.
And it all relies on
MediaMetadata's getDescription.
So getDescription actually looks
at all of those metadata fields
that you've added and extracts
just the most important
information from it.
It turns out that it extracts
the same information,
the same fields, that
I already talked about.
It's like I planned it that way.
But there's actually
display-specific metadata items
if you want to specifically
override this case because this
is actually what Android
Wear is also going to use.
But once we have
that description,
we can actually build
most of our notification
directly from that
description-- getting
the title, the text, and any
subtext that's available,
as well as a large icon
in the actual icon itself.
We can also actually
fill out other things,
like the Click Intent
for going through.
As long as we call
setSessionActivity
on our MediaSessionCompat,
we can then pull it out
for our notification.
Additionally in
LOLLIPOP, there's
a concept of hidden
notifications.
You may not want all
of your notifications
on the lock screen all the time.
So for these notifications
for media controls,
you probably want to
set them to Public.
That way people
can interact even
if they've chosen to hide
their other notifications.
And then, of course, when
the notification is actually
removed, swiped away,
we'll want to stopPlayback.
We don't necessarily
want the user
to continue to hear music when
they specifically swiped away
our notification.
So you'll note we actually
use a different one called
getActionIntent.
And that's actually fairly easy.
It's basically me faking
what you'd receive as a media
button, building
our own key intent,
and then building a broadcast.
That's just going to trigger
that same media button
receiver, going to your
service, going to your callback.
So again, you don't need
to write anything more
of all this.
So I actually have a gist
available of all this code,
so I'm sure you're
furiously writing.
There must be someone
furiously writing.
No?
You're just working
on your laptops.
That's fine.
That's fine.
I understand.
So let's actually
build our notification.
You'll still need a small
icon for the status bar.
And there's one
thing that's slightly
different about MediaStyle
notifications in that
the color, rather than just
affecting the small icon,
it's going to fill the whole
background of the image.
So bright orange--
not a good idea.
Generally, you want
to use something
branded for your color.
The primary dark is actually
a really good example
of something to use.
But you could also use
a more neutral color.
By default, it will
default to a gray color.
So it wouldn't be much
of a media notification
if it didn't have any action.
So I'd say a Play/Pause
action, and then
actually use our getActionIntent
to help us build the pending
intent that we need.
And then we'll actually
call creating a media style.
So this media style actually
requires that we then
choose which actions,
zero indexed,
you want to display
in that compact view,
that single line view.
In this case, we'll show
our Play/Pause button
because that's kind of useful.
But you may want to show,
say, an NextTrack button-- is
a really good example of
something you may also want
to show in the compact view.
We want to set
the media session.
So this SessionToken is
actually really critical
to get things like
Android Wear working.
So Android Wear
is actually going
to use that SessionToken to
pass callbacks to you on Android
5.0 and higher devices.
So if you forget this line,
you get a great notification
that appears on your wrist.
And you'll hit the
Play/Pause button,
and it won't do anything.
And your users will
be so frustrated.
So thankfully,
it's just one line.
It's super easy to add.
Just make sure you add it.
So Lifecycle-wise, here we have
we're showing our notification.
When we start playing,
we'll pause it.
We'll update it when we're
paused, any time we're
changing the state.
And then, of course, when
we're stopped or destroyed,
we'll want to clear
the notification.
Now, one thing you
might consider here
is sometimes it makes sense to
actually keep your notification
around just a little bit
longer before you're actually
stopped in case the
user wants to restart.
Say they reach the
end of their playlist.
So don't consider it
necessarily a hard requirement
to remove your notification.
But in all cases, you should
consider maybe a timeout kind
of a system.
So media playback
is actually one
of the things that's
actually going
to be a really good candidate
for foreground services.
Foreground services raise the
priority of your background
service such that it isn't
killed in only the most extreme
memory conditions.
So this is great because it's
super noticeable when the media
playback app is killed
in the background,
because media playback stops.
So foreground services
have a requirement
that you have a notification.
And thankfully, we just
built a notification.
So it should be fairly easy
to get this actually started
as a foreground service.
Now, there's one caveat, though.
Prior to LOLLIPOP, if you
call stopForeground false,
you actually can't slight
(dismiss) the notification.
And this is actually a really
common case in media playback
apps because you
don't necessarily
need to be a foreground service
when you're not playing audio.
In theory in the most
memory-hard areas,
you'll probably not want to
be a foreground service when
you're Stopped playback.
So how do I work
around this Bug ?
We fixed it in LOLLIPOP.
But that doesn't help people
who are not in SDK 21.
Anyone?
No one lucky here.
So MediaStyle kind of
built something for it--
the Cancel button.
A simple X in the corner
that allows users, even prior
to LOLLIPOP, to remove
your notification,
even though they can't
ever swipe it away.
So actually extremely
easy to add your call.
Set show CancelButton true,
and then add the intent
to send StopPlayback.
Now, the nice part is because
we fixed it in LOLLIPOP,
these calls won't actually do
anything on LOLLIPOP and above.
They're just only for backward
compatibility reasons.
On LOLLIPOP and above,
as soon as you hit Pause,
and you set your state to Pause,
and you stop being a foreground
service, you can swipe away the
notification without a problem.
So we've updated
our notification.
Now we're going to
startForeground, stopForeground
false to keep the notification
around, but not be a foreground
service.
And then stopForeground true
to remove our notification
when we've actually
stopped entirely.
Actually our services--
this is about everything.
It's a lot, but we've actually
handled all the things
that our service can handle.
There's just one thing we
still need to do-- build a UI.
That's a big deal for
most media playback apps.
Maybe if you had a
single radio station,
would be really boring,
but maybe get away with it.
So we have to figure out
some way of connecting
our service to our UI.
And ideally, we'd like to
reuse all of those callbacks--
the onPlay, onPause,
onStop-- that we already
have in our service for
hooking up our buttons to it.
So we build one of those.
It's called
MediaControllerCompat.
And it's actually the
way of once you connect,
you can actually get all
of the current metadata.
So you can update your II to
say what's currently playing.
You can actually get
the playback state.
So you'll know,
hey, are we playing?
Are we paused?
Are we buffering?
So you'll be able to update
the UI based on those actions.
And then there's also a
transport controls callback.
And here, it actually
gives you that one to one.
So it has methods like
Play, Pause, SkiptoNext,
that directly correspond with
the callback we registered
in our MediaSessionCompat.
So you can very easily hook
up, then, your Play button
to transport controls
play, and you're done.
There's no additional
communication steps
you need as soon as you have
a MediaControllerCompat.
One thing, though--
you actually need
to get a MediaControllerCompat
instance.
And to do that,
we're actually going
to use that same
SessionToken that we
added to our notification.
We just somehow need to get
that from our service to our UI.
So there's a really nice
class that does that, too.
We thought of everything.
It's called MediaBrowserService
and the MediaBrowser.
And when you create a
MediaBrowserService, rather
than just a service,
it actually gives you
mechanisms to connect to
your service from your UI,
and then retrieve the token as
well as get new APIs, which--
you guessed it-- allows you to
browse media on your device.
So this may be able to easily
build out your UI based on,
say, a list of tracks,
or a list of albums,
or any other kind
of things you want.
It is actually required for
Android Auto integration.
If you think about
Android Auto, you
don't have control over the UI
itself because of car safety.
But you do have control over
what audio tracks appear
in your media playback.
And that's all done through
MediaBrowserService.
It also adds a browse
action on Android Wear,
which if you scroll all the way
over from a playback, actually
you'll see a browse button.
You'll actually be able to
select the next track directly
from there.
So it's really simple
to actually use
MediaBrowserService.
Instead of extending service,
you extend MediaBrowserService.
And then there's just
one method to do set
SessionToken in your onCreate.
This is what ties
in that SessionToken
so that your MediaBrowserService
knows, all right, well,
what is that token I need?
Of course, there are the
new methods onGetRoot,
which is kind of the root
of your whole application,
as well as onLoadChildren,
onLoadItem,
which you could expect
loads a list of tracks
or a single track.
And we actually have an example
of the Universal Android Music
Player, UM, which goes
through this entire flow
and has a lot more code.
It goes through
basically everything
we've talked about here today.
Now, one downside is
that MediaBrowserService
is API 21 and above.
So we're actually working
on backporting it right now.
We have a version that works.
It's going to be in the next
version of Support Library.
But if you can't wait,
there's a few things
you can do in the
meantime because remember,
all we need is just
a token in our UI.
So we could actually
just use a static method,
called getSessionToken,
which then retrieves a token.
If you have your favorite event
bus application of choice,
there's certainly very many
ways of getting a random item
from one to the other.
Do note that the
SessionToken is parcelable.
So you can send it over
broadcasts or between processes
without a problem.
The other choice is to build
a MediaBrowserService light.
Basically, do your own
binding to the service,
and then actually
just have the getToken
as part of that binder API.
So it's a little bit
more complicated to get
through that, but we have a nice
article about bound services
if you're interested in
going through that approach.
And so that's about everything.
This is the slide.
I could have just given to
you this at the beginning.
But I would have felt a
little anti-climactic.
But this is actually
everything that we
need to get our service to be
doing literally everything it
possibly can to give the best
user experience for our users.
That means playing well
with audio playback,
having controls
everywhere possible,
using notifications, and using
a foreground service to make
sure we're not killed
in the background,
as well as doing Android Wear
and Android Auto integrations
so that your media playback
works perfectly everywhere.
So if you have any
questions for me,
feel free to reach out to
me on Google+ or Twitter.
Happy to answer your questions.
Thanks again.
