DAN GALPIN: If you're
just joining us,
we're live in
Sunnyvale, California
at the 2019 Android Dev Summit
and this is #AskAndroid.
LYLA FUJIWARA: OK, so
joining us right now
are Vinit Modi and
Trevor McGuire,
who are going to be fielding
questions about CameraX.
Vinit is the product
manager for CameraX
and Trevor is part of the
CameraX engineering team.
DAN GALPIN: All right.
So Kalanidhi_M asks,
so when will you
release the stable
version of a CameraX API?
Everyone wants to know this.
VINIT MODI: So, we're pretty
excited to share that CameraX
will be beta end of this year,
and then with the beta release
the developers will be able
to use CameraX in production.
DAN GALPIN: That's great.
LYLA FUJIWARA: Yeah,
that's a good thing
to point out because sometimes
with beta versus stable,
there's questions about that.
VINIT MODI: Exactly.
Exactly.
LYLA FUJIWARA: Anything
you can point to that
could encourage developers
to use in production,
or I guess some
evidence of that?
VINIT MODI: So, as you said,
the milestone can be fluid.
LYLA FUJIWARA: Yeah.
VINIT MODI: One of the
reasons that we declared beta
as to be ready for production is
actually the amount of testing
that we do.
LYLA FUJIWARA: OK.
VINIT MODI: So one
of the things is
we have a device
lab with 52 devices.
They represent 200 million
active devices in the market.
And so, everything from taking
a picture, rotation, et cetera,
is all tested real-time
on those devices.
In addition, we
test across hundreds
more devices for things like
stability, crash testing, all
that sort of stuff.
LYLA FUJIWARA: Awesome.
DAN GALPIN: Yeah, the
test lab is really cool.
It's, like-- if you
ever see it, if you ever
get a chance to see
it, it's amazing.
OK, let's see what else.
You were going to ask
next question, actually.
LYLA FUJIWARA: Sure, I
will ask the next question.
DAN GALPIN: Sorry.
LYLA FUJIWARA: So, I
think a question that
might be on everybody's
mind is on Camera 1,
should they be
migrating to Camera 2?
Should they be
migrating to CameraX?
What are the things that you
should be thinking about when
you make this decision?
TREVOR MCGUIRE: Well, I think
if you're using Camera 1 today,
then CameraX is going to
be much easier for you
to use because it's not
quite as steep a learning
curve as Camera 2 is.
If you want your app to be
a more fully-featured camera
app, like you want to make use
of a lot of camera controls,
like you expect in, like, a
DSLR or something like that,
Camera 2 is the more
flexible API that's
going to give you that control.
VINIT MODI: Yeah, I
mean, in addition,
one thing is if you have users
that are in API 20 or before--
I mean, 19 or before, right--
then they should
keep on Camera 1
for the majority of the users,
then you can move to CameraX.
So, CameraX will work
for 90% of devices
that are in market today.
LYLA FUJIWARA: OK.
Awesome.
DAN GALPIN: Excellent.
All right, so let's see--
so a question from
Pranaypatel asks,
is there any inbuilt
way that CameraX
have or will have for developers
to easily switch camera
from front to back
and vice versa?
TREVOR MCGUIRE: Yeah.
So today, you can actually
do this with use cases.
You configure them with a
specific camera in mind.
So, you say I want to
use the back camera,
and then you bind
it to a lifecycle.
Now, lifecycles
don't always line up
with when you want
to switch cameras.
So you might be in the
same activity and you
to use the front camera
from the back camera.
And so what you can
do is you can actually
unbind the use cases
from that lifecycle
and then bind new use
cases with the front camera
as the configured camera.
So that will help you switch
cameras easier than you could
do with Camera 2 or Camera 1.
In addition to that, we also
do have the Camera View module.
This is something
that you can include
into your build.gradle file.
And it provides a drop-in
view called Camera View, which
would allow you
to show a preview
on-screen easily and take
pictures and save those
to disk.
And that has an
API where you just
set which direction you want--
which camera direction you
want to be shown in the preview
and for the pictures.
So it's a simple
API, SetLensFacing.
LYLA FUJIWARA: Cool.
DAN GALPIN: That's
what I've always
wanted, just a drop-in view
so I can put my camera in.
LYLA FUJIWARA: So yeah,
we've got a lot of questions
from developers.
rahulraj and hossainkhan
both were asking about this.
So basically, the intersection
between CameraX and MLKit,
or other machine learning stuff.
So what is, I guess, the plan
involving MLKit and CameraX?
VINIT MODI: Absolutely.
So we're super excited for this.
In general, a lot of
developers are asking for this.
And what we want is CameraX
to be a seamless stack
where we take care of all
the hard stuff of configuring
the camera.
And then at that point, it's
easily able to slot things
like MLKit--
LYLA FUJIWARA: Plug and play.
VINIT MODI: --plug
and play on top.
That's now coming
more into next year.
But we are building
up towards that,
so I think that will be a
super exciting direction
for developers.
LYLA FUJIWARA: Awesome.
DAN GALPIN: So one
of the questions
that rahulraj asked was, we
are experimenting with CameraX
machine learning and our
TensorFlow Lite model
needs a 3D array of RGB values,
but the CameraX analyzer
returns YUV.
Is there an easy way to do
that, to get an array to RGB?
TREVOR MCGUIRE: So it's
not as easy as it could be,
but using the image analysis use
case, you can get YUV images.
And there are ways that you can
convert that to a RGB image.
You can use the YUV image class
to actually encode that image
and then decode
within that factory.
You can get an RGB bitmap.
Or if you're a
more advanced user,
you can actually use
RenderScript there.
DAN GALPIN: I was going to say--
TREVOR MCGUIRE: There's
a script intrinsic to--
DAN GALPIN: I would say
RenderScript would've been
my thought, yeah, on that one.
It'll be fast.
That's the important
thing on that one.
LYLA FUJIWARA: OK,
so from Twitter,
tellvic asks basically, can we
use CameraX without Jetpack?
And I'm interpreted
that as, like,
without the other Jetpack
libraries, like, can it be--
DAN GALPIN: Or we're not
migrating to AndroidX,
I'm not sure.
LYLA FUJIWARA: Or without
migrating to AndroidX, true.
VINIT MODI: Yeah,
I mean, I think
AndroidX and Jetpack are, like,
core components for CameraX.
I think some of the
things is there's
the lifecycle management, right?
So we rely on AndroidX for that.
LYLA FUJIWARA: OK.
VINIT MODI: So, it's really
tightly coupled with it.
So I think our
recommendation to developers
is to really consider migrating
to AndroidX as part of CameraX.
LYLA FUJIWARA: OK, so
you migrate to AndroidX--
do you need to be using
ViewModel and LiveData
or are those
separated from that?
TREVOR MCGUIRE: So, uh--
no.
LYLA FUJIWARA: OK.
TREVOR MCGUIRE: We
do want to expose
some of these new features,
or these new, sort of,
programming paradigms
in CameraX if we can,
but you will need to use
lifecycles right now.
And LiveData is something
that's optionally going
to be available as well, but--
yeah.
LYLA FUJIWARA: Yeah, so
it's, like, very compatible
with this latest
and greatest stuff,
but you can use it without it.
TREVOR MCGUIRE: Yes.
DAN GALPIN: So, magicaction,
the YouTube livestream,
was asking, how or if CameraX is
going to handle multiple camera
streams at once?
VINIT MODI: Yeah,
so I think today,
a majority of the use cases
require just one camera
being active at the same time.
We are, kind of,
overall on the platform,
we are exploring how do we
enable multiple camera streams,
especially for front and back.
In addition, though,
starting with Android P,
we have a new API called
the logical camera API.
That logical camera
API, what it does
is it combines all
the physical cameras
into one easy API for
developers to access
all the different
physical sub-cameras.
CameraX will be able to
work with all the framework
APIs that exist.
DAN GALPIN: I presume
that's supported on Pixel 2,
like, for the two rear
cameras, like [INAUDIBLE]..
VINIT MODI: Uh, yeah.
So the Pixel is using
a logical camera API
for both the front and the back.
DAN GALPIN: Oh, right.
LYLA FUJIWARA: Excellent.
Cool.
OK, so we talked
about front and back.
We have Cyprien asking, is the
handling of screen rotation
fixed in CameraX?
Maybe I can phrase that
a little bit better.
Basically, is screen rotation
going to be easy in CameraX?
TREVOR MCGUIRE: Yes.
Yeah.
So, screen rotation
is a very hard problem
because developers have
to worry about what's
the orientation of the
sensor on their phone, what's
the natural orientation
of the device,
and what's the current
display orientation?
So, we have a sample
CameraX basic that they
can take a look at it, which
does show how to use it
with CameraX, and we have a
few new things coming up soon,
so keep that in mind.
LYLA FUJIWARA: OK.
I think calling out the sample
is great because we're actually
almost up.
Is there anything
else for getting
started with CameraX that
developers should know or go
to?
VINIT MODI: So there's
a CameraX website
on developer.android.com,
and I think
that's the best way to
get started with CameraX.
LYLA FUJIWARA: OK, great.
DAN GALPIN: Excellent.
Well, Trevor, Vinit,
thank you so much.
If you want to learn more
about CameraX, of course,
we also in addition
to the website,
we have the CameraX Codelab.
And again, we'll be back soon.
So, thank you very much.
[MUSIC PLAYING]
