- So, thanks for the introduction,
my name is Michael Nebeling,
and this is Katy Madier,
and Katy decided not to
go to her own graduation,
instead she decided to go for CHI,
and so therefore we're
gonna co-present this paper,
it's our work on 360proto.
And I'm gonna start it,
and then Katy's gonna
take over in a few minutes.
So, last year at CHI, I worked
on a project called ProtoAR,
I'll show a quick summary here.
The idea was to start prototyping
mobile AR applications on paper,
and then provide a set
of tools to designers
so that they can quickly capture paper
and also Play-Doh as place
holders for 3-D models.
And that way, rapidly generate
mobile AR applications.
At that stage, they were market-based.
And so, this year we're
gonna present 360proto,
and it is the next tool in my line,
I strongly believe, like Hiroshishi who
fights the pixel empire,
I fight the Unity empire.
I strongly believe that we need new tools
to support AR/VR prototyping.
And I'm really impressed by the
power and flexibility that we have
with low-fi prototyping, in
particular paper prototyping.
Now if you teach interaction
design like I do,
I often refer to Marc Rettig's paper
on Prototyping for Tiny Fingers,
and this is the illustration
that was used there
to illustrate how a paper
prototype could be tested.
And so we expand this vision
to 360 paper prototyping,
modernized a little bit to support both
prototyping with VR users and
AR users in a similar fashion.
So this project presents, or this paper,
the main contribution, the way I see it
is the system that we produced here,
which consists of a suite of tools,
that's how I think of 360proto,
for generating interactive
AR/VR prototypes from paper.
So, both the interactive part and that
we generate something
from paper is key here.
And so the paper actually
contributes the study
on the use of emerging paper templates in
the practitioner community,
specifically for AR/VR.
We develop three novel tools in 360proto,
that we're gonna demonstrate here as well.
And we show support for
Wizard of Oz prototyping,
where we simulate, rather
than implement, everything
to produce an AR/VR interface.
And so I'd like to hand
over to my colleague Katy.
- So initially we
gathered design templates
from practitioners from places like
Facebook and Medium, and
here, as shown in this slide,
we found a variety of templates
for designing environments
for controllers and faces
for AR/VR experiences.
And initially we conducted
a set of design jams
with 36 masters students
to capture information
for requirements to better understand how
users prototype for AR and VR.
The students completed
specific design challenges
by recreating three interactions from
existing and imagined applications
using plain paper and an
equirectangular environment template.
During the first design
jams we also tested
our initial prototype for our tool.
And so, with these design jams
we uncovered three
requirements for the tool.
The first is a support
for layered prototypes.
So we found that designers needed to
prototype all the parts of the AR/VR app,
from the AR marker overlays to
the VR controller menus and HUDs,
so our tool needed to be able to
support multiple design layers.
And the second requirement we found
was an integration with digital tools.
So AR/VR prototypes
require a big jump from
traditional paper prototyping,
which is just 2-D.
Students found it was difficult to imagine
what a design may look
like once it was AR or VR,
so we found that our
tool should incorporate
realistic previews of the
designs on an AR or VR device.
And our third requirement is
for Wizard of Oz capabilities.
So in the design jams,
designers mentioned that
it was great to show the
interactivity on paper,
and so we wanna be able
to provide that dynamic
real-time preview of
interaction within the tool.
And so we derived our
inspiration for this tool
from 360 photos in their
equirectangular format.
So we were able to find,
to gather information
and create a 360 paper template
based off of a VR grid created
by design practitioners.
And as shown in this image,
the equirectangular grid
can be morphed into a sphere
too create the VR environment.
And to explain our 360 paper template,
imagine that you can wrap this grid
around your head to form a circle.
And so you can see that the
very center of the grid is where
you would put content that
is directly in front of you.
And at the top you would put content
that is directly above you,
and at the edges you would
show content that's behind you.
And based on our initial
feedback from design jams,
we added a field of view
and a range of motion
based on the work of Mike Alger
and other design practitioners.
These additions were
made to help designers
be able to draw within
scale and understand
object movement that happens
in front of the viewer.
So to test our requirements,
we created this example
of a mountain view with butterflies.
This example is composed
of three spherical layers,
a background, a mid-ground,
and a foreground,
and each is hand drawn
on a 360 paper template.
These layers are then placed at
different distances from the user
to simulate depth within the scene.
And so here Michael
will be able to provide
information about the 360paper Studio.
- Thanks Katy.
We have 360proto Studio.
I was hoping to give a live demo,
but we rely on wifi and this room
is particularly bad for wifi,
so what I'll do instead
is I have lots of videos,
and I hope to convey how this works.
So the way this tool works, if it works,
the way this presentation works,
hang on, I didn't get any of this.
All right, so, the labels that I show here
is actually fairly complex tool.
What I want to illustrate with this,
if you look at the, we actually simulate
what this would look
like on a mobile phone,
and that mobile phone could
be like in a cardboard
or in a Daydream, but
you see that view here.
And what I'll also illustrate
here is on the right
where it says 360 layers
is actually how we
composed this environment from the
different 360 layers from background
to foreground to mid-ground.
And we have various features in the tool,
including chromakeying
features to remove background
from like paper background
and other kinds of features
that you want to remove,
and we have a live feed,
that means we can actually stream back
to an AR or VR user,
which is a capability that
we use for Wizard of Oz.
So, what I want you to
understand a little bit
the different component of this tool.
So one part of 360proto
actually runs on a mobile phone.
We developed our own
camera app, if you will,
that has a variety of features,
it's also multi-user, and
it can actually be used
in a classroom environment,
and I've conducted
initial usage of the tool
in my own AR/VR course.
On this app, you can actually capture
the paper template that
Katy was talking about,
and so this would be in
a three step approach.
The camera itself, through
the 360proto camera tool,
has the initial support for capturing the
environment from paper and
providing a preview in cardboard.
So the way this looks, for example,
if you have a forest scene you composed,
not sketched, but composed using
paper printouts and
cutouts, you would take,
I'm sorry, this is a bit weird.
Okay, you would take a
picture and you would
immediately see a preview,
and I'm gonna show you a video where
this actually then happens live.
Okay.
So what just happened
is they took a picture,
and so this video wasn't edited,
it is immediate preview on a smart phone,
in sears copic rendering on cardboard.
Sorry about the technology hiccups here.
All right, so the next step
is actually putting this tool
in the larger workflow.
So this was just what's
happening on a mobile phone,
which allows rapid previewing
in VR or in 360 with a handheld,
so just like using the magic mirror mode,
but then we have a more
sophisticated part,
which is the studio, and the studio is
usually run on a laptop.
So that's the second step,
where you can compose
multiple layers that
you may have captured.
And I wanna show you one that
I think very powerful feature,
is there is a third
component to this project is
the final AR or VR application that
would support things
like a moving butterfly.
So I wanna illustrate you
one of the examples that we
created that I'm particularly proud of.
It is possible with this tool to create
a live preview of a butterfly
essentially flying in front of the user.
And we do this in AR, and
this is really something
I would've liked to live demonstrate here.
It is possible to spawn
a 360 sphere around me,
and project a butterfly
right in front of me,
if I walk around with my mobile phone.
You'll see a video of this in a second.
The way we do this is actually
on top of the paper prototyping template,
and we use a marker to
move the digital butterfly,
and then I'll show you in a
second part of this video is when
this butterfly is then
actually not live animated,
but it's actually kind of in the room
in AR and spatially anchored.
So the first part is
here, our live streaming
including capturing, and
it's really difficult to,
you know, because the other user
would receive this live stream,
so it's really difficult to illustrate
on stage how this would feel.
But so we capture, and now we're
moving over to the end user,
who sees the butterfly basically
appearing live in the room.
All right.
So if you're interested in
this, this is web-based,
and so it works in a variety of setups.
We have the camera
tool, we have the studio
that actually supports
all the different layers,
and we have the app that is running on,
again, a smart phone,
and then we stream (mumbles) media streams
into various directions to actually
capture from paper and then live stream
and compose AR/VR interfaces.
One feature that I'm
particularly proud of is
because current mobile phones are still,
they overheat when your
run this AR for 20 minutes,
they overheat, and they actually go down
in terms of performance,
so we had to pay a lot of
attention to where we actually
composed the AR experience.
So we capture the live video in the app,
we stream it to the studio,
the studio does the AR composition
based on information
from the mobile phone,
and streams it back to the mobile phone,
and that seemed to be the most
performant way to do this.
So I'll show you one of the examples
that Katy has worked on,
Katy is a designer, and I wanted Katy
to create the Star Trek for me,
the Next Generation Star Trek,
and I wanna illustrate a
little bit how we do this.
So what I'm doing here is
we just landed in my house.
We are Star Trek, I am captain Piccard,
and I am going into my, well
we just landed in my house,
so now we're looking at
my kitchen/dining room.
And then I fast forward, we're
walking into space right now,
and I'm showing you what this
Star Trek looks from the outside.
And you also see our screen,
which looks into my living room.
So, what I just illustrated
here is actually a
very flexible use of the tool,
we are constructing this experience,
if you're familiar with the Star Trek,
the captain's chair in the middle is
essentially a smaller 360 sphere,
there's another layer around it,
which is essentially the bridge,
and then we have the observation deck,
that is a room that is
just next to the bridge.
And we built this by oversecting
the spheres on the right.
And using AR Core, it is then possible to
essentially walk inside
your paper prototype.
So we explore the use of 360proto.
We ourselves try to create
a variety of experiences.
We ask students to recreate
relatively complex AR/VR projects
they had previously created
with A-Frame and Unity.
We asked students in a design jam
to create multi-layer 360 prototypes.
We really wanted to explore
like how complex can it get.
And we also explored live streaming,
and Wizard of Oz capabilities.
So, we have a variety of results,
and basically here are some of the things
that you can do with this template,
in a three-hour class we actually
ran with my AR/VR course,
we had students work individually
and we just show you a sample here,
so using the template they could quickly
prototype a VR YouTube experience,
including the menu, there
are specific templates
for VR controllers, as
you can see on the right.
And then an AR news
interface with news content
appearing in front of the user.
Or also, if you are more
interested in facial-based AR,
this is also supported by our tool.
We have a variety of features for that.
So I'm gonna show you two more examples
that I think are interesting,
and then we're gonna wrap up.
So if you know the Amazon
AR live shopping view,
that I'm sure the Amazon
spent some resources on it,
prototyping we can create that.
We can also recreate a highly
interactive racquetball game.
This is one of Katy's and
my own favorite example.
In this case, Katy was
the racquet player in VR,
and I was a tennis ball on a toothpick
above the paper template.
What did we learn from trying this
together with students
and our own experiences?
So, the overall feedback that we received,
apart from the fact, I
guess, one of the key things
you don't have to implement anything.
You do not have to be a programmer.
You do not have to know any APIs,
which is, well, the state-of-the-art is
you just have to know Unity,
or you do know any native AR choreo kit
or hollow lens mixed reality toolkit.
So, but our students are
increasingly interested in UX,
find the tools useful for
design space exploration,
so this a powerful tool to quickly
eliminate AR as an option.
You do not have to
implement all the things to
realize that AR is not the right approach.
You can just prototype this very quickly.
The template's guide design,
but sketching is actually still difficult,
Katy spent several weeks really exploring
the template and optimizing the template
and so that way we could create
a Star Trek, for example.
For our students in
particular, but I think also
Katy when she (mumbles) it was difficult
to imagine the final user experience,
so the live streaming
support is very important,
that you can actually try it out,
so essentially we need all three tools
in 360proto to really
prototype an AR/VR experience.
The tools did facilitate creation of
realistic AR/VR interfaces.
The fact that they are low-fi and paper
was actually never really mentioned
as like a negative.
It was sufficient to experience, at least,
and we think this is
actually the way to go.
This is the low-fi prototyping
version of AR and VR.
And the Wizard of Oz actually enabled
a lot of these interactive experiences,
and our students actually stated that they
consider the AR/VR prototypes created
with the 360proto as a
medium, medium fidelity.
Because obviously high fidelity would
then be probably that
thing in Unity again.
So again, we're not
trying to replace Unity,
we're just trying to fill the workflow
in AR/VR so that we can
have a nice prototyping
workflow starting with
low-fidelity prototyping.
And that is actually the paper,
360proto, thanks for your attention.
(audience applause)
- [Moderator] Thank you.
Any questions?
- [Audience Member]
Michael, you had reasons for
going with low-fi in almost everything,
you know, with phones and everything
it's a very inexpensive operation.
Any plans for working with real
gen or something like that,
and rate the equipment and not doing it
at the web-based model?
- So, we chose A-Frame
for a variety of reasons,
that way I can actually run it in a class,
and it works relatively okay.
Your question actually
targets multiple aspects.
My attempt is not to cre--
I don't think I wanna be more
higher-fidelity than this.
I think this is fine.
The next step is actually
going into Unity,
or actually just stopping that project,
which is you just saved
thousands of dollars asking
a developer to create a prototype for you,
and then realize it's not the right thing.
And I think, the way I
think about all these
different prototyping tools,
I put A-Frame here, put Unity here,
and I put Unreal, I'm not tall enough,
but higher than that.
And Cinema for the I don't
know where to put it.
The other thing is that
this is actually a tool,
in this emerging space of lots
of digital prototyping tools,
it just really we target
the low-fi prototyping gap,
and I think that's just where I wanna be.
But there are transitions, actually,
and hopefully upcoming work in the future
that these prototyping tools,
and one of the limitations is
the prototype dies with the tool.
Because we do not actually generate code.
We do not actually produce anything,
we could export assets to A-Frame.
But that's something that
I'm increasing exploring,
this bridge into tools
like A-Frame or Unity,
but not replace them.
- [Audience Member] Hello, (mumbles) here.
From National Taiwan University.
It's really interesting
work and I really like it.
I especially like the final part,
the final video that you used a toothpick
to animate the ball in the virtual world.
But I'm particularly
wondering is it possible
that the facilitator, the operators,
actually see the VR
scene, or the AR scene?
How does the facilitator actually know
what the user is looking at?
- So that's actually a very good question.
So we have the support
to stream these streams
and replicate those streams,
it's a multi-streaming interface.
In our experiments most of the time,
we oriented ourselves using
the studio that I showed you,
that actually has the live preview,
that well it would be hard
to find that slide right now.
The tool that's running on the laptop has
a live preview of what's
happening for the phone user,
and that is the orientation
point for the wizard.
So the wizard is just using that,
however, we could replicate that stream
and actually build other interesting tools
by using this multi-streaming
solution as well.
- [Audience Member] So congratulations,
this looks really
interesting and fun to use.
I wonder, you already
mentioned there must be
a difficulty to draw
on the template and to
predict what kind of
distortion you need to draw
the distortion on the paper,
and if you have any thoughts
on how to make this easier.
- Yes, so in the process
of drawing on this,
if you understand anything
about perspective,
it might be good to forget about that,
because it's really difficult to
conventionally provide perspective in
something that curves
so much at the poles.
The way that I found to be easiest
is to have that preview
of what it will look like
once it's been wrapped in a sphere,
so using the tool while drawing,
taking quick snapshots of the experience
just to make sure that you're
thinking in the right
way about perspective
was the way that I found
to help most in practice.
- [Moderator] Great, thank you everyone.
Thank you, Michael and Katy.
And thanks to all the speakers,
they will be here for any
questions that you might have.
