oh my god what they're dancing and
crawling oh my god this is so crazy hey
everybody's known from tested and
welcome back to projections where we're
still covering developments in the VR
and AR community even though we're
working from home and sheltering in
place and this week I want to talk to
you about new experience being released
today that's an experiment in hand
tracking yes one of the big
announcements at oculus connect last
year was that the oculus quest would
support built-in hand tracking that's
controller free recognizing of your
hands and fingers just using the
built-in cameras on the front of the
headset using computer vision
essentially and while we got to try a
few demos at oculus connect as well as
see some early work from developers it
really left us with a lot more questions
than answers about the potential and
usability of this type of hand tracking
system as well as hand tracking as a VR
interaction mechanism as a whole and
since then oculus has to their credit
put out as a beta experimental feature
hand tracking in the quest so you can
turn that on right now it will detect
when you're not holding controllers in
your hands and you can use it to not
only see a representation of your hands
and see how that latency is see how the
occlusion works my hands fade away but
also what oculus is thinking of in terms
of interacting with at least menu UI
elements and right now that's one of the
big things we have no haptic feedback
you can't infinitely grab things and
point at all sorts of buttons so what
they do is they cast array a laser beam
essentially from an arm that's not
represented but it is calculated and
then you can essentially laser pointed
objects and pinch to activate them
there's a high degree of confidence or
that kind of discrete action but there's
so much you can so much more you can
potentially do with hand tracking and
while the internal team oculus is
figuring that out they're experimenting
it's gonna be also on third-party
as well as users like me and you to kind
of give feedback and figure out this new
frontier of hand tracking and
interaction and one of those developers
is Denys Connor over at the development
team hollow notic they're the VR studio
that created the game hollow ception
it's out on Steam right now which is a
really interesting experiment in
disembodiment in VR it's an action and
physics game where you can be
first-person in a character but also
zoom out and then control that avatar
almost puppeting it from a third-person
really kind of mind-bending way of doing
a third-person action game in VR and how
notic is putting out today on side
quests their hand physics labs you may
have seen this shared on social media by
Denny's but it's a bunch of experiments
of hand tracking and interaction whether
it's picking up objects the precision of
virtual keyboard or all sorts of weird
experiments but rather and just talk
about it let's see what those
experiments are so join me as I go
through these different experiments and
also have a chat with Deniz and ho not
exceed Oh Roger about the making of this
demo all right so here we are in the
hand physics lab you can already see
here is a representation of my hands
it's not the flesh model of my hands but
it's basically looks like all the points
that the tracking system can see and I'm
just pressing these buttons to start and
physics lab now there are some
conditions here make sure I have enough
light in the room on already I can see
ooh
flesh and bone now you can see how those
axes maps to my fingers cool now I'm
hitting this button to close it
excellent
this is of course in a pre-alpha State
I'm already noticing yeah there's a
little bit of latency this is exactly as
I experienced in the other oculus quest
demos and I'll give you a sense of like
what happens when occlusion happens yeah
so putting hands behind each other in
front of each other
locking the cameras prevents the hand
start to forget prevents the tracking of
their fingers okay first demo let's open
this door oh okay ooh interesting
nothing to see here now this is
interesting this is a interaction with
you know it's a very natural interaction
I knew exactly what I wanted to do here
grab the handle turn it and while I'm
not getting haptic feedback my brain
like instinctively thinks that I'm
holding on to something and that feels
quite natural to pull that lever down
the handle down and open the door and
even push it yeah okay here is a menu of
options a bunch of options on the side
here I can have Skeletor hands may one
the other now notice I'm using switches
here to toggle these things and it
requires a little bit of precise
movement arms yeah I definitely want to
see my arms my elbows are not being are
they being visually track actually don't
know or is it's just I K I think it's
just ika they were definitely out of the
field of view yeah it's just I case oh
if I turn my wrist the elbows turn out
even though my actual elbows are not
turning so still traditional I can model
for that telekinesis air goes so I can
close a fist on here
it's a beam turned gravity on and off
I'll leave it on for now the axes that's
what you saw earlier so these are all
the positions of the joints of your
fingers and if you count there are leave
18 of those here including the wrist and
the wrists of that big ones you can see
you have X Y Z in space
interesting and then status is also
interesting because not only do you have
your position your XYZ of the risks for
example as I just mentioning but you
also see hand confidence high finger
confidence high pinching so there goes
middle finger pinching ah index middle
finger ring finger pinkie finger so you
have pinching across all of your fingers
as potential different inputs and that's
what the system is acknowledging and has
a comp high confidence level of saying
okay I am actually activating something
by hitting these fingers and if I pinch
the which one is it I think it's my
index finger together that's why I would
access like the main menu again grasping
true or false
another thing true true yep I'm grasping
this is all built into the hand tracking
SDK that oculus provides and which is
what holla nautics is tapping into let's
turn off the status and go into a
grabbing demo here we go it's a bunch of
objects in front of me if I want to grab
that Apple yeah
sure to sum up Oh dropped it grab that
Chuck I'm using my whole palm so grab it
so not using the fingers but I want to
grab let's say just between my fingers
ooh okay it's a little slippery that's
the physics part of it but I can go whoa
bounce it up how about this rock I want
a cup that in my palm slipping out so
easy to hold them face up but not so
easy to wrap my entire hand around it
see if I can drop it in my hand and if
it's modeled well enough I can actually
grab that around my palm ah not so easy
how about objects like a hammer here
okay
also slippery ah
yeah stuff I want to push that coffee
mug Oh
smashed it I feel like a baby learning
for the first time how to hold things
yep that was satisfying if I can grab
the outside of the smug little slippery
okay all right got to be very gentle
just like I tell my baby be gentle with
the objects in the world Hey okay there
it goes haha let's move on to building
can spawn things okay here are some
building blocks different shapes okay
it's a little bit tough to hit these
switches have to be very gentle and
precise and make very deliberate
movements I find that almost opening my
hand up so I know the cameras can see
all my fingers makes it easier very
mindful of the occlusion yeah this is
interesting because you know obviously I
don't feel the object between my fingers
so I don't know how hard I need to pinch
this but I am stopping my hands and like
it's weird in my head I almost almost
feel tingling like as if I'm pinching a
very light area object between my
fingers even though there's you know
nothing but air between my actual
fingers finger paint
okay cool that's it my beers in there
all right
oh yeah so if I try to push my hands
through the table
they obviously won't do it but in the
real world I want to be able to like
same as if I up now these are all
physics objects
but it's a such a strange sensation as I
push my hand through this again I don't
actually there's nothing I'm not feeling
a flat surface but my brain almost makes
the tips of my fingers gives it a little
bit of a tingling sensation as if I feel
like I'm pressed against something it's
so strange
I wonder if it's akin to like VR legs if
it's something that over time I won't
feel anymore or feel stronger but as I'm
doing this like I I swear I feel a
tingling tingling sensation as I push my
virtual hands against this completely
virtual surface oh so strange all right
egg paint tune picking up eggs oh this
looks like it's gonna require very
precise control I don't think I'm gonna
be able to do this grab an egg drop it
in the paint nope yeah that's for some
more time and patience okay this is
interesting typing enter text here now
can you use a virtual keyboard all right
nope let's say uh-huh I got the I got
the quick know brown fox yeah this is
not gonna work
backspace yeah enter enter okay let me
close my eyes and see if I can actually
type anything
yeah that's that's not gonna work but if
I do individual finger typing
clearly virtual keyboards and hand
tracking or finger tracking have a long
way to go before they can be paired well
and used in the same way that we would
use a traditional keyboard how about
fighting oh okay
hello there punching punching yeah this
is not bad I can just a little bit of
latency but I want to grab these knives
stab Oh see if I can grab this and throw
or how about a dumbbell alright this is
again the physics simulation so it's the
latency is a little longer here because
it's simulating weight it's quite
satisfying
Oh terrifying shooting all right
finger guns no hmm I grab one of these
Oh aha you're good the force powers Wow
okay
that is very satisfying
not super precise but again there's a
lot of potential and hand tracking
figure out the gestures trying to again
determine my intention of what I'm
trying to do with this force push there
goes
useless machine here goes
oh is a classic useless machine hit the
switch and turns it off neat two-handed
object Oh
again slippery doesn't snap on to these
handles but never lift it up and drop it
now this is interesting hand slicer turn
on the laser do not touch it turn on
this laser and this I've seen on Twitter
where I can oh my god what that's so
strange my hands are cut off obviously
they're still on me in the real world
but the hand tracking is working and I
can still manipulate them and they're
dancing and crawling oh my god this is
so crazy
No what's going on oh it's back here
okay the hands are back bringing that
hand back uh no what all right let's try
cutting one hand off and then grabbing
it with the other hand oh my god this is
so strange
grabbing my hand and I'm still ha what
this is so surreal I'm holding my left
hand with my right hand and that's gonna
solves the occlusion problem right
because my wrist is here and it's
tracking the movements of this hand it's
like a face hugger oh that is trippy
what if I put it facing me let's cut
this off this way oh my god weird okay
it's flip now so I'm moving my thumb
it's reversed and I'm holding it down oh
my god whoa that's nuts
and then finally puppy so this is a
character from holo ception and I can
pick him up oh this is trippy oh well I
don't like this I mean I really like it
this is so weird
where'd he go okay there goes so I'm
grabbing him with one hand see if I can
get that to work again see if I grab his
head and then he's like wiggling around
he's crawling and I swear to you
like I feel his little hands touching my
thumb I feel like it's like a real
physical sensation but like my brain
thinks there is definitely something
brushing against my thumb or some weird
tingly feeling oh my god this is so
trippy an animated character that is
resisting against me grabbing it now why
doesn't this feel like this when I'm
using like a touch controller
maybe it's cuz I'm not holding anything
at all but this is this is bizarre Wow I
could imagine it him being a spider or
something or a snake and wrapping her
mouth on my arm or crawling on my
virtual hand hey there hey there what
are you doing grabbing by his leg this
is so weird I love it
but it's also really disturbing if you
have an oculus quest you can try this
out be a side quest it's the hand
physics lab in free alpha right now and
stay tuned I'm gonna jump into an
interview with Halle knotek's denny's
and Roger about the making of this about
their thoughts on hand tracking how it
works and the potential for it in video
games so let's jump to that when oculus
announced that there would be hand
tracking support you know on the quests
using computer vision the cameras on
here tracking your hands you know a lot
of the questions from both the users and
I'm sure the developer community it was
what does that mean for VR interaction
we've all been very familiar with you
know touch controllers and having hand
presence that way which is an
abstraction approximates the positions
of your fingers based on where your
fingers are but here now you have more
more precise track controls so as a
developer what are you able to tap into
using the cameras and I guess what
information does the camera give you to
show your hands in the game and an
experience well the hand tracking
there's so many beauty about it and also
many other limitation because of it's
not perfectly precise especially the
tracking because it's not like it
depends on so many other aspect like the
quality of the light you have in your
room do you obstruct one hand with the
other that's all other limitation you
have to take into account when
developing for it but what it can bring
like really giving you full information
about where your finger are in real time
what you can do with it it's really one
step for it's really as we like to call
it like in some way they are cool
because it's really a new way to be
immersed in any kind of application that
with your eyes you can see with your
what your fingers are doing in real time
and we really like because that we
explore that a lot in the research we
experiment a bit with it that the brain
really likes to interact that the main
thing we interact with it are our
fingers so being able to see them in
real time and like what you expect to do
with your fingers actually happen and
you see that in VR is really a huge step
forward absolutely it previously when we
just had track controllers when they
could approximate whether your finger or
your index finger let you point with us
basically extended or close so you can
push buttons but you don't get precise
like analog controller detection of it
in in the application in the experience
you allow the user to reveal the kind of
joint axes these angles that the system
can see can you shed some light and say
how you're tapping into that like in one
hand I noticed sir almost eighteen
different points is that is that the
limit of what the system gives you
access to that's exactly what the system
provides us is basically up to now we
had like we had like six different like
points with six degrees of freedom for
each but now with the trend tracking you
basically know each position and
rotation of each bone so it's like 30
times more information you can get from
reality or what you can do from the
system and that's provided by the
headset and then you have funa tracking
and you can basically get those
information that's what we did we
basically applied physics rules to the
information you get and try to make this
interact physically with the world but
still try to respect as much as it can
the information provided by the system
right so it tries to map the virtual
world with the information provided to
basically adapt to constrain and make it
look realistic in a physics-based manner
that's right so every one of those axes
is its own sixth off-track point yes or
and then combine you get a skeletal
model which is a visual representation
and it works really well you know with
minor latency but you're then applying
designed for interaction for that so
let's talk about the interaction you're
grabbing things holding things or things
obviously people want to do typing on a
keyboard you have an experiment for that
he talked about the different types of
interactions you're providing and what
you've learned from working in that well
yeah so you know that many of your
experiences with the controllers have
when you grab something it's usually
snapped to a point especially like
half-life Alex which is probably the one
of the most known they have predefined
points of how you grab something and
this experience we want to explore what
if there's absolutely no limitations and
it's all just based driven by
mathematics or physics in the background
that you can for example if you take the
crowbar right you can take it any way
you want and there is no fixed position
which we define how you have to grab it
it's all based in friction the the
physics based called it collisions etc
and that gives a lot of freedom but we
also it's a lot harder than to actually
do it it's less convenient maybe in the
beginning because the tracking is not
perfect yet but we've also to explore
that range of flexibility which those
hand tracking features actually provide
and of course when you take a coffee cup
like you have unlimited ways how to grab
it and although it's very hard in the
beginning to get those interactions to
feel good and we're still working very
hard on making that feel better with
friction with how the hands are track
etc and still once you have such a
system defined you can then just add
more and more objects in it and you can
grab nearly everything without defining
a lot of Brad points etc the additional
thing it adds compared to the very
initial grabbing system we've had in
many V wraps that you could just grab an
object but as soon as you grab it and it
was going through stuff it was you
really had a feeling oh and just a ghost
like in that virtual environment I can
grab stuff it snaps to my hand but if I
put my hand on inside an object I just
go through because it's it's really not
physics based it's more like okay I can
do stuff I can interact if I press the
button but
no really that immersive feeling of
being able to really touch stuff right
right and here you have these rigid
bodies not just a flat palm but every
part of your finger your you're asking
the user this is a choice you made to
ask the user Thomas to relearn how they
interact with objects because like you
said the coffee cup I'm not holding it I
want to hold it the way I hold a real
coffee cup but I'm finding I'm holding
it differently to support the system
that's being provided mmm and of course
there's still a lot of things we want to
explore and get the feedback also from
everyone which tests app and what feels
natural what doesn't and how we can
improve it further and it's that's why
it's more for us research project where
we want to explore this realm of ok all
fully physics based no clear defined
snapping points that certain things are
harder and we know they're not always
working the ways you expect because it's
all physics based but it also provides a
lot of richness and and a novelty we
think in that space to explore it and
that's why we want also to share it in a
maple a bit early stage in pre-alpha
where you know things need still quite a
bit of work but we we had so many
requests of sharing that experience
because many could not imagine how it is
with hand tracking to go beyond just the
oculus menu that we wanted to make it
available relatively early to get
feedback learn from it improve on it and
see where it goes do you imagine
yourself in the future as you're
designing these interactions to be
implemented in a game environment so
start making lists of the poses and the
grab hoses that are the most most
natural for people you know whether
you're holding something like you know a
cup or a cylinder or cube that then you
can maybe it's like the auto aim for
grabbing assisting is that to complement
the physic space grabbing yeah I think
really the best possible feeling it's
about finding the right balance of
having full freedom to grab and touch an
object the way you want and you really
have full freedom or for specific object
like like a pencil for example you need
to snap it because if you really like
bet on the the limitation of the physics
engine to be able to precisely grab a
pencil and being able to draw control
that will really not feel great at the
current stage of course maybe in a few
years it will get a lot better but in
the current moment we some specific
object like like a gun of course you
want you have only one wheel ideal way
to grab a gun and be able to use it some
tools also we can ideally snap them and
yeah it's all about the right balance we
have also to determine what the purpose
of having a specific interaction or
component or tool in that environment
what will the user be willing to do with
that either having full freedom other
having a few snapping points or both hmm
and how do you imagine hand tracking
being used for interaction with menus
and in UI without buttons and in the
oculus menu right now they have ray
casting and a gesture based system to
basically have a binary on and off have
you guys thought about that and what do
you see is the opportunity for for
gestures for simulating you know button
presses and activating menu options
mm-hmm so they're sort of like the the
SDK at least at the current point um has
very limited gesture detection there's
the pinch gesture you know and and the
scrolling those things you can easily do
but there's not a lot of range which we
get directly from the SDK to define
different posts of the things you have
to do a lot of work to basically
extrapolate that from the positions you
get and for us also the what we really
like to compare it with the best user
interface for us is like to not have to
explain it to the user that's basically
intuitive like it's your button enough
we want to press it he see an object he
wants to grab it without having to go to
settings personally to say what he wants
and having to like click on a bottom by
pinching with the laser pointer that
works in some way but it's not really
intuitive for most user the first time
they jump in VR if they just see and
they can just see their hands and they
see a button or I can press it we don't
need to explain it to them in any way
and for us that's really important to
make everything like really intuitive
for user to be able to interact with
their heads it also the way of like when
you have for example an action which you
don't want the user to do by accident
instead of like having a long press and
something you can make the interactions
just more physics based and hard to like
a handle or a lever which you have to
pull down hard until it activates and
you have basically more options that we
want to explore also that going away a
little from the 2d menu design and
building more 3d components and see how
natural that feels it doesn't work for
everything certain things need to be
fast and precise and there's some tui
menus work still very well but for other
things especially when you see how
people use the hand tracking they want
to touch things they want to interact
with things and having the menu so to
speak spread out into the full 360
degrees around them where they can more
contextually interact with things is
something that you're really excited
about we see how well that actually
works in many occasions right it's a
balance between the confidence and
precision of the poses and information
you're getting from the system and then
also the potential fatigue even though
you might get more confidence with
bigger actions thank you for some action
it's way better to keep your hands and
just do some precise movement and really
do limited interaction or limited
movements if you have to really do this
every time you want to go to settings
it's really annoying and tiring for the
user that's why it needs to be limited
as soon as you can do some playful
interaction it's fine to have to move
your arms a lot but is if it's to do
some like productive stuff or more like
really setting up stuff it needs to be
really fast and precise yeah speaking of
playfulness something that was extremely
interesting in the experience and
something that went viral when you
shared on social media was this the
disassociation this embodiment of your
hands yours a tool where you can slice
off your wrist and basically your hand
becomes a physics object that you can
record but it's still tracking the
movements it's almost like puppeteering
like what you have in hollow ception
you're controlling a tiny avatar of
yourself how did you come pop that and
and what do you see as an opportunity
for Bill to use your hand but not see it
connected to your wrist
well that's always what he liked to
experiments since the beginning we
started like making application or games
like pushing the brain like the the
capacity of the brain to interact with
stuff or still keep the immersion in
some way that you can easily keep
control of what you're doing that's what
we did with the low section that you can
add that out-of-body perspective and
still be able to we like intuitively
move around grab object like fights and
we wanted to experiment that the data as
well here by but if you just cut your
hand you see your hand from another
angle and you try to control it
is it still intuitive does it feel weird
how does you break into that and there
were there was a lot of research being
done like also decades ago about that
like with the robber hand experiment and
all that and we wanted to experiment
with that yeah I think that it's just
like such a weird moment when that
happens and your hand is suddenly in
front there but you still know it's
tracked perfectly and we just definitely
want to have that in there because it's
like one of those magical moments where
it's it's so confusing to the brain but
also so exciting at the same time and
it's so weird to be able to detach your
hand you grab your hand you deactivate
gravity and then you see your hand
floating you can just push it around
there is so much stuff that can be
played around with that we didn't even
expect but just happen to be that but we
had the capacity to design you activate
gravity to detach your hand it just like
came naturally that you could do that
and it was just fun and it's we mentally
have the capacity also to kind of
reconcile that with a little bit of
learning curve it's yes it's so
interesting that we can even though it's
very strange you can still understand
what's going on and manipulate even
though your hand isn't where it's been
its entire life yeah we always make the
curse of driving a car right the first
time you drive a car it's very
complicated you have to think about
every single movement and after a while
your body basically becomes just the car
becomes an extension of your body and
you just don't need to think it's just
automatic and those kind of mechanism of
the brain how to adapt to unfamiliar
situations are the things which we
really like to explore in whole section
did that as well where you have
something in front of you which is
actually you but not really and is those
things Patel's trick of the minds which
we really just enjoy
to see how people adapt over time become
better at it and manage to disassociate
themselves by getting more source of it
something in front of them than
themselves and there's no better way to
discover those than they actually create
experiments like this which is what you
guys are doing and thank you for letting
the users out there experience this as
well what's next what are the plans for
this after you release the Alpha is it
more experiments is it implementing
these interaction models in in
entertainment games what do you want to
do with us
well we mainly want to get first a lot
of feedback from the people trying it
from the community and see what is what
is best do they like the feeling they
have when they interact with stuff do
they want what kind of experience they
want us to look more into to try new
stuff actually the main reason why we
put it out there so early is because we
really want to get a lot of feedback
because there is so much stuff that can
be experimented with we want to we want
to explore further but we also want to
know what the people want to experience
and the something like doing right yeah
we had like we had experiences where
maybe you saw it all on on Tennessee's
Twitter as well where we had a portal
where you go through wormhole and the
hand comes out on the other and you're
still controlling we still want to add
more different types which means to see
like what can potentially be gamified
and what can be like the core of it but
for that is for us it's story first of
all alkalosis has not opened up the
application process for hand tracking
experiences in any case what we want to
get it out there to learn them as much
as we can to build something which is
exciting and interesting for us we are
not breaking the immersion or making it
too complicated and hard because
sometimes it's very hard to be sure that
everyone has the right mindset or those
proprioceptive capabilities work in
every use at the same way so that's why
we're gonna get it out there very safely
early to be sure that we build some from
each the majority of people can enjoy
and associate with absolutely well
congratulations on the launch it's super
fun I think everyone should at least try
it out and I can't wait to see what you
guys work on NEX and what other
experiments you put out there thank you
both so much for your time and again
congratulations on the
thank you very much thank you
