So far we've been seeing augmented
reality demos controlled by gestures
performed in front of head-mounted
cameras. Such interaction may seem intuitive
but in fact it comes with many
serious limitations. When there's no
surface to touch the user gets no haptic
feedback and can't intuitively predict
when exactly the desired action will happen.
Nor does the computer feel the gesture.
The user has to wait after carefully
performing each movement to give the
computer time to realize what has
happened.
Move too soon and you interrupt your own gestures! Our fingers often hide each
other so the user has to make sure the
camera can see them. All interaction is
limited to your field of view which
means you have to keep looking at your
hands all the time. And of course it's
tiring to hold your arm in front of the
camera for long periods of time. Since as
far back as the 1960s this has
been known as Gorilla Arm Syndrome.
All these issues make interaction with
AR a slow, clumsy and frustrating experience.
Phantom Touch introduces a new
interaction model based on inertial
tracking. It monitors subtle finger and
hand movements no matter where user's
hands are. It also uniquely registers the
touch of two bare fingers, creating a
natural feedback surface.
There are a few carefully placed buttons
for specific tasks. The controller doesn't take
away your own sense of touch
and can be comfortably worn all day.
The first interaction area we need to
focus on is the flat interface on a
virtual surface around the user.
Pointing an index finger forward calls up the cursor. Tapping the middle finger
performs a selection or confirmation. Opening a palm calls up a context menu or
icons on the home screen.
Swiping left and right with two fingers
navigates back and forward. A fingertip
acts as tiny sensitive touch pad used
for scrolling. When we move to 3d
similar principles apply. A cursor is
used to point at objects around the user. A click
selects or holds them. Then you can rotate, resize, modify or build from objects
around you as intuitively as using a
mouse.
The last important area is typing on the
go because we don't always want to speak
aloud when surrounded by others.
Your hands form two parts of a
keyboard. The buttons and three bare
fingertips represent groups of
characters, similar to those on old
mobile phones.
This turns out to be quite an effective
way to type, especially when T9 is
added. All this combined with a specially
designed user interface finally makes it
possible to fully enter the era of
augmented reality and wearable computing
