in this video we'll be presenting
embodied axes tangible actuated
interaction for 3d augmented reality
data spaces modern immersive
environments such as virtual and
augmented reality offer true
stereoscopic environments and the
ability to engage directly with the 3d
visualisations most AR and VR
environments allow us to walk around 3d
virtual objects peer around them and
grab them with our hands or the tracks
controllers well this input style allows
engaging in more natural interaction
there are inherent issues to using our
hands or track controllers for data
visualization purposes for example there
are the lack of affordances it can be a
fatiguing interaction and it can be
rather imprecise our arms in hands deter
and prevent detailed interaction for
example selecting a point or precise
value along a visualization axis there
is a momentum for medical practitioners
to put their data into virtual or
augmented reality the operations that
radiologists or general practitioners
perform with 3d imaging requires precise
interactions such as slicing and
oscillating 3d route regions this has
been studied in previous work such as
Sousa
ATAR and using 2d surfaces and input to
support 3d medical imaging in this work
we introduce a more natural 3d
interactive mapping to support these
operations in immersive environments we
present embodied axes a device for
making precise selections in
3-dimensional visualizations so the
device is designed to be used with an
augmented reality head-mounted display
such as the meta 2 pictured here the AI
display provides the immersive
visualization of the data while the user
input is performed with the embodied
axes
when designing the embodied axes our aim
was to provide a coherent environment in
which you can still naturally interact
with your hands while supported by the
tangibility of physical affordances for
precise selection to achieve this we had
a set of design goals and principles
that drove development first we wanted
to support a set of common data tasks
that occur in 3d space such as filtering
and selection second we wanted to
support precise selection as we found
that was what is expected from our
experts based on the interviews we've
had we also wanted to provide a physical
frame of reference for these tasks which
is enabled through tangibility and
finally we realized that multimodal
input was essential to supporting both
precision and the fluid interaction
available through our hands
the embodied axes physically embodies a
three-dimensional data visualization
space it provides this physical 3d frame
of reference with three axes for the X Y
Z dimensions each axis has a range
slider composed of two physical sliders
and these sliders are actuated so they
can actually move by themselves each
axis also features a rotary button the
sliders afford direct tangible and
precise inputs the actuation supports
coordinating the users hand position
with the configuration of visualization
if needed so there can be this feedback
loop between the input and the output
the actuation also supports haptic
encodings potentially allowing the user
to fuel the data they are examining the
result is a coordinated gestures
supported by physical affordances to
explore and examine data this is known
as spatial data coordination the user of
the device can independently set for
each axis single values using the
position of a single slider range values
using the position of both sliders on
the same axis fixed range values by
using the actuation of a single slider
that follows another slider and Delta
rotation values using the rotary button
that can represent Delta's or continuous
values
the user can slice data along a single
dimension with one slider or select a
slice range using both sliders on an
axis the actuation can set the sliders
to follow each other at a fixed distance
creating a fixed range selection to
explore remote collaboration between two
users we replicated the embodied axes
prototype device and created a network
layer we tested this setup between two
Australian cities and we observe that
the device can potentially provide a
valuable cue for increasing the sense of
presence of the remote collaborator we
also explored integrating the embodied
axes with several input modalities here
we Ellis trait the natural hand tracking
integration with the leap motion
controller so with this configuration
the user can use a rock band interaction
between their tracked fingers to select
the volume the actuated sliders reflect
the volume and can be used to fine tune
the selection along the axis the
embodied axes can be used in the context
of abstract data visualization so for
range selections value filtering and
volume selection are all supported in
this context the rotary dial can also be
used to map the data met dimensions to
specific axes we ran a formative study
with three medical practitioners a
radiologists of forensics experts and 3d
medical imagery engineer and in this
study we explored their common tasks
involving 3d visualization and
interaction and we explored how embodied
axes can support these tasks so we
received positive feedback on its ease
of use the intuitiveness and the
tangibility for precise selection and
the experts suggested that embodied axes
could be used to measure regions in the
3d data or the actuated sliders could
even track a needle in the 3d volume
they also identified the need to be able
to rotate the 3d object while keeping
their 3d frame of reference this
formative study eventually informed the
tasks that we ran in the follow-up
controlled user study
to assess the performance that users can
achieve with the embodied axes we
conducted a controlled user study we
compared the embodied axes to
state-of-the-art six staff track
controllers in an augmented reality
environment in this study we tested
three tasks target selection volume
slicing and 3d bounding box selection
these tasks were informed by the
formative user study with domain experts
for the target tasks the user had to
match the value and position of the gray
target with the red target for the slice
finding tasks they had to find the
center of the biggest sphere hidden
within a 3d volume and for the bounding
box they had to define the smallest
bounding box around the red dots in the
3d scatter plot with the track
controllers participants manipulated the
targets and the volume directly with the
motion controller in the embodied axes
condition they had to move each slider
knob found that for precise selection
participants were 56% faster with
embodied axes and were more precise for
the bounding box task we found a
trade-off between time and accuracy
people did more casual selections with
the track controllers for the volume
browsing or the slicing task we found no
difference between the embodied axes and
the track controllers but we did find
that the embodied axes may be preferable
for extended use due to less fatigue
based on our post studies survey more
information about these results can be
found in the paper
so to wrap up we presented embodied axes
as a new controller to interact with 3d
immersive visualizations and we
demonstrated the application of this
controller in the context of the medical
practitioner domain weird inspired a set
of novel interactions that use the
motorized slider that are part of the
design of the embodied axes and we
compared it to state-of-the-art sixth
off controllers and found clear
advantages for using embodied axes for
particular selection tasks for future
work we need to support more
interactions so things like scaling and
rotating essentially everything that
breaks the frame of reference that we've
established we also need to explore
collaborative use cases as the domain
experts found this to be a particularly
compelling use case for the technology
and again for more details or questions
and follow-ups please look into our
paper
