We explore the idea of augmenting a wearable AR display with an actuated spatial augmented reality projector.
We call this AAR, … Augmented Augmented Reality.
Our prototype is a version 1 Hololens with a custom bracket to mount a small pico-projector.
The projector can be actuated in two axes using pan and tilt servo motors.
To prototype AAR experiences, we created a toolkit with abstractions to work with the
hardware and virtual environment.
This includes:
an HMD-Projector Controller to provide a high-level API for projector and servo movement;
a Spatial Awareness Manager to process Hololens mesh data for plane finding and semantic categorization;
and a Rendering Manager to:
control where virtual content is displayed, either on the projector or the Hololens,
with the option to blend between them;
create projected view-dependent renderings for the Hololens user or a bystander,
and project content to appear as an external display.
With our toolkit, we created a series of use cases that explore the design considerations
laid out in the paper for enhanced and shared AR experiences, and for ambient and ad hoc displays.
For enhanced AR.
The combined projector and AR displays can be used to expand the field of view of the user.
Here you see a large city model that the user explores through the projector.
By using the projector for peripheral notification or GUI. An arrow can be used to point to another
off screen 3D model in the scene,
Or the projector can be used to show a GUI outside the FoV of the Hololens.
The projector can also be used render a new perspective on a 3D model. Here you see an
orthogonal projection of a plane engine onto a table, producing a CAD-like overview.
Or by varying the intensity of the projection, it can indicate the height of the model to observers,
creating an inverted shadow.
The projector could even be used as a flash
to augment the lighting during a photo.
In order to expand AR user’s ability to work,
the projector can render a GUI onto
a table, keeping them focused on the 3D model.
If the AR user is prototyping a physical product, they can use physical props to aid in the process.
Like the design of a cereal box.
For shared use cases with an external user.
The projector can be used for a live presentation,
where slides are surface mapped onto a nearby wall,
allowing the AR user to view slides
notes in private.
Virtual third person views of the virtual
scene can be taken as well, giving external
observers a window into the virtual world
the AR user occupies.
Or, 3D content can be corrected for an external user through view-dependent rendering.
Here you see a user examining a 3D model the AR user is working on.
There are also several use cases where exact fidelity of content does not matter for a task,
where the projector can be used as an
ambient display.
Like using the projector as a spotlight to direct external users.
Or evening using it as a dynamic light source.
Here you see a user listening to some music with an ambient light display.
Or, the mobility of the device can be utilized to create an ad hoc SAR environment.
Here you see a user controlling the HMD to project a video onto a wall.
To enable all these AAR experiences, the projector and pan-tilt geometry are calibrated using
a modified structure from motion pipeline.
Projected gray and sinusoidal codes are used to construct dense putative correspondences
between the Hololens view and the multiple projector poses.
These are used to optimize over a Devenit-Hartenberg parameterization of the pan-tilt axis
and geometric structure of the projector mount.
The result is a calibrated virtual representation of the actuated projector relative to the Hololens,
enabling real world control of the projected image in and around the Hololens field-of view.
To validate our toolkit and our concept of Augmented Augmented Reality.
We conducted a study with professional XR developers.
All developers were positive about AAR, and
even developed some interesting applications
that you can see here.
For additional details, see the accompanying
paper.
