Hello, my name is Xiyao Wang.
I am currently a PhD student at inria France and Univ Paris-Saclay
It is a pleasure for me to share our joint work with 
Lonni Besancon, David Rousseau, Mickael Sereno, Mehdi Ammi, 
and Tobias Isenberg 
about ”Towards an Understanding of Augmented 
Reality Extensions for Existing 3D Data Analysis Tools” 
The main motivation of this project is that 
previous studies showed that immersive visualization 
with stereoscopic views help understand 3D datasets, 
and many of them have been created during the past years. 
Ranging from large environments, like a CAVE systems, 
to smaller ones like head-mounted virtual reality or 3D glasses. 
However, even though the benefits of visual 
immersion have been demonstrated by many studies. 
For example, there are controlled experiments proven that 
users can understand biological structure easier 
in immersive systems than a normal screen. 
Few of them have been practically integrated 
into scientific daily usage.
We thus want to investigate a practical way to bring
 immersive visualization to their daily workflows.
How to choose the aprropriate devices,
and how to design the interaction techniques,
For this project,
we are working with domain experts in particle physics.
This is an example of particle collisions. 
In general, one event is very large, 
and contains more than 10 thousands particle trajectories. 
With traditional visualization on screen, 
those traces overlap each other and lose spatial direction. 
It is thus hard to understand how particles travel in space. 
So they have interests in using immersive 
environments to explore their data. 
According to our discussions with physicists, 
they do not only rely on visualization to explore the data. 
A common approach is that they first use some 
statistic tools to analyze the data, 
to find their regions of interests. And then, 
with visualization software, 
they try to understand the event. 
Very often, they need to consistently switch between 
the analysis software and visualization tools 
to accomplish data exploration tasks.
There are some important elements in this process. 
The first is that traditional analysis tools heavily 
rely on script-typing and precise parameter adjustment, 
where mouse and keyboard are still demanded. 
Secondly, 2D plots, like histograms and texts are still important.
Thirdly, they need efficient 3D visualization.
Based on these considerations, we investigate an 
AR extension to existing data analysis tools. 
In this project, we use the microsoft HoloLens. 
We investigate the scenario where users sit in their office 
and work on the data.
The general design comes from the scenario where experts use two screens. 
They use mouse to interact with both screens, 
and the cursor can go from one to another.
Then one screen is replaced by the HoloLens.
In this study, we keep the same interaction on the Pc and on the HoloLens: 
we still use mouse to control both spaces, 
the cursor can go from one to another. 
Although there are previous work illustrating different 
techniques of bringing immersion to existing workflows, 
with different choices, 
we focus on different aspects. 
For example, we want users to be able to see both views at the same time 
rather than forcing them with only one. 
We also want to provide a unified interaction experience 
to avoid frequent changes of input devices.
We detailed our choices and the differences in our paper.
We show a part of our implementation here. 
We used similar user interfaces on both sides. 
The left side is on PC, on the right side is the interface on the HoloLens. 
Users can perform any functions on any side, 
everything can be done with the PC or with the HoloLens.
Based on our discussions with physicists, 
we implemented some basic functions to explore their datasets. 
We only show a very short concept here.
This is an example that we highlight or filter particles using an histogram. 
In this study, the two spaces can be synchronized,
but also can be treated seperately, 
and synchronized on request.
As mentioned, both sides have similar interfaces,
any functions can be performed on the PC or on the HoloLens with the mouse cursor.
This is another example that users select specific particles using a lasso tool in AR space.
This function can also be performed on the PC as mentioned above.
And the results can be synchronized directly,
Or just on kept on one side, and be synchronized on request.
Then, to understand how experts would use such a linked-view design, 
to investigate if desktop interaction can be well supported in AR, 
and to gather any other possible comments for future design,
we performed an observational study with seven physicists from CERN. 
This study was preregistered to follow best practices in experiment design 
and the preregistration and data are available online to follow open science principles
The experiment is composed of three main parts. 
First is the tutorial and explanation where 
the experimenter introduced the system and supported functions. 
And then a free exploration phase.
We asked participants to freely explore a simulated dataset with all possible functions. 
During this process, the experimenter stayed with 
the participant, took notes, and helped them when needed.
We also encourage the participants to ask questions and think-aloud. 
At last, we conducted a semi-guided interview to get their comments.
During the last part, 
we asked participants many open questions and several likert-scaled rating 
to get their general feedback and envision a future design.
We reported and discussed the complete results in our paper. 
We only briefly mention some insights here. 
For example,all participants liked the hybrid setting 
and 5 out of 7 preferred a balanced interface.
In general, experts had no difficulties using such a hybrid setting, 
they agreed that this is feasible and useful. 
For example, 
experts are familiar with the PC and can understand esaily
their results from traditional analysis tools.
 
Both sides have their advantages and that can benefit data exploration.
And the PC also facilitates the precise input using mouse and keyboard. 
While the AR part largely favors the spatial understanding. 
The additional depth clue makes the 3D trajectories very evident, 
and the spatial arrangements between different elements 
are also much clearer than a normal screen. 
In additional to their familiar view on PC, 
the large canvas in AR space facilitate thes multi-view analysis 
that are often involved in their workflow. 
It would then be useful to flexibly adjust the views in space.
However, despite the technical issues of synchronizing the mouse movement between the PC and the HoloLens, 
how to use mouse in 3D spaces still needs further investigation. 
Apart from that, walking around is seen as a great features of using AR, 
compared to VR systems that are tied with a PC. 
One participant particularly mentioned that he noticed the advantages 
of stereoscopy once he stands up. 
This is alos seen as a main difference of the AR compared to a screen.
Then, it would be a good idea to let users to define the link between the two spaces, d
epending on whether they want to explore the data or to compare different states.
In conclusion, we present a study to investigate how particle physicists
want to work with a hybrid setting to extend their current analysis tools 
with immersive visualization. 
In general, they agree that such setting is feasible and 
can large help data understanding.
We gathered comments and discussed some guidelines for a possible future improvement.
While we are focusing on particle physics, 
our results do not limit to this area. 
Other settings and other domains dealing with 3D data can also benefit from our results.
There are many features that worth to be further explored in the future.
For example, we have mainly two things to continue working in this project.
If you are interested, please do not hesitate to contact us.
First thing is to investigate both intuitive and precise interaction, 
how to appropriately extend the mouse in AR, 
or how to combine other input seamlessly into the hybrid setting.
Then we want to apply this setting to more realistic scientific scenarios.
One current project is that we want to use our setting to compare different algorithms of reconstructing the particle trajectories from measured data.
This is the end of my talk.
Thank you for your listening.
