Technologies today is storming across the globe, in terms of the advancement
providing endless promising
possibilities to support people's lives.
be it texting someone...
making a video call...
or taking picture...
Imagine your phone could do something
more, something that bridge us between
the digitalised domain and the real-world space.
Augmented Reality is a technology that superimposes
both the domains of human-real world interaction
and computer-real world interaction,
eliminating the need for switching focus between the two domains.
At the same time, it also brings imaginations to reality.
To put into simple terms, a mixture of indirect and direct view of real-world
environment,
simply through the display of our phone lens.
A consequence of the rapid growth in computational power,
ubiquity of consumer mobile devices,
along with the huge constraint imposed
by the increasing number of students
on the availability and the storage capacity
of the lab equipment,
has led to the increasing growth of interest in using AR in education.
We have collaborated with number of stakeholders,
including the mechanical engineering faculty
to develop number of augmented reality
applications
to evaluate the efficacy of AR in enhancing the learning experience of the students.
In this video we will first present on the significant contribution of our project,
followed by the design implementations such as how does augmentation works with Vuforia,
and content creation with Unity 3D.
After that, we will also present the projects
that we have completed throughout the
year
which are Geometric, Dimensioning, and Tolerancing (GD&T),
Colorimetric titration, and oxygen generation with bleach and hydrogen peroxide.
Lastly, we will conclude the presentation with the
conclusion and the future works.
As for the contribution of this project,
we have developed prototype of AR application for
engineering education,
in collaboration with number of stakeholders.
At the same time, we have evaluated the efficacy of AR in teaching,
and set the ground work for further development in the future.
In addition to that, we have also
submitted and published journal papers
based on the AR applications that we have developed.
Augmentation with Vuforia,
Vuforia is a Software Development Kit (SDK) that enables the creation of AR application.
It uses computer vision technology to
track and recognize planar images,
3D objects, text writings, and other more.
In the case of our project, a 2D based marker and a 3D object target is used.
For a 2D image target tracking in Vuforia,
the image would just have to be
uploaded into the Vuforia developer portal,
where the subsequent feature
points on the marker will be extracted
and then exported to the library for the
tracking mechanism.
As for the 3D object target tracking,
the object needs to be placed onto the specific target marker,
where the feature points of the object
can be extracted with the Vuforia Object Scanner,
that is installed on the mobile device.
Object should be scanned under moderately bright and diffused lighting.
To the extent possible, the surface of the object should be evenly lit,
and not contain shadows caused by other objects or people.
As for the content creation of the AR application,
we mainly used Unity 3D and blender that
was freely available in the market.
Unity 3D is a cross-platform game engine, that provides the functions and tools to
create, edit, and integrate information
onto the target markers in the scene.
With Unity 3D, we can overlay different
elements such as objects, videos, and tags.
Fundamentally, Unity is used to assign
the properties to the augmented object
along with the user interface to create
the interaction between the user and the
virtual objects. On the other hand,
blender is a 3D renderer, which was used
to render a more complex models and it
also provides the features such as
animation, compositing, and texturing of objects.
Moving on to the first part of the
project, GD&T, the requirements were to
develop an augmented reality application, 
which the augmented part can then be
interface virtually with the physical
base. The interface assembly can then be
analyzed through better visualization.
The developed application should also
includes UI functions for user
interaction.  Application should also be
designed around the learning outcomes in order to enhance the learning experience
of the students on the GD&T context. As
for the design implementation of GD&T,
the application would involves two
separate parts to be interface together
which are the augmented clamp to be
generated and the 3d printed base to be
interface with. There will be three user
input tolerance settings that can be
selected by the students, the connection
hole radius, pivot hole radius, and the
distance length between both of them.
Mechanical drawings of the parts is
provided to the students for reference.
Students are expected to have better
understanding on what is tolerance after
using the application. They are also
required to analyze the effects of
tolerance on the end functional
operation of the assembly, all through
better visualization and learning with AR
Moving on, we will be discussing on
the overall implementation of GD&T ,
which can be summarized into the following flowchart. The overall implementation of
GD&T can be separated into three
individual stages, having the stage one
to be students analyzing the assembly
drawings of the 3D printed base
and the augmented clamp to be generated.
In the next stage, the students are
required to read and understand the
instructions provided in the tutorial,
and then input the tolerance settings
accordingly onto the three different
dimensions. The augmented clamp then can
be generated and then dragged towards
the 3D base to be interfaced.
On the subsequent stage of stage three, 
the students are required to read and
understand instructions on how to
interact and manipulate the augmented
clamp. Students are then required to
examine the clamping tool through
observation with the help of the UI
functions to check whether the end
operational function of the assembly has
been met. Feedback in terms of the
message or vibration of devices will be
output to the students in correspondence
to the input tolerance settings they have
set previously. Students are then able
to redo the exercise with a different
set of tolerance settings, and then
further analyze how other different set
of tolerance could affect the end
operational function of a product.
As for the colorimetric experiment, it involves
the chemical reaction between acid and
base to form salt and water.
The setup for the experiment is shown on the screen.
A burette is filled with basic solution, meanwhile,
the beaker is filled with
acidic solution and a few drops of universal indicator.
The function of a universal indicator is to determine the
pH value of the mixture by observing its color.
The universal indicator colour chart
is shown on the screen,
which it shows the color of the indicator at
various pH from 1 to 14. At first,
the mixture should appear red colour. Upon adding
in the basic solution into the mixture,
the pH of the mixture will increase
leading to the changing color of the
universal indicator. Then, the experimenter
is required to record down the amount of
basic solution added to neutralize the
acid. In the following session, I will be
explaining the requirements for this
colorimetric titration application.
Firstly, the application must be able to
augment the liquid in the beaker and the
burette based on the image marker
Next, a simple user interface is required
to allow the user to add the basic
solution into the mixture.
The result such as the amount of basic solution
added into the mixture,
and the color change of the mixture should be able to
show to the user. In addition to that,
it also has to visually show the
interaction between the two imposed liquid body.
When the application first started,
it will prompt out the data of
chemical substance. More additional
features like selection of different
concentration, adding the base by flow,
changes of the mixture colour, and even
adding by droplets are integrated into
the application. To simulate the actual
environment of conducting the experiment,
when the beaker is not directly placed
under the burette, there will be a
chemical spillage, thereafter, the user will
not be able to continue adding the chemical.
As the prototype for colorimetric
experiment was completed in the end of
semester 1, we have decided to continue
developing another chemical experiment
AR application, which is the oxygen
generation by chemical reaction
between hydrogen peroxide and bleach. In this experiment the setup arrangement is as shown.
The motivation for choosing this topic is
because it can help to simulate a
control environment for the students to
experience conducting the experiment
involving hazardous chemical substances,
such as bleach. The student is required
to add hydrogen peroxide to the bleach
and then measured the amount of
oxygen gas produced.
However, when one of those
reactants is used up, it will become
the limiting factor that stop the generation
of oxygen gas.
"Hydrogen Peroxide and Bleach Application"
"Student Testing on Colorimetric Experiment"
"Student Testing on GD&T"
Based on the feedback from the survey we have collected, there are a majority of students
think that, AR can help them to
improve their learning experience. As for
the future work in terms of the project,
more interactive tool can be
incorporated into the augmented reality
application to heighten the learning
experience of the student. This will
conclude our video presentation as well
as our final year project. We would like to
hereby once again thank our supervisors
for giving us this opportunity to
work on this project. There will be Jonathan,
Tuck, Andy and Veronica. Once again, thank you. Thank you.
"Our Final Year Project Poster"
