- So traditionally 3D art for games
has been created manually.
Artists have been using
specialised 3D modelling software
to create everything from scratch.
These programmes allow you to model,
sculpt and paint props and environments
that you can then import
into a game engine,
such as Unity.
But creating realistic
looking models and textures,
is actually a really complicated process
and it requires the
combination of artistic skills
and technical knowhow,
that is pretty hard to get.
Plus making something that looks real,
is just incredibly time
consuming, because turns out,
real life is pretty detailed
and adding all this detail by hand...
That's a challenge.
So over the last few years,
a new trend has been taking
over the game dev industry.
I'm of course talking about 3D scanning.
This is a process where you
go out into the real world
with some specialised camera equipment
and record geometry and material data
that you can then take back
into your virtual world.
And at the for front of this technology
has been a company called Qixel.
They've built a huge library
of assets and materials
by going out and painstakingly
making thousands of scans
around the world.
But so far, this process
has been pretty complicated
and it requires a lot
of special equipment.
So not much luck if you
want to do it yourself,
but recently Ubiquity6 reached out to us
about a service they've been
working on called Display.land
and how they're trying to
make the power of 3D scanning
available to everyone.
Now Display.land is a free
app that you download,
and it allows you to create 3D scans,
just using the built in
camera of your phone.
So just to become DD transparent,
this video is sponsored by Display.land,
but I will say that we
were completely blown away
with how well this
technology actually works
and even crazier just
how easy it is to use.
So for the first scan,
we tried creating a scan of our office
and Dris walked around the room
while filming everything he saw.
Then once he felt like
he'd captured everything,
in a good amount of detail,
we hit upload and the scan went
to the cloud for processing.
And after a little while, it
was ready to view on the phone.
In fact, Display.land
is a social app as well.
And from here, it allows
you to share scans
as well as follow other creators.
But for this test, we just
send it to the computer
and opened it up in a browser.
And getting it into
Unity, was just as easy.
We simply exported it as
an OBJ and dragged it in.
I created a material, assigned
a texture and there it was,
of course the scan itself
does have the lighting
of our office built into the texture.
But I found that even with that,
it racked up pretty well
to the lighting conditions
of our scene.
So I added a few point lights
to where there were lights
in the office, as well as
dragged in a few example assets.
And it's pretty amazing
how well they fit the scene
out of the box.
They instantly looked like
they were part of the room.
And this is when I realised if
the scan has actual geometry
and it actually looks pretty clean,
we can probably have physics
objects interact with it.
And sure enough,
after adding a Mesh Collider
and a sphere with a rigid body,
we had physics.
In fact, I got so excited,
I created a quick script
that would spawn spheres over time,
which ended up looking pretty cool.
It definitely answered the question
of how our office would look
if it got flooded by red balls.
In a weird way, it's kind of beautiful.
So for this first test under the belt,
we were ready to do some more scanning.
But how does this process actually work?
Well, Display.land essentially
creates 3D captures
using a process called photogrammetry.
It uses many images of the same geometry,
but taken from different perspectives
to create a 3D
reconstruction of the scene.
It's the same reason why
animals have two eyes.
You need two different perspectives
in order to determine depth,
which is also why if you close one eye,
it gets pretty hard to tell
how far away things are.
So after taking a video with
some different perspectives
on the thing you want to scan,
the video, as well as some sensa data
gets sent to the cloud,
where it gets turned into a 3D model.
Some scans take just a
few minutes to process
while larger scans can
take up to a few hours.
And we actually slowly
started to figure out
what makes a good scan.
In fact, it's something
you really get better at
with practise.
Generally, you want to shoot
in fairly even lighting conditions
and move the camera at
a slow and steady pace.
It's also a good idea to scan
from as many different angles
as possible and to avoid
any reflective surfaces.
And scanning objects with lots of texture,
definitely helps the tracker a lot.
For example, take a look at these props.
We scanned these outside
on a fairly cloudy day
and all of them have a good textured,
but non-reflective surface.
Of course, sometimes the sun
peaks through the clouds,
but the tracker actually managed
to handle that quite well.
In fact, we thought
they looked so realistic
that we might be able to use them
together with regular 3D models.
In other words, could we take these scans
and put them into an
otherwise virtual environment
without them standing out?
So to answer that question,
I went to the Unity Asset Store
and found this free
industrial environment.
I imported everything into Unity,
played around with the
lighting and materials a bit
and added a first person controller.
And voila, we had a tiny game
where we could move around
and interact with the environment.
So far so good, now for the scans.
And this was actually just a CC.
I downloaded each scan and
opened it up in Blender
to cut off any unnecessary geometry.
I then re-exported, dragged the
model and texture into Unity
and assigned a symbol material.
And right away, it looked amazing.
I'd expected the scans to
stand out like a sore thumb,
but as soon as Unity did
it's rendering magic,
the concrete table blended in perfectly.
Of course, the scans aren't optimised
to use a simple geometry as possible.
There's inevitably going to
be a lot of extra vertices
on these models.
So if you're looking to use
them in a commercial game,
I would recommend doing
some work in order to try
and reduce the amount of polygons.
But the cool thing is that
you can utilise the fact
that you have a very detailed
version of the model,
by simply baking all this
detail into a normal map.
So next up I tried importing these statues
and they were even more impressive.
In fact, you could actually read
some of the writing on them.
So I created a tiny gallery
for the player to explore.
Finally I thought the tree
stump was just too good a scan
not to include,
but I didn't really have
anywhere to put it in the levels
since it didn't have any greenery.
So I turned the play into a tree,
which ended up looking more
disturbing than expected,
but of course the real
test was still to come.
Could we make a game with a level
that was entirely 3D scanned?
So we went out into our courtyard to see
if we could scan a larger environment.
And while we had good
success on the buildings,
it wasn't really a fair
fight for the scanner,
considering the amount
of clutter and bicycles
reflecting light everywhere.
Plus it was raining that day,
which of course makes everything shiny.
And while just scanning
some buildings is cool,
we thought, wouldn't it be even cooler
if we took some commonplace items,
lying around the office
and put them together
to make it level in real life,
that we could then load
up into Unity and play on.
So that's exactly what we did.
We immediately got started
setting up a nice playing surface
and placing objects that
our player could work on.
And after a quick scan of the table,
we hopped back on the computer.
I cut off some geometry,
loaded it into Unity,
created a ground cube for it to stand on
and the level was ready.
Of course, we still had
to build the actual game.
Our idea was to create a Top-Down Shooter,
where the player has to fight
through a series of enemies.
So I set up a quick prototype scene
and immediately got to work on
creating the basic mechanics.
I started with movement
and making the player aim
to what's the mouse.
I then made him shoot, added
some enemies to shoot at,
as well as some particles
for bullets and explosions.
And at this point I was ready
to put everything inside of our level.
So I imported the player,
added a rotating camera
and placed around a few enemies.
And picking the NavMesh for
this was extremely easy.
I just played around with the
slope and step height a bit,
and the enemies were moving
appropriately around the level.
Of course, they currently did
nothing to damage the player.
So I made them shoot and stop
when they got close enough.
I also implemented spawning
in case the player died
and made the enemies spawn
randomly throughout the level
at a slowly increasing rate.
Finally, I made some
boundaries around the levels
so that the player wouldn't fall off.
And with that, we had a game.
It's nothing fancy, but it
actually plays pretty well
considering how quickly
we put it together.
And everyone at the office
had a blast playing it.
So in conclusion,
while we can never get
the highly optimised
AAA quality scans that you can
with specialised equipment,
Display.land actually makes it possible
to start bringing the
real world into your game.
Personally, I just had
a blast playing around
with this technology.
If you want to check out
Display.land for yourself
and make some cool looking scans,
it's completely free and
available on the App store,
as well as Google play.
We'll have links to
both in the description.
And that's pretty much it for this video.
If you enjoyed it, make sure to subscribe
and ring that notification bell,
so you don't miss the next one.
And I just want to say a
huge thanks to everyone
who participated in the Brackeys Game Jam.
It was a blast to see so
many people joining the event
and we'll of course have a video,
showcasing some of the games, out soon.
On that, thanks for watching.
And I will see you in the next video.
- [Announcer] Thanks for the
awesome Patreon supporters
who donated in February
and a special thanks to faisal marafie,
Lost to Violence,
Loved Forever,
Leo Lesetre,
Nubby Ninja,
Danjiel Dusanic,
Dante_Sam,
Jacob Sanford,
Marc-Antoine Girard,
Naoki Iwasaki,
Gregory Pierce,
Michail Korobov,
TheMightyZeus,
Owen Cooper,
Alison the Fierce,
Yigit Kaya and Erasmus
you guys rock.
