Okay.
Shall we start? Yeah, let's go. Okay,
so welcome everyone to this webinar from
KDAB about KUESA 3D Studio and what's
new in version 1.2.
Welcome also to Bernie Kim, who runs TQCS,
which
represents KDAB in the Asia Pacific.
There'll be some time at the end for
your questions and Bernie can help with
any Korean
translations, if needed. If you have any,
please add them at the end
at the bottom, at any time, in the Q & A
you'll see at the foot of your screen.
For those of you who don't know KDAB, we
do consulting, training,
workshops and much more. We know all
about Qt, C++,
and 3D...and particularly
relevant for this workshop
Qt 3D, which KDAB created and
maintains.
KUESA 3D Studio, which sits on top of
Qt 3D,
was developed by KDAB as a solution to
workflow pain points we came across
in our customers again and again in
automotive,
in medical, and industrial, wherever
designers and developers are working
together
at the front line of product development
using 3D.
It's a solution that removes the pain
without taking away agency from
designers
or developers. Timo Buske
and Paul Lemire can tell you how it
works.
Yeah, thanks Frances, and welcome again to
the KUESA 3D release 1.2 webinar.
My name is Timo. I formally worked in the
mobile games sector developing
3D tools file formats for several
in-house 3D engines,
and now I work for KDAB as Product
Manager of KUESA.
Today, I want to show you a bit about the
workflow
and the design side of KUESA, especially
for release 1.2. And, with me, is
Paul. Can you introduce yourself?
Yeah, sure. Thank you, Timo. Good morning, everyone! My name is Paul Lemire.
I've been involved at KDAB since the
very beginning with Qt 3D.
I'm currently one of the co-maintainers of
that project,
and I'm also dealing with technical
aspects of the KUESA development. And,
today,
while Timo will be showing you the
designer's workflow, I'll be showing you
the developers workflow
and hopefully we can get something on
screen.
Yeah. Thanks Paul. I'm sure we can.
So, we have an agenda today.
First, I want to go very briefly
over what actually KUESA 3D is, so we are
all on the same page,
and what's new in release 1.2.
This will take about, maybe, maximum
10 minutes or so, and then we'll go to
the main part of this presentation and
we prepared a little demo for you.
We want to show you the design part.
Paul will go over the API a bit and then
we want to show you how easy it is to
make changes to the 3D scene and
incorporate those into the application
and to give you an idea about the
overall
workflow. So this will
take --
everything all together should take --
about 45 minutes,
and then we can go over questions, if
there are any,
for about 15 minutes.
So what is a KUESA 3D?
So, first of all, it's a 3D engine based
on Qt 3D,
and what that means is you can use all
features of Qt.
You can use all features of Qt 3D.
Together with the KUESA
features, you can integrate
3D into your user interface. And
the thing is that this is not just
showing a 3D viewport somewhere in your
user interface,
this is a bi-directional communication.
So, for example,
I could trigger animations parameter
changes no
changes from 2D in the 3D scene,
and vice versa. I also can trigger
events in the 2D scene, for example,
showing a
button or a text from an animation in 3D.
So KUESA is glTF 2 compliant.
glTF is the industry standard of Open3D
formats,
and it's often seen as jpeg for 3D.
So you can use all applications
that export to glTF with KUESA,
and this is basically all applications
because
blender comes with a glTF exporter
on board. And for other 3D applications
like 3ds Max or Maya,
there you can get a third-party tool
which exports the glTF.
So you basically can use all 3D
applications.
It supports a lot of features, so the
glTF format itself
supports describing an entire
scene in
one glTF file or even multiple scenes
with all the nodes, all the meshes, all
the textures
animations -- even special features,
like
skinning and bone animations.
Plus, it is extendable so we added some
specific features,
like layer management and
a special set of materials. So those
special KUESA features are only
available if you use
the KUESA exporter which is just
a glTF exporter supporting KUESA
extensions.
KUESA is not a black box, so on the one
hand you
have a high level API that makes it
super easy
with just a very few lines of code to
display something in 3D in your user
interface,
and on the other hand it's possible with
KUESA to access
all the items in the scene -- so all the
nodes or
the parameters of the materials, for
example.
You can even change the frame graph.
So you have the scene graph on the one
hand which describes the 3D scene,
and on the other hand in Qt3D you have
the frame graph
that describes in which order the
objects are rendered If you want to
change this,
if you want to apply post effects, if you
want --
even if you want -- to change the
render method.
So this is what you can do with the
frame graph.
The paradigm of KUESA is that we have an
optimized workflow,
especially for larger teams. We
maximize the separation of
the designer's and the developer's
workflow.
We optimized the performance, so for
example we came up with
Iro materials. We will show you later in
the demo. Those are highly performance
optimized materials,
and this makes it much easier to develop
for low end hardware like for embedded
devices, and so on.
KUESA is split up into two licensed
models. First of all, we have
KUESA 3D Runtime. This is the library you
need to build
apps with, and then we have KUESA 3D
Studio.
This is actually runtime plus
a couple of other things, for example dcc
integration, for example, for blender,
additional tools like a glTF editor
asset processor graph
exporter, other smaller tools,
plus the distribution rights.
So what is new in release
1.2?
We added the ability to load custom glTF
materials.
We can convert blender node networks --
Eevee materials -- to a graph
that works with KUESA, so we can
automatically generate all the classes
for that so
we can provide everything we need on
both sides --
on the developer side and on the
artist's side.
Eevee is a real-time renderer, so
for all those materials you will have
real-time preview in your 3D application.
But the question here is do we want
designers to create their own materials?
Theoretically, it's possible. We have all
tools we
need for that, but we would recommend to
use this for
internal or developers use only to
ensure the best performance
because it's super easy if you can put
all kinds of material together in a in a
graph in blender.
It's easy. The material gets too big and
then you lose
performance. Okay. So we use that
internally to create new materials, for
example.
We created a material library that is
called Iro materials,
and those are fully integrated. So,
fully integrated means they are not just
exported to glTF, we go one step further,
since it does not give you so much when
you just can export the material
from blender or Maya and it does not
look exactly the same
as in Maya or on blender and in your
target application.
So, since we can export the material
graph,
we have exact...what you see is what you
get during
asset creation. The designer can examine
and adjust the final
visual result during modeling already.
So this is a big plus. So designers
can use
their tools; you don't need extra tools
like a scene builder
in order to import and re-export it to a
proprietary format.
And a designer doesn't need to build
the application in order to see how it
looks later. And he does not
need to ask a developer and like hey Mr.
Developer can you please build this for me so i can see how it
looks on the target device? And this is
what we mean by separating
the workflows. So we will show you that
in the demo as well.
Apart from that, for those materials
we provide a more intuitive and artistic
way of working,
but of course if those materials are not
sufficient. You still can use
the PVR material.
It is now easier to extend the animation
system.
We can pass custom animation properties,
for example.
We can animate colors of materials and
so on.
We improved and simplified the API, for
example,
For binding property changes of acid
nodes and QML.
As part of Qt 3D
from Qt 5.15 upwards, we added profiling
capabilities.
Paul will show you that later in the
demonstration as well.
And we added installers so the
designers do not need to
build KUESA themselves anymore.
Okay so um let's do the fun part.
I need to share my other screen.
okay...um...Can you see that?
Paul can you give me...okay, perfect. So
I opened up a blender scene and I want
to show you the workflow
of the designer and the developer -- how
they work together
and how to save time actually with
KUESA.
So you can see a car here with the new
iro materials already applied,
and when I go through the timeline here
you can see that I already created some
animations.
So the iro material is one of the newest
features,
and they are fully integrated into blend
now
as part of KUESA studio.
So we have full what you see is what you
get. So, for example, if I go here and
change a color for the
car paint material usually, as I
mentioned earlier, we would need to test
on the device how it looks.
But now we have full what you see is
what we have
directly in blender, and
this is since we are using the Eevee
rendering engine this is real time. So
even
while we are animating stuff in blender
and even while we are modeling -- so I
could
go here to this object, grab a
vertex, for example -- and even while we are
modeling,
we can investigate
how it would look
later at the target application because
it uses the
absolute correct material here.
The iro materials are not just
integrated and highly performance
optimized
we made it much easier for artists
to tweak them so let's say we want to
have
a highlight here at the top of the car.
The usual
things we would need to do is either
placing
some lights in the scene so it looks
correct, or changing the environment map.
And this could be
a huge effort because you need to
find the right spot in the environment
map and need to
see how this looks later in blender. You
could use special tools for that. That's
right, but
the easiest approach would be if you
would
just paint the final result and this is
what the iro materials do.
So when I had the black car here
some minutes ago, I used
this texture for it. So the iro
materials
are consuming this texture
and they are adapting to the look. And
later I can
change the colors and so on. But if I
want, for example, now to have a highlight
on the top of the car,
all I would need to do is just go to
Photoshop
or gimp or whatever and paint a
highlight here on the screen. So I
prepared that,
and when I
now go back to blender and
open the new texture.
You can see that the
highlight appears here at the same place
where you painted
the highlight on the sphere.
This
makes things super easy because you
don't need to change
environment maps or do any special
mapping shenanigans.
So, we had a use case where we needed to
create or
adapt 15 looks of car paint shaders,
and we just could use the rendering from
the customer or the
3D setup from the customer, render out
the spheres, and directly use that with
the iro materials.
Okay, let's go over the animations.
I have some simple note animations in
here, so for example, 
to rotate the doors and the hood,
and I also have the
animations for one of the cameras, which
is actually
also just a node animation, so I can
see or I can preview the camera
animation without
building the the app, as I said.
And we also use for the scene the
animation extension so we can
animate material properties as you can
see here.
I made the motor block blinking in red,
so when I go back and forth you maybe
can see it.
So let's pretend we have another
requirement:
we want to show a text
in 2D whenever the engine block is
visible.
So when I scrub through the timeline, you
will see that this is not a trivial task
because,
currently, when the hood is closed
the engine block is hidden. So I
have to wait for the right
point in time and the right angle, so
when the hood opens and we have the
right angle
the motor block is visible. Then we want
to show something.
Then if we go into the car, it's
hidden again, and so on, and we need to
mark all these
points in time right and this could be
super annoying if you do this by hand.
You either would do this in an external
application
that combines your 2D scene and 3D scene
like a scene builder or something,
or you would have to do this in QML by
hand
and both consume a lot of time. So
the easiest thing again would be
if we just could do this in blender.
So, for this, I created this dummy object
here.
This is later not visible in the
application.
So, if I move this up, I
indicate -- like now -- I indicate the engine
block is visible
and then, if I move it down, I indicate that
it's going to be hidden --
same if I leave the car on the other
side.
Now I move the cone up
and when it starts to get hidden by the
hood,
I move it down again. So the idea is
that Paul later can bind
whatever he wants in QML
to the properties of this cone and we
are triggering, so to speak,
stuff in 2D from the 3D
timeline and blender so the designer can
put everything together
and sync everything in blender.
So I will hand over to Paul
now so he can show you the API
things.
Yeah, thank you Timo. So first I'll share my
screen with you guys and i'll probably
try to find a way to stop my video so
that I've got
a bit more upload bandwidth
and we will start first by me showing
you a tool
we provide as part of Kuesa Studio Tool
that's called the
grtf editor and that tool
is really useful for developers
because it allows me to load
grtf files that have been exported from
blender
and so in this case I'll be loading
the file that Timo exported for me
where we've got our car. The nice
thing about
that tool is that it allows me to
introspect
the content of the grtf file and have
all the names of the various parts.
I can select them, view what they are
called, so that later on
when I'm
working with my application, I can
retrieve the ood,
for example, if I want to add animations
and so on.
So that's one of the nice things of
that tool. The other thing is I can
select the different cameras that have
been exported
and maybe one of the most interesting
ones
is I can gather the names of the various
animations
and I can play them so that I can check
that.
Okay, everything seems to be working
correctly
on it.
One last thing that might be of interest
is
I can get previews of the
materials and so on, but
what we might care about is finding
the name
of that small object that Timo created
for us
on the engine that triggers
dummy object. And I can see that it's
being exported
as triggers or trigger motor info --
one of those two. So we'll have to try
later on.
But once I have checked that the file
is being loaded correctly -- that it
looks okay --
then I can actually move on to creating
a proper application.
And what we will do is actually
look at a very simple application that
just loads the grtf model
and tries to play back the animations
it contains,
and we'll do that as a 4-step
process. So our first step
really will be about just loading the
grtf file,
not playing any animation or anything
but just checking that we can load the
grtf file.
For that we will simply start
by creating a main function --
our entry point into the application.
We will set the OpenGL
requirements we want to use, basically
telling
what we need to render on the screen
from the
artwork and we will
create something called a Qt Quick view
that's essentially a window that can
load QML content.
And I'll show you briefly what QML is in
a second.
And one last thing that's worth noting
is that
I'm injecting a property called asset
path
into the QML scene.
And that asset path contains the
directory path to where
my grtf files are being stored. But, from
the
C++ part, that's really all I need
to do --
set the OpenGL requirements, create the
Qt Quick view, give a size,
display the Qt Quick view, and tell it to
load a QML file,
which in our case is called main.qml.
So let's have a look at that main.qml.
There's nothing really complex.
We're simply importing the values --
QML modules -- we might need, so Qt Quick
and Quick Scene 3D. We're creating a root
element for our scene. And the really
important part, if we want to display 3D,
is to create a center 3D element, give it
a size,
optionally set some of the options you
you can set on on that object,
and at the center of that object we will
create
our KUESA scene. So that will be for our
3D part.
And right underneath that's in 3D, we've
got the text
element. That text element is brought
in 2D on the scree,n and that will be
of use later when we want to
apply some 2D text over 
a 3D object. So we'll get back to that
in a couple minutes.
So that's really the skeleton of the
application.
Where are we doing the artwork? The
artwork
is being done in KUESA since step
one.
But before we look at that, I will
simply show you
what we want to get to,
which is simply loading our grtf file
and making sure
it is displayed correctly. So
if everything's working correctly you
should be
seeing a yellow car on the screen
and that's the yellow car we've just
loaded from grtf.
So now that we've seen that it's
working we can
go into the details of what we're doing
in the
KUESA scene step one,
and that's where all the work is done. So
we're
importing the various Qt 3D modules we
might need and we're importing the KUESA
module. The root element of the scene
is a KUESA scene entity that
are just an empty object that
contains what we call collections. And
collections
essentially are containers for various
types of assets.
So we've got a collection for cameras,
we've got a collection for materials,
we've got a collection for
entities -- basically for any type
of assets that we can extract from a
grtf file.
So our scene entities are root elements.
It doesn't do anything. It contains a few
properties that
might be of use later, but that's
all it does.
The really important part if I want to
display
grtf is that element called the
grtf2 importer. And the grtf2
importer expects two properties to be
set.
One is the scene entity that
wants to know where can I store the
assets
that are going to be loaded in the
scene entity we've created and it
expects the source
property to be set to a path that
contains a valid
grtf file. Once I've got that,
if the path is valid, the grtf2
importer will
simply load the file,
create the assets, and if i want to view
something on screen
I need to do two
things. 
One thing would be to retrieve
the camera I want to use to
view the
the scene. So remember in the grtf
editor,
we were able to select various
cameras on top. I just selected one of
the cameras by name.
I've specified from which collection in
the scene entity
we should retrieve the camera,
and that's all. Once I've retrieved my
camera,
I can go down at the very bottom of the
file
into our component section
and that section is where we're telling
Qt 3D we want to render the scene. So
earlier,
Timo was talking about the frame graph
which
describes in which order we want to
render objects,
which rendering methods, and so on.
With KUESA, we provide
a pre-made frame graph called the
forward renderer
and all it expects from you
is a camera.
So once we've loaded the grtf file,
extracted the camera, and specified the
camera on the frame graph, we've got
all that's required to get something
on the screen. So, that's our first step.
The second step
could be about loading
animation
and maybe displaying some
profiling information.
So I'll show you briefly
what we want to get to and then we'll go
over
the code once again.
Shouldn't be long, hopefully. Alright.
Hopefully, you should be
able to see the car moving along, maybe
not
that fast because of the uploading,
but we should be able to see that the
car
animations are being played back
correctly
just the way that Timo
intended them to be. So that's
one thing we have on that second step.
The other thing is we've enabled
the Qt 3D profiling feature.  So that's
simply
a property that needs to be set
to true, but once you do that you get
access to various profiling information
about Qt 3D.
You get access to the frame rate,
which might be lower, in my case because
of the
uploading, than what it would be without.
I get access to the number of jobs Qt 3D
needs
to execute to render the frame, so 86
in that case. I get access to the number
of draw calls what we call commands
that I need to render the various parts
of the car,
and I get more information such as the
number of vertices which is close to a
million.
so we're trying to draw close to a
million different points in 3d
and the car overall is composed of
around
300 000 triangles
so we get all those informations and
those can be
useful when you're trying to make sure
that
either the application is running as
fast as
uh it should be running or if you start
having performance issues you can check
well what might be uh
you know uh the the culprits if you got
too many comments or too many vertices
maybe
those are areas uh we we should be
looking at
but apart from that that second step is
really about
playing the animations that we
we have with with the car grtf5
and we can take a look
at that second step that second step
is pretty much what we were doing
earlier so we are loading the gltf
extracting the camera
all we're adding now is we're
stirring into an array
the names of the various animations we
want to play back so for that i've used
the grtf editor together the names of
those
uh and then i'm repeating for each of
those name
i'm repeating something several times
essentially creating for each animation
an animation player and the animation
player
expects the name of the animation
so that's one of those entries
number of times uh we should repeat the
animation
so if i were to set loops to one we
would repeat the
the animation only once uh but in our
case i want them to play forever
so i've specified the animation player
dot
infinite value and last thing i'm
setting the running property to uh
well to i'm banning that property to
something called animated which by
default is true
that when i press on uh
the space key i can change that value
and therefore
stop the animations or make them resume
uh but that's
that's step two so that's about loading
and playing the animations so as as you
can see
it doesn't take that much uh that much
work
um and let's have a look
at the next step so
like previously i'll show you what we
want to get to
uh but that step now is uh
about showing some 2d text
on top of the engine block
so let's let's go ahead
right so you should be seeing
exactly the same thing as sim 2 so we're
loading the car
playing the animation
and if we wait a bit hopefully at some
point the
engine bonnet should be open
and we should be able to see some
2d texts right above the
engine valve covers in red
and as you can see and as timo was
explaining earlier
sometimes you might want to
only make that text visible when
um you're animating the engine or
when part of the engine is visible on
screen and for that he created for me
that
dummy 3d object that trigger and that's
what i'm using to actually decide
when i shoot or should not be displaying
the text
so you can see that as we are in the
card text is not displayed
it's not displayed here anymore but once
we move
outside it's being displayed again and
the
engine animation is being animated and
as we move
towards the front and the hood closes
again
the text is not displayed anymore
so let's let's see how we could do that
that's not that odd so
just like previously we're resuming from
the previous step
and uh what do we need to actually
display the text at a given position in
space well we use something called the
asset so we we use the asset previously
to retrieve a camera
this time we're using the asset to
retrieve a transformation in space
and that transformation in space is the
transformation
that's associated to the dummy object
uh timo created for me earlier uh that
dummy
object that is called the trigger motor
info and uh one of the nice feature
with uh the quasar 1.2 release is
that if i want to access two properties
from the asset that will be created
so properties name translation or wall
matrix
i just need to define them right there
and they will be automatically bound
to the properties on the trigger motor
info object
if that object actually actually has
properties with that name
um so i've created that part
if we move back to the top
of our scene i'm creating two
intermediate properties which i'm
binding to the translation
so one property which
y axis of the uh translation
of our tummy object and right underneath
we've got
another object called the projection
alpha that's
uh a new helper i've created which
allows me given a camera and
a transformation matrix to transform a
3d position
into a 2d position on screen and that
i'm binding so i'm binding
the 2d screen position to another
intermediate property called the motor
label screen position
those properties i can then use
in our main.qml so remember in the
main.qml we add
a 2d text well that to the text
i'm using the multi-label screen
position
and the multiple label opacity to place
that text
on my window and to control with its
opacity when it should be or should not
be
visible so that's
an example of binding 3d content
information to some 2d content
and using uh using the
transformation properties from the dummy
object that timo created for me to
actually control
uh the opacity of that text
and what we can do now is something a
bit more
involved uh in quasar step four so
i'll show you what we want to do
um and that that one is
fun to look at
all right so just like
previously we're loading the car playing
the animations
nothing is really that different from
step three
we will check that the 2d text is still
being
displayed in a second
and is it yes it is
but what step 4 really is about
is integrating some 2d
shoot quick content into our 3d scene
so before we add the simple texture with
a kdub logo
what i did in that step is replace that
texture
by a custom texture
on which i'm running uh some cute quick
application
um so that application that's purely
qmr code that's running uh that was
actually used in a real desktop
application
and that we've embedded into the the
card display
uh and i'll show you i'll show you
how we can do that
um so what what is really involved
in creating that so we started again
from step three
well first thing for me was retrieving
the material so the shader
essentially that's that is in charge of
displaying the
center uh console screen in the car
um at the same time i've created
a 2d texture that by default is not yet
visible but that texture will be
uh a way to render
something something outside of the
screen and store
that visual result and if we look down
below so that's
essentially something from uh from
q3d sim 2d allows us to specify
that we want to render a qml scene
into the off-screen texture i've created
right above
uh and in that scene 2d i'm telling it
that i want to display
a qml application which is called
cinematic experience cinematic
experience
is actually a cute five demo
fortunate quick that was made quite a
few years ago by
quit quit coding
and uh that's actually a larger uh
larger qml application with loads of
code and
elements uh and we won't actually dig
into that
because i don't know the code and that's
not really that's interesting what we
really care about
is that we can import an existing 2d
cute quick application bundle that
in a scene 2d element that gets
branded into the offscreen texture
and that off-screen texture we can
actually change
uh well we can actually use it on the
material elements we used
for the center console display and
replace
the original texture by the onscreen
texture
that is animated and contains our
application
so what we've covered with those
five four steps is loading the grtf
playing the animations connecting
trigger translations to 2d text
and finally replacing one of the
materials are static textured by
the dynamic texture which we generate by
using
our cute quick
so that's that's that's it on my side
maybe teemo you've got some other things
we you want to show us
yeah um thanks paul let me share my
screen again
so um
i'm not sure how fast this was over the
zoom webinar thing but um
after the webinar we will provide you
with a code so you can play around with
it
and see how fast and real time is
it actually is so
what i'm now going to do is i want to
show you how easy it is to make changes
actually to the 3d scene
so let's say we want to
change the timing
of the door so they should open exactly
when we enter the car
like when i'm
at this point here for example
in time so all i need to do
is moving around those keyframes and
this is sure this is nothing new
to you guys
so let's close the doors when i'm at
this point
same for the other door
so the doors are opening directly before
i approach the cars
the car and they are going to close
again when i'm leaving the car and
let's change something more obvious so
i'm going to change the car
the color of the car again
so as i said this is nothing new because
i just
changed some stuff in blender right but
the point is
when i export this now i export this to
the gltf file
paul can directly use right we don't
need any
tool in between to re-import and
re-export into our proprietary format
we just use the actual direct gltf
output of blender right now
and could the code changes and
paul won't need to do any code changes i
can do
whatever complex changes as i want to do
right now
as soon as i keep all the identifiers of
the objects
i can change all the animations all
colors whatever
and paul can directly use my gltf
output
in the app you just need to make sure to
use the right assets
so i will hand over to paul again
right so let's get back to our uh
small example application um
and if timo is correct and usually
usually is all i need to do is change
the path to my grtf file
and hopefully we should be able to see
a blue car with slightly different
animations
but without me needing to
change anything at all uh and that that
seems to be
uh to be working really fine at least on
my pc
i hope you you can still see the car uh
and the animations on your end but
let's let's see let's see if we've
had a change in the animation i i seem
to remember that the door will already
be open at that point in the past
yeah it seems that the animations were
correct
we still have our cute quick content
embedded in the
the the center screen console which is
great
and as we leave the car the door
should close immediately after
so that that that's really perfect i
didn't need anything any change
uh in the code uh
and at the same time timo was able to
modify the car
improve it rate reiterate on that
process
without requiring me to be really
involved or
keeping touch with me to to get all the
things in sync
which is really uh perfect uh
and that's i think that's it for the
demo
part of the application
thank you paul and teemo before paul and
timo answer your questions this is to
remind you you can find out more about
queso 3d at www.crazer.com
and don't forget to check your mail to
find out how to get the code for this
demo
so you can try it out for here's
timo and paul again
okay so um i'm just checking
if we have any questions
in the q a but i don't see anything
uh bernie wants to say something yes
hello hi
timothy well thank you very much it was
a beautiful collaboration just between
the designer and the developer so it was
a seamless
i didn't know who was doing what so it
was a great thing
i just have a couple of questions
actually but you know i just want to
limit myself as we have lots of audience
here other than
ourselves so let me just if that is okay
just ask you a few
questions sure um yeah so you have
demonstrated today for the quest at 1.2
yeah is there any particular combination
of the q
release that you like to recommend to
use as far as i know that you
have 1.2 which is working not only in
acute
5.12 but also 5.15 which has been just
released
right that's correct so we were backward
compatible with 512
but if i were to recommend the release
to make sure that you get access to all
the latest
features such as the profiling uh you
would be better off going with five uh
515
and we're we're also planning to have
support uh for cute six uh which should
be
available uh november or december
of this year uh so crazy should be ready
for q6 when q6 is released as well
we're we're working on that and i guess
like everybody else probably must know
but
the quesa yeah i was just uh
um so you have actually used the uh the
blender for the demonstration for
today's webinar but
i also understand it works well on the
other
3d tools
you as said you can use any 3d
application as long as this application
can export to gltf
and since there are gltf exporters
available for almost
all 3d applications you can use them the
only point is if you want to use
specific razer features
like the integration of the
special materials into the 3d
application
or the cuisa layers the and you want to
export those
special features then you would need to
use a gltf exporter that supports
the cuisa features
nice you know i see because um the one
of the reason actually i did
ask this question is i'm sure a lot of
the designers here
in the eastern part of the world they're
more familiar with tools like
maya or 3ds max and they often actually
send us the material the shader which is
created
you know with their experience from the
raytracing method
so for example you mentioned in the
beginning of the webinar that you use as
an example with the io material
which are probably very new to most of
them
so would you actually have any
recommendations for
those people who are only limited with
their knowledge on the
uh ray tracing in order to get
themselves
more familiar with the materials being
used here in quesa
actually the materials are super easy
right so we have documentation for that
how they work
and it's basically you have a
reflection channel you have a diffuse
channel
it's it's nothing it's no magic right
it's not rocket science it's for for a
designer
they are much easier to get than the pbr
material
and if they want to use that in other 3d
applications
that will be also possible so
we have for example for the next uh
for the next release we will be
working on the maya integration so that
those
i think maya is maybe uh
one of the 3d applications that are
commonly used
so we will be working on the
full integration of the materials for
maya
as well for the next release i would say
so this is definitely on our roadmap
so perfect perfect and um i suppose the
people who are more interested in
using interactivity using quesa or
even some more fashionable effects
using the 3d particle engine those are
obviously the features that we would
expect to
have including the next release to come
yeah sure
so we have a planning session
beginning of next month for release 1.3
so we will check out what we are going
to work on for this release
i think we will give everybody an update
on that
right but so far and correct me if i'm
wrong teemo
uh what we're planning for the 1.3
release
uh will be having uh particles
uh shadows uh
and then maybe extending the iron
materials
to add uh different types of effects but
that would be the uh what we already
know
so far uh and in june we will fine-tune
uh what else we right we will
new new stuff and supporting our old
stuff because
it's also always a huge effort i mean we
have a long feature list
and if we want just add one tool like
maya or so
we would need to make sure everybody
runs seamless
in this tool so we will need to maintain
and need to add some of some new
features like particle system and so on
yeah
all right okay uh just the last question
so um to paul
you have been incredibly quiet for the
last few minutes so
um i'm just curious as a personal
question i guess that you know
how is your life as a developer you know
before you know about question and how
do you know about firsthand
because a lot of developers here as they
are limited with the knowledges on the
designing part
uh they're they're kind of lost you know
when they're faced with these 3d tools
and
uh you know workflow so as a as a as a
developer
you know using quesa is that actually
make your life easier
yeah well the interesting thing
is nowadays uh
with quiz i would just receive a single
gltf
file and we import that and we extract
the part we want
in the past what we would have had to do
would have been to ask the designer to
essentially
ship out each part in a separate file so
that we could load
manually all the parts and assemble the
car
by hand ourselves and maybe
in the grand scheme of things that gave
us a bit more flexibility but
it was a lot more work to to have to re
retrieve all the parts reassemble the
car than just
loading everything and uh selecting the
subpart we want to modify
uh after the fact so that's really what
has changed for me
so that means that the road to
getting a demo application or an
application is a lot faster now
than it used to be and it also makes the
iterating process a lot easier
because in the past if a part was
changed we had to
re-export that remodify check that it
was still
working with the rest of the other parts
and so on whereas right now we just
re-import the new model
and that that's it yeah so what do you
do with all the
free time
we work on other crazy features i guess
great thank you very much that's all for
me gentlemen
okay cool yeah thanks uh thanks for the
questions bernie
um yeah glad you are here um so
i think there are no questions in the qa
section so we can call it day
right okay well thank you again paul
timo and bernie and to all of you for
attending
please visit us at www.kdap.com
and write to us at info kdub.com to find
out more
and don't forget to download the code
you'll see that in your medium
so take care and stay safe all
right thanks bye-bye
