Dear Fellow Scholars, this is Two Minute Papers
with Károly Zsolnai-Fehér.
Earlier, we have talked quite a bit about
a fantastic new tool that we called artistic
style transfer.
This means that we have an input photograph
that we'd like to modify, and another image
from which we'd like to extract the artistic
style.
This way, we can, for instance, change our
photo to look in the style of famous artists.
Now, artists in the visual effects industry
spend a lot of time designing the lighting
and the illumination of their scenes, which
is a long and arduous process.
This is typically done in some kind of light
simulation program, and if anyone thinks this
is an easy and straightforward thing to do,
I would definitely recommend trying it.
After this lighting and illumination step
is done, we can apply some kind of artistic
style transfer, but we shall quickly see that
there is an insidious side effect to this
process: it disregards, or even worse, destroys
our illumination setup, leading to results
that look physically incorrect.
Today, we're going to talk about a flamboyant
little technique that is able to perform artistic
style transfer in a way that preserves the
illumination of the scene.
These kinds of works are super important,
because they enable us to take the wheel from
the hands of the neural networks that perform
these operations, and force our will on them.
This way, we can have a greater control over
what these neural networks do.
Previous techniques take into consideration
mostly color and normal information.
Normals basically encode the shape of an object.
However, these techniques don't really have
a notion of illumination.
They don't know that a reflection on an object
should remain intact, and they have no idea
about the existence of shadows either.
For instance, we have recently talked about
diffuse and specular material models, and
setting up this kind of illumination is something
that artists in the industry are quite familiar
with.
The goal is that we can retain these features
throughout the process of style transfer.
In this work, the artist is given a printed
image of a simple object, like a sphere.
This is no ordinary printed image, because
this image comes from a photorealistic rendering
program, which is augmented by additional
information, like what part of the image is
a shadowed region, and where the reflections
are.
And then, when the artist starts to add her
own style to it, we know exactly what has
been changed and how.
This leads to a much more elaborate style
transfer pipeline where the illumination stays
intact.
And the results are phenomenal.
What is even more important, the usability
of the solution is also beyond amazing.
For instance, here, the artist can do the
stylization on a simple sphere and get the
artistic style to carry over to a complicated
piece of geometry almost immediately.
Temporal coherence is still to be improved,
which means that if we try this on an animated
sequence, it will be contaminated with flickering
noise.
We have talked about a work that does something
similar for the old kind of style transfer,
I've put a link in the video description box
for that.
I am sure that this kink will be worked out
in no time.
It's also interesting to note that the first
style transfer paper was also published just
a few months ago this year, and we're already
lavishing in excellent followup papers.
I think this demonstrates the excitement of
research quite aptly - the rate of progress
in technology and algorithms are completely
unmatched.
Fresh, new ideas pop up every day, and we
can only frown and wonder at their ingenuity.
As usual, please let me know in the comments
section whether you have found this episode
interesting and understandable.
If you felt that everything is fine here,
that is also valuable feedback.
Thank you!
And by the way, if you wish to express your
scholarly wisdom, our store is open with some
amazing quality merch.
Have a look!
We also have a huge influx of people who became
Patrons recently, welcome, thank you so much
for supporting Two Minute Papers.
We love you too.
Thanks for watching, and for your generous
support, and I'll see you next time!
