One of the best features of Adobe Character
Animator is the ability to livestream your
animated creations.
Last month, I made a snowman character named
Bluster, and for four days I answered questions
live on YouTube, Facebook, Twitch, and Instagram.
In this tutorial, I’m going to walk through
what I did, so you can create your own animated
livestreams.
We’ll start with some tips on making a dynamic
and easily controllable livestream-worthy
character.
Then we’ll take a look at how to customize
and fine-tune streaming software to broadcast
your character to the world.
Let’s start off by digging into Bluster
— he’s a free downloadable puppet and
background, so you can open him up into Character
Animator and see how everything works behind
the scenes.
There are 2 components that I think make a
successful live animated character rig: 1.
having simple controls, and 2.
Including a set of diverse expressions.
I’ll go to File > Import, select the Bluster
and SnowBackground.puppet files, and bring
them in, which makes them appear in my Project
panel as new puppets.
With Snow Background selected, I’ll click
the Add to New Scene button below to open
it up in Record mode.
You might need to adjust the scene properties
— for fluid livestreaming, I tend to stick
with 1280 x 720 at 24fps.
If the dimensions are too big and the frame
rate is too high, you can run into choppier
streams depending on your computer specs and
bandwidth, so I tend to start optimized like
this.
Then I’ll drag the Bluster puppet into the
scene, and use the Transform behavior to move
and resize him as need be.
When you make a scene, by default you end
up in the Record workspace, but I’m going
to switch over to the Stream workspace instead.
This workspace hides most of the timeline,
so all you’re really seeing is the selectable
tracks on the left, and makes the Controls
panel more prominent.
So when I want Bluster to be sad, or raise
his arms, or show a heart animation when he
really likes something, those are clear buttons
I can press to trigger his various animations.
Making a Controls panel is a pretty easy and
fun process — when you first come here you’ll
see a Generate Controls button.
Click it, and any triggers you’ve created
will show up here as buttons.
Bluster has 23 triggers listed in his Triggers
panel, so that’s what shows up here.
If you switch to Layout mode, you can edit
and move things around to organize the controls
to your liking.
Personally, I delete any swap set default
triggers to keep things simple.
So for example, the Lids swap set has 4 states
— squint, lower, upper, and the default
wide-eyed look.
But since the wide eye is the default, it
automatically shows up when the other triggers
are turned off, so it’s one less button
I need to worry about.
I can simply select and delete it to clean
up my controls a little.
If there is any artwork associated with the
triggered layers, it will automatically show
up.
If there are multiple artwork layers in the
trigger, you can right-click the button and
select from the different available layers
to customize it.
But Replays don’t automatically get icons,
and sometimes you might want to make a custom
icon.
If I go over to Rig mode, I can see the Controls
panel is not open by default, but I can go
to Window > Controls to change that.
Then in Layout mode, I can drag any layer
on top of any button to change its icon.
That’s why Bluster has a group called “Hidden
Stuff” inside his body group — I made
a bunch of custom icons, hid them inside his
body, and dragged them into the buttons to
make a nice, clear custom set.
Note that while in Layout mode you can also
drag any Properties parameter with dotted
lines around it into your Controls panel,
giving you the ability to add things like
Position X sliders and rotation dials.
When you’re going live, the last thing you
want to worry about is remembering which key
to press or scanning through a cluttered panel
to find a relevant trigger.
Experiment with different setups and see what
works best for you and your character.
If I’m streaming for 30 minutes straight,
and all viewers see is a character with the
same expression glued to his face, it will
feel stale and a lot of times it won’t match
the emotion of what’s being talked about.
For example, when I was livestreaming as Bluster,
people often asked what it felt like to melt.
That seems like a pretty traumatic experience
— not something he should say with a smile
on his face.
So one of the triggers I used the most here
was his Sad mouth set, which changes his happy
set of mouths to a sad set.
Same with the eyes — I have happy eyes for
when he’s really excited about something,
and impressed eyes for when he’s bragging.
Mixing it up between these while talking makes
the character really start to come to life,
and for every question someone asked, I’d
try to insert at least one emotion-based trigger
to help convey whatever I felt Bluster was
feeling.
For the arms, I incorporated several Replays
to give Bluster a variety of arm positions.
I separated these into two categories: poses
and actions.
For poses, I simply made a short blended dragger
Replay that holds in one pose.
So for example, if I wanted Bluster to raise
his right arm, I’d arm Dragger, position
the arms as I want, record for a few seconds,
blend the edges, and right-click to create
a Trigger and Replay.
Then I’d set the Replay to Stop/Sustain,
make sure the trigger was latched, and add
a key to it.
By making a bunch of these and putting them
in a swap set, Bluster now can cycle between
several different arm poses.
While I was streaming I would often tap these
at random intervals to show emphasis for certain
points and keep things feeling alive.
Some Replays were longer with more complicated
sequences, like a wave with head movement
and eye triggers, or a rhythmic arm gesture.
Since viewers enter and leave from livestreams
often, being able to wave hello or goodbye
seems like a must-have trigger to me — I
probably used that one more than any other.
Finally, I added a few cycle layers animations
outside the body with question marks and a
heart icon.
Little animations like these can help keep
things fresh and engaging, and punctuate certain
moments with a fun cartoon element.
If you haven’t watched the Triggers and
Replays tutorials on this channel, I highly
recommend checking them out — they go into
a lot more detail about how to do all of this
stuff, and are some of the most powerful tools
at your disposal in Character Animator.
Okay, so now you’ve got an amazing puppet
that’s ready for the spotlight.
How do you get them from Character Animator
to places like YouTube, Twitch, and Facebook?
Well, it requires four parts: Character Animator,
a plugin to send your scene to other software,
streaming software that can take that scene
and broadcast it out to the world, and finally
your destination like YouTube or Twitch.
I’ll start by downloading NewTek’s Network
Device Interface, or NDI, plugin — the link
is on the screen and in the video description
below.
This will download a package of tools, but
the two I really want are the NDI for Adobe
CC Plugin and the NDI Video Monitor (which
is called Studio Monitor for PC), so I’ll
install both of those.
Let’s test things out and make sure I’m
getting an NDI signal.
In Character Animator’s Stream workspace,
notice the little Stream Live icon in the
bottom-right of the Scene panel.
When this is illuminated blue, it means it’s
on, sending out a livestream-ready signal.
If I click it while pressing Command on Mac
or Control on Windows, I’ll go to my Live
Output preferences.
Here I want to make sure Enable Mercury Transmit
is checked up top, and my Video Device is
NewTek’s NDI output.
I also want to make sure the background disable
option below is not checked.
When I click OK, my scene should now be ready
to broadcast.
Now I’ll open up that NDI Video or Studio
Monitor application that I also installed
and go to File > My Computer name > Adobe
Character Animator.
On Windows this is accessed by a menu icon
in the upper left corner of the window.
And my live scene should now show up in a
monitor window, without any extra UI or cursors
or anything.
So this confirms to me that everything is
working as expected.
If you’re trying this and it’s not working,
try clicking the “Learn more about live
streaming” link in the Character Animator
Live Output preferences for some helpful troubleshooting
tips.
Next I’ll move onto streaming software.
There are a ton of these out there — OBS,
Wirecast, Vmix, Xsplit, and more.
You can try different ones and see what works
best for your setup, but for the purposes
of this tutorial I’m going to use OBS Studio.
It’s a popular free and open source product
that works well with Character Animator, and
you can download it at obsproject.com.
There’s just one hitch — out of the box,
OBS currently doesn’t support NDI — you’ll
have to download and install an extra plugin
for that.
Once again, the link is here on the screen
and in the video description below, and luckily
this is the last thing we have to download.
All right, once I’m in OBS I’m going to
click the + under Sources and find NDI Source.
I’ll click OK and then click the Source
Name dropdown.
If Character Animator and NDI are setup correctly,
I should see it listed as a source in the
dropdown here.
Click OK and your Character Animator scene
should now show up in the window above.
You can drag it to move it around, or resize
it with the circles in the corners.
The size of your scene is found under OBS
> Preferences in the Video category — I
have mine set to 1280 x 720, just like my
original Character Animator scene.
Ideally the Mixer is showing your microphone
audio as well, but if it isn’t, just click
the gear icon, go to Properties, and select
the correct input device.
But when you’re broadcasting there can often
be an audio delay, and for cartoon lip sync,
it can look really bad when the mouths are
running a few frames behind everything else.
What I do is click the gear icon, go into
Advanced Audio Properties, and add some sync
offset, essentially adding a manual delay
to the audio to line everything up correctly.
For my setup I’ve found 300ms seems to look
pretty good, but you can test this by setting
a value and pressing the Start Recording button.
This will record and save a local video file
so you can see exactly how everything is lining
up, and you can change the format and destination
of that file by going to the Output section
of your OBS Preferences.
If everything looks and sounds good, you’re
ready to move on to the last step — streaming.
In the OBS Preferences you’ll find a Stream
section that has a wide variety of services.
Each service works a little differently but
the main thing you care about is a stream
key — this is a unique secret code that
a service like Twitch, YouTube, or Facebook
will provide to allow you to stream.
On Twitch you can find this in your Dashboard
under Channel Settings, on YouTube it’s
at the bottom of your Live Dashboard, and
on Facebook it’s in the Connect tab when
you try to start a live video.
Copying and pasting this into OBS is the last
step.
Click OK, take a deep breath, and click Start
Streaming.
Your desktop will probably be a mess with
Character Animator, OBS, and the destination
website all open simultaneously, so it’s
incredibly helpful to have a multi-monitor
setup to see everything clearly.
Now that we’ve covered the basics of livestreaming,
here are a few tips and tricks that might
help on your path to animated superstardom.
One huge benefit of NDI is that is understands
alpha channels, or the transparent parts of
your video content.
So I could turn Bluster’s background off
in Character Animator and add my webcam as
another source in OBS, and now Bluster will
show up overlaying on top of the camera source.
So you could create some kind of setup where
you have a live person talking to a cartoon,
similar to what you might have seen with Cartoon
Donald Trump on the Late Show With Stephen
Colbert.
This is also how Twitch video game streamers
like Scribbleh overlay their characters above
the game they’re playing.
You can composite different sources together
to create interesting live scenes.
If you had multiple computers running Character
Animator, you could connect them through NDI
to have two characters talking simultaneously,
like the team at Critically Awkward does.
Think about ways to make your scene continually
serve up new visual content.
You’ll notice Bluster’s background has
a cycle layers group that runs through several
messages every 350 frames.
This makes for an interesting dynamic element
that lets you run through different calls
to action or promotions.
In a previous livestream I did with Red Monster,
I used a similar technique to change the camera
every several hundred frames, cycling between
closeup, medium, and wide shots to make things
more visually diverse.
For the Bluster livestreams, I got a free
Google Voice phone number and asked people
to call and leave messages in the voicemail.
This was a big hit.
Not only did it help break up the monotony
of answering chatroom questions, but they
were questions I could prescreen and think
of entertaining responses to ahead of time.
I listen to some podcasts that do live call-in
shows through services like Discord, so I
think there’s a lot of potential here for
taking live audience interactions outside
of the normal chat questions.
You’ll notice when I was running through
OBS destinations earlier I left out Instagram,
and that’s because currently Instagram only
lets you go live from your phone’s camera,
not allowing connections to OBS or any other
streaming software.
I hope this will change in the future, but
for the Bluster livestream I had to put my
phone in a tripod and point it at the computer
screen as a makeshift solution.
The lip sync was off and the screen wasn’t
sharp, but it was fun nevertheless.
And finally, practice makes perfect.
It can be hard to control a character, talk
coherently, read chat messages, and even more
if you’re layering your character on top
of video games or other activities.
Before going live, try rehearsing offline,
talking to yourself, and answering imaginary
questions.
The more you practice with your puppet, the
more familiar and comfortable you’ll get
with its controls panel.
So that’s an overview of livestreaming with
Character Animator.
If you’re streaming, we’d love to tune
in — please use #CharacterAnimator when
sharing on social media so we can check it
out.
And if you’re running into trouble setting
up your stream, the best place to get help
is the official Character Animator forums.
Thanks for watching, and have fun livestreaming!
