Hi, I'm Zenith. no not that Zenith, it's
edgy Zenith! get it ha ha sword terraria
so basically there's this bot called Deep Leffen on Twitter, and using machine
learning, it generates text that imitates
a guy named Leffen, a pretty interesting
of a character in the Smash Melee scene.
And so I thought, who is somebody in the
furry community that is also pretty
interesting of a character?
just to preface this, none of the opinions said
by the bot are my own, anything
controversial this bot says are things
that have been learned from Kothorix
and/or furry Twitter... yeah it's that bad
In order to make an AI like Deep Leffen,
I will be using something called a long
short-term memory network, or LSTM.
Now, I don't fully understand the math behind it, but here's a quick introduction to
what an LSTM is. Remember in my last video
how I used ANN? Well you see, that
architecture is pretty much the exact
opposite of what you want when
processing language.
your dog your dog
your stupid your bad
[Incoherent talking about babies or something]
English is hard
whenever you talk, pretty much every
sentence is a different size unless
you're some iambic pentameter speaking
psychopath. Now, a simple neural network
like ANN can only take fixed inputs and
outputs, so you can't generate meaningful
English text with only an Artificial
Neural Network. Instead, we use a network
that is specifically designed for sequences,
like a Recurrent Neural Network or RNN
what an RNN does is basically predict
what happens next
it takes the previous output as an input
in order to generate a new output
complicated I know. For the sake of a
text generator, it should predict the
next letter. In the text "Hello World!" when
we input H in an RNN, it should predict
E. Then when we feed E, it should predict
L. And finally it does this over and over to
eventually get a basic understanding of
words and eventually language.
An LSTM is basically an RNN, but it's on steroids.
this absolute HUNKER of an architecture
uses fancy math in order to have some
sort of memory. Instead of just being a
virgin RNN and taking only previous
output it's able to pop off and take
neural data from previous iterations.
What a Chad. Now I'm going to be honest I
don't 100% understand
it myself, so if you want to learn more
I'll leave a few articles in the
description on how how LSTM's actually
work. Anyways that's all my AI does; it
predicts the next character
in a line of text it's kind of Poggers,
it's kind of witchcraft, but more
importantly I can exploit it to make a
questionably controversial video.
HELL YEAH.
I know I know. You're antsy to see the
results, but I really want to show the AI
learning how to write, because it's
absolutely fascinating to me. The AI goes
through training cycles called "epochs",
which is essentially it passing through the
entire data set once. My AI only took
about 12 [actually 15] epochs to Train and I'll
explain later why stopped there. Hey its
future me, I forgot to explain. Basically
it just gets very unstable. Also 15 Epochs not 12. The data said I used for
training the AI was quite large. It
contained many tweets from furry Twitter,
somewhere around 50 megabytes of text or about.... 25,000 pages of text holy Sh-
that data set also contains a couple
more megabytes of text that include
Kothorix's tweets as well as all the
subtitles from his YouTube videos. Using
this boatload of data ensures that Deep
Koth does not overfit the data and
actually learn English rather than
repeat from the data set verbatim.
So without rambling much longer, let's do a
training montage
no no no no no no no
I take back the music Susan Wojcicki
please don't demonetize me :(
yeah on second thought that montage is
boring. Instead, I'm going to talk about
temperature while I bust a few nuts. I
neglected to talk about it earlier, but
temperature essentially dictates how
creative of a word choice the AI uses.
When you look at a 0.4 temperature
you see a lot of overfitting where the
AI has memorized Twitter handles and
there's even loops where it gets stuck
on the same phrase.
Contrarily, 1.2 temperature is
what I would call thirteen-year-old
sparkledog Kothorix, because absolutely
nothing makes sense. But interestingly
enough, that's where you can see the most
learning take place. In epoch one, it's
near gibberish. However by the end, it
definitely uses more English at the 1.2 temperature. For the final
result, I will be using a 0.8 temp
because I think it strikes the best
balance of cohesion and randomness
Hello, I am Deep Koth.
not really
some carefully on the front line
Ate Feb. 9 Dollars to Australia
Come chat and hang and have a copyard digitigrade
Yeah, I'm not concerned about the AI uprising anytime soon.
Honestly though, Deep Koth made some BANGER text
Not in the sense that they're good but more like
I could see coughing saying this if he was
like blackout drunk on Twitter. I'm now
going to read out some text I thought
was particularly effed up or hilarious...
RT @elonmusk: I'm going I fricking
love it
let them suspend me in desk. @WeirdScience9 I have cut the paws...
#FursuitFriday. @carrionette
it's a brilliant pissing but a warm up
was the green pattern. RT @pawgazer: I made a regular Fox Dragon I got
stuck in my stomach... that's sooo stupid!!
I really love
my channel I've been so happy with how
the Godzilla cat Big Taleu's Angel are auto 
touching 7% the intro is based on
striking. People want to see it back to
N(ot writing that)azi memes and that we can't really
say anything of a group who is not a
such. @S2LIFTAG designs I did in
the next week was going to be a friend
of mine. Have fun having a fricking
therapy! Another wallpaper with a Mario
cancer cookie for @eccooloo Absolutely
fell in love with the trash!! Always go to
Catholic store and she's just a warm
jagged corpse a few mins and see it as a
sexual attraction to are pedo goals
this is this is not my these are not
my opinions please please I'm just
saying these out loud the TTS was slow
please please a-aeug oh no
If you want to see more examples for yourself,
I'll leave a pastebin in the description
with out around 60,000 characters of
randomly generated nonsense from Deep
Koth. Have fun searching there! and tweet
me on Twitter when you find
unlike my last video, I'm not gonna make this
section half the video.... unless (OwO)
but it's mainly because I think I've already
explained decently enough how the AI
works. There's just a few things I want
to point out though
For starters, in my opinion, I think the
AI did quite well on picking up furry
lingo, even when it didn't make sense. You often
see furry hashtags and also furry
specific sentiments. You know, like loving
paws, talking about art, and even a
ridiculous amount of "No u's". Now, another
thing I noticed was that... well... the crazy
attracted the crazy. I'm not gonna name
names, but I saw quite often in the
generated texts that there are some
pretty controversial furries. That's all
I'm gonna say. That's up to viewer
discretion about everything else. I'm not
touchin that. Honestly though, it was
hilarious seeing the AI in action. It's
as if the void took a look at furry
Twitter and it was like "hmm yes, I
understand." ^w^
and then spewed utter garbage. It's
actually exactly what I expected from
Deep Koth not gonna lie. And of course,
it only makes sense in short durations
and that's just due to the limitations
of the LSTM architecture. My model, at
best, can really only generate coherent
thoughts no greater than the size of
about 70 characters now open AI's GPT-2 algorithm, which is the architecture
that Deep Leffen uses, is much much more
advanced than mine. It can pretty much
write an entire coherent essay using
something called a transformer
architecture and that uses like attention or something and honestly my
brain melts every time I think about how
it works, so that's fun.
Anyways, that's pretty much the extent of
the analysis I have
Okay so this video is mainly a joke
video I don't intend to antagonize
Kothorix toooo much, it's mainly just to
poke fun at his past. That being said, if
you did enjoy this video, do all the
YouTube thingies: Like, comment, subsc- you
know the drill and just do all that. If
you want the source code of this project,
there's a link in the description for it.
It's not my own, I actually just played
around with Tensorflow's text generation
example on their website. Now in my last
video, people are asking if they could
get the application and/or source code
of my Furry Detector. Welllll I'm working on
that, for the coders out there I'm just
cleaning up my repository before
releasing the source code and for
everyone else, there may be some time in
the future where I get it up and running
as like a website or an executable
alright I have nothing else
thank you so much for watching! :3
