Hello everyone,
So I left the simulator running for several
more hours to see where the
creatures would end up and it turns out that
they managed to hack the physics
engine.
You may have noticed in the previous video
that the mushrooms were clustering
under trees since the creatures were unable
to get there to eat them.
Each tile also has a maximum mushroom count,
so I thought that once that many
mushrooms had spawned, the forests would effectively
become barren and we might
see the creatures evolve some more to favour
the meadows, perhaps becoming big
and slow.
So here's some screenshots I took at regular
intervals. The creatures
originally got smaller but have started getting
bigger again.
We can see that basically they're all darting
around as fast as they can
manage, so I guess being big doesn't hurt
them any more.
So this is how they've broken the collision
detection to allow them to sneak
inside trees.
The game engine checks where they will end
up after each tick, and if that will
be inside a tree, it stops them and kills
their speed.
We could see in the first video, that a bunch
of creatures would often get
spawned inside trees, and then they were stuck.
That barely ever happens any more, I think
that's because the creatures are
able to accelerate out of the tree and gain
enough distance inbetween ticks.
They've basically evolved teleportation.
So I clearly need to fix that.
This video also shows how bad the framerate
on the simulation was, so after
filming it I did some work to improve that.
Using Cython -- which
compiles Python code to C code and
is awesome -- helps a lot but there were other
optimisations to be done too.
You can check out everything that's changed
in the git history; link in the
description. I'm now just about able to get
60 frames per second.
I also wanted to show you the genome and neural
network in more detail since I skimmed
over it a little in the previous video.
The genome's stored as an object containing
a the genes that dictate various attributes
of the creature.
I dumped the current physical state of a creature
from the previous video into JSON format which
is what we're looking at now. This part of
the dump shows the genome, which has a radius
gene, a gene stating the number of hidden
neurons, and then two more genes containing
the input and output weights of the hidden
layer.
These are in a flat array, so the first three
values are the input weights of the first
hidden neuron, the second three values are
the input of the second hidden neuron, and
so on.
This format means that if the number of hidden
neurons gene is changed,
we can easily crop or grow the array. If it
grows we generate random values for the new
neuron; I guess this is like the gene grabbing
whatever proteins are lying around at the
time.
I've also created a neural network inspector
which we can load this creature into.
Here we can manually change the input neurons'
values to see how the outputs change.
This is the creature which changed angle when
it was touching something. It also looks like
it now won't spawn if it's touching anything,
which is what has avoided creatures being
born inside trees.
So what's going on inside each neuron? If
we look at this hidden layer neuron, we can
see that its three inputs have a weight of
1.6, minus 0.7 and 1.5 respectively.
We calculate the value of this neuron by multiplying
the value of the input by the weight of the
input. So in this case we multiply them all
by one, and sum those values.
1.6 minus 0.7 plus 1.5 gives us about 2.4
Now with heavy weights or lots of input neurons,
this value could get arbitrarily high, so
we also put it through what's called an activation
function, in this case the sigmoid function,
which converts it into a value between 0 and
1.
So if we change the haptic input to 0, we
now remove the influence that neuron has.
The same thing happens to the output neurons,
using the values and weights from the hidden
neurons. The only difference here is that
we don't use the sigmoid function because
we want the outputs to be unbounded.
So I hope this video was informative and interesting
to you. I promise I'm going to be implementing
vision soon so check back next video when
it arrives.
Thanks for watching!
