Hello everyone I'm David I'm working at
Selleo and today I will talk about how to
improve your Github repository, so
basically, how to give some steroids to
your Github. Agenda for today's talk
will talk about how to, why you
should use and how to use templates, also
we will talk briefly about how to cover
coverage in your project with Github,
also we'll talk a little about how to
take care of updates of your projects,
basically packages, how to improve the
speed of your build, it's more related
with CI but still,
also how you should take care of changes
in your project and how you can better
document your project. Okay basically, in
the past, we had a tool in our company,
that allows us to give feedback to other
people, to the code they
write. And most of that feedback later on
was replaced by robocop or any other
tool that basically could automate this
part. Of course it doesn't mean that we
could automate everything there but
still, we saw that seems that this tool
was redundant in most cases that time, after all
and that's why I created this
presentation basically to show you that
some things, some recurring tasks
that we do every day can be easily
automated, especially on Github.
The first one is PR templates. Who of you
is using PR templates actively on Github,
raise your hands. Okay, seems like half of
people here. What it gives you? Basically
if you use templates for PR creation on
Github you are not only documenting your
work and what's your code all about, but
also, you tell people who are
looking at your pull request that your
comprehension of the feature is aligned
with the customer requirements. Also it's
much easier to get all important
information that are required for
quality assurance for people who are
reviewing your code, because they have it
in a single place. It's also easier to
test if you, for instance, provide testing
scenario where you explicitly describe
necessary steps to find out is it
working well or not. Also thanks to
templates, you can attached even video
preview that takes minutes to produce
and, for instance, if your customer is CTO
he can just click on this video, which
points to YouTube, and see in seconds
that it works as expected, at least on
your local environment. Of course if you
have automatic tests then
he can be even more sure that it is
okay but still, those few minutes are
pretty
worth it to spend because sometimes, if
you have tons of pull requests, you may
not have full time to review every pull
request locally. Maybe you will just do
it on staging later on but still, you can
save your time and time of your customer
or whoever will review your
feature.
This is one of the newest feature, like I
don't know maybe it has month or two on
Github, it's pretty useful especially for
big repos, where you can create separate
templates for each type of issue that
someone can create in your repository.
It's especially useful for open source
where maintenance have to have some
template or framework for issues, so they
don't have to ask following questions
like: 'Can you tell me how to reproduce
that...', 'Can you give me some context of
this issue...' stuff like that. You can put
everything inside your issue template
and then, when someone wants to create an
issue he just has to fulfil this
template, just ask few questions
bam bam bam and that's it. So it's really
simplifying and also makes some order in
your repository. And this is the place
where you can basically find it in your
repo.
Coverage. Does any of you is checking
spec coverage of your builds regularly? I
mean, is it important factor for you to
make build red, for instance, that caused
the failure of the build for any of you?
Is there anyone who does that?
Yeah. So till now, we were using cane
gem which basically automatically do it
for you, so together with Simplecov, at
least in Rails projects or Ruby projects,
it gives you automatic check does your
spec coverage is not lower, in that case
it was like 91% of coverage. So you don't
have to worry that it's basically
lower each iteration of your code. It was
okay,
but still it was like more for
like stats, it doesn't bring as much
value, but there is a better option for
you. You can use one of the apps that are
available on Github, on marketplace, where
you have much better report and
basically, you can even get the
report for new changes that came to your
project. So basically you can even cause
the failure of your build if changes
that were introduced,
wasn't covered in 100% for
instance. So
each iteration you can even see which
file wasn't covered, because these two
reports that for you as well. Okay
how about updates? Quite often we have to
worry about keeping our project up to
date and usually, we just took
gem file or package.json or any
other file that is responsible for our
dependencies, for instance, and we just
try to bump all those dependencies,
usually manually, but we don't
have to do that because there are tools
that are dedicated for that,
for instance, Greenkeeper. Once it's
attached to your Github repository it
will automatically create pull requests
on each update of the packages that you
are using. What's cool about that it's
the fact that if you have pretty nice
test coverage or spec coverage then
it will basically do checks for you so
once you bump the version, once
the version of your packages
automatically bumped and you trust your
specs,
you just have ready to merge pull
request that in theory should be fully
mergeable, without any issues. What about
speed? Quite lately, we had a
problem on one of our projects, where our
build took about an hour and it
becomes pretty... it was pain in the ass to
be honest, because we had to wait too
long for full build to finish. I talk
with people a little in our company how
they handle that and maybe there is some
quick win to get and in fact, there was
one, there is Knapsack gem which is used
on one of our projects. What it does is
basically splits your build into two
or more separate builds and in case of
two nodes, if you run
concurrently your build, you get almost
50% lower time of your build and even
you can keep your, for instance, spec
coverage tools on place as well because
you can use, for instance, after success
at the end of your Travis
configuration file where you can add
some extra checks, which has access to
shared directory for all those builds
that are available here. The other cool
stuff is that you can, for instance, split
your builds if you have more nodes to
work on, based on type of specs you have
so, for instance, you can see that only
policies are
failing, also you don't have to wait for
full spec suit to fail because you
can fail the whole build right after one
spec, so you better utilize your Travis
in that way. Also on one of the projects
I heard that changes that are handled
there, I mean code changes,
were handled
by hand basically. They had to rebuild their
change log every time they do some
merges, they do some releases and of
course, we don't have to do it manually,
we have to stay strictly to some
regime to make it useful, because if
someone will make very
misleading pull request names then it's
not going to work like that.
But basically, if you stick to some
regime, then those change logs can be
fully automated
basically. So all you have to do, this is
a gem called github-changelog-generator,
which basically regenerate
your changelog based on the pull
requests you merged, issues and releases.
So we just run one command and the
complete change log was rebuilt. And
there is alternative, pretty cool one -
lerna which even gives you, based on
the labels that you attach to your pull
request, some fancy icons and fancy
grouping and some stats at the end
who are the committers. It's especially
useful if you have to maintain open
source libraries, so you don't want to do
that stuff over and over but also, if
your project's getting bigger and bigger,
it's nice to consider such tool that
will automate that for you and you will
save some time for some cooler stuff to do.
Okay, I know documenting your code
it's not the coolest thing we can do but
in terms of libraries is very expected, I
would say. We quite often look at the
source code, for instance, if
documentation sucks and thanks to
inch-ci.org you can basically get some fancy
badge that just tells you what's the
coverage of your library in terms of
documentation. Also you can get some
pretty nice
suggestions so it's like a linter for
your documentation, especially if you
have some more complex part of the code
which you would like to comment, but I
would highly recommend it especially for
libraries. So the whole point,
my point is that your repository is
like a product and you should treat
people who visit your repo and
contributors of your repo as customers.
You won't get as much
customers as you can, so you should focus
on deliver cool stuff, cool features
there and not focusing on tasks that are
repetitive. Basically, don't waste your
time and automate whatever you can.
especially for contributors
that are not always our team mates, they
don't have to be in our organization,
so templates, for me at least, is a must.
Okay and do you have any questions?
For open source all are free, but the pricing
is the Knapsack, for instance, is the tool
that was created by a guy from Poland,
from Krakow, DonorLogic  and it's pretty
cheap
it's like dollar or something per repo,
something like that. Simplecov it's
also like $5 maybe per repo, so it's
not like extremely expensive
stuff. Don't remember the pricing some
dollars, I believe that all
tools that I shown here it's like ten
dollars in total monthly per private
repo.
Okay and that's it, thank you.
