Developer friendly cryptography is what
I'm gonna get started on this morning.
How many of you would call yourself
developers in these some capacity?
All. right. How many of you work with
developers? A couple more. OK, very good.
So because we have the small screen, I
was kind of moving things around last
night to make it larger. But you know,
certainly feel free to come forward. This
link at the top here, will get you a copy
of the slide deck in a PDF format. So
you know, if you want to follow along on
your laptop or something feel free.
My name is Bryce Williams. I work for a
consulting firm called SysLlogic based
here in Milwaukee. And we are a managed
security services team. So my group, we
provide application security guidance
and a variety of topics for
organizations. We review lots of code. So
we do source code assessments you know,
security assessments on a lot of different,
large systems to small embedded devices.
Train thousands of developers. This takes
me all over the place you know. Talking
to developers specifically about
applications security concerns which
includes cryptography.
And personally been in the field for 20
years as an application developer
architect and then transitioning into
the security application security space.
So, everything I do is around application
security and working with the developer
community. I want to start with this
statement, this quote from Dr. Neumann.
This is, this may be something you've seen
before. There's several different
variations of it. This was some ,this is a
statement that Dr. Neumann gave in 2001
for a New York Times article. And the
article was discussing an announcement
by a Harvard professor about a new type
of unbreakable encryption technique that
he had you know put together, he and his
team. And it was based on the use of a
key based on a stream of random numbers.
So it's a pretty novel technique from an
academic and theoretic standpoint. It was
actually pretty cool. Where the statement
from Dr. Neumann and some others that
commented on it was, you know this is
really great you know from that
perspective but, we often see weaknesses
when it comes to the implementation of
you know, truly great like cryptographic
techniques. So I want you to keep this in mind
as I go through this discussion that, you
can have great cartography but often you
know, putting the pieces together,
actually practically putting it into
places where we see weaknesses.
So to get started, I want to look at
cryptographic best practices for 2018.
And don't worry if these are unfamiliar
to you or if you can't read it.
These are things that you know, over time
you know, now that we are in 2018, we have
some updated cryptographic best
practices. If you were in Zack Grace's
talk yesterday, same of, there's a little
bit of overlap with what I'm talking
about today, what he covered. The best
practices for example are similar you
know, derive for same sources. From
cryptographic professionals that you
know, know this topic inside and out. So
we have things like random data from
kernel based CSP renji's, your random
number generators, crypto random number
generators. Use of authenticated
symmetric encryption. Symmetric
signatures. Using each map with your hash
functions, make sure that you're using
those that avoid length extension
attacks, that's an important one. Some of
this I'll come back to, but just keep in
mind you know, at high level these are
the best practices that
cryptographic professionals promote and
encourage folks to use, yeah. So password
storage. Password based KDF's. Asymmetric
encryption. Preferring the use of
elliptical curve cryptography or ECC
over RSA. For asymmetric signatures,
preferring the use of EdDSA
or RFC6979. And then secure
communications TLS everywhere you know,
is great, something along those lines, you
know. Good
end to end encrypted communication stream. TLS 1.2 are now 1.3. So I want to get us
started my field observation. So these
are examples I'm going to go through
that my team and I initially review
things that we have you know uncovered
in our assessments. Some of these you
know, let's start with a basic well, let's
start with some common mistakes first,
then we'll get in the actual examples. So
here, some of the things we have observed
include you know, these different
categories. Things like weak passwords storage.
Use of reversible encryption. This is all too
common.
AES, RSA used in storage of passwords,
not ideal. Use of hash functions without
salts, a little better, again not ideal.
Insufficient work factor to prevent
brute-force. Often you know there might
be some work factor utilized or what we
call a KDF a key derivation function but
the there hasn't been sufficient thought
put into the work factor. We also see
poor key management. Keys are not stored
securely, hard coded or placed in source
control. It's a pretty common issue. Loose
access control or over shared keys. Lack
of granular key usage and periodic
rotation. All sorts of you know, related
key management issues. We also see a
general lack of authenticated encryption.
And if you're not familiar
with that term, authenticated encryption
don't worry, most developers aren't either.
They haven't, they've never
heard of authenticated encryption, they
don't realize that something that they
need to be concerned about. Um in their
minds,
AES, they've heard of AES that's good
enough. Let's just use AES because we've
heard that's military-grade. It's a good
start. AES isn't necessarily bad but, you
need to know a little bit more than that.
Or they may know they need to use
authenticated encryption. So they go down
the path of developing a you know,
authenticated encryption construction
based on something, but it's not the
preferred encrypt-then-MAC approach. We
also see use of keys initialization
vectors and nonces in a way where
there's misuse, there's reuse of keys,
they're hard-coded, they're using some
sort of strange obfuscation too you know, to
protect them. Use of passwords for encryption
keys. Another big issue. A password is not
designed to be an encryption key. You can
turn a password into an encryption key.
But a password itself should never be
used for an encryption key. Also, we can
turn on random values for these you know,
keys IVs and nonces, or just reuse of
values. There are certain constructions
where reuse of a nonce is a big no-no. So
here's my first example. Use of not
encryption is what I call it. So you
probably, if you're in the back you might
not be able to read all this. But
essentially we have, this is in C code. In
fact, all of my code examples are in C.
This is from an actual assessment that
was performed. We ran across this
function, I think I ran across this one.
Called encrypt password. And just
skimming through this right away I knew
we had a problem because there's no
mention of any algorithms and any and
you know, cryptographic algorithms or
ciphers. There's no encryption key.
There's just this interesting line here
that's doing some, essentially some,
shifting some bits around. So that's not
encryption. That's what we, that's
obfuscation essentially. And you think
well yeah, this is a pretty crazy example,
this wouldn't exist that often in
the field or maybe it's super old code,
but I think you might be surprised at
how often you run across examples of
encryption that looks something like this.
In this next example, we have a random
number generator that's being seated or
initialized. Here the comment says it all,
"Need to see the RNG with a hash of the
MAC address. So if you can kind of read
the code here, it's doing a murmur to
hash based off some pieces of a MAC
address and this is just messed up.
You can you know, a MAC address is a static
value for a device. A random number
generator on the other hand needs to be,
it can't be deterministic, it needs to be
unique and unpredictable. So the use of a
MAC address for a random number
generator should never happen, it just it
shouldn't be any part of it. We also see
a lot of overly, what I call overly
complex encryption, just extra layers of
things that you know, you... I remember
running across this one for the first
time. You know, right away going, what is
going on here? And then breaking down you
know, reverse engineering the logic to
understand what was happening. And it
kind of looked like this. Where it's
generating a random value, that random
value gets stored in a database, the
random value gets split into two, you
combine the second half with a plain tax,
reverse the combined value, hash it with
SHA256 and on and on. One of my favorite
sections is line two here, where it goes
in creates a UUID, takes the five parts,
reorders the parts and then puts it back
together and it's like what? Who dreamed
this up and why? And essentially this
second step here is the cause of the
concern. There's actually a weakness in
this approach regardless of the fact
that it's just weird.
The random value should never have been
stored in the database alongside the
encrypted data. I forgot to mention, this
was for storing data in a database using
reversible encryption. They wanted to be
able to recover the plaintext later, so
they use this approach. Which you know at
its face is crazy, but the reason I think
that the designers put this in place is
because they recognize that by storing
the random value in the database, it just,
maybe it felt wrong? They knew there was
some sort of weakness, so they added
additional layers of obfuscation,
essentially to make it more difficult to
reverse-engineer. Now obviously we
figured it out and an attacker
with enough information or access to
things could do the same thing and
ultimately, potentially reverse engineer
and recover the plaintext values that
are stored in the database. So if they
could you know, take the entire or have
access to the entire database, they have
all the detail they need to recover the
encrypted information. So anytime I see
extra layers of complexity involved in
encryption techniques,
it usually clues me into the fact that
it could warrant some study because
there's probably a reason there's extra
layers there. They're hiding
something. Here's another example. It's
not entirely cryptography based, but this
is looking at a system, this is a cloud
hosted you know, typical web system with
a database back-end. An API front-end and
a JavaScript UI. Three different issues
we kind of highlighted in this
particular implementation. We've got
missing authentication on the API
endpoint. There is this function called
get to random security questions by user.
That allowed an anonymous user, so no
authentication. Anonymous user could ask
this based on a user ID and get back
that user's secret you know, two of their
three secret questions and answers. And
then interestingly enough, when I saw
this I realized we had a big problem. The
validation of the answers that were
given by the end user was performed on
the client side. So the Java Script was
the one that was actually saying, oh did
they answer that, did they input the
correct answer for this question or not?
Yes or no? So obviously an attacker could
bypass that step. And this was a
brand-new system I think. We looked at
this like two months ago or something. A
brand new feature that they added. So
obviously the team wasn't really
thinking about the security aspects of
this. And even, my third point here
was, even on the back end there was
issues as well. The answers themselves to
the secret questions were stored in the
database using symmetric you know,
reversible encryption. Ideally you want
to store your secret question answers in
the same way that you store passwords,
using you know, key derivation function.
Because there's really no reason to
recover the actual plaintext of that
answer, you just need it for comparison
purposes.
So we looked at a few issues that my
team and I have uncovered and kind of
highlighted you know, some basic issues.
So if you compare those to the
cryptographic best practices, we're way
off the mark you know. We're looking at
you know basic issues that we seem to
uncover over and over. Rather than the
you know, seeing that if developers are
be able to address the more advanced you
know, modern topics they're not there, in general. So this slide looks
at recent cryptography issues and in
this case these are some items
highlighted by Matthew Greene, who's a
professor John Hopkins. Pretty well known
in the cryptography space. So what are
you highlight here? DUHK/Fotigate hardcodes
a key that makes every VPN session
crackable, you might remember that
scenario. Recently being you know, in the
past two or three years that these, these
issues came out. You know in the press,
pretty big deal. All related to
cryptography base mistakes.
Next one, Juniper hardcodes is similar
key and then gets hacked by the Chinese
who changed that key to one of their own
choosing. That's obviously was a big deal.
Every major browser manufacturer and a
number of websites make TLS vulnerable
to practical decryption attacks, that was
our freak attack. And similarly the next
one, browsers and websites make TLS
vulnerable to practical decryption tax
yet again, with the DROWN attack.
Apple uses crypto wrong and their iMessage
encryption for a billion users. That was
a pretty big deal. And remember, all of
these are big manufacturers. These are
the folks that have big dedicated
security teams. They have Phipps
certification. So what kind of issues is
everyone else running into? Obviously, my
team and I have seen some of those and
some of the examples that I highlighted.
You know, a lot, these are more
complicated kind of concerns. You more,
advanced subtle mistakes that are made.
The subtle mistakes you know, are going
to certainly bound to happen. It's the
basic mistakes that we often see, they
concern me. So why are mistakes so common?
Three different metrics here, from
different studies. Just a fact back up
the statement that cryptographic issues
are a concern. 61.5% of applications scanned had one or
more cryptographic issues. That's from
Veracode's report last year. But they
scan a lot of code obviously. 66% of the
most popular crypto concern...
crypto currency mobile apps. So there
was a study that looked at
crypto currency mobile apps. 66% of those
contained hard coded sensitive data
including passwords or API keys. I mean
that's really unfortunate, uh-huh. And
then look at this last one here,
highlights the fact that 17% of bugs
they looked at, were in the cryptographic
libraries. Whereas the remaining 83%
were in cryptographic, were
misuses of cryptographic libraries. And
that last point is when, you know want
to emphasize that the cryptographic
libraries certainly can have mistakes
and you need to be mindful of those
libraries. And you know, make sure you're
using latest ones, and I'll talk more
about libraries. But the implementation
of cryptography is more often where we
see you know, problems being introduced.
So, if we look at like, just a few
developer challenges, again kind of why
mistakes made? What is it that
developers have to fight with? And I
think you know for those of us that do
development, this will ring true. Things
that you have to you know, be mindful of
as you're working on a system. You got to
make it work, it needs to work first. You
have to ensure the product actually
works before you make it secure. You know,
I had to argue that you need to do both
at the same time but, clearly you have to
make the product work, that's important.
You also have to meet delivery dates and
often delivery dates can be a priority
over adding in a certain level of
security or getting extra expertise or a
pair of eyes to review it. You might have
a performance or usability concerns as a
result of certain security choices and
so, maybe they went out. User performance or
usability that is. You might have
inadequate security testing. Maybe
security controls don't get tested with
sufficient expertise. Lack of crypto
knowledge. Developers might not get the
training that they need. The access to
knowledge sources to give them the
information about cryptography that's
accurate. It might also be a problem with
poor library or API support. Either in
crypto libraries that are being utilized
or in the programming languages that
you're using, you know that
cryptographic API's. I want to look at
those last two items; training and API
use. Just to highlight a few things. So
first off, I looked at the top 10
cryptography courses on Pluralsight. This
is not to dig Pluralsight, they're a
beverage sponsor here at CypherCon. You'd find this issue with others as
well. If you're not familiar with Pluralsight, they're a excellent online video
based training vendor. Lots you know,
thousands of different technical
training topics that you can get. The
interesting thing with the cryptography,
so I looked at the top ten. If you
search for cryptography, these are the
ten ones that have, that are focused on
cryptography. Some interesting highlights.
They generally provide good history on
cryptography and basic concepts, but
generally lack practical engineering
guidance. There's no discussion of
authenticated encryption remember. I
mentioned authentication in,
authenticated encryption AE. That's a
pretty big deal, that's important. No
discussion at all. Also, no discussion of
secure key management or key storage
options. No mention of kernel based
cryptographic random number generators,
versus user space RNG's. Two of the
courses, two out of ten, made a kind of a
passing mention of ECC. But they didn't
really get into any details talking
about you know why would you want to use
ECC versus RSA. What are some preferred
curve choices, that sort of thing. And I
think some of it is just because the
contents a little dated in some of them.
Not to say this could be addressed, but
this is just indicative of cryptographic
like, training material that's out there.
Whether it's Stack Overflow posts or you
know, training materials like that. You
have to be careful because, I mean
knowledge is good. But when you give
people the wrong knowledge or dated
knowledge and it can actually work
against you. I mean so on and so forth,
nearly all hashed examples use MD5, which
we know is broken. And quite a bit of odd
advice. Things like, you should double
your PBKDF2 iterations every year. Which
I've never seen anywhere else. I don't
know where that came from. Maybe you
should use this obscure TIGER hashing
function instead of SHA-256 because,
SHW's been shown to have weaknesses in
the past SHW-1 for example. That's the
kind of stuff I don't want developers to
focus on, see. Oh maybe I should, maybe I
should double my PBKDF2
iterations
every year so now I've got you know, a
gazillion iterations and it doesn't work
anymore.
So, what are some popular cryptographic
libraries? These are general-purpose
cryptographic libraries, that in my
opinion provide too many options to the,
the average developer. Botan, Bouncy
Castle, Crypto++, Libgcrypt,
OpenSSL, wolfCypt, this is just a few of
them of course. You may be familiar with
some of these, you may use some of these.
It's not that they're bad, they just
provide a lot of different options.
Things like you know, encryption
algorithms, hash functions and I
may not have these numbers you know,
completely accurate. I kind of went
through documentation and tried to
summarize things and based on like
interfaces and API endpoints. So you as a
developer go in, you have to choose a
hash function. You've got 27 different
choices in Libgcrypt. A signature
schemes. Read a reason Bounccy Castle
goes crazy on signature schemes, that you
know at your disposal, so. As a developer,
it increases your chances of choosing
something that may not be in your best interest.
Oops, land mines in cryptographic
libraries. So, in addition to having just
lots of options, they also have some bad
or insecure options that you need to be
aware of. Things like RSA within secure
padding that's an area where
cryptographers you know, have a lot to
say about. You should never you know, you
not only should, you maybe not use RSA,
prefer using ECC. If you need to use RSA,
make sure you're using preferred padding.
There's a lot of implementations that
just provide insecure padding that's the
default. All of these libraries provide
at least the option of using RSA with
insecure padding. ADS with ECB mode. We
know that's gonna, you know faulty. Broken
cyphers. Things like RC4 for example.
RC2. Just some really outdated, clearly
broken cyphers. There, they're there and
every single one. A lot of its to provide
backwards compatibility or you know
interoperability with things but... If
that's an option, and someone you know as
a developer meaning well, maybe they've
copied something off the internet?
I was looking at a blog article just the
other day where someone was talking
about old you know, algorithms that were
basically defunct in the mid 90's. Same
with hash functions. Old hash functions
broken, hash functions. This user space
RNG's, they all have, other than wolfCrypt
I guess. They all have you know, it's user
space RNG concept that... can get you into
trouble without you even realizing it.
Because it's just, he's kind of like the
default behavior. Another thing is with
implementation challenges, here's a
specific example around the
encrypt then MAC approach. So you have
some knowledge, you know you need to use
authenticated encryption. But maybe
you're working with a library or a
programming language that doesn't have
authenticated encryption. But you can
build it yourself with the built-in
primitives. So you put together this
encrypt then MAC construction. In order
to do it correctly though, you have to
keep all this in mind. You've got to use
a different key for your encryption from
your authentication. Preferably you
derive those keys from a single master
key using a KDF. You have to also make
sure that all your string comparisons
use constant time, can't use standard
string comparison functions. You
definitely have to use an HMAC and
ensure it includes the cypher text, your
additional authentication data, the
initialization vector and the encryption
method. All of those have to be packaged
inside the HMAC. And those fields passed
in the HMAC, they must use a format that
unambiguously delineates them. An,y if you
leave out one of those fields, if you
don't use a correct format, you
essentially are introducing a weakness.
And you don't you don't necessarily
realize it, cuz this isn't your bread and
butter to create these encrypt then MAC
constructions. I've seen so many examples
of these where, well-meaning developers
know they need to go down this path of
authenticated encryption, but they try
and put all this together and it's
admittedly, it's complicated. We also have
weakness in, weaknesses and standards and
protocols. Many of you may worked with
the JavaScript object signing and
encryption standard suite. It's like a
suite of standards. Things like JSON web
signatures JWS,. Which even if you have a
compliant implementation, meaning it
meets the standard correctly. It is
vulnerable to passing in the none or HS256 algorithms.
As an attacker, you can pass those in
when you're, when in reality you're
expecting an RSA signature and it's able
to kind of work with those. Essentially
it's a fairly known by now you know,
attack vector. So you want to make sure,
not only are you using a
standards-compliant implementation, but
it's also aware of these weaknesses and
has put in extra safeguards to
protect against them. Same thing with
JSON web encryption, JWE allows this
insecure padding you know, RSA with
insecure padding choice. Which is
unfortunate. It also allows ECDH with
NIST curves, which introduce risk of
invalid curves attacks. A little more
rare, but something to be concerned about.
It's unfortunate that these choices
exist in the standard. OAuth 2. May be
familiar with OAuth 2, it's a great you
know, standard for technically as an
authorization protocol but, used in the
authentication, authorization space.
Provides several weak workflows that
ideally should really never be used in
modern systems, but I see this all the
time. Use of client credentials and
password grant types where they
shouldn't you know, that's not
recommended. There's also this optional
state parameter. This in the standard
it's optional, but it really should
always be used. There's also no explicit
access token specification in OAuth 2. So
you'll see, you'll see some interesting
implementations as a result of that.
There's no specific guidance. So
obviously good implementations of OAuth 2
to do a great job, but they're going
above and beyond the standard, because
they've you know, have additional
knowledge and experience with associated
with that.
I always recommend that anyone that's
working in the OAuth 2 space, that you
read the threat model and security
considerations RFC you know. If you
search for OAuth 2 threat model, you'll
get this RFC. Goes in a lot of detail
about particular threats and
countermeasures that you can b,e that you
should be aware of. But most developers
just aren't aware of that and as a
result they don't think about certain
types of attacks you can do in this space.
So, I asked myself this all the time...
Should developers stop using
cryptography? You know, because we do a
lot of training. At one point I had asked
a client and said; you know we've done
all these assessments, we see all these
you know, consistent issues with
cryptography. Maybe it makes sense to
start telling your developers to stop
using cryptography huh? Because clearly
at least everything I'm looking at, no
one's doing it right. So does that make
sense? Do we tell developers to stop
using cryptography? Clearly we know one
pro of that approach, does have less
security implementation flaws. KAHN is of
course then your systems will lack even
basic protections. They're not gonna have
to use any kind of cryptography and so
then, it's more obvious that you've got
you know, a lack of security protection.
So what if security pros do all the
crypto, crypto you know, cryptography work
instead? And in certain organizations
that might make sense. You might be able
to do that. But generally security pros
don't scale well. You can't have you know,
one or two people writing all of this
code you know, anything involving
cryptography. Which admittedly is an
increasing amount of features and system
aspects these days that need to take
advantage of crypto so... That's just gonna
slow down progress if you have the, you
know just folks with specific expertise
take care of that.
So, assuming we can't do that. What are
some other options? What is this idea of
developer friendly cryptography. Has
anyone ever heard of that concept before,
a developer friendly cryptography? I
don't think I dreamed it up. Not you. So
how to define developer friendly. Here's
four different things that to me mean
it's developer friendly. Like takes it to
that next level. There's no need to
select ciphers or key sizes. There is
automatic generation of encryption key
initialization vector salt and nonce
values. There's simple clear API's that
provide high-level outcome-based
functionality. So rather than using
cryptographic primitives you know, you
have to choose ciphers and algorithms.
You instead say I you know, its
high-level. I'll show you some examples
of what I mean by this. And also
important, also important, there are no
insecure or low security options. So as
much as possible, you avoid the scenario
where developers shoot themselves in the
foot. So, start with let's look at some
developer friendly libraries. Those that
I feel are kind of candidates for this
concept of developer friendly. We've got
Libsodium. And that's an important one
to remember. If you remember anything
from this talk, remember lipsodium. It
is a cross-platform compatible
you know, module based on the NACL
package. It's got language bindings for
most languages out there. It's very
popular in the security professional and
cryptographer space. It's generally easy
to work, with from a developer's
standpoint. And it supports things like;
authenticated encryption, digital
signatures you know, hashing performance
optimized. And it doesn't give you so
many options that I mean,  it gives you
enough options as a developer to get
most things done. If you need something
certainly more advanced or more unique,
you may have to move to a more
general-purpose library. But this is a
type of library that I feel comfortable
recommending to any developer in any
sort of environment
that they're working in. Libhydrogen is kind of like the younger
brother of libsodium. Designed for the
embedded system space. It's written in C,
C99 so it you know, it's fairly versatile.
It has the same general API, the
interfaces as libsodium. But implemented
with essentially just two cryptographic
primitives, so it keeps it very small,
very lightweight. Not as Tridon. Not as
well tested as libsodium at this point,
but definitely something that you should
keep your eye on if you're working in
the embedded system space. Monocyphers is
another one. Pretty nice and not
something I would necessarily recommend
over the other two. And then, but at the
same time, if I was assess you know,
performing an assessment against the
system and I saw use of monocypher or tank
or ASP.NET core, I would feel more
comfortable than a general-purpose
library. Tink is another library put
together by some Google engineers that
is designed again, to create those
high-level kind of outcome-based
functionality. It doesn't have password
hashing or perform, it's not as
performance optimized and it only works
the Java and C++, but it's still a decent
option. Even and then I also want
highlight you know, ASP.NET core which is
an interesting one in this list and that
it is, you know only a subset of
Microsoft's new net core framework and
it again, it only provides authenticated
encryption and password hashing. And
it does provide some key management
functionality as well. But it is, it's
kind of nice in that it's built in so
developers that can leverage this, I like
the direction Microsoft is going with
this cryptographic API. So here's an
encryption example from libsodium. This
is C code again. You don't really have to
read the detail, you probably can't
in the back. Other than it is fairly
simple you'll notice. If you know
view this later for example, there's no
mention of any cryptographic algorithms
here. No choice of cipher, no key size
selected, no you know, correctly
generating initialization vector or
nonce. it kind of makes I mean, you still
have to do you know, create the key and
create the knots for the encryption. But
it it provides functions to do this in a
very simplistic manner. Where a developer
is not going to
generally choose you know  a wrong
selection. Here's Libhydrogen, this is a
public key signature example. Where again
it's using its creating a public/private
key pair and using that to generate a
signature to sign some data and then
later verify the signature. So again, very
straightforward, very simplistic for a
developer to leverage.
So, that's all great, but what about FIPS 140? Anyone
familiar with FIPS 140-2? A couple folks.
This is, so FIPS stands for the federal
information processing standard. And
publication 140 has to do with this
particular space and the two is, means
it's version two. That's the current the
current version of FIPS 140. So
cryptographic libraries can be FIPS 142
level 1 validated so the software
library itself can be level 1 validated.
Level 2 validation is reserved for
hardware devices. But that validation,
that requires some time and money. You
actually have to ship your you know, your
library off to a testing organization
called CVMP. They run it through its
paces you know. Look at a whole bunch of
things with it. Obviously it takes a lot
of time. And you have to pay them. But
then you get a certificate that says OK,
we tested this particular library on
this specific set of hardware and
actually we'll list out you know, if
whether it's 2 or 4 different hardware
based environments you know. This
particular, like it's Windows 2008 with
this particular version on it. So they
certify that will work properly
according to FIPS approves you know,
standards with that particular
environment. Use of FIPS     validated
modules is mandatory by US and Canadian
government agencies. Others may use it as
well, but definitely if you're working in
the federal government space, this is
most likely going to come up. For them
that's important that you have a FIPS
validated library that you're utilizing.
Doesn't mean you have to create one and
get it validated, it just means that any
of your cryptographic choices you know,
cryptographic code is utilizing a library that's
gotten this sort of validation. Keep in
mind that FIPS validation doesn't
necessarily mean that it's more secure.
That library is more secure, it just
means it's gone through this validation
process, it's been reviewed essentially
by the CVMP against FIPS standards, So
if you are working this space you do
need a FIPS validated library. Clearly,
you can't use any of the, I shouldn't say
clearly, because I didn't say it but, any
of those libraries we looked at before
libsodium libhydrogen, none of those
are FIPS validated libraries. There's
reasons for that. They you know, they're
really not in a position where that they
want, they need to pursue you know, as an
open source library, to pursue this
validation. Generally you're looking at a
commercial library that's gone through
this process in this case so. They're out
there. There's some good libraries like
wolfCrypt for example. Nanocrypt by
mo Cana. Even open SSL has a FIPS
validated version of the library that
you could look at. I do recommend that if
you need one of these, that you take the
time to consider writing a developer
friendly wrapper around it. Similar to
the you know, providing interface's
similar to libsodium or others. Or
maybe get some, some expertise some
outside help for example to put this
together this wrapper for your developer
group. That way they're not having to
worry about the specific cryptographic
choices and so on.
Also, key management best practices. We
talked a little bit obviously about you
know, general encryption and
cryptographic function best practices.
Key management is another area that I
mentioned is a fairly big
weakness. Developers that I, in code that
I've reviewed, and my team has reviewed,
ten vacant at times do a very good job
of putting together a cryptographic
functions. But then the keys are you know,
they're stored in source control or
they're stored even in the you know,
hard-coded in the system itself. So, some
best practices around key management. If
we had an ideal system, it would have all
of these. No key or secret, it should be
stored in clear text. Keys should have a
defined limited lifetime based on usage.
The key should be refreshed
automatically where possible. There
should be a method to manually revoke
keys. You always want a way, that we know,
once you generate a key, that you can
revoke it. Even if it's you know, kind of
a manual process that you have to go
through. He should never have an
unlimited lifetime. Access to clear text
keys should be limited through
authorization or permission. So being
able to place some access control around
it would be, is important. You also want
to make sure that access to clear text
keys is limited through authorization
and permissions. And that all key life
cycle and access event should be audited.
So ideally, anytime there's you know,
access to a key you know, you pull it out
of a vault for example, that there's an a
log event that occurs that you know, this
particular application, or this
particular user. You see this a lot with
code signing you know. If you've got an
implemented code signing kind of process
setup within your organization you
generally want to have a more robust
handling of the key material around use
of code signing so that only certain
individuals have access to that. That
usage is fully audited and so on.
Kinda' sounds like we're in an elevator.
So, some key management solutions . You
might not be able to read this slide. But
I'll highlight a few solutions that are
you know, some pre-made kind of solutions
rather than having to roll your own.
Which is generally not ideal. So you've
got Hashicorp Vault or Keywhiz. These
are good for distributed systems.
You know, client server environments
anywhere where you've got more a you
know, more components involved in your
particular system environment. Vault is
consider the gold standard in this space.
They have very comprehensive key
management strategy and allow a variety
of topologies in how you deploy your
key management infrastructure and so on.
So it's a great tool in this
space. You may also leverage a
hardware security module. And hardware
security modules often can be used with
a lot of these solutions. As a hardware
backing essentially. So, you could get it
as in a chip set form if you're working
you know, an embedded environment for
example, a vetting system. Maybe as a USB
device or even as like a rack mount
appliance that you could have in your
own data center. An HSM provides a very
protected form of, kind of a vault if you
will you know, an actual hardware vault
to store keys. And to do so in a way
where it's tamper resistant. Very
difficult for someone to pull keys out
of there. They actually get generated on
that device and those pkeys at least
you know, in case of private keys for
example, they never leave that device,
they always stay there. If you're working
in the cloud, you got things like Amazon
KMS as your key vault. Google Cloud KMS.
In OpenStacks Barbican. These are all
HSM as a service essentially. Which are
great, especially if you need to use this
on a, maybe you need to use an HSM at a
more periodic basis. You don't have the
funds to get a full you know, HSM
appliance, you can just leverage one of
these systems. And do so in a fairly
inexpensive manner with good you know,
robust... I mean on the back end they're
essentially using those HSM appliances
themselves. If you're working with either
Ansible or Chef in automation processes,
both Ansible and there's, an Ansible
Vault feature and Chef
has a vault feature as well, which is
above and beyond just Chef data bag. Chef
vault is more robust for use with
secrets and key management. They do lack
more advanced features like so for
example, if you're using Ansible Vault,
you're not gonna get all the feature set
that you would with Hashicorp Vault.
But, because it's built in, it can be
obviously very convenient, it's a good
place to start. If you want integrated
secret and workflow management, you might
want to consider Docker with SwarmKit
or DC/OS. They have commercial offerings
of these that offer even more
advanced features than, but you will get
some you know, a basic feature set in the
free versions. Which is a good again, not
as robust as like a Vault implementation.
But that integration of course, is nice.
The other ones I'll mention, Knox, Tink,
and ASP.NET Core. Those are all examples
of integrated application level key
management. You're not gonna get
necessarily the more advanced features,
but they, and they have more limited
language bindings for example obviously
ASP.NET Core only works with ASP.NET Core. Tank only Java and C++. I forgot
now what Knox works with, probably GO. But
they're very, they, because you can
implement them at your application level,
you've got more flexibility on how those
keys are managed. Often... they will also
work with an HSM back-end. So if it's
present you know, you can leverage the
HSM hardware.
So, high-level developer friendly
recommendations. Transport layer security.
So if someone you know, if we're
reviewing code, or someone comes to me
and asks about what kind of algorithm
should I use for secure communications?
I say let's not talk algorithms, that's
just you know, we're we're gonna go
nowhere with this. Someone's already done
that work for you. Just use TLS or
essentially you know a robust
communications protocol that is already
going to take into consideration all the
things that you don't need to worry
about. And do so in a you know, in a right
way, in a secure way. On the storage side...
you want to consider built-in storage
encryption solutions where possible you
know. Certainly take advantage of these
if you have the opportunity to. Things
like Microsoft's transparent data
encryption. Again, you're just getting
disk level encryption at this point, but
it's so easy to use there's almost no
reason why not to and it gives you the
ability to protect your data at rest,
your data backup, database backups and
sequel server and so on. Same thing with
like Amazon's EBS encryption, full disk
encryption, those are just good options
to leverage where, you as a developer
don't have to do anything, you have to
know it even exists, it just kind of
works for you for those, for the specific
scenarios. We mentioned developer
friendly route libraries. Definitely want
to check out use of those libraries even
above and beyond you know, your built-in
programming API's. For example in dot net.
Dot  net has a pretty decent cryptographic
API that includes, that is included with
the framework.But I've seen so many
misuses of it. The general, my general
recommendation to development teams as
we go through training is, wherever
possible use Libsodium. If you need to
get, if you need to get to the point
where you need to use cryptography, you
need to use some encryption, like you
can't leverage TLS, or you can't you know,
it's not covered by any of those
kind of built-in solutions, take
advantage of Libsodium. Which has libsodium net is the dot net language
binding of Libsodium. And then key
management solutions. You always want to
consider how the keys are being managed.
That includes, not only you know, your
production environment, but also your
test environments, your local development.
Anytime I see a key checked into source
control that is not specific to a
development environment
you know, kind of a small test
environment. We always want to flag that
and say hey, this is not an ideal
solution. Over sharing of keys is, all
too common and very easy for things to
leak out I mean. We've probably all seen
examples of things that get into Github
or get into areas where, they're just
more accessible. Logs sometimes, a lot of
times keys get logged too and those logs
will get aggregated into log management
systems and those logs you know, sucked in via syslog systems. And so
then you've got more people with access
to your logs and then if they get keys
or credentials that way, then you've
opened up a door and so on.
Questions? I left a little time here
so any specific questions, yeah feel free.
These links are again linked to the
slide deck and I specifically put in
three links because I wanted to keep
these, these three tools if you will, or
libraries, is what I would consider the
most important. These are they, these have
a general purpose in the sense they have
broad applicability, broad you know, usage
in the case of libsodium and the
Hydrogen fairly decent like language
binding so that they'll work in most
scenarios. So if you're working in PHP. In
fact PHP has in the latest version, has
Libsodium built-in as the default
cryptographic library now, which is
pretty exciting. They're kind of
ahead of the game. Of all things PHP. Well
if you're working in Ruby, go russ.net.
Java, you name it, I probably have, there's
probably a Libsodium binding for you.
Questions? Yes? -Is there other good code review options? -Other there other good code options? For cryptography specifically?
Yeah. My team obviously, enjoys looking at
cryptography. There are other I think firms
that kind of specialize in looking at
cryptography but from an automated
standpoint there are some things I think
most tools... I have a leg in the like
static analysis space as well. And spend
time looking at what tools are capable
of in determining cryptographic issues.
And is an interesting space. Clearly they
can identify usages of old, outdated
functions. But being able to determine if
you're using even newer functions in a
proper manner, can be complicated you
know. You got a like, an encrypted Mac
solution. It might be great. It might be
that you've missed something and it's
just there's, just not enough context for
an automated tool to figure that out. So,
I can't point to one specific example. I
think most obviously the big name
commercial static analysis tools are
probably, they all attempt to do
something in this space. Lighter-weight
tools you know, open-source tools. Burying,
it's getting mixed I think based on the
language. Java has some very interesting
default cryptographic choices. Things
like, if you encrypt with AES you gotta
get ECB mode by default so that's
obviously not cool so a lot of tools
will pick that up.
Some java-based concerns. So yeah, cool.
All right,  I think we're good thanks everyone.
