YUFENG GUO: Code reuse
is a central tenet
of software development.
Machine learning
should be no different.
Stay tuned to find out how to
use TensorFlow Hub to easily
load up and customize
state-of-the-art models
in your TensorFlow code.
Welcome to "AI
Adventures," where
we explore the art, science,
and tools of machine learning.
My name is Yufeng Guo.
And on this episode, we're going
to check out TensorFlow Hub.
Developing state-of-the-art
machine learning models is no
easy feat.
Large, state-of-the-art models
require lots of carefully
curated, balanced data to
train, and they take a long time
to train as well.
Now, TensorFlow Hub is a library
for the publication, discovery,
and consumption of reusable
parts of machine learning
models.
The primary use case
for the models in TF Hub
is for transfer learning.
Transfer learning is a technique
to base your machine learning
models on other large models
pre-trained on even larger data
sets.
This enables you to train your
particular customized model
on a smaller data set, improve
generalization, and speed up
the training process
since you're not
starting from scratch.
Because TF Hub is
integrated with TensorFlow,
you end up pulling in sections
of a TensorFlow graph.
And so the library makes
it super easy to load up
these models to be fine tuned.
The resources in TF Hub
are known as modules.
A module is imported
into a TensorFlow program
by creating a module
object from a string
with its either URL or file
system path that's shown.
Now, there are
already many models
available on TF Hub
with more on the way.
The two broad categories
of model types
available today are
images and text.
The image models are
trained extensively
to classify objects and images.
And by reusing their feature
detection capabilities,
you can create models that
recognize your own custom
classes using much less
training data and time than it
took to make those
more advanced models.
Some of the image
models available today
include Inception V1,
V2, and V3, Mobile Net
in all sorts of configurations,
NASNet and PNASNet
as well as Resonant.
Most of these models
were, at one point,
state-of-the-art image
classification models,
and they serve as great starting
points for many image-based
machine learning applications.
Now, the text-based models,
they're no slouch either.
Currently, there's the universal
sentence encoder, Elmo,
which is a model
train on the 1 billion
word benchmark, NNLM, which
is a neural network language
model trained on the
Google News dataset.
So yeah, there's some pretty
cool models on TensorFlow Hub.
OK.
Two final aspects that I want
to cover about TensorFlow Hub
are the guides to using
the modules, which
are very important,
and how you can publish
your own modules to the hub.
First, the TF Hub website
has a number of great guides
that show you how to load up
and find two models for your use
case.
For example, this
guide here shows
how to do text classification
of movie reviews
with the NNLM model.
You can take this
and easily swap out
the dataset, the model,
or both to your liking.
So TF Hub-- it can't be a hub
without your contribution.
As you develop new
and interesting models
with your own data, consider
creating a TF Hub module
that others can then reuse.
You might find collaborators
who can improve upon your models
even further or discover
use cases that you
hadn't thought of.
TF Hub is not just the place to
find machine learning models.
It's an entire framework for
publishing and consuming models
all with an easy-to-use
and consistent interface.
This, I believe,
is critical, as I
think it will enable
us to think in a higher
level of abstraction.
This leads to more innovative
ways of remixing and reusing
existing models, expanding
the possibilities of what is
possible with machine learning.
Thanks for watching this episode
of "Cloud AI Adventures."
And if you enjoyed
it, please like it
and subscribe to get all
the latest episodes right
when they come in.
For now, check
out TensorFlow Hub
and see how you can build
on top of some of the most
advanced models around.
