Welcome.
My name's Pete Frisella.
I'm a developer advocate on
the Google Analytics team.
Well, today I want to talk to
you a little bit about the
Google Analytics superProxy, and
what that actually means.
From a high level, it
really means you can
make your data public.
To drive a lot of different
things like widgets, testing,
dashboards.
So we'll just get
right into it.
We'll talk a little bit about
the agenda, and then we'll
dive right into a demo
and some examples.
So we're going to talk a little
bit about what it is,
so what is a proxy?
Some example use case, and then
a demo, and actually how
to get started and
do this yourself.
So we'll go from start
to finish type thing.
So what is Google Analytics
superProxy?
If you're familiar with some of
our other projects from the
developer relations team here
at Google Analytics, we've
done stuff like the
Google Analytics
easy dashboard library.
And we have the report
automation, which some of you
know as the magic script.
So these are products that we
developed to help users, and
developers also, to understand
things you
can do with the platform.
And also tools that solve some
more complex problems that
people are trying to solve.
So the superProxy's just another
one of these open
source projects that can be used
as a tool to make your
data public.
It's open source.
It runs on App Engine, so it's
an actual web application.
There's a few key objectives we
wanted to meet with this.
One was to make it really
easy, of course.
To make it scalable, so it suits
implementations where
you need that kind
of scalability.
Or you have a lot of visitors,
things like that.
And also we wanted to
make it extensible.
So we'll get a little bit into
what that means in a second.
But we wanted to make sure that
people could take it, and
transform it, and do their
own thing with it.
And provide different formats
and stuff like.
So with that, let's just get
into some examples and use
cases, and think about
maybe how you could
use this new tool.
So when you think about making
your data public, one of these
cases might be-- and we hear
this sometimes-- is that
people just want to take some
report from Google Analytics,
some data, and make that
publicly available.
Maybe on their website.
Or maybe you might want to do
it from internal public
perspective.
Create a dashboard that you can
share with everybody in
your account, without having to
worry about authentication
authorization.
So with Google Analytics, of
course you need an account and
you need to authenticate.
But this can be a challenge or
complex when you just want to
share data with a whole
bunch of people.
So if you have a website you
might want to show, for
example, like this is browser
share for the last seven days
for your site.
Or some other demographics,
maybe of your visitors.
Maybe you want to create this
page for advertisers that they
can come and see
it, who knows.
But the point is, you're trying
to make some public
data available.
There's a lot of different use
cases and scenarios where this
makes sense.
So if you think about this
case, where you have a
website, the pie chart, how
would you actually go about
doing this today?
Let's look at that scenario
and the process that you'd
have to go through.
So with any website, you
have a web page is at
the minimum, obviously.
And that's usually served from
a web server, some server.
To do this implementation for
making data public, there's a
few steps you'd have to go
through to actually accomplish
this and implement this.
The first thing is obviously,
we have to get past this
complex issue of
authentication.
You need to make requests
to Google
Analytics for this data.
And you can use the Core
Reporting API and other Google
Analytics APIs to make
requests for data
programmatically.
But you need to do
authentication.
And a lot of times, this is
OAuth 2.0 is the recommended
approach for authentication.
And you need to actually
interact with the Google
accounts for all the
Google APIs.
And once you have that token,
you need to save it.
And you need to also manage
the whole process of
refreshing tokens, making the
requests using the token.
So this is a whole process
in itself.
And this is outside of Google
Analytics necessarily, but
it's part of any APIs that
you work with at Google.
And once you've got that token,
now someone comes and
visits your website.
And you want to serve this page
to them with this chart
that has this browser
share for example.
So you'd have to make the
request to Google Analytics
through the API.
And you'd have to use that
token that you stored for
authentication.
And it would come back with a
response, with the data that
you've requested.
Now, And once you have
that data you have to
then parse the data.
Because you need to pull up
the information that you
actually want to use
and display.
And you have to transform that
in some format that would work
with-- for this example, we're
using the Google Charts API.
So you'd have to make sure the
data is in a certain format.
In this case, let's say data
table format, to work with the
Charts API.
So you'd have to do that
yourself and write the script
on the server side.
And then that point, you could
show to the user the chart and
they could see this.
And this would work for anybody
that visited your site
because you're doing everything
server side.
And it would be public and
everybody could look at it.
And you would accomplish what
you're trying to do.
But it's probably not
what you want to do.
So for every visitor to your
site, you don't want to have
to go to Google Analytics to
say, give me the data.
And then come back, parse it.
You want to actually save.
It so you probably want to put
the response in some kind of
data store or database
on the server side.
All right, so now you
have it saved.
But the thing is now it's saved,
you want to make sure
it gets refreshed or updated
on a regular basis.
You don't want to be serving
data that's two weeks old, and
you don't want to have
to manually do this.
So you actually want to have
something like a refresh
that's done automatically,
in at a set interval.
Maybe every hour, or every two
days, or one day, depending on
the data in your account
and what you
think might make sense.
So in that case, the
whole system looks
something like this.
Where you have the web server,
all these different components
that are each doing
their own part to
serve this data publicly.
This is really complex, right?
I mean, you're doing
things server side.
You' have to write all these
code and scripts to get all
this to work.
And really you just want to show
a pie chart that has some
data about your browser share.
And you want to do this
in a scalable way
that's kind of easy.
The other thing is, when you
think about saving data and
caching it, again it's a little
bit more complex.
But ultimately, what would end
up happening is if you got the
system in place, is that more
visitors would come.
And you would actually be
serving the data from the data
store or the database.
Which is more efficient, would
save you on quota,
and things like that.
The other thing is that maybe
you can't control what's on
the web server.
And maybe you don't even have
the option to write code or
write scripts.
In that case, the only thing
maybe you can do usually is
maybe provided a JavaScript
snippet on the HTML page.
And that's about as extensive
as much as changes you can
make to a site.
And in that case, you wouldn't
even be able to do this
implementation because you
wouldn't have access to the
server, itself.
So it's complex.
Obviously, we understand this.
And that's where the Google
Analytics superProxy can come
in and take away a lot
of this complexity.
So if we look at an
implementation where you've
deployed your own instance, or
your own application, of the
Google Analytics superProxy.
If you look at those components
now, where they
would fit, it looks something
more along the lines of this.
You do all that authentication
with the Google Analytics
superProxy.
So you've given access to that
web application to access your
Google Analytics data.
And it does this all through
a web interface.
It's all a web flow, so you
don't have to worry about
writing code for that.
Once it's got that token,
it'll manage all of the
refreshing and getting
valid tokens.
And it'll communicate directly
for you with the Google
Analytics servers and APIs.
And it'll then handle the
responses from Google Analytics.
And once it has a response for
you, it'll save it for you in
a data store automatically.
It will also refresh it
for you, of course.
It'll do all at managing
that for you.
And it will also do
the time interval.
So if you can say I want to
refresh it every hour, and it
will take care of
that for you.
You don't have to worry
about manually going
and refreshing this.
And then finally, and most
importantly, one of the big
things about this is that
it will do some
transformations for you.
It'll change the format
response from
Google Analytics API--
is JSON is the default
format--
it'll transform that's to
different things like CSV,
Data Table, TSV.
And again, like I mentioned
earlier was one of the things
we wanted to make sure was
that this was extensible.
Sensible it's obviously
possible, this is open source
for other people to write
different formats that they
can transform to.
So now we've removed
all that complexity
from the web server.
And now it's sitting and taken
care of for you by the Google
Analytics superProxy
that you deploy.
And then when you create
queries, you actually get
these public endpoint URLs.
And I'll explain that
in a second.
But if you think about that
scenario now, you
have the web server.
Which just might be an HTML page
being served somewhere.
It doesn't really matter
at this point now.
And in that page, you're going
to actually make the request--
the client is going to make the
request, or the visitor is
making the request directly
to the superProxy.
In this case, it's pulling the
data directly from superProxy.
It's cached, it's fast,
and it's scalable.
And now you're going to have
this pie charts, or whatever
visualization, or whatever
you're doing is being driven
through this proxy.
And it's a public URL.
So anybody can visit that
URL and get this data.
I because it's cached and it's
being going through the
superProxy, you're going
to save on quote.
It doesn't matter how many
visitors coming.
It's going to scale up.
App Engine is great for
that, obviously.
So we've [INAUDIBLE] all
this complexity.
It's a little bit nicer now.
We'll explain in a second how
you actually get this and
deploy this thing.
But let's take a step back for
a second and think about what
this actually means when you
say public versus private.
What does that mean?
So it's an App Engine
web application.
It runs on App Engine,
it's a superProxy.
And you're the admin of it.
So you deploy it
and you run it.
And you manage it.
Once you've authenticated,
that's all taken care of for
you, the tokens and things.
And what you do then is
you create a query.
And you say I want to
make a new query
public, new data public.
So you go in there and
you create a query.
And you specify what the
query should be.
What dimensions and metrics.
So kind of the standard stuff
that you would do for any
co-reporting.
And then what it would do,
it's going to create
the query for you.
And it'll assign an
ID to the query.
So for example, we have a query
here for country and
visits, and it's got
an ID of 1,2,3,4,5.
It gets saved to the data store
in the Google Analytics
superProxy.
And only you have access to this
application, because you
are the administrator of it.
And then what happens is there's
a public endpoint or
URL that's pointing to your
instance of this application.
And you can give out this URL.
And you can give people the IDs
for the queries that you'd
like to make public.
And these can be
used anywhere.
So they can be just made
directly for a request.
You can use this as part
of the web dashboard.
The point is that these
URLs that you
provide will be public.
So for example, in this
case you have a
public URL that's called--
so it's hosted to App Engine.
So you have an appspot.com
domain.
But you can use your own domain
on App Engine, that's
definitely possible.
And you would provide this
URL with the query IDs.
So query equals 1,2,3,4,5.
And you give that URL out.
And that URL will then query to
the superProxy, which will
retrieve these public, stored
version of that response and
return it back to the user.
So you're enabling these certain
queries that you want
to make public by using the
superProxy, and using these
URLs that become what you can
consider public endpoints.
So let's do a little
bit of a demo.
For example, I have something
here running.
So this is a pie chart.
This is actually coming from an
instance of superProxy that
I've deployed.
These are public URLs.
And you'll see that this file
here is just an HTML file
sitting on my desktop.
It's not hosted anywhere.
It's just a simple HTML file.
If you look at the source-- so
I have two charts on here.
This is actually coming directly
from the superProxy.
If you look at the source,
you'll see all this JavaScript
here is just standard Charts.
So this is Google Charts,
which is a
visualization library.
It used to be called gvis.
This is all standard
JavaScript.
There's nothing really here
customized, other than the
stuff that's like directed
towards superProxy.
So for example--
[AUDIO DROPS]
for the data source,
use this endpoint.
And this endpoint is
the superProxy.
And then we're just doing a
couple things like we're
setting the refresh interval.
And we have a couple
configuration options like the
title and stuff like that.
But none of this is customized
JavaScript or
anything like that.
The only thing we're really
doing is pasting in these
certain values, which is the
URL from the super Proxy.
So I could give this
file to anybody.
[AUDIO DROPS]
So let's take a look, a little
bit about how you would
actually go ahead and do this.
So it's kind of three steps
to deploy the app.
That's the first thing
you need to do.
This is a one-time
configuration.
You need to deploy the app and
run your own instance of this
on App Engine.
App Engine provides some free
quota, which is probably good
for most use cases.
And you probably wouldn't need
to go beyond the free quota.
So it's free to set
up and create.
And all the source files are
available on GitHub.
So the first thing to do is
get the app from GitHub.
There's a link there.
We'll provide these
resources later.
The second thing is you need
to set up and configure the
application.
And I'll show you how to
do that in a second.
And then you want to deploy
this thing to App Engine.
So this will give you your
actual appspot.com hosted
application.
So start off with GitHub.
This is the actual repository
for the superProxy.
So you can come in here
and download it.
And there's a whole bunch of
instructions here, and more
information on how to
actually do this.
So if you want more detailed
instructions,
please visit this site.
But once you've downloaded it
to your machine or whatever,
you can make a pull request
and get the source.
And I have it actually sitting
over here in this folder,
Google Analytics superProxy.
So we'll show that
in a second.
The other thing you need to do
is you want to create an
actual application
on App Engine.
So you can sign up
for an account.
And you create an identifier.
So for example, it could be
like My Proxy App, or
something like that.
You want to check if it's
available and create the app.
But this is application
identifier is important
because we need to use that.
So in this case, I've
got My Proxy App.
So we'll remember that
in a second.
So you create the application.
And you also need to
create an API.
APIs console, and we'll
create a new project.
And we'll call it
Proxy Project.
[AUDIO DROPS]
integration options.
What's important here is that
you change this here to point
to your new App Engine instance
that you just created
a second ago.
So in our example, ours
was My Proxy App.
And it would be @appspot.com.
And this is all in the
instructions also.
But there's the callback URL
for OAuth is admin/auth.
So that's the one configuration
that you have to change.
And then we don't need
to worry about this.
And you create the client ID.
[AUDIO DROPS]
download it also.
There's a source folder.
And within that, there's
a few things you
need to make changes.
One is the app.yami file.
So if we open that, we'll see
that at the top of that
there's a first line.
And it says application.
And this is where you specify
the ID that you just created
in App Engine that you want to
use for your instance of this.
So it would be My Proxy App.
Save that.
So then we're done
with that file.
It's configured.
The other thing is we also want
to configure the client
ID for the OAuth 2.0.
So there's a config.pi
file here.
We'll open that one and you'll
see the same thing.
There's a few fields that
we have to fill out.
And it tells you which
ones to replace.
But we need a client ID.
So in this case, we would go
back to the APIs console and
we would copy the client ID.
Copy that and replace this.
We'd also copy the
client secret.
And again, we'd put that into
the OAuth client secret.
And we're going to deploy this
right to App Engine.
But you could also run a local
environment if you wanted to.
And there's instructions
on how to do this.
But for this point, we're just
going to do a redirect URI.
We only need the host
name and domain for
this particular instance.
We don't need to worry
about the URL part of
it, or the page path.
So in here, for the OAuth
redirect URI, I would just put
proxyapp.appspot.com.
And it automatically will
take care of the admin
auths part for us.
And then there's this secret
phrase down here, which is
used for cross-site stuff.
So you can just put some kind
of unique thing down there
that you don't share
and keep secret.
So all this should be kept
secret, and you're the only
one that has access to it.
But we save that, and
we close this.
And then you want to launch and
deploy this on App Engine.
So the way to do that now is--
well, I guess there's one more
thing you might want
to configure.
And this is optional, is the--
in the controllers util,
there's a co.pi file.
And this is the constants file
that you can-- there's a
couple things you can
configure in here.
It's up to you if you
want to do this.
But it really depends on how
you want to deploy this.
But there's two things.
One is anonymized responses.
And this is set to
false by default.
But what this allows you to do
is you set this to true.
Any response you get back from
Google Analytics usually
contains information like the
profile ID, the account
information, things like this
that are part of the request
that you usually make.
So in this case, what will
happen is if you put this to
true, the responses you get
back, will be, those keys will
be removed from the response.
So we'll remove stuff like the
query itself, which contains
profile information.
And the account ID
and web property.
Now this isn't a truly private
thing, but these are values
that you may not just want to
make available and share.
So if you set that to true,
that's what will
happen in that case.
And the other thing is that we
provide the functionality for
relative dates.
So you don't have to specify
in your query start date is
July 23rd and end date is July
29th, or something like that.
You can actually specify
relative dates.
And those will automatically
get resolved for you every
time a query is made.
So in that case, we need to
determine what times, when
should we resolve
these dates to?
And by default, it's Pacific.
But you can come in here
and change this to--
so right now it supports North
American time zones and UTC.
So if you're in the eastern and
you want to make sure that
your queries are resolved to
eastern time executions, then
you can do that.
But for now, it's Pacific.
So we'll close this.
So those three files we edited
was the app.yaml, which was
just putting your ID.
There was the config.pi,
which is for your
OAuth 2.0 client details.
And then there's this co.pi,
which is in the utility
folders, which allows
you to configure a
few different options.
But this is the nice
timing to do this.
Once you've deployed the app,
it's going to be running and
you won't have to
do this again.
But let's look at how
you would deploy it.
So if you have the Google App
Engine Launcher installed--
and again, there's detailed
instructions on the GitHub
site for this--
you would actually just go to
Add Existing Application.
Browse for it.
And in this case, so I
have the superProxy
folder sitting here.
And here's the source.
And you just want to go to the
folder that has the app.yaml
in it at the root.
And you choose that folder.
And you say Add.
And you'll see, now it's
added to the launcher.
And then you can just
right-click it or go to
Control and go to Deploy.
And it'll ask you for your
Google account for App Engine,
and it will deploy
to App Engine.
And once it's deployed,
it's ready to be used.
And as long as you have the same
account, Gmail account or
Google account, for what you've
deployed and what App
Engine instance you created,
then you'll be an admin of the
actual application.
OK, so we won't actually
deploy that.
So let's take a look at--
so this is the one time
thing you've created.
It's deployed, and now you can
actually use superProxy.
Let's see what that entails and
how you'd actually create
the query and get
this working.
So there's three things
on how to do this.
So if you want to go from a
blank page, to this, where we
have pie charts and things that
are getting data from
public endpoints, I have
a file sitting here.
So this is just a
how-to example.
If you look at the source for
this file, you'll see that
it's got just the standard
JavaScript here, which is
nothing customized.
You could copy and paste this
yourself and use this.
And I could provide
this later.
There's a couple things
we need to get.
So let's create a query, and
let's see if we can get our
chart to show up on this.
So let's start by going to
the superProxy itself.
So once you've deployed it, you
can visit the superProxy
by going to the host
name /admin.
And that'll get you to
this particular page.
And the first thing you need to
do that initial time is to
authorize the application
to access your
Google Analytics data.
So you'll see that there's an
Authorize Access button.
And you click that.
And it'll go through
the OAuth 2.0 flow.
And you'll accept it, and we've
successfully connected.
So now that we've connected--
I'll just change thi--
we can actually start creating
queries now.
And you'll see something like
this when you come in after
you've authenticated.
And you only need to
authenticate the first time.
And after that, it'll
stay authenticated
until you revoke access.
So it is only a one-time
thing you need to do.
So what we want to do
is create a quick.
So we want to make
something public.
Let's do something like
we'll share the
source medium and visits.
And we'll do a pie chart
to show this.
So we'll go to Create Query.
And you have this interface
here to
actually create the query.
So right now, we're using the
core reporting API, obviously,
to make these requests.
And at this point we're asking
for, in the admin interface
here, is the actual core
reporting API query that you
want to make.
So I suggest you can use
something like the query
explorer to actually do
this query and to get
the data you want.
And then just copy the API
query and put it into the
superProxy to use it.
So for example, we'll do
something like ga source
medium, which is a new dimension
we actually just
released recently.
And we'll do ga visits
for the actual data
that we want to share.
And we'll sort it by ga
visits descending.
And the date doesn't really
matter, so we'll
just get the data.
So this looks good.
And actually, we'll
just do top five.
So I'll do max results five.
And this looks like the
data I want to share.
So this looks fine.
The dates don't really matter
because we're going to use
relative dates.
We don't want it to be
a static report.
We want it to be moving
over time.
So a grab the URL here.
We'll just copy this URL,
which is the actual core
reporting API request.
And we'll go back to
the superProxy.
So we'll name this, let's say,
Top Five Source Mediums.
And we're going to do it for
the last seven days.
We'll refresh this, let's
just say, once a day or
maybe twice a day.
And this is in seconds.
So we'll just say around
4,300 seconds.
You have to do the
calculation.
And then we'll paste in that
URL, which is the query URI
for the particular data
that we want.
And then we're going to
replace the dates with
relative dates.
So you can see here, there's
some supported date
parameters.
So one of them is today, which
will resolve to today's date
when the query is executed.
And then we have n days ago.
So with these two things, you
can pretty much create any
relative date query
that you'd like.
So we're going to remove
the end date here and
change it to today.
And we'll change the start
date to six days ago.
So six days ago and today will
give us seven days of data.
So we'll change this
to six days ago.
So any time this query gets
executed, it's going to, at
that time, resolve the
dates properly.
So tomorrow it will be today,
will be tomorrow, and it will
be continuing to going on.
So you're always going to have
the last seven days of data.
So we have the option to test
the query, which we'll do.
And oops, we made some
mistake here.
Every 4,300 seconds.
So the initial one will,
we should be
scheduled and running.
Probably in the next few seconds
or so we should be
getting a response back.
But you'll see this interface
now for each
query that you have.
And it has all this information
about the query itself.
So we have the name that we
just gave it, obviously.
And we have this URL.
And this URL here is
the public URL.
So you can give this URL to
anybody, and they will get the
response back from the query.
So this is getting them
out of authentication.
It's a public URL that works.
Then we have formats.
So we can do stuff like
CSV, Data Table.
And those will provide the
same response, but in the
specific format.
And then we can see what the
original query was that we're
actually using to drive this.
And then we have other things
like scheduling.
Like, is it running right now?
It's currently scheduled to
run every 4,300 seconds.
When was it last refreshed?
And then we also have stuff
like, what is the last save
requests we had?
What does the response
look like?
And what's the request count?
How many times have people
actually requested this URL
externally?
And then what was the last
time they requested it?
So if we click this
URL, if it's run
already, we'll see a response.
So this is the public.
A proxytest.appspot.com.
And this URL is now public.
Anybody can visit this URL
and get this data.
You'll see the response here is
from directly what it would
look like from the core
reporting API response.
It's just the same response.
But because we enabled the
anonymized responses, there's
actually no profile
data in this.
It just has the raw
reporting data.
And no account information
and profile data.
So this is now published.
It's cached, it's available
for us to use.
If you click on the Data Table
Response format, we'll see
that it comes back with
a little bit.
Query, but it's in Data
Table format.
And we can actually use this
directly with the Google
Charts API, which is great.
We can drive or our Charts API,
pie charts and all these
things, with this particular
query.
If we refresh this, we'll
actually see some updated
information now for our query.
So for example, we know it was
last refreshed 59 seconds ago,
this particular data.
And the last requested was 19
seconds ago, was the last time
it was requested.
And it's been requested
two times.
And then we can see here the
actual response that we have
saved in the data store.
So this gives you some
information about
what's going on.
You can also pause scheduling.
You can refresh the query now.
If you don't want to wait
another 4,300 seconds, you can
tell it to refresh now.
And you could also disable the
endpoint, which will disable
it from being available
publicly.
So if you do that, then anybody
who tries to visit the
URL will get an error message
saying it's not available.
So now that we see the interface
and we have this
query created, let's actually
use it and try to drive this
pie chart that we have with
this data that we have.
So I'm going to go to
the data response.
This is the one we want.
We want to use Data Table.
And we'll copy this URL.
And now this is our
current case.
So if I refresh it, we'll
see we get nothing here.
And let's go to our source
for that page.
Again, all this JavaScript
here is currently just
standard JavaScript.
There's no special
customization or
anything like that.
We have a div down here that's
going to hold the chart.
And then we have this Google
Charts JavaScript that
interacts with the servers
to create these charts.
So there's a data source URL
that we want to replace with
our public endpoint.
And it doesn't need
to be HTTPS.
We can just make it HTTP.
So this is our data that points
to our source medium,
top five sources.
And we're going to just replace
the title with our own
little title.
Top five source mediums
last seven days.
And we can replace also--
there's a refresh interval.
Since we're updating this data
every twice a day, we can just
change this to around the
same interval or less.
It doesn't matter.
And this is just going to be
if someone leaves the page
open, this is how often it's
going to refresh that page.
But most times, you might
not need that.
So let's save this.
We have a response there.
And let's refresh the page.
And there it is.
So this has actually made a
request directly to the
superProxy and requested the
data that we just created.
And again, I could send this
HTML file to anybody or post
this anywhere.
And anybody who visits
it would be
able to see this chart.
And again, it's cached, so it's
fast [INAUDIBLE] quota.
And it's all being directed
through the superProxy.
So we did it.
Let's go back and talk a little
bit more about some
other stuff that we do.
But this is one example of what
you could do with it.
There's a lot of obviously other
use cases that you could
definitely implement this.
You could use other
Charts APIs.
What's important is that you
have to make sure the data
that you're using, the data
table response, is going to
work with certain
visualizations.
Because some visualizations
require
certain columns and data.
So as long as you have that
configured properly, you can
definitely just pop in the URL
to the data source URL
attribute of a Charts API and
automatically use the
superProxy as a source for all
your different charts.
So yeah, pat yourself
on the back.
If you just did that, you just
automatically opened up some
data to the world through
a the superProxy.
So let's continue a little bit
and see a few other things
that I want to touch on to make
you guys excited about
using this thing.
The more features that are
coming, things that you can do
with this that aren't
really apparent.
That you didn't see there, but
are actually happening in the
background.
So things like you can
do multiple users.
So although you've deployed this
yourself and you're an
admin for the application,
you can add
users through the interface.
You'll see there's a Manage
Users, and you can add other
users that can come in,
authorize their own Google
Analytics account, and
create your own
queries that become public.
Because of the caching--
this is a huge thing-- you're
going to save a lot of quota.
If you have a lot of visitors,
it's going to
scale nicely for you.
Because you're not going to
have to hit the Google
Analytics API service
for each request.
You can rely on the proxy to
take care of that for you.
Auto scheduling This is another
feature that happens
in the background.
So for example, just say you
create a query and you say, I
want to refresh this
every hour.
And you make the URL public.
But nobody visits the URL.
And it's been two hours and
nobody's visiting the URL.
What will actually happen, the
superProxy will look and say,
OK this hasn't been requested
for a couple hours.
We'll pause the scheduling
for now.
We won't make any more updates
to Google Analytics to get
more data, because nobody's
using it anyways.
And this is going to save
you on quota again.
Because now you're not
requesting data that isn't
necessarily being used.
Now subsequently, if someone
does visit and uses that URL,
it will go and fetch the latest
data for the person, or
for the visitor, return that
latest response to them, and
then automatically start
scheduling again every hour.
And it will continue to
do that, and does
that for any queries.
So again, it's saving
you on quote.
And it's a little bit just a
nice feature to have, to not
use up quota that doesn't
need to be used.
The other thing is, we
also handle error
logging, or error responses.
So it does happen every
once in a while.
You might get a response
back from the service.
Or something, maybe a token's
expired, or something's
happened that's unexpected.
In that case, when it does the
refresh to get the latest data
from Google Analytics, if it was
an error, it will log the
error and retry again.
But it won't return back an
error response to the user.
So it will always return back.
The best [INAUDIBLE]
to return back the most recent
successful response.
So if you continue to get
errors, we're always going to
return back to data that's
actually from a
200 successful response.
And this is is great so
users aren't going
to have broken charts.
You're not going to have error
messages showing up on pages.
And also, after you've hit
about 10 errors, it will
automatically pause the
scheduling for that query.
And it'll require the admin
to come in and make sure
to clear the error.
And at least address the
situations to see what the
problem is.
And then also we support
callbacks.
So if you want to query maybe
a JSON response with a
callback URL because you want to
handle it and do some maybe
client side parsing, you can
definitely use a callback.
And this is described on
the GitHub readme.
So definitely it's another
feature that people wanted and
is part of this.
So for the future--
and we talked a little bit about
extensibility and what
was important for this thing--
in the future, you can imagine
that we have [INAUDIBLE]
response, which works really
well with the Google Charts.
But you can imagine, you might
want to provide formats that
work really well with other
visualization libraries.
And this is definitely possible
because you can take
the response.
We've made it very easy to
transform to new format that
you can return.
So definitely, this is something
that the community--
and as much as we can, we try
to provide more formats for
people that are of interest.
Also, we can do stuff like maybe
in the future, maybe day
comparison, being able to handle
calculated metrics kind
of responses for certain
use cases.
But it's early on.
So we'll see where there's
demand and what people maybe
are looking for.
And then also, you could use
this as a testing platform.
Creates some queries, and test
against the proxy instead of
the actual Google Analytics
service.
Which should save you
quota, and maybe
give you more stable--
if you're looking for a
specific response, you
definitely can use this as
a platform for that.
And there's a whole bunch
of other use cases.
I'm sure people will figure
out what they can
do with this thing.
But it's freely available.
It's open source.
I GitHub, so I'd recommend
you check it out.
And we'll provide some
resources here.
Definitely go to the GitHub,
to the repo.
There's also a link on
developers.google.com, which
gives you a little more detail
about the Manage Users and
kind of other information
around the proxy that we
didn't cover today.
And then the chart wrapper,
which is that JavaScript I was
using as part of the example.
You can go to the Charts
developer's site.
And there's a link here.
And that'll show you that
the JavaScript snippet.
And I can provide that
as a sample also.
But it's really just easy
as providing the
URL as a data source.
And then of course, App Engine
and APIs Console for more
information on how to
deploy and use this.
So I want to thank you very
much for joining me today.
I hope this was useful
for you.
And I hope some of you will be
able to take advantage of some
of the features that
the proxy provides.
And definitely, download Google
Analytics superProxy
and let me know how things go.
Thanks, bye.
