PRIYANKA VERGADIA: Welcome
to Deconstructing Chatbots.
I'm Priyanka Vergadia,
and in this episode,
we will learn how to integrate a
Dialogflow agent with BigQuery.
If you have not checked
out our previous episode
on fulfillment, I
highly encourage
you to watch that before
you continue here.
Because we are going to use
our same appointment scheduler
Chatbots that
creates appointments
in Google Calendar.
Today, let's enhance it to
send appointment information
to BigQuery to gain some
insights from this appointment
data.
So let's jump into
our Dialogflow console
and see how to set
up the fulfillment
to send data to BigQuery.
Open Appointment Scheduler
Agent in the Dialogflow console
and navigate to Fulfillment.
In order to integrate
with BigQuery,
the first thing we
need to make sure
is to open package.json and
add the BigQuery dependency.
Now, before we make
edits to the index.js,
let's first set up our BigQuery
data set in GCP console.
Under Resources on the left
pane, click on the project ID.
Once selected, you will see
Create Data Set on the right.
Click on Create Data
Set and name it.
Once the data set is
created, click on it
from the left panel.
You will create table
on the right now.
Provide table name
and create the table.
Click on the table
and edit the schema.
Add date, time, and type as the
three fields for appointment
time and the appointment type.
Take note of your
project ID, data
set ID, table ID, and
the schema fields.
We will need all those
in our fulfillment.
index.js code is linked
in the description below.
First, create a
BigQuery constant.
We will scroll through the
Google Calendar setup code
for creating calendar invite.
Please watch the
previous episode
to learn how to integrate
Google Calendar with Dialogflow
Fulfillment.
In our Make Appointment function
after the calendar event
is set, we are calling
our BigQuery function,
Add to BigQuery, which will
add the same appointment
information into the BigQuery
data set we just created.
Let's check out the
function, Add to Dialogflow.
We are passing in the agent
and the appointment type.
Then we are modifying
the date and time
into a readable format.
Uncomment the code and add
the project ID, data set ID,
and table ID to make
that BigQuery connection.
We are using our schema fields
for date, time, and appointment
type to create a row
entry into BigQuery.
At this point, we have all
the information we need.
We call our BigQuery
object to add the row in.
We finally catch if
there are any errors.
Once you've made the required
edits to the index.js,
copy and paste it into the
fulfillment Cloud Function
within Dialogflow
and click Deploy.
Now we are ready to test.
In the simulator on the right,
query your Dialogflow agent
with something like set an
appointment at 8:00 PM on 10th.
And respond to the
follow up questions
to provide appointment
type information.
I'm using license
as an example here.
You will see an
appointment setup response
in the simulator, and it
will show up on the calendar
as well.
Now navigate to BigQuery
and execute select star
from the table query to
verify that a rule was
added for your recent test
with the appointment type.
From within BigQuery,
we can explore this data
to find interesting insights.
Click on the Explore
in Data Studio
button, which takes
us to Data Studio.
Here, say we want to know
how many appointments
are for license versus
registration versus others.
It is as easy as
dragging the pie chart
and editing the
values we want to see.
I'm also interested in knowing
how many total appointments are
set up and another bar
graph that tells me
appointment breakdown per day.
Well, that is pretty amazing
in less than a minute.
Let's summarize what
we learned today.
We enhanced our appointment
scheduler Chatbot
by creating an
integration with BigQuery.
We created a data set
and a table in BigQuery
and connected it
to our fulfillment.
Then we tested our agent
to confirm that the data is
being sent to BigQuery.
Finally, we were able
to explore the data
in Data Studio through
one click explore feature
within BigQuery.
Comment below to
let us know how you
are trying this integration
in your specific use case.
I will see you in
the next episode
of Deconstructing Chatbots.
Until then, please like and
subscribe to our channel
for more such content.
