[MUSIC PLAYING]
VANESSA JULY: Hi, everyone.
Thank you so much for
joining us this afternoon.
My name is Vanessa July.
And I'm a Cloud
Customer Engineer
for Higher Education at Google.
That means that I work with
customers like yourselves
on applications that
touch the student
body, the staff
at an institution,
and even sometimes,
in use cases, distance
learning and online learning.
So today, you'll hear
a little bit from me
on what Google's vision
for an AI-first institution
looks like.
Most of the time,
actually, you'll
be hearing from people just
like yourselves and institutions
who have implemented
some of this technology
and what that was like--
what that experience was
like for them and
some of these cases
that came out of that
and, potentially,
even some of the roadmap
opportunities that
exist for those institutions.
So I think a lot of us here know
that the path for a student,
from prospective all the way
to employed professional,
is not one step.
And so Google really
took this to heart.
And we think of it and translate
it into a student lifecycle.
And what that means
is, there are so
many places along
a student journey
that we can interject
with technology.
And we think about how
we can use technology
to enrich that experience.
This is much more than what
you see on the screen here.
But we're going to focus on
steps two and a little bit more
on step four.
When you have a student
that is at your university,
how do you help them?
And how do you deploy
an agent and not
just use ML as a buzzword?
How do you actually interject
that into the student lifecycle
to help that student as they
work through your university?
And so what you'll see
is Dialogflow is really
the agent that enables this.
Very simply, Dialogflow is a
conversation-building tool.
And it does so in
a very powerful way
by using natural
language processing
to take the way that a student
or a professor, a teacher,
or someone would speak
naturally, and translate that
into something
that you might have
heard before called intents.
And you can think
of that by imagining
all the different
ways that someone
might ask about the weather.
So you can say, how
hot is it today?
Will it rain today?
What's the temperature
going to be like this week?
And traditionally,
it may have been
very rigid and
difficult for people
to interact with
an agent because it
required a very specific input.
And it's also very
difficult for developers
to create that agent by
hard-coding very rigid language
and contexts into that agent.
And so what Dialogflow
does is remove that,
meaning that your user, your
student, your staff member,
can interact the way that
we talk very naturally.
And all of the natural
language processing
that is built into Dialogflow
translates that for you
into something
actionable in your bot.
And so today,
you're going to hear
from three different
institutions on how they've
implemented a lot
of this technology
and how they were able to
build faster to iterate,
to bring in student
input and focus
group input to deploy
something very quickly.
You'll hear about how the
integrations that Dialogflow
presents to you, allows
you to meet your student,
and to meet your staff
member where they are.
So this means they can
interact on their phones.
They can interact in
the LMS, in the LRS.
They can interact in
the student portal
how they're used to interacting.
And you'll hear about
how, when you're
able to meet your students
and your staff where they are
and how they want to interact,
how that really helps you
to maximize those investments,
not just in technology,
but also in the information that
you're able to collect and then
give back to your user base.
And so with that, I'm going
to bring Greg Reynolds up
to the stage to tell
you about how they've
done that at Case Western.
GREG REYNOLDS:
Thank you, Vanessa.
Hello.
My name is Gregory Reynolds.
I'm an application developer
at Case Western Reserve
University.
And for those of
you that don't know,
Case Western Reserve University
is a mid-sized research
institution with lots of
undergrad and grad students
that all intertwine.
And like all
universities, we have
a vast amount of information
for our students to consume.
And our job, as
technologists, is
to bring that information
closer to the students.
This saves them time so they
can focus on their learning.
One of the ways that
we're doing that
is through our chat
bot, Spartan Answers.
Spartan Answers is built on the
technology of Google Assistant,
using Dialogflow, infused
with our information.
So the project to
create Spartan Answers
was a very unique
project for us.
It wasn't a top-down
approach, like most projects.
Right before the
second quarter of 2018,
our CIO asked, those of us
in university technology
that we had interest in
chat bot technologies,
to just self-group.
So we did, and we had a really
good group of people that
decided, let's talk about this.
Let's figure out.
And we had our own roadmap.
We figured out our own
audience and direction to move.
And we're really excited
about what happened.
So that all started around the
first part of second quarter
2018.
And within about a month,
one of our developers
found some code on GitHub,
wrote a web application
using AWS and Amazon Lex, which
worked well as a prototype.
But it didn't allow us to
integrate as quickly as we
wanted to with our services.
And it also didn't work
really well on mobile devices.
So we knew we had
some work to do.
So fast forward to
July, last Cloud Next.
I met Vanessa.
And we started
talking about what
Case Western could benefit with
Google and Google technologies.
And then, in August, Google
released Google Assistant
to G Suite for Education,
which opened the doors for us
to use Google Assistant
for our chat bot.
So I started exploring
how to use that.
And within a couple of
days, I had feature parity
with our original prototype.
And then, a couple
more days after that,
I was able to integrate with
our location information
and our campus events
information really easily.
This also allowed us to have
easy access on mobile devices
because Google Assistant already
has an Android and an iOS app.
So we were very
excited about that.
And we decided to
switch our platforms
to Google Assistant
and Dialogflow because
of that prototype.
So in quarter four
of 2018, we started
demoing that project to
the different faculty
and staff and our students.
And the student focus groups
gave us a really great idea
that we were on the right track.
But they also helped us build
our roadmap for the future.
And then, in February, we
released our product to pilot.
And it's been very
well-received.
And we're very excited about it.
So I want to talk
briefly about where
we're at, how we built it,
and where we're going next.
So to explain where
we're at, I want
you to imagine that you
are a first-time student.
You're stepping foot on
Case Western's campus
for the first time.
It's August.
The weather's really nice in
Cleveland, Ohio at that time.
So what is the first question
you as a first-time student
might ask?
Well, I know most
first-time students would
ask, how do I connect to Wi-Fi?
So you pull up Sparta
Answers, and you ask it,
how do I connect to Wi-Fi?
And if you see,
on the screen, it
brings up a nice chat interface.
In a chat bubble, it
tells you that it found
the following information.
And then it uses the
material design card view,
which also allows for a link
that you can click through
for more information.
So you know how to get on Wi-Fi.
So you turn on Wi-Fi.
And you're, like, OK,
I need to buy my books.
So you ask, where's
the bookstore?
So it shows location, a link
to Google Maps for navigation.
It shows the building
title and picture
so you know exactly
where you're going.
And third, you might want
to know what's happening
this weekend, so you ask that.
And sure enough, it shows a
list of the events happening
this weekend so you can
enjoy time with your friends
before classes start.
So with that, I want to
move into how we built it.
And I know this
might be a little
technical for some people.
But don't worry.
There are a lot
of pieces that you
don't need to be a
developer to understand.
And I'll walk through those.
As Vanessa mentioned,
Dialogflow uses
a term called intents, which is
just a question and answer set,
so Dialogflow, as
machine learning,
can understand what to
respond to a certain question.
But to get it to respond
with the machine learning,
you have to train it.
And the way you
train it is by giving
various types of questions--
various types of ways
to ask the question--
so it can run its machine
learning training on that.
And so you see on
the screen, it says,
how do I connect to Wi-Fi?
Is there Wi-Fi on campus?
And the third one,
does CWRU have Wi-Fi?
And I wanted to point out
the highlight in yellow,
that CWRU is one
of the many ways
that you can say Case
Western Reserve University.
And so it is tagged as
a university intent--
or university entity,
which allows for synonyms.
So this is how we
got around that, so
we didn't have to train
for each type of way
you could ask, how do you get
to Case Western or whatever.
So once you have it trained, you
have to tell it how to respond.
And if you remember, back
to the Wi-Fi question,
we had a little chat
bubble that said,
I found the following
information.
So in the simple
response on the left,
that's where you input
that information.
And then, in the simple
response on the right,
this is the audio response
that Google Assistant gives.
And then, in the
Basic Card, this
is where it shows the material
design card layout with a link.
And that's all it takes to do
one question and answer intent.
It's super easy
and fast, which is
why we really like Dialogflow.
Now, I have one more
example for you.
And that's the
location question,
where is the bookstore?
So you'll see that it's
the same type of training.
And there's a lot of words
highlighted in orange here.
Now, instead of using the
synonym approach with entities,
we're using multiple
locations as an entity,
so we can train one
question, one intent,
with all of our locations.
And we didn't have to input
every building into Dialogflow.
So it makes it really easy.
Once we have a new building
come online-- which we will,
here in the next few months,
with our health education
campus.
once it's inputted in
the campus maps database,
it will be live right in the
chat bot, which, we're really
excited about that integration.
Now, to do the response
for a question like this,
it's much different.
It's a programmatic way.
You start with
Dialogflow, which is
what we've been talking about.
But the blue box in the
middle is our webhook.
And a webhook is
simply a program
that you write that sits
in between Dialogflow
and other services.
And in the red box,
the service API
for the location information
question-- that's
the campus maps database API.
But we also use it
for Google Calendar.
It can be any type of service.
So once Dialogflow
gets the question,
it sends that question to the
webhook with the intent name
and with the parameters.
So in the case of,
where's the bookstore,
it sends the map intent with
the bookstore as a location.
So the webhook knows to talk
to the Google Maps database.
And the Google Maps
database looks up
information about the
bookstore and passes it
on back to the webhook, which
then packages it up in a way
that Dialogflow can understand.
So that is how easy it is
to do both of the types
of questions-- one
that is just simple--
you give it the answer--
and the other that it looks
up in different applications.
So with that, where
are we going next?
We know we have a
really good start.
But we know there's so
much more room to grow.
We're definitely going to
be talking to our students
for more ideas.
But the few that we're starting
with here in the next month
is our question and
answer input tool.
So even though it's
easy to input questions
into Dialogflow,
it's not as quick
if you have hundreds
of questions
to input at the same time.
So we're writing,
using Dialogflow APIs,
we're writing a tool that
will allow our departments
to input hundreds of questions.
And then our university
marketing and communications
will verify and approve each
question and each answer.
So it can all be streamlined.
And we can have thousands
of questions in our chat bot
up and running very quickly
here in the next few months.
So the next two editions
are all using the webhook.
Course information-- just simply
any type of quick information
about a course.
So where is it?
What day and time?
Who's teaching?
This is something our
students asked for,
so we're excited
to put it in there.
And finally, the shuttle routes.
We don't want our students
sitting at a shuttle stop
waiting and waiting
for a shuttle.
So we're going to be putting
in, where is my shuttle?
When will the shuttle
be at my stop, which
will save them a lot of time.
So with all of these examples,
and with all these additions,
we are totally
bringing information
closer to the students.
That way, they can
focus on things
that are more important than
just finding information.
They can focus on
their learning.
With that, thank you.
And I'd like to turn it
over to Daniel McCarthy.
He's going to tell
you what they're
doing at Strayer University.
[APPLAUSE]
DANIEL MCCARTHY: Thanks, man.
[APPLAUSE]
Good afternoon.
The company I represent,
Strategic Education, we
actually have two universities.
We have Strayer University
and Capella University.
And our student population
are primarily working adults.
They have jobs during the day.
They come and do their
coursework online
in our learning management
system after hours.
So I'm here to talk to you
today about conversational
self-service and what we've
done at Strayer University.
At Strayer, one of the problems
we were trying to solve--
or the problem we
were trying to solve--
was volume, variance,
and availability.
So we have working adults.
They aren't typically able to
make it into the university
or call the university
during working hours.
When they do make it in, or
when they are able to call in,
they may get bounced
from department
to department and, again--
hold times, wait times.
And then finally, we're
getting more service requests
than we have staff to support.
So a chat bot's kind of
a near-perfect solution
to a problem like that.
A chat bot can be
available at all times
of the day, 24 hours a day.
It can handle an
infinite amount of scale.
And trained properly, it
can cross domain boundaries.
If you think about this outside
of the education industry,
for example--
just think about
HR for a second.
Within HR, you might have
benefits management and then
payroll.
Those are two different
domains with staff
that are trained specifically
for those domains.
And you might have to be
bounced from department
to the department.
Our timeline for the
initial release of our chat
bot, Irving, which was named
after the founder of Strayer
University, Irving Strayer,
was very compressed.
You'll see here, roughly six
months from idea to pilot
to production and then scaling.
What enabled this was a
little bit of a perfect storm.
So those developing
this solution,
myself included, had the domain
experience for the domain
that we were going to hit
first, the technical experience,
and then combining that
with Dialogflow and GCP
allowed us to move very quickly
from an idea to a prototype
to production.
Now, we'll give you a
little bit of a demo.
Now, I apologize.
I'm going to start and
stop this a few times
so that I can talk through
it just a little bit.
So I've already stopped it.
Isn't that disappointing?
All right.
So Irving is available
in our student portal
where almost all of
our learners come.
AUDIENCE: It's
[? right ?] there.
DANIEL MCCARTHY: Oh, thank you.
All right.
So still going.
AUDIENCE: [INAUDIBLE]
DANIEL MCCARTHY: All right.
We're going to go back
here and do this again.
All right.
I'll talk a little
bit before we play it.
Irving is available in the
student portal online, the web
application.
We're piloting Google
Assistant right now.
And we're also piloting SMS.
So meeting the students when
they are and where they are.
When they are--
24 hours a day.
Where they are-- whatever
platform that they
want to interact with.
Some of those a little
bit easier than others.
And now, with that,
we will play the demo.
As students interact
with Irving,
it recognizes that, if
they've used Irving before.
So you'll see here Holly,
who is a learner that
is interested in her academic
progress and completion
or progress towards graduation,
has used Irving before.
She wants to know what
classes she has remaining.
Irving goes into our
back-office SIS system
and pulls up her
program, how she's
progressed through that program,
and what she has outstanding.
Now, you'll see
here, as Holly asks
about the syllabus for a
course that she's already
registered for, that
she's using the title.
And this ties into a
little bit of what Greg was
talking about with entities.
So we can take the
title of a course,
we can take the course ID, and
we can use those as synonyms.
And you'll see, in the response,
that Dialogflow and Irving have
recognized that
critical thinking is
synonymous with PHI 210.
And in this case, Irving is
giving the student the link out
to the course syllabus.
Because would you ever
deliver a course syllabus
in a conversation
in its entirety?
No.
You want to keep things
brief and conversational
as you go through that.
Now, Holly's concerned
with her GPA.
So again, we're going to go
into our back-office system
and pull up that GPA
and see what she's got.
This is a little verbose, and
may need a little bit of work
regarding the response here.
But Holly's doing all
right with a 3.6 GPA.
And she's in
satisfactory standing
both academically
and financially.
Now, Holly's asking
about her grades.
You'll notice that she
didn't give a term.
Schools are term-based right?
It doesn't matter-- semesters,
trimesters, quarters, what
have you-- they're term-based.
Irving has contextual awareness.
We have set it up such
that Irving assumes
the most relevant
term based on the time
that the question is asked.
We're going to go
through it again here.
And again, we have entities.
Holly asks about
grades for fall.
But again, that's a
little bit nebulous.
Is it fall '18 or fall '19?
Which one makes most sense?
Again, contextual
awareness allows
us to know that we're
talking about fall '18.
And great job, Holly,
for getting A's
in both your courses.
Now, when am I
going to graduate?
It's a hard projection, right?
It's really based
on student input.
What are you going to
put into it as a student?
So Irving goes through and
calculates the average courses
completed per term, and then
asks Holly how many she's
going to continue completing.
And then, we're a
quarter-based school,
so how many quarters
throughout the year
are you planning to attend?
And from that, Irving
will then calculate
an anticipated
graduation date and give
the projection and a
little bit of legalese
for those that require it.
So now, we'll talk a
little bit about surveying.
And you'll see here that
Holly is saying thanks.
And that's a little
bit of an indication
that the conversation is over,
that the student, the user,
has gotten what they want
from the conversation.
And that's rare.
How many times have you
guys been in a conversation
within any chat and just left
it when you got what you wanted?
I have.
And that that's the
more frequent case.
You just leave the conversation.
Like, I'm done.
I got what I needed--
no, thank you, or
anything like that.
So we survey at the time
of conversation conclusion.
So when we recognize
with a goodbye,
with a thank you, or
something along those lines,
which is a rare occasion.
The other time that
we really survey
is when we know
that we have failed
to understand the student.
And that happens,
within Strayer,
when we have failed
sequentially three times.
Because we really try
to understand as best
we can-- give them
as many opportunities
to let us help them
with automation.
But when we fail
them three times,
we'll give them a survey.
And we will transition
if we have availability
to a human agent so
that the human agent can
continue that conversation.
And the human agent will
receive the full transcript.
But we're talking about surveys.
So we survey more
often when we know
we've failed than when we
know we've been successful.
And as we get into
that, we'll talk
a little bit about those
numbers and how that
plays into the survey results.
So we've had over 400,000
conversations with Irving.
Over 80% of all chat requests
are handled within Irving
without transitioning
to a live agent.
This is where the
surveys come in.
Of those that have been
surveyed by Irving,
87% agree that
Irving helped them
answer their questions easily.
Now, we've gotten some
really good feedback,
and we've gotten some negative
feedback in there as well--
all constructive-- and will
help us considerably to improve
the Irving experience.
So Greg talked a little
bit little bit about it.
Vanessa talked a
little bit about it--
intents.
Intents are-- you can think of
those kind of as topical areas.
We have 1,400 intents
in production right now.
Now, this chart is
a bit of an eyesore.
But I want you to think
in terms of intents.
This chart shows you,
over the past 30 days,
how our intents have been used
by our student population.
And look at that-- you
got 20 or 30 intents that
are receiving the most volume.
But you look on the
right, the long tail--
look at that.
All other intents
represent the majority
of the questions
that are answered.
What you should
take away from that
is that the superb
service with a chat bot--
it's coming on the right.
For all of those
questions, they didn't
have to go somewhere
else to get an answer.
They were able to get it
within, even though the 20 to 30
there are the most
frequently used.
I'm going to stop
going backwards.
I promise.
So how do we do this?
So this is going to be very
similar to a typical software
development lifecycle.
You got to understand
your use case.
When you define your use
case, be very granular.
You want you want to
be as specific, when
you start any
conversational experience,
as you possibly can.
And then you need to gather
data on that conversation.
How do your students,
how do your users,
ask about this topic?
What parameters are
they providing to it?
What context are they giving?
And then you need to
script your conversations.
And when you script
your conversations,
you start with the happy path--
the happy path being,
how you want them
to go through the conversation?
But then you need to think
about gracefully handling
the fallout, the fall backs, the
branches of the conversation,
and allowing for
that contextually.
Then, you're going to start
developing your integrations,
you're SIS systems,
your financial systems,
your human resource
systems, any system.
And you're going to test those
integrations with Dialogflow.
Five-- that's your
most important step.
You must pilot with
your user population.
If you do not part pilot
with your user population,
you will get some very
unpleasant surprises
when you take this into any
kind of production environment.
We do this regularly
at Strayer University.
And every time, we
learn something new--
a new way that they ask about it
that we haven't trained the bot
on or a new use case that we
can then take back to number one
and begin developing again.
All right.
So we've gone one through five.
And now we're in production.
We can all go home.
We're done.
It's not quite that simple.
So when you're in production,
you need to monitor your bot.
You need to look for errors
within your integrations,
and more importantly, fallbacks
within your conversations.
And fallbacks are when the
bot can't understand what
the users are sending to it.
Because within those
fallbacks, that's
where you're going to
find that long tail.
What do they want to talk
to your school about?
That is where you find this out.
And they're going to
keep repeating this
until you maybe run out
of conversations to have.
Our experience is that you're
likely not going to run out
of conversations to have.
How do we do this within GCP?
Well, Dialogflow is
the system that we
use for the intent detection
and entity extraction.
Then we use Cloud Functions for
fulfillment and conversation
routing.
We use Cloud SQL to
record every interaction
that we have within Irving.
On a nightly basis, we
put that into BigQuery
so that we can do some
additional analysis.
And we use Data Studio
for the reporting there.
We use Compute Engine currently.
Although, there was
an announcement today
that may change that.
We use Compute
Engine to post Redis
so that we can do caching.
And then we have a
custom AutoML model
that allows us to do fallback
classification to better
the experience.
And since I am
already out of time,
anyone who wants to hear about
that, we can follow up after.
All of those components
are put together here.
And even this is a
little bit high-level
on how we combined all
of those components
into an architecture
that allows us to serve
1,400 different intents.
And again, I'll be
available for a little bit
afterwards if you have
questions on this.
So where are we
going to go next?
Well, this week,
we are launching,
at our sister
university, Capella,
we're launching Ella in a
pilot phase-- very excited
about that.
We're also going to be
increasing the personalization
both within the integrations
that we're developing
and the ability to interject
within conversations
and maybe not even wait for
a conversation to start.
Next, we're going to go into
our learning management system,
where our students spend
the majority of their time,
to assist them in the
classroom experience--
completing assignments, writing
assistance, maybe even quizzes,
and then, finally, phone
system integration.
So the, press one for
the first circle of hell,
press two for the
second circle of hell--
we're going to replace that
with an integration with Irving
so that you can use
natural language
to better the experience.
And with that, I'll hand
it over to Neil Gomes.
And he can talk to you
about their experience
at Thomas Jefferson.
[APPLAUSE]
NEIL GOMES: I'm
going to be talking
to you about the value of chat
bots in the medical space, both
for medical and education,
as well as, actually, maybe
at some point in the
future, seeing a chat
bot as your doctor.
I'm sure everybody
will like that.
No?
No?
I see some faces saying no.
OK.
Just you wait.
[LAUGHS]
Maybe at some Google Next,
we'll be laughing at this
and saying, hey, I said that
you'd see a chat bot as a doc.
OK.
All right.
Well, so what I wanted
to impress upon you today
is really the power
of conversation.
So I'm going to ask
you a quick question.
OK, and the answer is not iOS.
It's not Android.
It's not Windows.
It's not Chrome.
What is the best operating
system that we currently
understand really well?
What's that?
AUDIENCE: The brain?
NEIL GOMES: The brain.
Not really.
I can't tell what's happening
in your brain right now.
But what else?
Yeah.
AUDIENCE: [INAUDIBLE]
NEIL GOMES: Language.
Conversation.
The cue is right there.
OK.
There isn't much of a
learning curve, right?
And you don't need to teach
somebody how to speak.
Usually, after the age of--
in my kid my kids' case, 0.5--
they know how to converse.
And so I think we've been
sitting on this for a long time
and haven't really
started leveraging it.
It's really, really powerful.
And being able to
get machines to start
engaging in conversation is
actually a huge leap for us
in terms of human development.
So I think that's
really important.
We need to understand that.
And I founded this group called
the DICE Group, or the Digital
Innovation and Consumer
Experience Group
at Thomas Jefferson University
and Jefferson Health "where
we bring digital to life."
That's our motto.
And so we've been
we've been doing
a lot of work in
the chat bot space
that I'll talk to
you about in a bit.
But to give you some
context, this is who we are.
Thomas Jefferson University
is the third or fourth
medical school in the nation--
almost a 200-year-old
institution.
You've heard of terms
like "gross anatomy"--
Samuel Gross was the
Jefferson physician
who started gross anatomy.
The heart-lung machine
was invented at Jefferson.
And we have about
15 hospitals now.
We've been in a rapid
growth kind of phase, mode--
and on a revenue of about
$5.5 billion at this point.
But the most important
figures to us
are really those patients.
We see about 3.2
million patients a year
in our outpatient clinics,
about 150,000 in our hospitals,
and about 550,000 in our ERs.
And we have about
8,000 students--
so a lot of impact through
some of the solutions
that we create as a group.
Our team is about 220 people.
We've grown from about 2 when we
started 4 and a half years ago.
These are people that
are software developers.
Rob, who's sitting
in the front here,
runs our software
development team.
And we have about 70 or
so software developers.
We have designers,
human-centered designers,
about 40 of them, that take
up solutions that come to us,
and then find what
the real problem is.
And only then do we
develop a solution.
OK?
And we develop several different
types of solutions like this,
about 130 a year,
and return about 10x,
which is something I learned
about at Google last year,
in terms of returns on
most of our projects
because we do things
very frugally--
very frugal innovation.
And we return a lot of
value to the organization.
Just on our digital
solutions, we
save the organization over
$40 million or so per year
and have done a bunch
of other things.
And sometimes you'd be surprised
at what generates value.
It's amazing.
Our donation app generated about
$1.46 million just last month.
So we never thought
that would happen.
What we do with digital
is, we improve access,
make it convenient to get to
us, or for us to get to somebody
else where they are--
that's always better--
and then provide closed-loop
digital experiences,
which is very rare in
the health care space.
So we try to do that.
And chat bots and
such types of tools--
voice chat bots also
enable you to do that.
And I'll give you
a quick example.
So voice bots, right?
Powered by a Google Dialogflow.
We'll be talking about
this on Thursday.
Rob's going to do
the presentation.
We felt that there
was tremendous value
to our patients if we
were able to answer
any of their
questions at any time
while they are in
the hospital room,
even though there was no nurse
available or no physician
available.
So we made a hospital
room chat bot.
And it's voice--
we got a speaker
system that really worked.
We experimented with a lot of
different types of speakers.
People stole a lot of
our speakers [LAUGHS]
speaker cables disappeared.
And then we came down to
this particular form factor
with the company
that we work with.
And it tells you
a lot of things.
It tells you when
your meal might
be coming, who your
doctor is, whether they're
board certified not.
It will tell you about
your disease or condition.
It'll give you your
discharge instructions,
if you're leaving, and narrate
them out to you from the EMR.
It will control your TV.
We have mostly
Samsung Smart TVs.
And so it controls-- we've
found a channel to do that.
Controls the TV, changes
the volume, all that--
lighting.
We can also have it control HVAC
through our building automation
systems--
all just through voice.
So that's our voice bots.
We found that people that
agree to do clinical research,
they hate the surveys.
I mean, they are long things.
They lie to you at
the beginning and say
this will be over
in five minutes,
but it doesn't get
over in five minutes.
But they'd rather engage
in conversation, sometimes
even with a chat bot.
So we made some research chat
bots for our researchers.
And we have the service
available to them
to use anytime they want.
We are now experimenting with
a differential diagnosis chat
bot.
Differential diagnosis
is, whenever your doc--
you go to the ER, and the doc
says, or the primary care,
and the doc says,
I'll be right back--
they're going to check their
differential diagnosis tool.
Sometimes that's called Google.
But there are actual tools that
they check that give them--
they put in your
presenting conditions,
what it seems like you might--
not conditions,
really, but symptoms.
And then it gives them potential
conditions with probabilities
next to that.
And it's very, very important.
Because sometimes you
might miss something,
like Lyme disease
for example, which
presents in a very simple
way in an ER many times.
And you might just forget to
ask that other question like,
were you in the woods recently?
Did you get a tick bite?
Those types of
things-- so it prompts
docs to ask these
types of questions.
And it's an
extremely useful tool
for learning for
residents and students.
Because sometimes
they just don't
realize they've got
to ask that question.
So they go and do this.
What if we could present that
as a chat bot, almost in simple,
quick, and easy ways?
Especially because chat
bots have the capability
to represent canned responses
that they just tap on,
versus having to type
something in, right?
So that's what we are--
we're experimenting
with a company
called Isabel that
creates a great differential
diagnosis tool that's
named after the CEO'S
daughter, who passed away
because a good differential
diagnosis tool was not used.
So we also created a--
along with one of our students--
once created a compendium
of all medical school notes.
OK?
Came with this brilliant
idea-- got permission
from the faculty
and the teachers.
And they said it's
fine to do it.
So they made a Wikipedia of
all medical school notes.
And so now we're building
chat bots on top of that
because we have a huge
data source to mine.
We also are-- this
is very inspiring
to see both examples of
student experience chat bots.
So this is a space we really,
really want to get into.
We've done some of these things
for patients, but not yet
in the student space.
So that's something
we want to build.
We've also just
deployed, recently,
appointment chat bots.
The Rothman Institute,
which is part of Jefferson
and one of the best orthopedic
hospitals in the nation,
now only takes
online appointments
through a chat bot, 24/7.
They do have a phone
number you can call.
But any online appointments--
there's no form.
You just do it via chat bot.
And they're seeing great
engagement and very fast
information collection.
And finally, a topic that, in
the future we're probably going
to have to deal with-- and
we are doing this right now--
is a lot of people, especially
institutions like ours,
are investing in Robotic
Process Automation, right, RPA--
in the financial
side, sometimes even
in deep clinical areas, things
that happen on the back end.
And we like to
call these autobots
because this is pointed at
automation and automation
platforms.
And we think there's
great scope here
because a lot of people that
are in the operational spaces
don't understand how
these things work.
And the bots could
provide that insight.
The bots could also
initiate the next action.
If you build the right
type of platform,
you can do some of these things,
transitionary elements, really,
really effectively
using bots, we think.
Because when we go into clinical
areas with our design teams,
and we try to make
them more efficient,
we find that most latency
and most issues with service
happen when something
transitions from one person
to another.
So now we're starting
to automate some
of those transitions, right?
But some of them
are still human.
So if you want
somebody to respond,
you have a bot in between
that initiates that next step,
either via text message
or via a platform that you
might have across the system--
so make things happen
a lot faster, which,
in medicine, is very important.
Time is life many times.
[MUSIC PLAYING]
