As with any of these presentations, it
makes a lot more sense if we first of
all give you a little bit of a flavor of
who we are as a University and why
we're approaching it the way that we're
approaching it. I was gonna play a little
game to see who knew where Darwin was, but I won't embarrass you or us.
It's easier to just show you so
we're at the very very top of Australia.
We're a tropical place. We can fly to
Bali in an hour and a half, but it takes
four and a half hours to fly to Sydney.
So that kind of gives you an idea of how
big Australia is and where we're
positioned. You can see that we've got
campuses all over the country, although
our main presence is in Darwin. Whenever
we come to America we love telling
people this: the Northern Territory -- that
area and white, that's our state -- it is
the size of Texas, Montana, and New Mexico
combined. It's a pretty big area. Total
population: 240,000. So for us to be
viable as the University we need to
reach out. We need to get out of town a
little bit, which is partly why we've set
up campuses all over the place. But also
we're online. We don't like to say that
we have students who are online and
students who are on campus. All of our
students are online. All of them are
using the same technology. So we have a
suite with the LMS at the heart. The LMS
for now at the heart. We'd like to move
to a system where in fact the student
identity will be the thing that the
student logs into at the start of each
day, but for now it's the LMS. Most of our
classes are using collaborate sessions
and then we have mobile and we have
Blackboard student services. We're a
managed hosting institution, and of
course today in particular we use
Analytics,
and we'd like to talk through that
story. It's sometimes necessary to point
out...it may be not in America but you
know last week I was in China given a
very similar kind of address, and in
China they still don't officially
approve online degree education. And it's
quite useful sometimes to point out that
we are a university ranked in
the world. Within Australia we have the
highest success rates for our graduates
that not only get jobs at a higher rate
but they also tend to start at a higher
starting salary. And if you think that
that's because a lot of them are mature
aged, part-time, online students that
would be a reasonable assumption. It's
also not true. Because when you discount
that, when you look at our school leaver
cohort, we still ranked number one in
Australia.
Sometimes number two to university that
we don't mention. It alternates from year
to year. So the point is quality online
education leading to quality student
outcomes is a possibility that does
happen. But we have challenges. We have a
retention rate that in Australia is
appalling.
So we retain about 65% of our students
through to completion. In some countries
I'm told that's quite good. But in
Australia that's that's not terribly
good. We have staff who want to better
know their students. They
often won't ever meet their online
students unless it's through a
collaborate session so they need to
understand them better, they need to know
more about them. We need students and
staff to be more present. To be visible
and a lot of that is data driven, not
just human to human. We need our students
to be engaged all the time. Now what we
mean by that is there's certain times of
the day when they'll be able to get
synchronous interaction with the staff,
other times where they can get
synchronous interaction with peers, and
we're increasingly building our
resources to be interactive resources so
no more PDFs or static videos. And we're
trying to make the actual learning
materials engaging interactive
materials.
But beyond that we're looking to
analytics to provide real-time dynamic
personalized services. And I've let one
of the cats out of the bag about what
happens next.
All with a view to improving learning
outcomes. So that gives you a little idea
of the type of institution we are spread
all over the place. We're technical and
higher ed and we are multi modality. So
with that as the background I'm going to
give my voice the rest and Bill will
tell you about the meaningful stuff. 
I'll come back at the end. Thank You,
Martin.
So that provided us with a number of
opportunities because, like Martin was
saying, we have a huge digital footprint
because of our online presence. So we
decided that we really needed to look at
learning analytics and implementation of
learning analytics. So the project was
really semi-structured. There
was no real predetermined outcomes. We
wanted to dive in. We
weren't able to pilot this. You couldn't
bolt it together without actually doing
an implementation. It's not something you
could trial. And we wanted to utilize the
Blackboard off-the-shelf reports as our
early starter. And then we wanted to also
get that user buy-in from the academic
community. So the project was really
designed around project management
principles. We had project sponsorship
and ownership, which was Martin as the
PVC. We had a university-wide retention
project that was running in parallel to
our learning analytics implementation, so
we were able to use information from
that project to inform what we were
going to do next with our analytics
implementation. We also contracted
Blackboard consulting services to
basically worked in a partnership with
CDU to help drive the implementation.
We had a whole range of on-site
visits, consultations, awareness sessions,
workshops. And they also set up a
technical infrastructure for us to
actually install and move to the
learning analytics platform. They
assisted us at our report development
and more significantly in our
customizations which we'll talk about
shortly. The launch provided us the
opportunity to really pitch learning
analytics to the academic community.
We had a bit of a show and bang, we
invited a whole bunch of staff to an
auditorium. We launched the product, we
made those reports available. It was
really to brainstorm and get feedback
from the academic community about the
initial rollout. We also made it clear
there was no mandate or obligation to
use. It was really around exploring.
Sort of trying to garner some
insight as to what the uses were, and what
the possibilities were in the future. So
there was no obligation to actually use
it. Now what we found and I'm sure you
probably agree, is that the consultancy
investment costs reoccur over time.
We sort of thought that maybe
it would be the initial curve and it
might drop off. But actually what happens
is it goes up and down. And it goes up
and down because there are several
considerations: the extent of the
customization, upgrades or changes to the
technology, the evolution or the adoption
of the use throughout the institution,
changes to data sources which are always
constantly happening. And then there's also
those changes to organizational
structure that influence it as well. So
what we found is our institutional costs
officially started to go up and then
plateau out. But then what does happen is
this: will it go down or will it continue
to rise? So as the chart shows over time
as adoption grows it can go either one
way or the other and it's subject to
those considerations that I was talking
about earlier: changes to technology,
changes the data sources. There's a whole
range of considerations there. We sort of
theme this there are many paths that you
can go by when you're looking at
learning analytics. You could go for
Blackboard Analytics. You could use other
BI systems. You could do it manually. You
could set up a data warehouse. But we
decided that we wanted to use Blackboard
Analytics and there were some value
propositions for us. We're a small
university with limited resources. We
have a limited recruitment pool with a
high turnover of staff so that expertise
wasn't really in Darwin. As Martin said,
we're not
place. we've only got 240,000 people in the
entire Northern Territory. There's a very
transient population. So Blackboard
offered us some quick wins. Some runs on
the board. We could get there in
the least amount of time possible.
Blackboard also brought expertise and
knowledge that we didn't have and we
were able to capitalize on that
knowledge as well. And there was the
integration with the LMS,
which was a critical component for us. We
didn't want staff to have to log into
another system or to another BI platform.
It was sitting integrated in the LMS,
so that was another key value
proposition. There was also professional
development training support to the
academic and professional staff that
Blackboard consulting services
provided us. We are also developing
our own in-house data warehouse, and I'm sure
a lot of other institutions are looking
at that as well, because we believe that
intelligence is too limited until it
draws from multiple data sources and the
biggest digital footprint possible. So
this is a bit of an overview of our time
line from 2013 to where we are now. So we
started our journey in 2013 with no real
fixed agenda. We paid attention to
the stakeholder engagement and all the
consultation required to sort of
pull the project together. In 2013
we did all the technical installation
and all the infrastructure to support
the platform. In 2014 we went live
with our first reports. We then scoped
out requirements around bespoke reports
that we wanted to create ourselves and
also scoped out requirements for
customization to context. You'll see that
context goes through the whole period
from 2013 to where we are now. Because
context is a really important
consideration. We swapped out some
reports and replaced them with some
other bespoke reports because we wanted
to reflect our context,  and I can talk
about that in a bit. We focused on
staff development and training. And then
at the same time the university led a
national project that was funded by the
federal government called Let's Talk
Learning Analytics for Student Retention. In
2015 we started to think about the
business intelligence.
We had more on-site visits. We started to
dive further into our customizations. We
launched our first round of Pyramid
dashboard reports for schools. And we
also launched our first SQL reports. In
2016 we started to build use cases. We
did further customization. We developed a
report specifically for vocational
education. And the reason we had to do
that was the vocational education
doesn't have a fixed start or fixed end
date in terms of the semester and
analytics sort of had a dependency on
that. So we had to build a bespoke report
in SQL reporting services in order
for us to meet that requirement.
We've also rolled our SQL reporting
services in 2016 to I suppose what you'd
call the professional staff, those people
that are in student support, students
central, academic liaison, so that they
can also have access to data. This year
it's really been taking a bit of a
deep breath to reflect on how far we've
gone, and to start to think about a
strategy. We didn't start with the
strategy but now we are thinking about
the strategy. We're thinking about
those policy and process issues around
data stewardship, data governance: who has
access? how long do you retain the data?
Ethical considerations. We're also
starting to have more conversations with
the academic community, to try and
socialize the use of learning analytics
across the community. We wanted to
understand, is it being used? And did we
get it right?
What do academics want? What's possible?
And we're starting to now publish case
studies and there'll be some case
studies published from CDU at a date
to be announced because it was just
cancelled unfortunately because the
minister wasn't available, so we'll have
three case studies published very
shortly. The next steps is really about
building a praxis community, a community
that also allows people to engage with
each other but also reflect on their
practice and also bring in some research
aspects as well. We're working on a
student project around student-
facing learner analytics. Student
dashboards.
We made a conscious decision to not
enable that when we first went and
throughout implementation. So we're
working with a consortium of Australian
universities called the Innovative
Research University Network or the IIU
for short, about an 18-month project looking
at what a student-facing analytics might
look like. We're looking at what are the
actions and interventions? And how can we
facilitate those actions and
interventions out of learning analytics?
Looking at drawing in multiple
data sets as we spoke about at the
beginning. And a renewed focus on
professional development with the
academic community. So one of the things
we ask ourselves is, is it being used?
And how do academics get access? So we
have set up a Blackboard
community site. So on that community site
it's integrated with the LMS and we have
a range of information there for staff.
The one on the left or sorry ... on this side
is all around the Blackboard Learn integrated
reports. So there's a whole bunch of
resources, information, guides, data
dictionaries, use case scenarios, and a
whole bunch of support resources for the
academic. The nice thing is it's
available in Learn. They don't have to go
anywhere else because it's in the
context of the LMS. And on the
other side is our school dashboards. So
we made a conscious decision to roll out
Pyramid dashboards across the school
context. So every full-time teacher has
access to a dashboard that is specific
to their school. So it's actually picking
up on the conversation this morning that
was happening in them on the main stage
about, I suppose, the democratization of
learning analytics and making it
available more broadly to the academic
community. I want to talk a little bit
about our customization and our bespoke
reports. So we had probably four core
areas that we wanted to focus on: grade
performance, time and date, who's in my
class, and measurements of counts.
So grade performance. We've got a
customization there that allows us to
look at grade performance by enrollment
type. We have external students and we
have internal students. And we're able to
actually look at that great performance
by those two parameters. You're also able
to then apply those two parameters to
the number of course accesses, the amount
of time spent on the LMS, and the
engagement and interactions that occur
on the LMS. We're really interested in
the time and date and one of the core
bits of feedback we got from the academic
community was, "I want to know when those
items were accessed and I want to know
how long they're spending on those items."
So we made a decision to record
the first access date and the last
access date and a time and date stamp
measurement against that. So academics
are now starting to sort of correlate
that to, is the date of first access related
to engagement and activity across the
teaching period? And is there any sort of
causation there as well? We're still yet
to unpack that, but one of our staff
members has done a case study around
that. We also wanted to, as Martin said at the
very beginning, because we are an online
institution -- 75% of our students study
wholly online -- academic staff
had very limited visibility of who those
students were. Who was in the class? It
was a student ID and a name in the LMS.
So we commissioned a bespoke report
called the Student Snapshot Report.
That report is available to the
academic before the commencement of
teaching. So as the student enrolls into
the SIS, their enrollments are automatically
pushed into the LMS and all the data
surrounding their enrollment is available
to the academic. So we highlight some key
demographics. We highlight male-female,
international-domestic, age range,
non-english speaking background,
hours worked,
carer hours... So a whole bunch of
attributes around the student. We then
pull in attributes from the student
information management system around the
GPA and the percentage of course
completion or program
completion. We also bring in information
about what degree are they taking. So an
academic can see that they're into
Bachelor of Arts and they're taking
these units. Or these courses as you
would call them. We pull in the basis of
admission,
so an academic can get some insight into
how the student gained entry into the
institution. And we also pull in the load
for the teaching period so if I was taking
one, two, three, four courses in the
semester, they would actually calculate
that out. So gain some insight into what
the student load is and what their
commitments are because we're pulling in
carer hours, we're pulling in hours worked,
we're pulling a how many units or
courses they're taking, and a whole bunch
of attributes around the student itself.
Academics are using this information to
set up retention rules via the Blackboard
Retention Center, for example. And
setting up conditions on those rules. So
we have a suite of reports within the
Blackboard Learn Analytics integrated
reports we have some of the out-of-the-box
ones. We have Student at a Glance,
Unit at a Glance, the Activity and
Grade Scatterplot, and the Activity Matrix.
The two custom reports there are our Vet
Site at a Glance and the Student
Snapshot Report that I spoke about
earlier. Within the SQL layer we have a
whole bunch of exception reports around
grade performance, around submission
exceptions, etc. Login exceptions. But
they're rolled up to the school and
faculty level so your course advisors,
your people that are in an  academic liaison
role, can actually see across the unit
set or the course set any issues
relating to the student. And then we've
rolled out a whole series of scorecards
and dashboards in Pyramid. And 
like I said they're available to
full-time teachers. They get an overview
of the teaching period compared to
the previous teaching period. So they
have a whole bunch of information
around school metrics, activity across
the semester by interactions by accesses
by time spent. The items in the units and
the percentage of items accessed.
The tools of the tool utilization across
the teaching period, so whether it's a
content item, a tool item, or an
assessment item. We bring in the
demographics that I spoke about in
Student Snapshot, but that's rolled up to
the school level so you could actually
get a picture from the school and from the
faculty level as well. And then
importantly academics really wanted to
know what activity the student was
taking across the degree program. So we
have a report there that allows us to
look at the the student by program and
what courses they're taking, and a whole
range of data about those individual
courses. So we asked ourselves
is it actually being used? So like
most clients we wanted to know if
learning analytics was
being used by the academic community so
we developed our own report via SQL
reporting services --  and Rachel you and
Mike actually spoke about that an hour
or two ago that was fantastic that
announcement -- so we sort of
were figuring out how could we do that
so we spoke to Blackboard and worked
out that we could get access to this
data by the by the log files. So we've
got some insight into the actual usage
of it by the academic community. This
gives you a bit of a snapshot of our
usage over the last 12 months. So from
July 2016 to July 2017 because
unfortunately the log files only retain
365 days of data. So the Student Snapshot
Report is our most commonly accessed
report and I think that's also partly to
do with answering that question that the
academics wanted us to answer is who's
in my class so I can actually be
prepared for the teaching cycle. The
Student at a Glance report is our next
most commonly accessed report,
and then you've got Unit at a Glance,
the Matrix, and the Activity and Grade
Scatterplot. But what we're finding now
is that more staff are actually looking
at the Activity Matrix and the Grade
Scatterplot because they want to gain a
bit of insight into grade performance
and the interactions
and the engagement that happens within
the LMS. So that's a bit of a shift that's
started to happen now. The next bit is we
were involved in Co-leading a project
with the Malaysian Research University
Network and the Innovative Research
University Network in Australia.
it was a Co- project. It has only just
concluded. And these are some preliminary
findings around ... we did a series of
focus groups across a range of
institutions, where academic staff were
invited to come in and we got them to
rate a series of reports. There was a
whole bunch of reports that those
institutions had and we focused on the
Student at a Glance and the Unit at a
Glance report. So this is really around
the usefulness or the usefulness of
purpose. So these are the preliminary
findings from the discussion groups
around Student at a Glance and Unit
at a Glance.
The first chart report shows the mean
scores on a scale of one to five, with
one being not useful and five being very
useful across the three universities.
Student at a Glance was ranked the
highest by all the participants. The
second chart shows the number of
comments. We then asked them to actually
make comments on all those reports as
well. We asked them to rank them from one
to five then we asked them for comments
about them. So the second chart shows the
number of comments made against
each report related to the learning and
teaching style. Was the only impact on my
learning and teaching style in
particular? The data analysis training
requirements, or the perception of
training requirements. And the usefulness
purpose. And again you can see that
Student at a Glance sort of outranks the
Unit at a Glance as well. I'll touch on
the data analysis comment a bit further
down the track. One of the other things
we wanted to do ... was we
wanted to sort of link 
learning analytics with the curriculum
lifecycle to encourage the broader
adoption of learning analytics across
the academic community. So following the
cycle of learn and teach, review and plan,
we've sort of come up with this little
idea. So the first report would be around
the preparation for teaching. Which is
you wanted to gain some
insight into an overview of a class.
The second report would be the Student
at a Glance report. During that teaching
period to understand the accesses and
interactions. The third report we are
identifying is the Activity and Grades
Scatterplot, which is really around
student performance, grade performance,
but also very important around that time
of grading and moderation. The fourth
report sits in the review cycle, which is
trying to gain some insight into the
unit design or instructional design and
using analytics to do that.
Number five is one of the reports that
we have in the pyramid dashboard which
is looking at comparisons across the
courses over time. Number six
is again around preparing or planning
for the unit. Again looking at how over
time the unit or course may have changed.
What tools and utilities have been used?
And what are the patterns
of access. But we've also realized there
are more cycles for us to consider.
So not only have we now tried to align it to
the curriculum lifecycle, we've got
more work to do. We want to align it to
the student life cycle, and we also want
to align it to the staff life cycle as
well. This one is to call out some
of the project limitations that we've
discovered along the way. What we've
realized is that we
believe there's an overestimation of 
the initial interest by the academic
community which has sort of slowed the
project down. I mean as project
sponsors, as a project owner, and working
with the key stakeholders we were all
really excited about this but it's taken
us time to build that excitement
throughout the academic community. We've
found again from feedback that some of
the data comparisons in some reports was
not seen as relevant. An example of that
is Unit at a Glance. It might be the
way which we've set that up because the
comparison is the course compared to the
school in the same mode, whereas 
academics have told us I would rather
see the comparison against the previous
offering of the unit to see how it's
tracking against the previous offering
of the unit. The program
level reporting options from the LMS has
frustrated staff because they are
required to log in to another platform
to do that, and one of the constraints
there is that unfortunately at the
moment there's separate login
credentials to the the Pyramid BI
platform which has limited peoples'
ability to take that up and
that's something that we want to work on.
There's no integration of analytics with
any intervention or communication
systems as yet that we're aware of, so
that inhibits some sort of more
personalized services as well. And
it's got a very us centric language, so
one of the things that we would love to
see is that I suppose the language used
that it is more internationalized for
for context and purpose as well. But we
had some we've had some great successes.
It's focusing the academic on the
endeavor of empirical evidence. We've
been able to link exception reports with
our SMS services or the CRM so we've
been able to pull data out of SQL using
those exception reports to do some reach
out to the student community. We've been
able to customize to reflect our context
which has been really critically
important for us. I think again
establishing school dashboards using
Pyramid has been a win for us because
we've really made that data available to
the teaching community. It's not just in
the hands of leadership. It's not just in
the hands of executive. But it's actually
available to the academic community. And
I think one of our other big wins is
really the use of analytics to
understand who's in the class and shape
the learning and teaching approach by
the academic community. I'm going to hand
over to Martin.
We're going to do some crystal ball gazing, looking
at what the possibilities might be in
the future. Thank you very much.
So it'll come as no surprise to many of
you to hear me say some of these things.
We've been very impressed at the way
that Blackboard has been doing deep
dives into the Learn dataset and the
Collaborate dataset to come up with a
predictive analysis. But our institution
is fundamentally of the view that you
can ask the most sophisticated question
you want of a limited data set and
you'll get the best answer that that
limited data set can provide. So if that data
didn't include the data that was
most germane to your question you've got
a problem. So we've been grateful that
Blackboard's been doing that because that
kind of querying is going to really come
to the fore we believe when it's done
across linked up multiple data sets,
which is of course what is happening.
And we're a firm believer that in any
sort of broader predictive work with
analytics that we do, we'll wait until
we've got those data sets linked up. Now
having said that we're working on that
right now.
Wasn't interesting that the Student
Snapshot was the most important one to
them. There was nothing predictive about
it. It was descriptive only. But they had
faith in it because they they felt that
they could trust the data. They don't yet
feel that they can trust the predictive
data, at least until they can see it is
drawn from the maximum digital footprint
possible. Secondly, it's not about more
reports. Half the reports we produce
aren't being accessed as you saw, and
even if they are being accessed that
doesn't necessarily mean that they lead
to action. Doesn't have to be done that
way.
The real beauty of analytics is the
ability to use the analysis to trigger
some kind of automated response. It might
be an automatic email. It might be
serving up some kind of other
personalized service. We're quite
convinced that that's the future of it.
If we went back to our academic staff
and said, "wow we've got a hundred more
great reports for you!"
I'm pretty sure there would be a short
conversation with them. Okay, it starts to
get exciting here. For us the future is
in students not just studying with us.
We're already linking students across
multiple campuses, why can't we link them
across multiple institutions and
multiple countries? A student in
Australia is going to be far more
excited in, I don't know, an engineering
degree ... no that's a bad example, actually ... in a law degree. You're gonna be far more
(because engineering is
already accredited according to the
Washington Accord and EUR-ACE, so
you know a graduate has global mobility).
Law's a good one, though. Where law
tends to be very jurisdictional but if
they could study law at an American
University and at our university and get
a dual badge degree -- it might be a
slightly longer degree to cover all the
issues -- that gives them greater mobility.
But to run analytics across that means
that the analytics dataset is going to
have to draw from all the contributing
universities. So we've got to start
linking that up. And then lastly ... and
if you're at the in panel this morning I
snuck this one in right at the end when
there was really no chance to talk about
it ... but let's just touch on ethics. As we
merge all these datasets together. 
Now let's talk about what we're
doing on campus let alone what might be
happening at Blackboard or elsewhere. On
campus we're doing what most of you are
doing which is a data warehouse and
we're pulling data from our student
management system, from Learn, from
Collaborate, from library utilization,
from
access to third-party online
tutorial services, etc. We pull all that
data together. To link it up you use the
student ID. That's the unique
key that's common to them all. And you
can anonymize that if you like, and you
can strip it out afterwards for sure. Has
that made anonymous? I know where a
student was on Thursday night. I know
we're on the campus say where I might
know what they spent. I know what library
books they looked up. You put that all
that data together over a period of time,
I don't need your student ID to know
everything about you. I know you as well
as I would if I had your name. And
your student ID in front of me. So
anonymity as a set of ethical issues
that we're not yet equipped to deal with
because the usual methods that our
ethics committee apply of, you know,
take out name and address and student ID,
that's just not going to cut it any more.
But even more than that, if we use this
data to provide personalized services
like the Jill Watson fantastic case
study at Georgia Tech (wasn't it Jill
Watson?) you know AI can deliver
academic tutoring, AI can deliver student
advisory services. But it will do so
based on a rational response to the
issue at hand and the data that it can
interrogate. And that's fine if the
 human to human
communication that it's replacing is: I'm
asking a question that requires a
factual answer and I get a factual
answer. I can replace that and it's
relatively value free and it's freed up
time for the academic staff. But the AI
by definition learns every time it has a
transaction, it learns every time more
data is added so the the quality of
rational responses that can provide are
going to get more and more sophisticated.
And it will start to replace some of
what we now regard as being in the
domain of the academics. You know, helping
someone interrogate whatever
are. Here's my point:
Rational responses in that educational
context will only ever get us so far.
It's a little bit like the theory of
evolution.
It took the mutant deviations
that made no sense at the time
to kick into play when conditions
changed in ways that weren't predicted,
to allow species to evolve and develop.
Human social interaction is the same. We
need that random input. We need those
mistakes. We need the imagination that AI
might not be able to provide for quite
some time to come. God willing that it'll
never provide it. So let's just start
projecting now what happens when we're
using analytics to drive complex AI
services direct to students that are
going to replace academic tutoring, that
are going to replace advisory services.
We've all seen the movies where sooner
or later it ends up with the AI being
decided that humans are irrational and
have to go. Now I'm not saying that we're
anywhere near having to worry about that
but I am saying that we should start
thinking now about at what point is the
human intervention still important and
why. And how does that get weighed up
against the incredibly exciting and
potentially time-saving and
cost-effective things that we can do
with AI. Our universe isn't doing that
just yet by the way.
But the ethical conversations are
starting now, particularly as we look to
start investing in the use of AI
personalized services. That's our story,
so far.
 
 
