Hello everyone, my name is Ramesh and I'm the Project
Director for the AIMS squared program at
California State University,
at Northridge and along with professor
Nathan Durdella, who's a Project
Assessment Coordinator,
we put together this brief
presentation to describe to you how
we moved from formative to summative
evaluation with the AIMS square
program
at Cal State Northridge. So first and
foremost,
the AIMS square program is funded by the
U.S. Department of Education
under a Title III part F HSI STEM and
Articulation five-year grant
and the contents of this presentation do
not necessarily represent
the policy of the U.S. Department of
Education and you should not assume
endorsement by
the federal government. With that out of
the way, I want to describe to you
what our goals were in framing this
particular project,
then I'll speak to you about the logic
model that we developed
as we continued along the path to build
the program,
then we'll talk about the implementation
of the program,
particularly the mixed methods approach
that we use
to report performance on the various
project measures,
and finally we'll explore the impact of
the program by looking at processes
with some suggestions for insight.
So to give you a sense of history the
AIMS squared program, which stands for
Attract, Inspire, Mentor
and Support students was originally
funded in 2011
with a grant from the U.S. Department of
Education. At that time it was a
collaborative grant with the College of
Canyons
and Glendale Community College and
primarily focused on transfer students.
So students transferring into any major
in the College of Engineering
and Computer Science in the College at
CSUN, Cal State Northridge
and particularly looking at Latinx
students as well as economically
disadvantaged students.
Now as the grant progressed, this is a
five-year grant,
when we got to 2016, in the 2016
competition, we expanded the scope of the
grant,
so we added a couple more community
colleges, namely
Moore Park Community College, Los Angeles
Pierce College
and in particular we expanded the scope
of the
program to include both freshmen as well
as transfer students.
So we'll talk about that in the next few
slides.
Within the Cal State University
Northridge campus
a number of colleges are participating
in various aspects of the project
including faculty and staff from College
of Science and Mathematics,
from the College of Education, and of
course all the departments within the
College of Engineering and Computer
Science.
So one of the first things that I
mentioned to you
was that we started in 2011 focusing on
transfer student success. We noticed that
the
graduation rates of transfer students,
particularly students
from underrepresented minorities, were
showing a gap of almost 20 percent
compared to better served peers in the
college. So naturally we started asking
your questions
as to why this was happening and how we
could go about
addressing these issues. That led to the
first grant
where we built a cohort based model so
Attract, Inspire,
Mentor, and Support students is really
based on students coming into a cohort
working closely with faculty, working
closely with one another,
enabling and empowering them to be
successful in their programs
this model is very successful. We
graduated
about over 200 students from the initial
program with
high graduation rates almost doubling
the graduation rate for their peers
from same communities in the college but
as we fill it into 2016
we decided to expand not only the number
of partners in our
initial program but also to expand the
scope of the program itself
to include freshman students. So now
we're looking at students who are both
being challenged to balance their
workload,
as well as work outside of campus,
preparation in terms of prerequisites,
study
skills, time management, etc and just
generally adapting to
the campus environment. So these were the initial problems that we were looking at.
Clearly the resources included financial
support, intellectual resources in the
form of
faculty and staff mentors, physical
resources, including access to computing tools,
equipment, project-based experiences and
so on,
and then outside of the grant we brought
in different partners
industry in the case of engineering and
computer science to truly help these
students
so the activities were wide and diverse
and through these activities
we were able to demonstrate success in
terms of mentoring,
in terms of transfer student success, in
terms of student success
with regards to research with regards to
academic
success for their graduation rates and
so forth. So overall, the logic model
has short-term as well as long-term
outcomes. The long-term outcomes of
course are
transfer completion and postgraduate
success whether they go on to
take a job in industry or whether they
continue in graduate school.
In the short term, we were very focused
in a formative way
on how we could help these students
adapt and learn
by monitoring various outcomes in the
short term. So attitudinal and behavioral
changes, working with peer mentors to ensure
that they receive the support that they
needed.
If they were a transfer student allowing
them to be socialized into the Cal State
Northridge environment.
And then enhancing their standard
research skills
by participating in mentored, peer
mentor, as well as faculty mentor
undergraduate research. And finally
career preparation skills
because our external advisory board is
made up of a number of members
who are in leadership sessions in
industry who are able to come in and
help these students
understand what it takes to be
successful in industry.
So as you can see this logic model we
call it bridging the gap
is truly multi-fold and it focuses on
both short-term as well as long-term
outcomes
to improve student success for Latinx as
well as underserved students.
So what are the goals of the program,
there are six primary goals. The first
goal is to enhance the academic
achievement
of Latinx as well as underserved
students by engaging them in a variety
of activities - hands-on activities,
learning activities, cohort based
activities and so on.
Second objective is to enhance faculty
and pure
environments so that they are in a
position to support
underserved minorities in their programs
and their classes.
The third objective, which goes back to
the original grant,
is to ensure transfer success so when
students transfer to a four-year
institution
oftentimes one of the barriers is gaps
in articulation agreements so we wanted
to
not only address those gaps in
articulation agreements but do that in a
very collaborative way,
so that faculty of the four-year college
are actually working hand-in-hand
with faculty at a two-year college to
jointly develop the courses.
In other words, when the students show up
at the university to
undertake their junior and senior years
they're well prepared to be in the upper
division
it's a very very collaborative effort.
The
fourth objective is career success.
One of the goals of the students in the
College of Engineering and Computer
Science
they're all graduating from accredited
programs is to be prepared for
professional practice.
So a major goal of the AIMS square
program is to really prepare them
whether it's resume writing,
mock interviews, connecting them with
employers, getting them internships,
exchange shows etc,
Getting them the hands-on experience so
that they're actually prepared for their
careers so graduates.
The fifth objective is undergraduate
research and this we found to be a very
high impact practice, as a matter of fact,
a good way to measure the success of
this program
is it has now been institutionalized
across the university.
CSUN now has something called a CSUN
symposium where
all majors and all disciplines
participate and present their research
the model emanated from the work that we
did in AIMS square.
And of course the ultimate goal is
graduation, retention and graduation of
our students,
and none of this ladies and gentlemen
would be possible without the
collaboration that exists
between the faculty between the staff
across multiple partner institutions
to serve students to enable success. So
now that you've learned about the goals
of this program
you should know that the program has
been recognized with a number of
national
as well as international awards,
including the White House Initiative for
Educational Excellence for Hispanics and
more recently
as the 2019 Example of Excelencia as a
bachelorette degree of
the year a bachelor degree program of
the year excuse me
by Excelencia and Education. So
using the logic model to guide our early
program development
meant that we focused on six program
objectives I've already talked to you
about that
transfer success academic achievement,
faculty and peer interaction,
career preparation, research skills
development, bachelor degree completion.
Now if you dig a little bit deeper into
this
underneath each of these program
development
metrics that we're looking at we have
both performance measures
and outcomes measures. So for example, if
you look at transfer success
a performance measure 1a is the
percentage of project participants
who successfully completed gateway
courses so across our partner
institutions across CSUN
in the College of Engineering Computer
Science we look at the student
performance
in typical gateway courses that allow
them to advance to our tuition course
work. This then is a measure of transfer
success.
In the same way we look at academic
achievement, which is based on the number of project participants
and the improvement in their performance
as they demonstrate
as they go for the program. Then we look
at faculty and peer
interaction and here we're looking at
both the change in enrollment of
Hispanic and low-income students
as well as the number of students who
are retained so we look at
year-over-year retention
as well as graduation statistics. Career
preparation is measured through
measures of some perception attitudes
and skills, knowledge skills and
behaviors,
related to career, and then finally
research skills development,
I'll mention the instruments that we use
we look at self perceptions attitudes
and skills
because ultimately we are able to show a
good correlation between success in
these metrics
and career success and graduate school
success. And the final
outcome, long-term outcome if you will,
is bachelor of degree completion
as a matter of fact one of the reasons
why
our program was selected by Excelencia
last year is a tremendous
degree completion rates but number six
would not be possible
without one through five right as you'll
find out in the next few minutes.
So we have a matrix of these objectives
the performance measures
and we use both institutional as well as
data from our
direct measurement and our partners this
is all available on our website as you
saw in the previous slide
www.ecs.csun.edu/aims2.
So if you look at this on the assessment
matrix you'll find that for each of the
objectives
we've identified the parties that are
responsible. For example,
the percent of Hispanic and low-income
students
who participated in grant supported
services and programs
who successfully completed gateway
courses. Now this is measured in
all of our institutions as well CSUN as
well as for partner institutions
the data we use is both institutional as
well as program data
and then some data is very specific to
CSUN, for example baccalaureate degree
completion,
and in the percent of students who
persist at CSUN - these are all very
specific to CSUN.
So all of these are done in a very very
tangible
measurable way and let me just say
something about assessment
there is a saying that you measure what value, and you value what you
measure, and this is something that we've
really learned and we really practice
in the AIMS square program. So
what is it about the AIMS squared program
that makes it so successful?
In one word, AIMS squared is all about
community. So when you talk to our
students they'll tell you that
they belong to a small group of students
and a cohort they work very closely with
a faculty mentor,
they have peer mentors to support them,
peer mentors are not there for academic
support, they're there to give them
holistic and moral support to get them
motivated to improve their study
skills to help them with time management
generally to navigate classes. We found
this to be particularly helpful
for freshmen. Student tutors, again, we've
identified a variety of courses
where students need help
and we've hired tutors to help them and
all of these courses.
Students are respected to meet weekly
with their faculty mentors some of them
in fact have
bi-weekly meetings and there's a series
of
documented journals that keep track of
student performance
and how they're being advised. One of the
important requirements in the AIMS squared program 
is maintaining minimum requirements for
scholarship. We expect all students to
complete a minimum of 24
units during the academic year on the
average they take about 12 units a
semester
but we give them the latitude to catch
up during the summer
because some of them might be working
and they take fewer than 12 units during
the semester
as long as they complete 24 units during
the academic year,
which includes fall, spring, and summer,
with a 2.0 or higher grade point average
and passing all their courses to the C
or better. So these are the requirements
from the perspective of the students
some of the other things that happen
with the program are the services that
you provide
through our external advisory board we
have a number of workshops and industry
panels,
many workshops and resume preparation
interviewing skills,
research presentations, this is part of
our undergraduate research
symposium. So every year there's an
annual symposium where the students
make a presentation in a very
conference-like format
so you find students presenting posters.
It is all run very professionally and
the students actually get the experience
of presenting
in a professional setting. Okay,
so now let's look at the implementation
of the program
from a reporting perspective. As many of
you
who are part of the HSI STEM grant
program
are aware we all submit an annual
performance report or an APR
and then periodically the Department of
Education requires submissions of
reports
with project status updates as a
matter of fact up until last
October we were submitting monthly
reports focusing again on the
objectives in our case our six
objectives
demonstrating what was being
accomplished during that month and what
the future plans were
and we did this diligently across all of
our partner institutions
so as a PI I would collect this
information on a monthly basis
and submit this to the Department of
Education. What the APR
framework does is it informs the
decision making
from a formative stand form so that we
are able to improve the program
for example the whole program
around peer mentoring came as a result
of our formative
project assessment and then from an
outcomes perspective we are looking at
summative outcomes namely graduation
rates, retention rates
etc. It also gives us a chance to
introspectively look at
how the program components work and what
program components are the most
effective.
So the implementation in the mixed
methods approach
has really three foundations. The first
foundation
is based on social science. So this is a
systematic rigorous and empirical
investigation
of really human social interaction
we are students, we have freshmen and
transfer students,
we have faculty, we have staff, we have
multiple
institutions, community colleges,
different disciplines within the college,
and the interaction is very different
depending on where you are
in the program and who you're speaking
with. So we've developed instruments,
we've developed protocols,
to assess this on a regular basis. The
second one,
is applying and adapting these
behavioral
and social science research standards to
improve educational practice
because ultimately we want instructors
faculty in the future
to be able to use these techniques to
serve students effectively.
So our mixed design is a
quasi-experimental
and observational design protocol which
uses both quantitative and qualitative
data collection
and analysis methods to inform the
research project.
So we've already talked about the APR
and the APR consists of baseline data,
as well as actual performance measure
data. The data sources
there are two primary data sources one
is institutional data
typically in a university you have the
office of institutional research,
which collects information that
projects like ours can use and then the
second one is survey data
for example in the undergraduate
research component we use the student
self-assessment survey colorsta
for the engineering majors we use a
survey called EMS.
So we not only have these qualitative
survey data,
we also have quantitative frequency data
on project performance measures
and this can be reported both in terms
of raw numbers accounts
as well as percentages. In addition to this, going back to the
social sciences and behavioral sciences
approach,
we have a number of roundtable
discussions and focus groups
at our monthly team meetings to inform
our annual
APR reporting. For example, we get a group
of students together
and we do focused interviews asking them
about their experience on research
or asking them about their experience of
peer mentoring and tutoring.
The team as a whole, the faculty and
staff from the community colleges, from
CSUN,
meets on a monthly basis we've been
doing this from the inception of the
grant so if you go to the website
you'll find every single meeting
archived and cataloged
with the discussions available for you
to review going back to 2011.
So what do you do when things don't go
as planned?
So this is a good question to think
about you know as you're thinking about
formative and summative evaluation.
So the first grant that we did we only
had transfer students and transfer
students were coming in
with a level of maturity where they knew
what they wanted and they were
focused and all we had to do was to
provide them with the support provide
them with the resources
and they were able to be very successful.
When we started the second grant in 2016
we decided very ambitiously to add
freshmen to the program.
Now this came with its own set of
challenges. While many of the freshmen
were academically
brilliant, we found that they were
struggling to adapt to a new environment
in college,
to figure out how to be efficient in
terms of time management, how to be
efficient in terms of the study skills.
So the need for peer mentoring became
very acute and
while we'd always talked about peer
mentoring and had a few
anecdotal type of experiences we've
really rarely engaged in peer mentoring
as a solid practice. Matching mentees
with mentors,
having a structured way to document
those conversations,
having a structured way for
interventions with the help of the peer
mentors.
The monitoring of the project
participants
is another critical aspect of our
program. So every month when we meet,
we literally go through every single
group with each faculty member finding
out
where the students are in terms of their
academic career
and personal development. So faculty can
speak very candidly in this environment
and tell us about challenges the
students are facing and we don't wait
till the end of the semester to solve
these
we immediately have a team in place of
staff as well as peer mentors
to address the challenges that they have.
Now the reason we were able to do this
is that our early analytical work
allowed us to switch gears and
evaluation
so not only did we have the data we were
able to run statistical analyses
to decide which data sources to use for
further analysis
as we pivoted towards summative
evaluation. This also meant in some cases
that we needed to move away from survey
data to institutional data particularly
if the survey responses were low in
number and it was not a
and we were not able to draw meaningful
conclusions from the severe responses.
So as we transition to summative
evaluation
the key point that Nathan and I would
probably share with you
is that it's very important for you to
plan ahead and work with all of your
project stakeholders
because you not only need to reframe the
metrics but you also need to understand
how this has shaped the experience for
your students.
And it's very important to be looking at
the long term and this means
not just looking at year over year or
semester or semester but
longitudinal patterns and trends and
outcomes
from baseline to the end of the program.
Another aspect that I'd like to share
with you
is that while we have the APR formatted
fields
that ask for specific data in specific
formats we have found that adding
graphics
really enables the faculty, enables all
the stakeholders,
to visualize what is happening in a very
practical
actionable way. And then finally use this
to do something new because as a
transition to summative evaluation
and you're looking towards the end of
the grant you can really design and
execute a study
that tells you what the impact of the
program was. For example, one of our goals
is to not only create a change across
campus
but to create system-wide change and on
a broader level
to create change for Latinx and
underserved students around the country.
Plan ahead, I can't tell you this enough. So
from the start I think it's very
important that you will
anticipate what you'll want to know at
the end of the program and then start
building your summative evaluation
into a formative design to use the civil
engineering analogy
the logic model is your foundation and
your foundation has to be strong
and then you have to build the structure
with scaffolding in order to build them
to a very successful
summative evaluation and you can
certainly use the program performance
measures and objectives
to guide what you want to know so that
when you look back at the end of the
study
and then every year and every time of
the project you need to have
timelines to design and implement each
phase of this evaluation work
that's the only way it's going to get
done.
So let's look at how we do some of this
so
in our long-term scenario when we look
at the big picture as we call it section three talks about
institutional measures. Academic quality
outcomes, for example, is a focus
area, student support services and
outcomes, is another
area. So if you look at academic quality
outcomes and the question we are trying
to answer
is has the enrollment of minority
students increased? And here we report on
this
year over year as well as compared to
the baseline.
Similarly on student support services
outcomes, we look at continuation rates
both year over year as well as over the
life of the grant so the switch here is
from annualized reporting on term by
term
or even year over year to multi-year
trends
so you start developing longitudinal
trend lines and then you need to have
research stories that describe what
these longitudinal patterns mean
over time. This is a
great graph that shows you how visuals
can demonstrate the outcomes from the
project.
In this particular set of visuals,
you're looking at research and the
impact that research has
on our students. So for example, this
graph right here
talks about the progression of our
students
in terms of wanting to do research so
you see year one, year two, and year three
the number of students who wanted to do
it it's been about 71 percent
year one year to a change dropped to
almost 50 percent
I wonder why because we had a number of
freshmen who joined in the second year
of the grant
probably doubled the number of transfer
students and they perhaps did not
understand what research is all about
and then if you look at another graph
right here you have attributes like
research confirmed my interest in the
field of study, my resume has been
enhanced by my research experience,
research prepared me for graduate school,
research prepared me for a job,
again you can see year over year trends
and you can see how things are changing.
And then finally we look at how many
hours per week did
students spend talking with your recent
faculty research mentor. Again we look at
year to year trends
and we're able to address this in a way
that supports
both the current students as well as
allow us to plan for future students
coming into the program.
So graphs, charts, tables, etc are really
very very helpful
in exploring these long-term trends.
The last thing that I want to share with
you is always be looking to do something
new.
So take a look at the data that you
already collected maybe you just want to
look at graduation rates and retention
rates
but perhaps there is something in the
data that you didn't anticipate.
For example, use the textual data from a
qualitative survey
or focus groups conducted during the
formative evaluation
so that you can in turn help future
students
become better learners. Again you can
return to the institutional survey data
to plan a new study because it opens up
new avenues for exploration
you can design a study of program impact
this could be part of your competitive
preference priority
from the funding agency or it may be of
interest to your stakeholders.
For example, we have an external advisory
committee which meets annually their
industry leaders
that look at this independently and tell
us how well we are doing and areas that
we need to improve.
So this year during our advisory
committee meeting one of our advisory
committee members
wanted to know the gender distribution.
We had distribution and student
performance
for Latinx students but she was more
interested in the gender distribution
of the students to see if any practices
could be particularly
implemented to help women in engineering. So these are the kind of insights that
you can get
by looking at the data with a new pair
of eyes.
So I want to close with a couple of
examples that have really worked well
for our program
clearly from the remarks that I made so
far,
you understand that research and
mentoring we believe are big keys to our
success
if you look at the research program on
the on the right you see this graph that
shows you
the growth and the number of projects
that's the bar in blue
and the growth in the number of students,
which is a bar in orange,
so we actually started this even before
the new grant
in 2016. And you can see
in the first year we only had about 10
students and by the third year it almost
doubled
to 15 or 16, 17 students, 17
projects excuse me the number of
students started out at about 30 and now
it's almost
50. And these students as you can see
from the visuals
are expected to present a poster a
poster that is formatted according to
professional conference guidelines and
we literally put them on stage
give them five minutes to make a
research presentation, three minutes for
Q&A that are moderated just like a
professional conference
and the students do this they prepare
for this with their faculty mentors.
At the bottom you see the 2019 cohort of
research students with the president Dr.
Diane Harrison
who loves to visit this program and
support the students
how impactful and impressive this
particular program
is what is even more important to me is
that
the students invite their families their
parents their grandparents in some cases
their spouses their siblings their
children
to come to this event because they're so
proud to demonstrate what they've
accomplished
as a result of studying in the College
of Engineering and Computer Science,
and oftentimes it's the first time that
anybody in their family
has known or seen the type of work that
they've done. If you want to look at
numbers
you can you can easily see right here
that the
number of faculty involved over the last
five years is 57
these students have conducted almost 37,000 hours
of paid research, incidentally the
students are paid
$15 an hour for their work so we invest a
significant amount of time
and money in grooming these students
and helping them
be more successful.
Here's another data point and this was
one of the reasons why AIMS square was
selected as the 2019 example of
Excelencia,
increased persistence and graduation. We
had a three-year transfer graduation
rate of around 70 percent,
cohort persistence rate of 86 percent,
increased completion of gateway courses,
triple the number of graduates in five
years, 57 to 171,
and the last one is really you know
very enabling to me because it fostered
a positive career outlook in other words
100 percent of our cohort students
felt ready and prepared to be able to go
out and take on a professional career.
Really magnificent outcome and very very
humbled and deeply proud
of the accomplishments of the faculty
and staff to enable this to happen.
Now a lot of things have changed in the
last six months,
on the 12th of March we were all in our
classrooms and our labs and our lectures
on the 19th of March we pivoted to going
completely virtual
as a result of COVID-19, but although the
times have changed
the AIMS squared program still remains
very strong. We continue to meet
via Zoom. This is our monthly faculty
meeting.
We continue to meet with our research
team, again on a monthly basis. This
summer
we actually ran a virtual online
research program
with 10 students and four projects very
successful
and we plan to expand on that in the
coming fall semester
to include community college students as
well. So none of the
program services have dropped off we
still continue to do peer mentoring,
continue to do tutoring, we in fact
recruited cohort five
our last and final cohort, we have
programs in place to help the faculty
guide these students and we hope that we
will continue along this path
to make systemic change. So in closing
for further insight here are some
questions to guide formative and
summative evaluation.
I think it's really important for you to
be measuring what you value and value
what you measure that's that's a big
message.
So do interim an immediate formative
evaluation
so this gives you an idea of what
program components need immediate
and intermediate term attention and
based on that
you're able to adapt your program to
serve the students.
Contextualize the information so what
have you learned from the project
performance data
besides the fact that you're submitting
the annual performance report
look at the data deeply to see what the
students are telling you, what the data
is telling you,
what is the discussion telling you, and
use that to improve the program.
And then use the early intervention
evaluation
as an alert to update approach, in
other words, what do we know from the
descriptive or
inferential statistical analysis about
both the impact of the program
and the implementation including
evaluation design
and methods. So I think this is a great
program
it's a great opportunity for all of you
to move from formative to summative
just doing the things that you're doing
but with laser focus
on student success. So I want to thank
you again for listening to this webinar.
If you have any questions please feel
free to visit
our website which is up on the screen
www.ecs.csun.edu/aims2
My email address
s.ramesh@csun.edu.
Again on behalf of our project team on
behalf of our evaluator
Professor Nathan Durdella and myself, I
thank you very much for attending
and hope you enjoyed this webinar look
forward to hearing from you thank you.
