- [Instructor] For the webinar today
here are some objectives.
Whether you're a pro or a
newcomer to federal awards,
you have to submit performance
reports a few times a year.
The measures were outlined
in the solicitation
so collecting and managing
the work becomes real once
the budget's approved.
Often, we hear from
grantees at the PMT helpdesk
such questions as, "How do I define that?"
or "Where do I report on this?"
The objectives for this
webinar are to start with
making sure we're all on the same page
with some common terms.
Then look at partnerships
and TTA question banks
and shared measures.
And then end with questions
and your feedback.
So before we get started,
let's do a Webex chat
as a quick introduction.
So, to get us started again
can you please share in the chat box
the name of your organization
and the name of your award.
We'll give everyone a minute
to type in their answers,
and then you can read your responses
from your virtual peers.
Oh, wow.
Well, welcome everyone.
We've got quite a diversity
of folks coming in
from all over the country.
Quite a diversity of
programs as well, too.
It's definitely good to
see you all here today,
and I'm glad you're able
to find the chat pod.
Excellent!
Please keep typing in your answers
and I'm going to move forward.
So let's start at the top.
We often use the phrase,
the phrase is "Performance Management"
or "Performance Measure"
or "The Measures."
And in the next few slides
we'll create some common ground
for this webinar anyway
on a few key terms.
As part of your grant application
you were asked to create a
plan for Performance Management
for collecting the required data points.
So, mostly you're probably familiar
with the performance measures as the tool
or the place that you log into
and report
on
the different data points.
And that's different than the strategy
of how you gather data
and assess your goals.
So officially according to OJP
"performance measurement"
is a regular collection
of the end indicators
to assess whether the correct processes
are being performed
and desired results are being achieved.
So this is kind of higher
level terminology I meant
to kind of draw attention
to ongoing analysis
to continue this process improvement.
And next we're going to review
a bit about performance measures,
which I kind of like to think of
as a guidepost or mile markers
to measure progress toward goals.
So next, let's look at a logic model.
The indicators or measures,
as we often refer to them,
are something we talk about a lot
and are part of the
regular collection of data.
What are measures?
These are the things you report.
And again, as we often hear from grantees,
sometimes really not sure in the PMT
exactly
where to highlight
different types of work.
So for example,
how the number of TTA activities
reported shows any impact.
The items on this slide,
inputs, activities, outputs, outcomes,
are part of a logic model to think about
for performance measures.
Inputs
being the program resources,
many of which has been
used to create baselines.
Examples include grant funding,
cross-sector partnerships,
and community participation.
Activities are the actions
that convert your inputs to outputs
that eventually result
in measurable progress
toward a program's goal.
These can include things like
trainings, meetings,
partnership development,
data analysis, and strategic planning.
Outputs are the countable
products or services.
For example,
how many individuals
or organizations you provided
technical assistance to
via in person
calls or through emails.
For trainings, registration systems
or sign-in sheets often help
you to collect this data.
But you should be reporting
the number of individuals
who actually participated
or attended the training.
Outcomes are a bit tricky
because they speak to the
longer term program goals.
Outcomes demonstrate that
impact of the training
and technical assistance on
partnerships for the program.
So what is Performance Management
and how do you leverage
OVC performance data
and make reporting easier
and use the data
to kind of paint that data
driven picture of your work?
Well, here's the plan.
Here's what we hope grantees have embraced
or will grow to. Performance Management,
which is systematically using
the items on the slide here
to improve the results of programs
and the effectiveness and efficiency
of organizational operations.
So why talk about this now
when most grantees are well into the work?
Part of the reason is to set common ground
and to highlight a few points
such as analysis, strategic
planning, and reporting.
The plan is important
because many of you have
multiple OJP awards.
And, possibly there are
folks on the webinar today
doing the data collection,
and then there are also those of you
who are doing the reporting,
and those may be two separate
roles in your program.
These plans are not static,
but they should be
considered living documents.
They should be a touchstone
throughout the course of your program.
But what does this all
mean in the real world?
So, Performance Management
Plans often include
identifying a primary and
backup point of contact,
or POC, as we like to say,
responsible for collection,
documenting the processes, data entry,
and training others at your organization.
The main contact is hopefully
someone who knows your internal systems
and when they should be updated,
and someone who can create consistency
for your assigned performance measures.
And in the create consistency
bullet, for example,
we're talking about documenting processes,
creating SOPs, or standard
operating procedures,
so that if there's staff turnover
or someone wins the lottery
and leaves the program,
it's easier to transition the work
and the responsibilities to someone else.
And a new person, for example,
has the OVC helpdesk point
of contact information.
Currently, as defined
in PMT system training
that many of you have taken last year,
there are two different core reports
that you're responsible for.
First, is the quarterly
performance measures data report,
and that collects information
on your direct activities.
The second report is your semiannual,
where you will respond to
additional narrative questions,
your qualitative, in addition
to your quantitative data.
The report is eventually
made into a PDF document
that the PMT automatically generates
that brings together two
quarters of data entry.
So within each report
you see question banks
and shared measures,
which were assigned to you by OVC,
depending on the goals of the award.
So, let's take a second to look at
question banks and shared
measures, just generally.
So depending on which question
banks and shared measures,
again, that OVC has assigned,
you may also report
on a set of corresponding shared measures.
OVC listed out all the
performance measures
and organized them so the grantees
only answered each measure once.
Each measure connects
the inputs and activities
to the outputs and outcomes,
which help tell your program's story,
and ultimately decide if
you are successful or not.
Looking at the full list
of your assigned question
banks and shared measures
is a good place to start.
For example, while waiting
for budget approval
it's a good place to begin
to prep the qualitative
and quantitative answers,
in particular for the narrative section
of the semiannual report.
Under the Need Help tab
currently in the PMT,
you can find a map that
shows each award name
and the assigned measures.
So there's a lot to discuss
on the slide, I know.
So let's just focus on
two particular question banks
and shared measures today.
So as promised, we're going to
look at partnerships, first.
So, this slide may look familiar.
It's question bank V,
collaborative partnerships,
something you all report on currently.
This is a measure where actions
convert baselines to outputs.
Most often collaborative
partnerships that last,
and are productive, are
not created overnight.
They don't just drop into your lap.
They take planning
and often funding.
These measures help OVC learn
on whom their funding is focused
in relation to the larger collaboratives.
So for example,
multidisciplinary teams or MDTs.
On this slide, you will see
what you report quarterly
and what you'll have a
chance to kind of expand on
in the semiannual narrative.
As part of Performance Management planning
ask yourself, "What's the partnership goal
for the next 6 months?"
This kind of gets to
those story questions.
For example, is the goal
to increase partnerships?
To increase the number of partners
using evidence-based practices?
Or increase the diversity
of your partnership?
In other words,
increase the diversity
of who's at the table.
In terms of evidence-based practices
organizations will need to self-identify
if they use evidence-based programs,
and this is an important
point not to overlook
as we consider more
data-driven approaches.
This question allows OVC
to demonstrate the value
and specific benefits of the
program to government agencies,
and the victim services
field, the general public,
and other stakeholders who
often come to OVC for data.
So this is the question bank,
but what does that
really look like for you?
Previously, we defined
performance measurement
as information
or data
showing achievement of
desired goals or results.
Performance measures being the parameters
against which progress
toward goals was assessed.
We get a lot of questions at
the helpdesk about things like
"Where do I report on this
partner organization?"
So I want to highlight
questions two and three
because of the different
levels of formality
and the type of
partnerships for the award.
For many of you question
two may be answered
with a one or a zero if you only have one
or maybe no grantees for
the life of the award,
and that's okay.
As a grantee myself,
I often created a
Performance Management Plan
that asked me to document measures monthly
in my agency's tracking sheets.
Things often happened so fast
and partners are out there
and doing the work daily,
so I found I needed a Word document
where I could kind of summarize
the monthly activities,
particularly related to
partnership organizations.
And I just did this in a
couple of bullet points,
two to three sentences.
I would ask myself "Why and how?"
so I could remember it while
it was fresh in my head.
This also lent toward that
consistency bullet point
that we spoke about before
in making sure that I was
applying the same definitions
for partners from quarter to quarter.
And this may also help you
in working with subgrantees
to set a framework
for why performance
measures are important.
Then quarterly, be able
to step back and look at
performance measures as
those quarterly numbers
ultimately flow into
your semiannual report.
And consider then documenting activities
and further provide context
for your program objectives.
For example, we said before,
what is the goal of your partnerships
for the next 6 months?
And then be able to say how did we
increase the numbers
and why were we able to do that.
For example, we started
doing this new activity
to get our partners more engaged with us.
Or we ran into these challenges
and some partners had to withdraw, why?
Capturing these stories make
shared measure questions
later in the report easier to answer.
That's related to the shared measure.
Before we go on to the shared measure,
we have one more topic to
cover in this question bank.
So evidence-based practices.
Question four asks about those practices.
They're self-reported by your program
and your program partners.
If you were to bring this
up with a subgrantee,
for example, how do you
know what is evidence-based?
What programs are out there
and what programs are actually being used?
How do you create a common
definition for your program?
There is, as many of you know,
much research that goes
into the government
designating something
that is evidence based
and not just a promising practice.
So I know that reporting on
this one can feel tricky.
So I'd like to open it up to the group
with a quick poll question.
I will post that here
and bear with me one moment
as I get the poll
question up on the screen.
Okay, so you can see the
poll question on the screen,
asking you if you've ever
researched evidence-based programs
or practices on CrimeSolutions.gov.
That is a popular site to use
and one that's often
referenced in solicitations.
If you have,
I was wondering if you could
share which program or practice
you've used in the chat pod
with your virtual peers.
Give each other an idea of
what evidence-based programs
or practices might be
popular with our OVC grantees.
All right.
So it looks like there
is a majority of folks
who have not really ever researched this
on CrimeSolutions.gov,
but we can always talk about that more
in the feedback session today.
Or I would encourage you
to give that one a look.
So,
moving on
to our
shared measures
B.
How does the information in
question bank V differ from
shared measure B partnerships?
This particular measure up on the screen
allows you to provide feedback
and assess if partners are
substantively contributing
to the implementation of a project
or development of deliverables.
For those of you who have ever completed
a community assessment or needs analysis,
did you ever use this list of partners
to shape who you asked
for a formal agreement?
Could this be partially a reported outcome
of that particular assessment?
This is a great list of questions
to consider things such as
are you noticing a change in time,
over time, in the quality
of partner engagements?
Are there groups with which
you don't have a strong relationship
and you could implement strategies
to improve that relationship?
I've used this list periodically
to look at what groups are
maybe informally involved-
there's no MOU
or formalized collaboration agreement-
that maybe we really should
have a more formalized agreement with.
One that could maybe help
you with a gap analysis.
So, here on this slide,
we have the table that's
pretty comprehensive.
For each category, you'll
respond to the statement,
"This partner is actively
involved in the program,"
by agreeing or disagreeing on a scale.
You may not have partners
in all those categories,
and that's fine.
You can indicate NA for those categories.
The definition, of course,
for active partnerships
will be up to your program.
But again, as I mentioned before,
it's often good to use this
table for a gap analysis.
Perhaps you're working with
several different types of
law enforcement agencies,
and are there others listed here
that you're not currently engaging with
that might be able to support
your program in some way?
Also we hear at the helpdesk
about the great work being done
by collaborative partnerships,
and that's coupled with a concern
of where to report that work.
I wish I could give you a
simple and clear answer,
and I would ask that you keep an open mind
and think strategically
about who is involved,
how you're rating them,
and again, considering
that semiannual report.
So, also think strategically
when you're talking about partnerships
with our next question
bank and shared measure.
Technical assistance,
or TA,
or TTA.
We seem to have a variety
of terms and acronyms
that combine two distinct activities.
So, before we move forward,
I just want to stop for a second
and acknowledge the
definitions that OVC has
in the Performance Management User Guide,
and these can be a touchstone
or kind of a beginning
way to map your definition
for technical assistance,
and create that consistency
from quarter to quarter.
So in terms of definitions,
for example, using what's
on the screen to say,
"Based on the mission
and of our organization and
definitions in the user guide,
our program defines TA as
the following,"
and list out what that could be.
Then in the semiannual narrative
or in future grant applications,
quote the definition and state,
"Therefore, based on that
definition in quarter one,
we saw technical assistance email requests
for information after training
or after an outreach event,
as the main reasons that our
numbers were increasing."
Training of course, on the other hand,
aims to increase knowledge
and build skills,
using a specific curriculum.
Training is really more time bound.
Technical assistance of course,
is more going to be that
delivery of knowledge
to a group of individuals
to help address a specific inquiry, need,
or emerging issue.
So I highlight this because
we often hear at the helpdesk,
"Is this training or
technical assistance?"
or "When does training
become technical assistance?"
And that is definitely
something that we're here
to help you figure out
because that definition
is going to be different
based on the program.
So, often the people you train
will continue to contact your agency
and report back on what they've learned
and how they're applying
that in the community.
And if that's the case
and you provide additional guidance
or additional resources on
whatever they spoke to you about,
your training has now crossed the line
into technical assistance.
So I'm kind of curious
how everyone on the call works with TA.
I have got a poll question here
that
is on the screen,
and if you just bear with me one moment,
I am going to get this poll question open
for you to actually answer.
So I'll give this one a couple minutes.
We kind of want to know,
on a scale of one to five,
how well does your organization
collect data on TA?
Ranging from
folks who are going to
give us feedback at the end
on their best practices,
all the way down to,
I think there's TA out there,
but I'm not quite sure what it is.
All right, well thank
you for those responses.
It's kind of interesting.
We've got a bunch of different responses
kind of all over the board.
So, I'll definitely,
feel free to put your questions,
your comments about
this into the chat pod,
and we'll hopefully have time
to discuss this a bit more
toward the end.
All right, so technical assistance,
question bank II.
All right, time for my soapbox lecture.
So TA is a measure and an
output often underestimated,
but to me, highlights
that opportunity again
for continuous quality improvement
because sometimes it really can allow you
to ask yourself some tough questions.
So for example,
in creating your
Performance Management Plan,
is there a definition
for what does completed or
closed TA actually mean?
What does it look like?
I mean, I know the grantee myself,
I found that to be a sticking point.
TA can also be a jumping off point
for strategic planning with partners
to look for potential
avenues toward sustainability
of other funding sources
based on trends in TA
that maybe you didn't
even know were out there.
There can often be
surprises from the community
that are shown through
different TA inquiries.
This question bank gives us an opportunity
to really listen to the community
involved with your program.
In fact, it might even
start a data-driven review
on who and where the
requests are coming from.
Plus, how many were actually completed?
And being able to look back
at those open TA requests
and say, well, why are they still there?
Are they opened because
resources don't exist?
Maybe there's a lack of a
partner to, a partnership
or a partner, to refer someone to.
Or maybe just your staff isn't quite sure
how to best answer that question.
TA is often tied to training
by many organizations.
So for example, did you train,
or did a subgrantee
train, community members
who then came back to
you again after training
to ask for that help?
And did you provide any referrals?
You know what?
Maybe you changed your case intake form
based on the way TA is coming in?
How can you document for yourself
those lessons learned each quarter?
What stories are you telling yourself
and the organization?
How are you starting to
create that qualitative data
in a tracking system, quarterly?
For example, why there
was a fourfold increase
in TA requests.
How can increases in TA
maybe help you plan for staffing needs?
Maybe you've got interns
that need something to do in the summer.
So pretty soon you'll enter
your April through June data,
and then you'll
ask your narrative questions
and answer those in your
report that's due in July.
So, if this is your TA data on the screen,
how can you use a narrative to talk about
the increase in completed requests?
So OVC loves to know that stuff,
and I'm sure if you do
there's going to be long-term
impacts to the TA provided.
So following up on these requests
to find those long term impacts
can definitely take time.
So to help complete requests,
a colleague of mine used
to always have a link
in their email signature block that says,
"Have you ever received TA from me?
Tell me how I'm doing."
And that honestly just
links to SurveyMonkey,
a free survey tool that's out there.
You can use things like JotForm and Wufoo.
These are all free applications,
and they're also a place
that can give you a reminder
to maybe send out a TA follow-up survey
after 6 months
to ask if what was provided
helped in any significant way
or made any long-term difference.
So I've personally often used
interns when I was a grantee
to make those follow-up calls,
to export the surveys from SurveyMonkey,
and to kind of pull some
of that data together.
So as to collaborative partnerships
this TA question bank has
a related shared measure.
So we have this on the screen again,
and many of you may be
familiar with this view.
This measure is also
reported by your subgrantees.
So for me, this is a good example
of why you need that
consistent definition and POC
if you're working on reporting.
Similar to partnerships,
do you want or need
to increase the number
of team materials created
to reach a different
portion of your community?
This is another opportunity
to tell yourself,
and then eventually OVC, a
story from the past quarter
in the semiannual report.
So maybe consider instead of
rehashing program objectives
and the quantitative data
that you've already posted quarterly,
use the narrative to relate
why the materials, for example,
were translated into two languages.
Or how are you now defining
that term,
"other customized TA
toolkits and materials."
Does that relate to a social media plan
maybe you haven't thought of before?
Consider adding
a reference to
TA materials shared
through shared measure A
were created and called...
whatever the names are,
and we created these because...
Or due to a trend in
this type of TA request,
we improved our intake form
or improved materials,
which will be disseminated
at teacher trainings
or by our partner organizations
during their TA efforts.
In other words, lead into
talking about your subgrantee
or your collaborative
partner's TA efforts.
So document those stories
and highlight them in the semiannual.
Technical assistance offers a place
where you can create stories,
you can use when talking up
your program in the community,
or pitching a future funder.
For example,
you provide
the national level data,
you provide your local level data.
You want to make that real for them.
Technical assistance can
be a place to do that.
You can say, for example,
this one technical
assistance came in where...
And that highlights a need for...
I'm sure everyone on this
call can fill in those blanks
with their different stories.
And then follow up with the surveys.
Are you using surveys to
collect feedback on TA?
What are the results you're seeing?
So this question bank and shared measure
are an example of what was noted previous-
on a previous slide-
finding definition
consistency by creating a map.
So for example, based on
our organization's mission,
in the definition in the OVC User Guide,
our program defines TA as...
In the semiannual narratives, say,
shared measure in question one,
we trained this many people
from these types of agencies,
and after the training,
how many people came back
to ask for more guidance?
One example is, provide
one really good story.
I know you've all got them.
So I told you a little of what I've done
and what some of my colleagues have done.
So now again, I'm curious to hear
from everyone on the call today
with another poll question.
So our poll question is up
on the screen for you to read
and I will open the poll now.
So I want to know who
actually uses "post" surveys?
And we say "post" that
could mean training.
Hopefully, everyone out
there is doing that.
And also TA.
Or, are you actually doing both?
Okay.
Quite a few did not answer.
It looks like post-training
was definitely our most
popular way to follow up,
which I think is probably pretty accurate
for most organizations.
All right, so moving on here.
The performance measures
we talked about today
are not changing anytime in the future.
And again, as we know in July
there's a semiannual report due.
So ask yourself,
how can I maximize these
performance measures
to
show my work?
How can you tell OVC a
story of your community,
your example pulled from
TA and partnerships?
As we have heard,
there are activities
related to your grant work
that sometimes don't often feel
like they sit in a performance measure.
You're not sure how to connect
those outputs to outcomes.
The semiannual again is a
great place to step back
and look at your two
quarters worth of data
and ask yourself, is this reasonable?
And maybe, kind of, start
to come up with some of your
qualitative data from there.
And as we've covered, again,
this report allows you to take advantage
of the narrative sections,
further explain the data
from previous quarters,
use examples, focus on increase
in referrals for victims
so they can get the services needed.
Review your report to see how
you accomplish those goals.
For example, we increased partnerships
over the previous 6 months.
Or maybe we didn't increase
as much as we wanted to,
and we're still working on that.
Once you have
this very manageable data
set in place, what's next?
Well let's consider some options.
So on the screen here,
we have a couple of different
options to consider.
So after you've tracked your inputs,
and you've reported your baselines,
and you're actively engaging
in performance measurement,
and I know a lot of people have ideas
for showcasing programs
to increase and solidify partnerships.
And I'd love to hear
those in the chat pod,
but I also have some others
up on the screen here
for you to consider,
such as downloading the
quarterly Excel reports
and aggregating total TA,
or providing the feedback on training
and sharing that with board members,
or within the community.
For example, maybe creating a snapshot,
one page, or a fact sheet
that focuses just on the partnerships,
or just on the
training-the-TA question bank
and shared measures.
Fill that in with your narratives.
Maybe build an Excel-based dashboard
to share staff meetings internally.
So, have you ever been invited
to speak at your local police
department at a roll call,
or at a county board meeting?
These are all potential
avenues to pitch your program
in the hope of attracting a
future funders and partners.
And, kind of like as in the movie
and like documentary field,
they'll often create what's called a pitch
to do the very same thing.
So what is a good pitch?
Some key aspects of it are
telling your story.
Pull a story from your, from TA,
pairing it down to the essentials,
outline your strategies,
and how they're supported by real numbers.
Full program evaluations take time,
and you have updated
and specific PMT data, quarterly,
to show percentages and changes.
And, a pitch could be short,
it could be 3 minutes,
it could be 7 minutes.
And it's a short time,
but it's a good time to talk
about yourself, your program,
the work that it's done.
Give the local community-level data
to bring home what those
national numbers actually mean.
Well, hopefully I've given you
some new things to think about,
and my team and I are
definitely here to help
with
those options.
So, how can we help?
I've mentioned our wonderful helpdesk team
in this presentation a couple of times.
So, now I'd like to bring
on my colleague, Louis,
to tell you a little more
about how we can help
and wrap up the presentation.
- [Louis] Appreciate that.
Thank you, Erin, I really appreciate that.
We think it's important
for you all to know
that you have the creativity to find ways
to use the OVC performance measures
to be more data driven
and to tell your unique story.
The OVC Performance Management
Team is here to help
with a suite of resources.
Contact your helpdesk to receive
assistance with data entry
or any specific questions that
you have about your measures.
We can accommodate TA calls
outside normal business hours
for our grantees
in different various time zones.
If you're experiencing
any staff turnovers,
particularly with your
data collection POCs,
then ask about scheduling
a "Welcome to OVC
Performance Management" session.
I can help answer any of these questions
or even schedule an individual
technical assistance session
via a conference call or Webex.
And we can do a screen share
to review your account.
Sessions may also involve a system expert
or a data analyst, if necessary.
For example, if you were looking
to create a program pitch
for volunteers to use in your community,
or to create consistency on messaging,
we can be your sounding board.
Please contact the helpdesk
to schedule your call.
And with that, I will post
our contact information
on the next slide.
On the screen now,
is the Performance Management
Helpdesk contact information.
We will send this information out
and points from the last
slide on how we can help
in a thank you email after this webinar
to all of our attendees today.
We hope to have this webinar
posted in the near future.
We want to thank you so much for your time
and attention today,
and I would now like to open the floor up
to questions and feedback.
Please unmute yourself to speak
or type in your question in the chat pod.
