Hello everyone. I'm Gina Gomez with the
SNAP-Ed Connection. Welcome to part 3
of our series on adapting SNAP-Ed to the COVID-19 era. Today is our final session of the
series. So for those of you who've joined
us for the first 2, welcome back. Like
usual, I'm gonna get started by going over
a couple of quick housekeeping items. Yes,
these are being recorded, as, the last two
were as well. We will have them available
on the SNAP-Ed Connection website shortly, as soon as we're able.
Copies of the speaker
slides are available to download now from
this webinar. If you find the Handouts tab,
there are 4 files in there, 1 for each
of the speaker slide sets, and then also,
we have there, a certificate of completion.
If you would like to download that. These
webinars have been pre-approved for 1.5 CEU hours from the Commission for Dietetic Registration.
So, if that's of interest to you and something
that you're tracking, you can download that
certificate during the webinar. Because
it's available for download, we are no longer
doing an automatic e-mail after the session
to send you copies of the slides and the certificate.
So, if you want those, download them now.
Today we have speakers from 3
different organizations within the SNAP-Ed
community, and we're going to be doing Q&A
at the very end. So, if you have questions
for any of the speakers throughout, please
take a minute now to look for the questions,
panel, and your webinar menu. Once you open
that up, you'll have a text field where you
can send your question, and then also the
drop-down to select the recipient. In the recipient box,
please select "organizers," regardless of who
the question is for- organizers. And then
just write your question. It does help us
a little bit to keep things organized. If
you could, in the body of your question, if
it's for a specific speaker or organization,
mentioned that. Just say "a question for Missouri"
or "question about PEARS", that way it
will help us to queue up the questions a little
bit more efficiently. So that would be helpful.
Let's see, so, also, if you need
help or technical assistance- I saw a couple of questions already coming in- Just use that
method, and we'll work with you one-on-one
to help you troubleshoot any problems that
you might be having. Okay, a quick, series
overview. So, we put together these webinars
as a joint project, between the SNAP-Ed Connection
and the SNAP-Ed. Toolkit, inspired by many
of the conversations and questions that have come up during ASNNA open calls, questions,
that we've received at the SNAP-Ed Connection, and questions that have come into the regional
and national staff at Food Nutrition Service.
We focused on bringing together
organizations that have had perspectives based
on experience working with remote platforms.
So, folks that have been doing remote delivery
of their initiatives pre-COVID, so we can
learn from their expertise. And then also,
bring in folks that are adapting in person
curricula to these times, so that we can learn from their experiences in making
sometimes dramatic changes, to the way that they're delivering their services across the country.
Mostly, we wanted to make a space
where people could share, brainstorm, share
their challenges, and successes. So, we put
content of these series into 3 separate
sections. The first 2 have already
taken place, and those covered adapting SNAP-Ed
to remote delivery methods. So, we won't
be really focusing on that today. And then,
last week, we talked about measuring and evaluating
impact. So, those two topics we've already
done, if you've joined us, great. If you didn't,
recorded sessions will be available for
both of those shortly. Today we're really
going to zoom in on collecting and reporting data.
So, we will hear stories from people
that are using tested methods, emerging ideas,
and we'll go from there. There are three different
presenters today, from 3 separate implementing
agencies who have some great experiences to
share in this area. We're gonna start by hearing
from doctor Kimberly Keller, the State Director
of the Missouri Family Nutrition Education
Programs. We'll hear from Aaron Schroder, Associate
Director and Technology Officer at the Office
of Educational, Innovation, and Evaluation,
Kansas State University. And finally, we'll
finally, we'll hear from Doctor Lila Gutuskey,
evaluation specialist and director of public
health programs at the Michigan Fitness Foundation. So a lot of really good information today,
get ready for that. So we're going to transition
to the next slide deck and we'll hear from
Kimberly Keller. And while that is happening,
I'll do a quick introduction.
Doctor Kimberly Keller has
been working in areas of program evaluation
for almost two decades, including leading
evaluation efforts for the University of Missouri
Extension, SNAP-Ed and EFNEP program since 2006. She's also served as a member of the
Leadership Team for ASNNA from 2016 to 2018 and was co-chair, excuse me, of
the evaluation committee from 2016 to 2019. Doctor Keller is currently co PI the University
of Missouri Extension statewide SNAP-Ed program. I will let you take it away!
[KK] Great! Thank you, Gina, for your introduction, and
thank you to everyone joining us today. 5:39I'm
going to talk about the SNAP-Ed Data Collection
and Reporting Procedures we use at the University
of Missouri Extension and modifications we've
made because of COVID-19. And... there we go
First, I'm going to talk about our
program, in general, to give you some context
on the data collection infrastructure we had in place. SNAP-Ed, as a statewide program
in Missouri, and, there are 4 organizations
associated with SNAP-Ed in Missouri: the Missouri
Department of Social Services, as our state
agency, University of Missouri Extension is
our state implementing agency. And we subcontract
with Operation Food Search. And through that
partnership, we're able to extend the reach
of Cooking Matters throughout more areas of the state.
MOCAN or the Missouri Council,
for Activity, and Nutrition, is our state's
partnership organization and serves as our
state's SNAC.  And you see the, the map
of Missouri here on our right. It's largely
a rural state with urban areas on the Eastern
and Western borders in Kansas City and St.
Louis. Being a very rural state, rural broadband
access is a challenge, and it's a challenge during COVID for sure, with remote delivery.
In Missouri, our SNAP-Ed projects are
aligned with the Federal EARS Reporting Template. Specifically,
we're focused on the 6 setting domains
that are specified in the EARS reporting form for item 5.
Some of that form, is shown
on the right side of your screen.
And the current fiscal year, FY 20, is the 2nd year of this project structure, and we made
this change in order to be able to talk about
our programming efforts within specific domains
in a more integrated manner so, that more
information can be lifted up easily for specific
stakeholder groups. An important caveat
to keep in mind is that a site-type approach,
such as this does have some limitations. There are some locations like churches that can
fall in multiple places in this. It's
a place of worship obviously, which would
fall in the "live" domain. Many churches have
childcare which fall into the "learn" domain.
And they sometimes have food pantries,
which would be in the "shop" domain.
So those are some, some nuances to the data that we have been able to be flexible in our data
entry system to capture those nuances at the time of reporting. So our data collection
environment as a whole, has 4 main components, which we'll go into a little bit more detail
in the subsequent slides. 1st component
is, are any Extension web application that's
specific to our SNAP-Ed program? And that's our main reporting system. We also have
a PSE project tracker to track policy systems and environmental change efforts.
We have some GIS applications, and these  geographic information systems applications help to spatially
orient some of our efforts on maps. And
we also use Qualtrics as an online survey tool.
So, a little bit deeper dive here.
Our MU Extension web application, we use
to collect group enrollment of SNAP-Ed participants.
Whether they're in direct education classes,
or if they're being served through indirect
intervention channels, and groups of participants,
regardless of either of those two delivery
types, are recorded at the site level.
The image you see here on the right is an example of what one of our fully reported events looks
like in our system. And this is this particular
example is an example of a direct education
event. Reporting of an indirect event would not have the demographic
information that you see here, but it would
still have the site name, the site type dates,
numbers of youth and adults reached. Our
PSE project tracker is on this slide. Or
parts of it anyway. And this is an Excel
template. Credit for this tool, I give to
our PSE specialist, Jollyn Tyryfter. She is the mastermind behind this. It has some basic information
about our local project description, including reach. When the efforts first started, where
it's located, whether it's still ongoing,
or whether it's been completed.
We collect information about community partners,
such as contact information, what type of
organization they're from, what kind of contributions they're making. You'll also see kind
of like the 3 little screen box down as
an action plan template. So listing the tasks
that the effort has identified to achieve
who's responsible, target dates and whether
they've been completed. As well as a
project narrative that kind of gives more
of a descriptive overall view of progress
on the project. What's not shown because
that would be an utterly cluttered slide. On
subsequent tabs of this Excel template, we
have checklists that document any nutrition and physical activity supports that have been
implemented as part of that PSE effort. A
table documenting the support received from
partners towards that PSE effort. As
well as additional funding or other assets
that have been leveraged in support of that
PSE effort. And Jollyn, put a lot of
effort into making this be a tool that would
be very useful to those who are helping to
coach this PSE activity to completion. And
that would also document the information that
we need to be able to report on these PSE
efforts. Our GIS applications: we had
these two GIS applications in place at the
start of 2020. On the left is our
SNAPEd.engagementnetwork.org site. And, I would encourage you all to look at it.
It contains secondary data sources that are specified in the SNAP-Ed Evaluation Framework, specifically,
many sources that are in the sectors of influence and the population results sections of that framework.
When possible, those data
have been re weighted to be specific to that
SNAP-Ed eligible audience- that segment of the population that's below
185% of federal poverty. And, we've included data for all states, not just Missouri.
So you don't have to be looking for some data that's in Missouri, to be able to use this site.
On the right, we have something that
was new for this year, is our garden map.
We have been tracking information about the locations and activities in our gardens, just
on simple Excel spreadsheets, but we've had the opportunity to move this to GIS platform
for this fiscal year. So, here you'll
see a snapshot that is focusing on one particular
garden on this map. Collecting information
about the location, how many gardens, and what's
being done with the produce at that site.
So, if they're incorporating that produce i
in taste test, whether they're donating it
to food pantries, whether they're sending
at home with participants, those are the kind of activities that they're documenting with
this particular tool. And then the 4th
thing that we had in place, at the start of
2020 was our Qualtrics, which we use to collect both outcome data and some process data.
Now, if you're looking at the image on the
left hand side, these participants surveys:
those are screenshots of paper surveys. We use Qualtrics for participant surveys at the
beginning of this year as more of a data entry portal, but we were still using paper
based surveys. But that's a very
useful way of using that particular tool.
On the right hand side is a success stories
survey. A simple success stories survey that
we have available to our staff for them to
be able to document success stories, that
they've had, and delivering SNAP-Ed activities. And we asked staff to submit at least 4 times a year.
You can see that there
are some check-boxes for particular topics
that a success story may involve. And
you can't see it in this screenshot, but there's
also the ability for them to upload a photo
and photo release forms, and that all be housed
within that Qualtrics survey umbrella. So,
we're pretty excited about having all that
information together. So, then March
happened, right? And you know, good plans can
be in place, but plans have to change, right?
So, we all know that 2020 had some unanticipated
challenges for our daily lives, the needs
of our participants, and how we are able to
carry out our SNAP-Ed activities. And, in
this next section, I'm going to talk about
some of the data collection and reporting
strategies we're using to reflect our current
SNAP-Ed work. So, I'm guessing many on
this call can relate to these photos of one
of our educators. You know, the distance
Zoom kind of life that we're living in, and
our delivery mode changed in many instances. We were fortunate that many of our partners
allowed us the opportunity to continue meeting
with their groups online, and recently, we've
also had several areas of the state that have
opened up, and our educators have been able
to meet with groups directly, such as in,
our garden programs are really effective when
deliver outside. So that makes it easier to
deliver those kinds of programs. When
we had situations when things were first starting
to close down, especially where we had to
truncate a series, our educators were
often able to continue to deliver educational
materials, even when they were not able to
continue meeting directly with those groups.
When schools and agencies are kind of scrambling,
trying to figure out their next steps. Yeah,
we were all in that situation. So these
shifts have created some reporting challenges,
as, I'm sure that's why many of you are tuned into this call- to figure out, OK, what do we do?
So it can be difficult to collect
demographic information when you're not physically
in the same room with people, or you can't
do visual estimates because everybody's zoom
screen is, is blank. If you're in the
middle of a series and the local ordinances
say that, OK, it's time to shut down, we gotta
go remote, that's that disruption is really
difficult to deal with and report. There
was a question that was submitted online from
one of the earlier webinars: "How do we
know that our audiences qualify?" In our
program, because of the way we recruit, we
work with agencies, and the agencies often
have that information about their clients
that they can share with us. So we don't do
a lot of that screening at the individual
participant level. We've really been
working a lot with our partnering sites to
assist in collecting information that we need.
And looking forward, as we consider recruiting participants directly, that's an implication
that we have to do. If we don't work directly
with those agencies. You know, we have to
find those alternative means of collecting
that demographic information. That qualifying
information, making it part of a registration
process perhaps. So there are some real
challenges to work out and there's not gonna be the same answer working in every situation.
What we have done to deal with the
shift and delivery mode, challenge is, we
added some language to some of our reporting
forms. We did not come up with this language.
I'm not going to claim credit for it.
I believe that this was language that
was adopted by a number of states who are using the PEARS Reporting System which I think
you may hear a bit more in just a few minutes. But these categories are trying
to cover the range of possibilities of how
this pandemic social distancing these challenges
have affected a particular series of programming. PSE efforts: in terms of our PSE reporting,
because we have a very action plan and narrative structure for reporting those activities,
those periodic narrative reports and
action plans seem to be sufficient for collecting
the broad brushstrokes of those PSE efforts.
Our staff have been adding,
you know, what progress they've been able
to make, or if the those efforts have shifted
focus, things like that. But we also
had surface, some other other programming
opportunities that have been in support, in direct response to COVID. So, this Missouri
Food Finder Application, which you see a screenshot on the right, is one example.
So this was developed in response to markets, producers caterers in some cases, that really had to
shift their business models because of local ordinances. Local stay-at-home orders,
you know, institutional outlets
for their product closed. So, they really
had to change your business model, because they didn't have the same ways of doing business
that were available to them. So, this,
our Local Foods Specialists developed -or
it was the brainchild of- to help those producers identify different marketing ways of of distributing
their product to those who needed them. Or for those who needed to find local food sources.
They could find this- find what was local to
them using this particular application.
In terms of things that we've changed, in Qualtrics, we are implementing such changes, in our Qualtrics
surveys. We are modifying to allow direct
online survey-taking instead of just paper
surveys. You know, lots of agencies want to really reduce the amount of paper, or amount
of things that people are touching and passing to each other. So, we're trying to be responsive to that.
We're also including some demographic
questions to help collect that data for our
reporting purposes. In terms of success stories, you may have noticed on the screenshot that
I showed earlier. We do have a tag that
we've added so, they can have a tag
for disaster response eg. COVID. So, you know, I don't know why we didn't come up with this sooner.
It's not like, we don't have floods,
tornadoes in Missouri, so I think we'll be
able to use that for other purposes once COVID hopefully resolves. Hopefully very, very soon.
And we are considering adding some
additional surveys to do some needs assessment
work with partners and our target audience,
as well. Just to kinda make sure
that we're on the right track and meeting,
the changing needs of our audiences.
So addressing data collection challenges: we talked a little bit about challenges of collecting
data from participants regarding their demographic information and evaluation responses.
You know, partner agencies can help if they have it available, or they can help in distributing
links to clients to fill out. There are
also some data structure limitations to consider too.
So, for example, our data structure
for reporting, our direct contact work is
based on a series of lessons, which, you know,
we found is not quite as convenient when the
the 1st part of your series is delivered face-to-face, and then suddenly your school is closed and
you have to continue it online. So you don't
have the same delivery for the first 3
lessons that you do for the second 3.
So we know that that is no longer a safe
assumption that we're going to have the same delivery mode for a whole series. So are we
going to have all the data that we would like to be able to report? Unfortunately, not.
But we do make an effort to document
what has happened, and those decision rules
for handling data, and tagging data, and categorizing data. So that's going to be very, very
helpful in reporting--just to describe-- what
it is we know and what it is that we don't
know, but with what decisions went behind
that. There are also some workflow challenges.
And this last issue is pure logistics.
If you are relying on a paper-based data collection
system, you know? What happens when your staff does not have access to printers and mail
services and things like that? So we've learned to be very friendly with our scanners.
To scan what we have available, make it electronic, and, and, you know, change our workflows when we have to.
But bottom line, I want to re-iterate: document, document, document!
Because nothing's changed since mid March, right? Yeah.
So things have changed  like every five minutes, some days, it seems.
And having those documentation trails,
that audit trail, really helps to put that
decision in the context of what was going
on at the time. So I cannot emphasize that
point enough.
In terms of reporting strategies, we have, basically two different kinds of
reports in SNAP-Ed.
We have our ears form, and I think that the EARS Frequently Asked Questions
document that you see a little screenshot of here,
is really, really important to look over. It answers questions about
reporting in general on the EARS form,
and also directions for, for reporting interventions
that were developed or delivered specifically
in response to COVID-19. So it's very important
for you to take a look at that. And, you know, I know our colleagues have worked very hard
to provide clarification on how to be consistent, in the ways our program efforts are being
reported using this established form for SNAP-Ed. And I think they've, they've really done
a good job of addressing a lot of those questions.
We also have narrative reporting. SNAP-Ed
has an official Annual Report template,
that's a narrative form, and many of us also
make other reports or elevator speeches to
our constituents and stakeholders.
So, there's a lot of different narrative reporting, a lot of talking, that we do about our programs.
And I would, first of all, encourage
you to consider framing your work in principles-based approaches.
So, what is SNAP-Ed, or your
program trying to achieve? What principles
do you follow? And how are you doing that? So, that kind of helps provide a common language
about what it is you're trying to do. Providing contextual information about our SNAP-Ed efforts is also important.
We know COVID has hit many of our most vulnerable populations very hard.
And many of those vulnerable populations
are our SNAP-Ed audience. So, how is SNAP-Ed helping?
Also, high unemployment rates
often turn into increases in our SNAP rolls.
So, we're seeing some new audiences in many cases. And, the specific stories behind
these dynamics will be different for every
state and every program, but you have the
opportunity to put your program's efforts
in that context of what's what's happening
nationally and in your communities. There's this third point: you can boil it down to
my advice being, not to say something like
"FY20 sure was a disappointing year for SNAP-Ed."
No one wants to say that about this
program. So consider how it
is that you're talking about your program
and your efforts. So these two options, you
know, do we want to focus on the fact that
our enrollment went down some percent?
That's not a very positive way of looking
at this but, you know,  even when social
distancing happened, we were able to maintain connections with our partners, with our participants.
So focused on what we can do. What we have been able to do. Don't focus on lost hopes
and dreams, would be my advice. Because that doesn't really help anybody and all of us
have had lost hopes and dreams. But what
is it that we can do to overcome those and
help to meet the challenges that the situation has faced? And finally, there have been
a ton of interventions...innovations,-- I
think I made up a new word there-- innovAtions
and new partnerships formed as a result of
the common challenges we faced due to the pandemic.
SNAP-Ed is a dynamic and responsive
program and we should be able to highlight
and celebrate what it is able to achieve.
And this is a great opportunity for us
and our program in being able to describe
and celebrate what it is that we've been able
to accomplish together. That is
the end that I have today. So, thank
you and you have my contact information here. And I'll pass it back to you, Gina.
[GG] Thank you so much. I enjoyed your presentation. I appreciate you responding to so many of those
questions that were submitted by attendees and advance. Thank you. Next, we're going
to be hearing from Aaron Schroder, and he's going to be loading up his slides while I
do a quick introduction. Aaron is the Associate Director and Technology Officer with Kansas
State University's Office of Educational Innovation and Evaluation. In his Aaron leads a technology
team that works with evaluators and clients to streamline data collection and reporting
efforts by developing custom databases, web-based data collection systems, and other solutions
integrated within evaluation plans. He was
instrumental in spearheading the program evaluation
and reporting system or PEARS, which we're going to be hearing more about today, so Aaron, take it away!
[AS] All right, thank you very
much. So, I wanted to highlight
before I get started on that, one of the things Kimberly mentioned is the engagement that
network Missouri has created, and that is
an excellent tool, so if you haven't already
checked that out, you really should. It's
really cool! So, it's coming from a tech guy.
So, here's what we're gonna talk
about, today, quick overview. Just a
few words on kind of the relevance of tracking activities, and especially reach at this time,
A little bit more detail on the EARS guidance. I know Kimberly touched on this a little
bit, and I'm just going to not get too deep
into the weeds, but a little bit of a high
level on what's changed, and what that might mean for you for reporting. And then, the
last half, we have about 20 minutes or so.
We're going to dive into PEARS. And I know
not everybody on this webinar uses PEARS,
so kind of structured in a way that hopefully
will get some useful information, even if
you use other systems. That's kinda more about
the approach to collecting reach and demographics
and the new ways that people are working.
All right. So, just briefly on relevance.
And I think this kind of first point goes
without saying that, you know, especially
now, the efforts of SNAP-Ed, and by extension SNAP,
are just more critical than ever. And
there's a lot of people hurting and, you know,
jobs and things like that. And so, the work
you all are doing is extremely important.
And, I think, one of the things that
has stood out, and, at least for us, as we've
sort of, been supporting a lot of states through a data management system, is that one of the
strengths of SNAP-Ed is just how flexible and adaptive it is. Especially with the program
being kind of tailored all the different local
needs. And, I think you've seen this from
the previous webinars, how different groups have really just done a great job, adapting
programming, you know, going from not being
able, from doing a lot of in person type,
direct education activities, and interventions
to that getting almost completely shut down
overnight. So, I think there's, I think there's
a story to tell there. And I think that's
one of the strengths of SNAP-Ed. And,
that's why it is even more important that
we have the reliable and valid data to tell
a story. And, we all know that, you know,
tracking what you're doing, you know, doing
evaluations, and surveys, and tracking reach,
it's-- it can be cumbersome and time consuming. But, it's the data we need to talk to
the stakeholders. To talk to members of Congress, both nationally and of the State level, and to the
the program leaders of USDA and FNS to show that, you know, SNAP-Ed is is a good, and  necessary program.
Especially during this time it's
very well-suited to be innovative and adapted
to the needs. So, just kind of as
a reminder that, yes, collecting data is hard,
and it's time consuming, but is very critical
and the data are being used. So with that,
one of the ways that-one of the tools
that FNS has to aggregate data up, just have
a national level is that EARS form. And
again, EARS stands for the Education Administrative
Reporting System. And this is an online system managed by FNS. It's 1 of the tools, not
all the tools but 1 of the tools they have
to just get at what's going on, who we're
reaching? What settings are working in? And to do it in a consistent way that all the states all the SNAP-Ed
programs can report up against. And,
one of the things they did, and this guidance
was just released, I think a couple weeks
ago, was, without being able to like drastically
change the system, they found a way to add some additional guidance that would allow
kind of telling that story of how we're changing and doing things differently because of COVID
and how we've been able to adapt. And
so, what you're seeing here on this slide
is just a quick outline of that. There's
a link here. And for those that are responsible
for filling out EARS reporting, I definitely
recommend reading the FAQ, and especially
question 25. It's a few pages long, but
it gives you all the details. I'm not gonna
go into the details, but just give a quick
overview because I think it's helpful for
everyone to kind of know what we're trying
to get at, what FNS is looking for and how
we can tell that story. So just generally
what they're looking at doing is splitting
the main intervention reporting into 2 categories. The first being new interventions or activities
that were either developed or adapted in response to COVID-19. So that would be something
where maybe you didn't plan to do a PSE or
a certain activity, but then when that hit
you can change your plans and then are
going a different direction. Or maybe you
had planned to do an on-site, in-person training and you had to kind of rework it to be online
or to be even just to record a video that
was shared more as an indirect activity. So
they're looking to split that out from just
basically everything else. Everything that
we were planning to do when we did anyway, and  even things that were kind of interrupted
midway. So if you started a direct education in-person, you have the curriculum, and but
then you just had to finish an online, that
would not actually be counted as the newly
adapted. Because we're trying to be very precise
about the data as much as possible, and not
report things that aren't specifically impacted by COVID-19. One of those kind of
caveats there, that note at the bottom, is
that we can now report just standalone indirect
activities as part of this. Because a lot
of cases, what states are doing is doing more
indirect- like a post on social media, or
videos to YouTube, or things like that- where
there may not have an interactive element.
And so not strictly considered direct education,
but still important in reaching a lot
of people. And so that cannot be counted in
kind of the main-what's called Item five-
but one of the big sections of the ears form.
Then one more slide on this, and then
we'll dive into PEARS. So a note on reach in
settings, with with the new guidance for EARS is: when we're splitting this out --when we're
reporting COVID-19-impacted, or adapted
versus non-- we really only want to count the
people that were-- or the individuals that
were-- reached entirely by that adaptation or
buy that new program. So if it was kind of
a mix, where some of the people were reached
by the existing method that was already planned and in-process, then don't count any of those
people in this new category. So, really trying to just split out how has the program changed
to adapt to just this new environment, where we can't do a lot of in-person. And a lot
of the direct education is taking a entirely
different approach. Then finally, just
on settings: So I think you saw that list
from Kimberly slide, with just all the settings
were used to coding, a lot of, those are kind
of, temporarily been suspended, because we
aren't going into schools. We're not going
into community centers and things like that.
So for these programs that are delivered either to people that are tuning in from home, or
from their place of work, there's a way to
code that as well. So instead of saying it's
a school setting, that COVID-impacted activity would, would have an other and then it would
be coded as COVID-19, home. So, details
are in the FAQ document, if you're interested,
for those that have to report, for the rest
of you, hopefully this is just kinda good
overview of what we're looking at.  Alright,
so PEARS, and how does this all fit in? And
I know, like I said, not everybody is using
this system. This is, PEARS stands for the Program
Evaluation Reporting System. We have about 32 states that are using it to track their
SNAP-Ed activities. And this is kind of our,
what we live and breathe every day, so we
built this several years ago, and just
kinda been building on it ever since.
Before I dive into some suggestions and ways of collecting data, I do want to highlight this note
at the bottom: that for those using PEARS,
please, please refer to your organization's
specific guidance. Sometimes what we've put out there then needs to be adapted a little
bit for how your state operates, or how your implementing agency does. So, check with your
program coordinators and program leaders before implementing the ideas that we're gonna share.
All right. So, the first thing, and Kimberly shared this briefly and
she was right: this kind of came- when all
this started back back in March, we got together
with our advisory group and got some advice on how we could code activities to show that
they're being affected by the pandemic. And so, the 4 categories we came up with are
listed here, and there's some definitions,
kind of official definitions over on the right.
So, you're welcome to use those. And the link is available to the public, so you're welcome,
whether you use PEARS or not to look at that and see if this would be a useful way to categorize
activities that were impacted. And, the next
piece is just how to code settings. And so,
what I'm going to do here --I hope it's not
too distracting-- is I'm gonna switch back and
forth a little bit. Slides are kind of backup.
But I want to show you, just in practice,
how this works, for those that use PEARS.
And then if you don't, you know, similar data coding
could apply to either your system,
or how you fill out the EARS form. So, we're
gonna go into Program Activities, and we're
just going to open up, one I have here that's
a course that we decided to deliver via Zoom.
And, you'll notice that on the general
information, we have that drop-down and so, now, within PEARS all the different types
of activities, whether it's direct ed, indirect
ed, PSEs, social marketing; they all
have this drop-down. So, you're able to split
out of the data in different ways. And we
don't actually need all these for EARS, but
we'll use a couple of them when we auto calculate that.
So, that's available, and that's
how we're coding everything. And then,
in the custom data section, one of the changes is, typically, when you're reporting what
you're doing, and you're picking a setting,
like schools, or restaurants-- that's the
big one that's been shut down to various levels. Instead, what you could do is pick the "other"
settings, where people live; or the "other"
settings, where people work; and then type
home. If it's connecting from home or work,
And, again, check with your States on this
guidance. But this is kind of how we're going to code things in EARS, according to the EARS FAQ.
All right. So we're going to switch
back, then, the presentation here.
We're gonna talk a little bit about how to collect reach. So we've got some great examples of
transitioning programming to more of a virtual online format, but how do we get the demographics,
and what's the most efficient way to do that? So, we've listed a few options here, and I'm
going to just show you each real quick. Probably one of the best is just to use a very short
survey and whether that's in PEARS, or Qualtrics, Survey Monkey: whatever tool you have access to,
just post it in your webinar and ask. Just
encourage people to fill it out and kind of
explain why it's important for us to track,
just demographic data, assure them that this
is anonymous, and not connecting them to anything. So, in PEARS, what you would do for that is,
you would go to the evaluation section of
your program activity. Then you would attach
a survey-- I've already done it here-- but you
would attach one that was specifically created
to track demographics. So, I've attached this webinar survey for demographics, and when
I go into enter response data, you can see
what it looks like. It's based, it's just
four simple questions. The age, bracket, sex, race, and ethnicity. And, if I go back here,
in PEARS, what you can do, is, you can click
this little link, and you could show this
on your screen, or you could copy, paste this link into the chat window.
And then when you, when they go to this, they'll be able to fill out, basically, fill this
out, click, "submit." And now, we've got
demographic information. No, it's not 100%
perfect, because maybe not everybody will
fill it out, But it's a starting point, once
you've done that, then. And if I refresh
this page, you'll notice, have a few extra
records, because I've filled out a couple
more.  You can then use that for demographics.
So, if I want to know how many people
attended this webinar and what their demographic
makeup is, I can just use auto-calculate.
It'll pull the values from that survey. And
then basically, I'm done. So that's probably
one of the more accurate ways, if you can
convince everyone to complete the survey.
Another option available, it's a little
bit more of an estimate, But we do have some states that are using this is if, if kind
of your administrators behind the scenes,
have prepopulated site demographics. And to
show you this, I'm going to switch to a different program. So, if you're reaching a group
that, say, you were going to deliver
a program to a grade at a specific school,
and that's still your target audience. But
instead of doing it in person, you're now
having an online, interactive session, so it's
still direct education. It's still with a
specific group. But you're just not there
in person. You can't hand out the survey, you
can't necessarily do a visual estimate because you may not even see their cameras on. So,
another way you could do that is, I've got
this other example here, like a Canvas course.
So it would be like an online course. And
if, in this case, we picked Manhattan Catholic Schools,
because that was our target audience.
And behind the scenes, an administrator has
already populated the the overall demographic makeup of those schools. And so, when you
go to enter your demographics, you can, again, use auto-calculate this time. You just say
how many participants there were. So
if there were 44 simple, then you click yes,
it'll put 40. And it'll say this wasn't just
an estimate. It was based on the site demographics.
And then it'll populate all the demographics
based on the overall site breakdown. Obviously,
this isn't going to be 100% accurate, which
is why it was coded as an estimate. But it's
still better than just not having anything.
So, those are kind of 2 primary ways you
can still get at those numbers, even if you're not in person, and that's a little more difficult.
And then a couple others, I'll just mention,
are what are called platform analytics. So,
if you're using Facebook Live, other social
media type platforms, you can often get, and
get some of this data from kind of their administrative side of things. So, I'm not gonna go
over all the details of how to do that, but
we'd be happy to help people kinda offline
get at some of that. It kind of varies, depending on the platform, and the demographics you get
may not align perfectly, but that is a good
place to go if it's other options aren't available.
And then, of course, finally, you can
always do potentially a visual estimate if
all the cameras are on. Again, not ideal,
but it's better than nothing, and it gives
us some estimates which are needed for, for these reach numbers and making sure we're
reaching the appropriate demographics. So, a couple more slides here.
Indirect activities is another way that a
lot of people have been coding data because
of the change in programming. So, some
of the things that used to be direct ed, where
you had interaction, you had a class, you
had people in person, are getting moved to
maybe just a video where it can be consumed at the individual's leisure. So you're just posting
it to YouTube and then it's no longer interactive. It's, in that case, it would be considered
more of an indirect type activity. And I have
a screenshot here. I don't think I'll switch
over for that. But, but, essentially, you
can go in. You can code this as, in this
case, it was a video that was posted to
YouTube. And, again, you could use the
platform analytics to see how many individuals watch that video. We don't track the detailed
demographic breakdown for indirect activities, so that's maybe something we'll need to consider
as this continues, but for right now, we just
look for a total number reached. And then
we do also look for how many of those people estimated, do you think were reached, that you
haven't already reached through other methods. And that can be a bit difficult to estimate.
You kind of have to do your best, and some of the States have worked out guidance for that.
So that's indirect activities. And then, finally, and let's look at the time here, okay, we're doing good.
So just all of this, and again, coming
back to the purpose of tracking this data,
is really to show what the SNAP-Ed program is doing. And ultimately, is to get continued
funding and support of the program. If we, if we can't show results, and we don't have the
data to back that up, then it becomes a much harder conversation about, well, should we
consider funding? Should be increased
funding to this? So again, we encourage you
to use these systems, like PEARS, to get the data out there, and then use them to pull
it back out and to help tell the story. So
we have some ways of doing that in PEARS.
And I'll flip back again, at the minimum,
one of the things our team is working on in
the next few weeks is updating just sort of
the automated EARS form. So for those that
use the system, we have a way that you can go through, change some settings, and run
the EARS report, and then it gives you, basically, the report that follows the template. What
we'll be doing is-- we haven't finalized this, but-- we'll be adding probably an option here, where
you'll be able to run it, with the new COVID
breakdown. And it'll do most of that work
for you, behind the scenes. So, I think everybody in PEARS has access to that.
Another thing I'll just mention that can be really helpful is, under "analyze," we have these impact
dashboards. And right now, we just have one called "SNAP-Ed Highlights," I've got one pulled
up here with some filters applied. But, you
might, if you haven't already, and you use
PEARS, you might take a look at these, because they can be a great way to share the successes
of, of SNAP-Ed, with some of your stakeholders. So, there's a lot here, I'm not going to spend
a lot of time going over it. But, they'll
just kinda give you, based on whatever filters
you pick up here, it'll give you an overview
of what's going on in a very visual way about
reach, what indirect activities are happening, and PSEs, and things like that. And then, one
of the things, in this case, you can do, is,
you can actually share these reports outside
of PEARS. So, maybe you have someone in your state legislature, or somebody just in your
organizational leadership that you want to
share this with. You can set that up. You
can make it publicly accessible, and send
them a link, and they can see what's going on.
So, we hope to do more of this. And, really,
the goal we're trying to do with all this
data, which again, we know, is, is cumbersome and time consuming to track. But, we're really
trying to help tell the SNAP-Ed story, make
sure we continue to get funding and make sure
all our stakeholders know what's going on
in the great work that's happening, especially
now as it seems, the program is doing an amazing job, just adapting to get it done, doing a complete
180 in a lot of ways of delivering programs.
All right, So...locked up for a minute.
I guess I skipped this number 2: a final
thing you can do in PEARS at least, is just
export your data to Excel and it will include
all those new fields, like the COVID-19 impact.
So if you go into here, and go to export, you can, you can get your program activity data,
or your indirect activities, and do
a little more advanced analysis with it for
those that applies to. So, that's really
the gist of it.  And I would just say that,
if you have questions about how to do this
in PEARS, feel free to reach out to us, we
may direct you to your state administrators. If you don't use the system, that's fine too.
But you just have some ideas, or would
like to see how we're doing some of the more
detailed things, please feel free to reach
out, and we'd be happy to kinda walk you through that.
And so with that, I will say thank you.
You guys are all the ones doing the work that
are, that is having an impact. We're just
here in a supportive role, so we appreciate
you, and I'll turn it back over to Gina. [GG]Thank
you, Aaron. Was very informative presentation,
and I enjoy your little whimsical pear at
the end! So finally, we're going to transition
for our last speaker today, over to Michigan, where we'll be hearing from Doctor Lila Gutuskey. Doctor
Gutusky is an evaluator and director of Public Health fellow program at the Michigan Fitness
Foundation. Along with Michigan Fitness
Foundation Evaluation Team, Lila supports
the design and execution of evaluations for
SNAP-Ed, AmeriCorps projects, and provides
evaluation and grant writing services for
client organizations. She is a self proclaimed
data nerd and enjoys translating data into
meaningful Insights for stakeholders. So I
look forward to hearing from her. [LG] Thanks, Gina. So, just as I get started, a little
bit about the Michigan Fitness Foundation, we do, in addition to being one of the
2 SNAP-Ed, implementing agencies in Michigan, also have an AmeriCorps program, and operate
the Safe Routes to School program in Michigan, and have a variety of other health and physical
activity initiatives and partnerships throughout the state.
So the way that we operationalize SNAP-Ed across Michigan is through a collaborative of local,
and regional organizations that we partner
with. And each one of those organizations
carries out their own holistic direct education and policy systems and environmental change
initiative program for whatever priority populations
that they've identified. So, as the data and
evaluation team that I'm representing here
today, we really look to try to design evaluations
that can provide meaningful insights for a
variety of stakeholders. So, we do have that,
that programmer lens stakeholder. So, what
information can we collect and synthesize
and send back to our local partners so that
they can continue refining and revising and
deciding what programming that they offer.
As well as: how can we aggregate that data
up and tell a collective story across programs? So I'm going to try and operationalize
my presentation a little bit different. I
was hoping that, you know, I would be able
to kind of come in at the end and show some examples of the actual data that we've collected
in, different,  in different sort of stages throughout this addressing COVID. Continuing
to have innovative SNAP-Ed programming at during stay-at-home, schools closed, things like that.
So this is just sort of an
outline, and in a timeline progression of
the types of data we've collected and how
we've used that, both to help inform our current
programming, help inform our final report
that's coming up, and also to help inform
all of those program adjustments and adaptations, that will continue on at least through the next
fiscal year. So I also wanted to note that
I had actually signed up, and I was really
excited for the webinar series before I was
asked to present. So I am excited, and took
notes on what Kimberly and Aaron said, and we have not figured out everything, right?
So the best way that we can all sort of tell
our collective story is learning from each
other. You know, best practices: what works for one may not work for another. So, these
are our practices, and hopefully there'll
be something helpful to you. And other things
that probably won't be, so that's OK. So,
the first thing that we did in early April
was, we knew that schools closed a couple
of weeks ago. We had no idea how long this
was going to ask just like any of you, but
we wanted to connect with our local programmers
and sort of see where they were from their
original plan back in the fall. So, we did,
just a quick e-mail, had a little template
that they each filled out, and just ask them
to tell us the number of direct education
series within each intervention, that,
they had plans at the beginning of the year,
what percent had been completed? What percent
was paused, which we heard about too, and then also, which percent had not started.
So you can see here it was sort of a
third, a third, a third, with what was
completed and what wasn't. And so, this kind of just gave us, are like a moment in time,
and how, how many people we had potentially reached. We also collected that information
compared to what we had planned, and what, what kinds of adaptations and sort of quick
recoveries we needed to do. I will also
note that that 68% either paused or not started,
91% of those series were planned to take place in schools. And so, they were more so, that
sort of mid April through the end of the school year, or a summer-based program within a school.
So the next thing we did was we had sort
of a quick opportunity. I was getting ready
to field a survey any ways, for our state-wide social marketing campaigns. So we added some
questions on that so that we could hear from our priority population. So, these were mailed
invitations out to households that met SNAP-eligibility income in 18 of our
counties across Michigan, where more than three quarters of our SNAP-eligible Michiganders live.
And I will say, we, you can see on there,
we've got over 1,300 responses. We did tend
to have participants typically, that are a
little bit younger and female than the
general population, because that's who's aligned with our priority population for our social
marketing campaign. When we asked questions about behavior changes that had changed, we
talked about, since the stay-at-home order
and social distancing. And the stay-at-home
order in Michigan was on March 24th. We
collect this information, we fielded it beginning
in June, and it did go probably through mid
July, so over about a month period of time
we collected this. We thought this information would be helpful to, sort of get, again, like,
a status, a time in place, of things that
have changed, that could be helpful for understanding
future program priorities. Helpful for understanding some of that quicker and more flexible, indirect
reach through online platforms. And also,
we're thinking about our narrative report
for our final report, right? So how can we
tell the story of the impact of COVID in Michigan
for our priority population? So, you
can see here that, not surprisingly, food
shopping frequency went down for almost half of our participants. And about a quarter of
those participants saw a decrease in their
food budget. What was increasing? Might be also
the same for all of you. I know
it was for me, was that cooking and preparing
food at home, So people are cooking more at home. They might have tighter budgets, they
might be doing a little bit more bulk shopping, right? And we did see when just self reporting,
if they're eating the same, more, or less fruits and vegetables. They're eating, 21% were eating
less fruits and vegetables than before the
stay at home order and social distancing went in place.
We're looking at physical activity.
This might also sort of make, make, some common
sense, and give us some numbers to understand what percent of our priority population were
having different barriers. So, about
half said that they had decreased access
to the places where they liked to be physically active. 40% had decreased availability to
the equipment that they'd like to be physically active with, and about a third had less time
to devote to physical activity. And as a result, 35% self reported that they were getting less
physical activity than they were before. So,
in that same survey, we also took the opportunity
to collect some more needs assessment, right? So, needs and readiness around what our participants
might might be experiencing, and what we might be able to offer them, that they
would be most interested in. And so, while we did this through an online survey, mailed
out to our priority population, think about
other ways that you may be able to capture
the same information. So, it could be during your indirect channel, you know, a pop-up survey
in Facebook, or something like that. It could be if you are, if you are programming with
people, to be able to ask them in their survey or something like that. So you can see
here with our what kind of nutrition
information would you be interested in or
physical activity. There's not too much
separation between that, 36%, and that 31%
in between the actions that we gave them.
Slightly, slightly at the top, was that recorded
online nutrition education. So this would
be more of that asynchronous learning that
we heard about in a previous webinar. So,
online physical activity, going down to a
podcast. So, when we asked them about
what topics they would be most interested
in, you can see too that there is sort of
a not much separation there between those
first 4, regarding physical activity exercises,
tips for healthy eating, and so on. So, we
also asked people, preference for if they
were to get a nutrition education class, what,
what their preference ford frequency would
be and duration. So, about half of people,
said, weekly was the right frequency and they were split, where about a quarter said 10
to 20 minutes and a quarter said 20 to 30
minutes. So, in that sort of 10 to 30 minutes
was about half of what people said. So,
with that, we had, you know, our, where we
stopped with our direct ed at the time of
COVID, and then we had some new information
around changes and behaviors and interest
from our priority population. So, we also
sorry. So, we also then, connected with
our partners, at the same time, sort of end
of June, early July, and asked them to look
at the interventions, and the settings, and
the audiences that they were currently serving, and let us know where they saw an opportunity
to make adjustments. So, now, we're looking at, where can we change delivery method? Where
can we add an intervention that we didn't
have a program before, but might fit now,
and that type of information. So, just
to give context to these numbers, across
those 50 or so organizations, 83 changes
in delivery method were noted for interventions.
So, this doesn't mean 83 separate interventions. This means 83 time they noted an intervention
that could have a changed delivery method. So one partner might have three different
programs. And note that a delivery method
could be changed for all three of those.
Offering new intervention: So our top new interventions were the Farmer's Market Food Navigator program.
With where we were at during the fiscal
year and the seasons and Michigan, our Farmer's
Market Food Navigator Program was just getting off the ground sort of late May and people
wanted to add that in so that they could meet people where they were in the summer and do
some programming with them at farmer's markets. New settings: new interventions too.
So, retail, anything grocery store, a corner
store, convenience store, and also pantries.
Also Farmers' Markets. So, these were sort
of new intervention and/or new settings that
people were looking at doing more PSE work in these places where we know when people
are leaving their house, it's to go get food. So we want to meet them there and try and make
those PSE initiatives and make that an easier process for people. OK, so we've got
how we were planning now to respond to needs. So we've got, we've identified new interventions,
new delivery methods, things like that. So
this slide is really just to sort of situate
with how do you track what did happen? And so this is something kind of coming up with
that narrative aspect of our final report.
The EARS will tell us the who and how many.
And this narrative report will tell us what
was done. So this is something that
the ASNNA evaluation committee actually talked about and situated away those 2 different
reports providing a different part of the
collective story. And we're sharing, we're
sharing those practices amongst ourselves, as well. So I wanted to share that University
of South Carolina plans to do interviews with educators to help understand how their work
changed, and how they adapted what they were doing. The University of New England and Maine
is looking at doing a survey with educators,
So, the same types of content and multiple
choice, open, open-ended, Likert scale, things like that, So if you're looking for resources
like that, please feel free to reach out.
And this is something that we're trying to
share around, to make sure that we're all
supporting each other. So, MFF plans
to do a survey with program leads, and then we'll theme and aggregate those, that, those
themes of the way that the individual partners were meeting the needs of their populations.
So, we do a collection of information for
our final report from all of our partners each year.
ach year, We have a portal that they put
similar information, GIS information. So we
have site level detail there, and then we
have a form staff that collects all of the
EARS information, and a separate form staff that captures more of that narrative, storytelling
of adaptations and meeting the needs of priority population. So, we're working through what
those new questions will be. I put
some examples here, depending on if you are
collecting information from a program lead,
an educator, or a participant, so feel free
to make this flexible and what fits with your
program and what you're looking to understand and share.
So, one of my sort of timeline
pieces was, we have new sources of data. I
will just take a moment to reflect that we
also lost sources of data that we were really
expecting to have, right? So, previously,
folks had talked about, you know, those paper
surveys, those programs that didn't end. We have less of that traditional potentially
outcome information that we would normally have. Um, so that's sad as an evaluator.
But we also have new sources of data. And we also have new questions that we want to answer.
And so this slide just goes through some of
those new sources of data, specifically, that
we're collecting with participants and also with educators. So you can see
now that my slides have sort of switched from having numbers on them and data that I'm sharing
to now more words. And that's because I've
sort of caught up with where we are right now.
So, these are sources of data that we
have out in the field. And we're not yet to
the point where we've got enough data to be able to, to tell a collective story. So, an
online participant survey. So, instead
of just going to a pre/post online version
of our traditional outcome survey, we wanted to take a few moments to sort of step back
into more of a formative and process lens.
And so, now we're still collecting some of
that same process information we normally would: Did you find this interesting?
Did you learn something new? Those kinds of questions. We've got outcome at the short-term
now instead of that medium term, so we're
asking some intentions to increase fruit and
vegetable consumption based on their participation in the program. Then we have a new
set of questions, similar to the ones that
we asked in that online survey that we mailed
invitations out for. So, we're asking about,
you know, what platform do they use to interact?
And what kind of distractions do they have? What are things that we could do to meet their
needs better in our virtual virtual interaction that might not have come up in our in person?
And also preferences in the future for any
learning topics: The length of lessons, things
like that. So that we're continuing to collect
information, that we can share both out with
our partners, and also share up to give an
understanding of what's going on in Michigan.
We also have a new project that we
thought that this would be a nice moment to
sort of step back and reflect on. So, we're
calling this, our Qualitative Impact Project.
We're having partners recruit past
participants. So adults that had been in nutrition
direct education in the past year or two.
And we're setting up our phone interviews
with them right now so that we can ask them questions about what really, you know, resonated?
What did they remember? What might they be using out of those direct education classes?
And then sort of making that leap to what
has changed for them during this time of COVID?
And were there any ways that what they
learned in that direct education, helping
them or supporting them in the time of COVID. So, that way we can understand what
we might be able to hone in on and focus more of our effort doing in the future, based on
what we hear from past participants. The
Michigan Fitness Foundation has also set up
a YouTube channel where all of our partners can put their videos that they have developed
to meet, to continue interacting with
participants. So, a lot of this is indirect
channel connected with ongoing direct education intervention. So, think about all of that
school programming that got paused or not started, and how can we keep the conversation going
with those, with those participants. So,
at the end of all the videos, is this slide
here, where it will take them to, through
a text that they opt-in to, a short, you know,
4 or 5 question survey, where they can provide us a little bit of information about that.
And so, again, this is very formative for
us. We've never had this sort of depth and
breadth of our online indirect videos. And
this will be a nice way to package, not only
who we reached, what they thought about it for our final report, but also continue feeding
that information back to programmers. And
our educator log has been adapted to online,
not only in the way that it's completed, but
also asking some questions about what it was
like to recruit and, and interact with participants through an online format. Which, again, will
be nice to be able to share out. The other type of new data that many
of us are probably experiencing are analytics, sort of generally. This bottom of this slide
is actually a screenshot from our YouTube
channel analytics. And so, I just wanted it
to be a, sort of an example of lots going
on. And so, as you're, if you're new to sort
of sitting with this analytic data, I have
just a couple of tips. These are real
numbers that I've provided here for some some online articles that we have promoting different
different pieces of PSE work going on, and
things like that. Our Facebook page for our,
our statewide social marketing campaign. Then that YouTube channel, where all of our indirect
videos are sitting for all of our partners.
And what I really wanted to point out here
was use of words like reach and engagement, and impressions, and things like that.
You really have to kind of see how that platform is, is defining those words, so that you can
kind of dig in and understand, based on our interpretation of SNAP-Ed and of, what means
actual unique people reach. And things like
that. So, it may be different across your
analytic platform. The other thing
I wanted to point out here is, if, as an evaluator,
again, one of my stakeholder groups, is my
internal sort of partner program, or program
partners. And so, when I give them updates
of what's going on, what pieces of information
would be helpful to them to understand, what they need to do more of, or less of. Or how,
how they're recruiting people. Or, sort of
promoting the different videos or articles,
or things like that. So, in this, YouTube
Analytics, the last data point I have
here, is that all 3 of the top performing
videos for this 3-month period were guided
book readings. And so, those guided book readings were an indirect reach for, for that school
based programming. And so, that's the way
that people were engaging with their participants
during, during this time of distance. So,
this is my last slide, and it's the most words,
and the least numbers. Luckily, Kim and Aaron also already covered this. So, this is just
sort of brainstorm that we've come up with.
We do not provide specific guidance to our
partners of the way that they must collect
their demographics or reach. We understand
that, again, depending on the format of your programming, depending on how you are recruiting
people, and the way that you're engaging with people, you're really the expert in understanding
what works best. So, just a few notes
off this page. If possible, match the estimation
method that you already came up with for your in-person. And that way, you're keeping some
consistency within your numbers. Aaron
mentioned your recruitment sites. So yes,
with elementary-based programs, we were using demographics for that school level before
COVID, and it's still appropriate to to keep
that in the time of COVID. You might be doing
online surveys, online post surveys. You can certainly add your demographics on that.
You can do, maybe, you know, every third lesson of a series. You pop up a little survey for
demographics, and in the fourth lesson, you pop up a couple of questions about process.
And then that fifth and final lesson, you
pop up some questions around outcome.
So feel free to get creative now that we're all learning all of these new platforms. And,
you know, the thing that's really important
is make sure you're including your methodology
in your EARS piece of what you did if you
estimated. One thing I wanted to point
out that I thought was really cool from last
week's Minnesota's presentation is, they
talked about using web pages, embedding things in web pages so that you get that, that deeper
analytics, and consistent analytics. So, one
thing that I was thinking about is, yeah,
if you're sending out an e-mail with a newsletter, maybe instead of doing an attachment of a
PDF newsletter, you turn that into a link,
and it's a webpage. And, so, that way, you're
seeing how many people actually went to the newsletter, versus how many people you sent
that newsletter out to, and, that will also
give you that richer information, that could
be helpful for future newsletters, around,
how many minutes they stayed on the page,
and things like that. So, um, so just
just some ideas.  This will be in the slide
deck that you can download, as well, and I
just wanted to say thank you. You know, I think
we're all doing some things, very similar,
in some things differently, and it's great
to be able to learn and reflect on what each
other are doing. [GG] Thank you so much.
My cameras coming back on, OK. There we go. We're going to be moving on to Q&A, Thank
you, Lila, that was a wonderful presentation. We've got lots of questions here. Let
me pull myself back in and the presenter,
so that you can see everyone's e-mail
addresses here. OK, this slide is gonna
be up for a little while, so I thought it
would be helpful if you wanted to jot down
contact information for any of our wonderful
speakers. You can do that now, as well as
the SNAP-Ed Connection inbox, if you
need to get in touch with us. So, thank you,
again. Wonderful presentations. In case anyone
missed the very beginning of the presentation today, we have had some questions pop in,
so I'll just repeat sorry, for those of you
who've heard this now, like six times, Yes, we
are recording these. We will have them on
the SNAP-Ed Connection shortly. If you haven't
done so already, the certificate of completion and the speaker slides are all available for
download under the Handouts tab. The certificate has kind of a long file name, but it is it, down at
the end there. So, go ahead and download
that if you want it before the webinar ends.
Let's see, what else did I need to cover?
I think that is it. So, I'm going to
get to the questions, and if the speakers
could pull up their cameras, I think everybody's
back on. Great. So, our first set of
questions is going to be for Missouri. OK,
"Is the PSE project tracker proprietary software you developed with your Extension Web App,
or is it part of Qualtrics or other? Could
it be shared?" [KK] It is other. It is actually
an Excel document. So, if you have an Excel program that we could, we would be able to
share that with you and you could use it.
So, if you would like to send me an e-mail,
we can make sure you're connected with that. 
[GG] Thank you. "Are the
garden maps only available internally? Sort of a follow up question. [KK] So the garden maps, I
don't know how useful there'll be to other
states, but it is available freely on the
Web as part of Extension, Missouri Extension's All Things Missouri platform-- GIS platform.
[GG]OK, Thank you. "What application and why do you use for a GIS projects?"
[KK] All right. I'm not the GIS
tech person. I do know that they set up a
Qualtrics data entry system. So for the Food Finder, producers can just answer a Qualtrics
survey. Our garden map, same thing: Qualtrics survey. And they take that address information
and put it in their GIS infrastructure, and
are able to, to match based on address, using
some program that I don't know anything
about it. [GG] So the next question, sorry
if this is redundant. "What is the interface
for the Missouri FoodFinder.org site?
Who manages it, and who pays for it?" [KK]It's, it's the partnership with our Center for Applied
Research and Engagement Systems. It's here at the University of Missouri. And
they've been doing GIS work for years, decades, perhaps. And we had the opportunity
with our garden maps, they wanted to do some proof of concept work, and so we were able
to partner with them and provide them a small, manageable project, and it really is a partnership.
There is some SNAP-Ed funding, a small
bit of SNAP-Ed funding, but there also are
really able to leverage their platform and
their services, and actually, the person that
is our contact, that is doing a lot of the
GIS work, was one of our nutrition educators.
So they really know our program well. [GG] OK, thank you. "What is the primary reason
that you now have to collect demographic data differently? Were field staff visually screening before?
Or how do you use that data later? [KK] Good question. I appreciate the opportunity to
clarify. A couple of things that are playing
into that decision. First of all, is we're
moving away from pencil on paper because, you know, we don't want to pass items people.
And we're also, with more of our education
moving to online, we are finding ourselves
potentially in positions where we can recruit audiences that are not specifically tied
to a specific agency. So we want to offer
ourselves the flexibility of not having to
rely on agencies to collect that
information. You know, visual scans are always a
last resort. We always want to make sure that participants have the opportunity to self-identify.
And when we're able to capitalize
on agency information where they've already
self-identified their demographics to that
agency, you know, that's a great opportunity
for us to reduce participant burden and not
have them answer those questions again. 1
So we're just looking at those different ways of collecting that information in different
delivery modes. [GG] Great, thank you. OK,
next set of questions will be for Aaron. You ready?
"You mentioned that you're already using
some of the COVID-19 impact options for EARS.
Can you provide a little more detail about how that is working or is going to work? [AS] Yeah.
Maybe misspoke a little. We're working on
building those options into the automated
EARS form for PEARS. So what we'll
do is certain of those 4 COVID-19 impacts
will affect how the data gets split, and we're coming up with a plan, and then we'll be sharing
that with our advisory committee. So we have representatives from every state, and then
just kinda getting feedback, making sure we're going the right direction, maybe tweaking things
a little, then our goal is to have that ready
to go towards the end of this fiscal year
and start of the next one. So when people are starting to run years, we'll have the option
available. [GG] Thank you. With platform
analytics. How do we account for unduplicated
reach? [AS] That's tough. I think it depends
on the platform. I know some, like,
Facebook, if you're tracking posts and things like that, they actually do a fairly good
job of showing you individuals, I think it
kinda depends on the platform. I'm not super
experienced with all of them, but would be
happy to help, look at that. A lot of times,
they're just showing views. So, yeah,
you kind of have to make a determination on
how many people do you think actually viewed this multiple times. Probably not too
many, but, yeah, it kind of has a platform.
[GG]Yeah, OK, Thank you.  "For collecting
demographics, what do you think about the
option of using an anonymous poll for platforms
like Zoom that have that functionality?" [AS] Yeah, I think that's a great option. I would say,
if you're --especially Zoom-- you can do a poll that has multiple questions. So, it would be a great
way to quickly get demographics. I would say, if you're using PEARS, though, it's going
to be a lot more efficient to just go ahead
and create the survey and use the link, because
then the data go directly back into the system. And you can just click auto calculate and
have your information there. [GG] OK, and
this question also came in during your presentation,
but anybody could probably answer it, and
I'm guessing it's going to be an 'it depends,'
But "would you consider distribution of educational materials as indirect or direct educational activity?"
I'm thinking it depends... We can just move past that one.
[AS] Our understanding is that direct ed
needs to have some kind of an interactive
component to it, So yeah. [GG] OK,
I'm going to move on to questions for Michigan.
First one is "What is the YouTube channel's
name?" [LG]Oh, I should have put that in! It's
"Online learning in a SNAP!"  [GG]Nice!
Yeah, yeah, we're pretty happy with it. [GG] Excellent, OK, we'll check that out.
OK, we'll check that out. 1:22:48Um, what
do you mean by guided book reading? 1:22:53So
[LG]So we added --so if you go to Online Learning in a SNAP--you'll be able to see, but it's
literally an educator will go through
one of the books that they would have done,
potentially in a school-based program,
so the one that's reaching out to me
is that "I never will not ever eat a tomato."
I don't know if anyone's familiar with that
one. So it's basically a way for educators
to just kinda keep in contact with
the schools and the classrooms that they were supporting. So, while anyone can use one of
those videos that's up there, if they don't
have the capacity or the time to create their
own video, we have found that some educators really had some, some deep relationships.
And so, seeing Miss Carrie's face and seeing her do, that the book reading is really meaningful
to kids. [GG]OK, I'm going to jump into
questions here, but it's related to that.
I said, "I joined late, but did anyone talk
about SNAP-Ed direct education, being adapted
by teachers for use virtually?
It sounded like, Michigan Fitness Foundation
may have had that experience. If yes, could
you talk about how it happened? What changes
teachers needed and a technical considerations? And results/future directions?"
[LG] Yes, all those things! Yes. So we have we did look at converting. So in addition to being
a SNAP-Ed implementing agency, we also do have some programs and interventions that
we've developed like the farmer's Market,
Food Navigator program. So we did take the
ones that were more summer-based like My Garden, which is for younger kids and has a garden
component, and looked at ways to adapt that to an online delivery method. And then for
the Farmer's Market Food Navigator Program, we basically just tried to problem solve
and create a variety of ways that Food Navigators could still support farmers markets, not always
physically in the farmers market. So,
its definitely more than we could cover in
the next couple of minutes, but we're looking at things, sort of along a progression of in- person
to all the way through asynchronous, recorded video on YouTube, and seeing ways
that we can use different platforms to provide interactive. So, I think Aaron mentioned that
word, interactive, so, doing things like on
Zoom, partially with a prepared, PowerPoint
partially links, to videos, being able to
have a different person from the chat, so
that people can engage with asking questions that way. And you're adapting to your actual
audience. Versus just going off of notes.
So, those are all things that we're looking
at and trying to understand truly, and more
of that formative and process of what's working
well. [GG] Thank you. And for folks that are
interested in more ideas and concepts on that
line, definitely check out the recording of
our first session of these of the series because
we really kinda dug deep into a lot of those
issues. Another question for, Lila,
"How many surveys were sent out for the priority population, and where the surveys given out
to the whole state, like rural and inner city?"
[LG] So, yes, to rural, and suburban, and
urban. We did 18 counties and mostly in the lower peninsula of Michigan, we matched the
counties that got it were the ones where
we had the most presence for our social marketing
campaign, because this truly was our typical
survey for that, that program. And what I
did was just tag on, because it was so,
of the moment, ready to go out. I tag these
on. So, it was not a true, a true sort of
slice of our entire statewide. There
is rural in there, but it probably is weighted
more towards suburban and urban.  And we we
go through the process of purchasing a mailing list. So, I think we sent out maybe maybe
6,000. We also supplemented with an online web panel, so I can talk with folks if they're
interested in that. If they want to e-mail
me. It's something we've been doing the
last couple of years to supplement our mailing with, with folks aligned with our priority
population and SNAP eligibility as well. [GG] Thank you! We've got time for two more quick
questions. Aaron. I'm giving you one. And it's a two-parter.  "Any  new projects in
the works for PEARS and how much does PEARS cost?" So, I think that's really, how do we,
how do we get hooked up with PEARS type of question. [AS] Yes, There are always many new projects
in the works! So, that could go on.
And how much it costs, e-mail me, and we can get in touch.
[GG] OK, great question. And I'm
going to ask this again, even though, I think
it's kind of been addressed a few times in
the content of your presentations, but it's
one that we keep seeing a lot. And I know,
Kimberly, you mentioned a little bit, but
the question is, how do you ensure that the
reach that you're pulling from various social
media and online platforms is a SNAP-eligible population? It's a very big question
that everyone's having, so if anyone would
like to address that, more, now's your opportunity.
Anyone? [LG] So, I mean, one thing I would
say from, this is, my experience is deeper
with social marketing campaigns. Because we use online channels for that, and have for
years. Think about the ways, if you are
boosting those social media posts. Or, if
you are promoting those social media posts through, through a program, or through an
e-mail, or things like that. So, sort of how
are you connecting people to that information?
That would be one way to think about it. There's definitely, I mean, I know in my Online Learning
in a SNAP, YouTube channel, I have educators that are watching those videos, right? And
so it's not a pure number of participants,
But you can think about the ways that people
are sort of pulling into your channel or,
or your online posts as one way. I don't
know if Kim or Aaron have other, other suggestions? [GG] Again, if folks are interested
in that particular topic, we have kind of
talked about it in various capacities in
all 3 of the sessions now, so please be
sure to check those out, if you haven't seen
them. We'll also be publishing shortly
a Q&A document from the first two sessions,
and then any questions that we didn't have- we did get a chance to get to most of the
questions today. But there are a few that
we're not going to be able to get to. So we'll
send those out to our speakers, and give them an opportunity to provide written responses.
So that is our time. Thank you so much to
doctor Kimberly Keller, Aaron Schroeder, and
Doctor Lila Gutuskey, for sharing your time and your expertise with us today. I know I'm not the
only one who learned a lot today. Thank you to all of our attendees who sent us wonderful
questions and helped us to create a presentation that you would be interested in. All of these
sessions will be available on the SNAP-Ed
Connection shortly. So please let your colleagues
know. If they didn't have a chance to attend
the live sessions, that they can still get
this information. So thank you again. Be safe everyone. Have a wonderful day.
