Sima: Hi everyone. Thank you so much
for joining us today for our webinar,
Tools of the Trade:
Privacy in the Digital Age.
Before we get started I just want
to go over a few housekeeping items.
All callers will be muted, so if you have
any questions feel free to use the chat box
that you see on the left-hand side of your
screen. If you lose your Internet connection,
try refreshing your browser, and
using the link that was emailed to you
when you registered. If you are interested,
or if you have to drop off early,
or if you want to watch the
webinar again at a later time,
you can visit our website at
www.techsoup.org/community/events-webinars
to watch the webinar at a later
time. We’ll also be sending an email
with the presentation, the
recording, and the links.
And if you are on social media feel
free to send us a tweet at us @TechSoup
using hashtag #tswebinars. But again,
like I said, we’ll be checking the Q&A
as we go through the presentation today.
So just a little bit about TechSoup,
we are in 236 countries and territories
and we work with over
a million nonprofits.
So I just want to take this
opportunity to give you guys a chance
to use the chat box, so if you want to
chat in where you are dialing in from today
I can read out a few of the responses.
Okay, we have San Francisco, so
somebody right next door to us.
Plainfield Indiana, Portland, Lowell,
do we have any international folks?
I don’t see any yet. We have
Prescott Arizona, Fort Wayne Indiana.
There are a lot of Indiana folks on the
line. All right, so the chat is working.
That’s good.
So just a little bit more about TechSoup,
we work with several technology partners
who make our mission possible, so
Adobe, Intuit, Microsoft, Symantec,
just to name a few. If you
are interested in learning more
about the products that we offer
nonprofits, you can visit our website.
The URL is here,
TechSoup.org/get-product-donations,
and see what is available
to you and your nonprofit.
So today we have a presenter, Tracy
Ann Kosa, who is a Privacy Researcher
at Stanford. So just a little bit about
Tracy, so Tracy is a privacy researcher
at Stanford University, and adjunct faculty
at the Seattle University Faculty of Law.
Her current work proposes
privacy patterns for enforcement
and regulatory activity. Dr. Kosa
has held a number of leadership roles
in privacy at Microsoft, and in government.
She holds a doctorate in computer science
in privacy, a master’s degree
in ethics and public policy,
and undergraduate work in
economics and political science.
So I’m going to go ahead
and pass it off to Tracy.
Tracy: Thanks, Sima. I appreciate
it. I wanted to start off today
by saying thank you very much to
everybody for taking the time to join us,
and I am very much looking forward
to seeing and hearing your questions,
so please do actively use the
chat box if you have any things
you’d like to raise
specifically for today.
I’m going to read to you a little
bit about the Stanford Society
that I come from just so you have
some good background information.
The Stanford Center on
Philanthropy and Civil Society
is known as the Stanford PACS.
And their mission is to develop
and share knowledge to improve
philanthropy, strengthen civil society,
and affect social change. PACS is
mandated to connect students, scholars,
practitioners. And they also publish a journal
called the “Stanford Social Innovation Review.”
That is where our research
work is funded out of.
To that end, I wanted to take a minute
and ask you if you would mind sharing
with me what type of
organization you work for or with.
These are some loose categories, and
if you don’t quite fit the bucket,
I understand. It’s not scientific, but
it would be great to see your responses.
This is fantastic. Thank you, all. I’m
just watching the poll results come in.
Sima, I just hit “Skip to results,”
to share them with everyone, correct?
Sima: That’s correct, yeah.
Tracy: Okay, great. So you can
all see where we are at today.
Fantastic. It is great to
see such a diverse group,
and I know you will probably have
questions related to your specific area.
So please again, feel free
to raise those as we go along,
and I will do my best to address
them now or at a later point.
The next question I have for you is to tell
me a little bit about how comfortable you feel
in the privacy area. So I would
assume lots of you are here
because you are interested,
but I would like to gauge
how many people feel like they are
expert, or they are really just starting
at the beginning, or may be are
familiar with some of the basics
from any particular source. It sounds like
that’s where a lot of you are probably sitting.
I’ll show those results with you. It
looks like we have a good standard bell-curve
which is great to see. And again,
questions, comments, concerns,
feel free to drop them
in. So let’s get started.
So the Digital Civil Society Lab
investigates challenges and opportunities
for civil society to thrive in
the digital age. And this of course
has expanded the potential for
civil society participation,
but it also presents new
challenges and threats.
And our dependencies on digital
software and infrastructure
that are commercially built, and
government surveilled require new insights
into how we make these digital
systems work, and how we can engage
with them safely, ethically, and
effectively for civil society purposes.
What that means for our work
specifically, is that we are focused
on looking at how privacy and privacy
requirements fit into digital civil society.
So the lab overall, seeks to
engage researchers, practitioners,
and policymakers, and it touches on
technology organizations’ policy and norms.
I firmly believe that privacy
fits into all 4 of those buckets
and was really happy to see that the civil
society lab was interested in funding this work.
Today, I am going to walk through a
little bit about some of the issues
in privacy for a digital civil society.
We are going to talk a little bit
about the hypothesis and design of
our research to give you some context
for the web app that we built. It
is called “Privacy Patterns.”
And I want to be really clear with
you, it was designed and intended
to be a free tool. It will always
be a free tool. It is currently up
and available to be used and
explored, but it is very much in beta,
and that is part of why we
wanted to share it with you today
to begin to get some
feedback, and better understand
who is going to be using the
tool and how we can improve it.
So with that, let’s jump into
privacy. The first thing I want to do
is level set what the expectations
are when we use this word quote that
There is a lot of legal definitions for it.
There are a lot of personal reactions to it.
I like to think a privacy as broken
out into 3 different components.
We deal with territorial privacy.
We deal with physical privacy.
And we deal with informational
privacy. And all 3 of these things
come together in how each individual
thinks about and represents
their privacy decisions and choices.
To talk a little bit about physical, I
think the best example I can give you,
given that we are in a webinar,
is that if you look around you
to examine the personal space you have in
a room, that is very much a privacy decision
that you make. There’s been
some really fascinating research
in this space done in social psychology.
And as a Canadian I am happy to tell you
that they definitely got
it right for me personally.
Canadians on average require
2 to 3 feet of physical space
around them in order
to feel comfortable,
probably indicative of a
30 million person population
in a very, very large country. But
other countries, other cultures,
and even families would operate with
a different sense of personal space,
and that is very much a privacy issue.
That privacy issue can also be extended
to what is called territoriality in law.
It refers to your stuff. When I
drive my car somewhere and park it,
I fully expect it to be there when I
get back. But what kind of car it is,
what is in that car also
speaks to me and my identity.
Personalized license plates
could be a good example of that.
That is another kind of privacy,
the expression of our identity,
what information we choose to share.
And then of course,
you have informational.
And this isn’t just about the
information that we create about ourselves
when I send an email or a
text message, it also includes
all of the information that is
created around the transmission
of that information. So you may
have heard the term “meta data,”
over the last few years it
has become more common placed,
but this refers to information
that is created by machines
when we use them that can actually
be used for some really interesting
and possibly scary behavioral analysis.
With that, let’s go a little deeper
into some of the common requirements
when we talk about privacy. You’ve
probably seen things like policies,
and consent statements,
notice and contact information.
These are some of the basic requirements
that we see in privacy in law,
and also in the expectations that
people will have with organizations
that collect, use, and
disclose their information.
The consent statements are generally
an area of contention right now.
For example, in health care consent can
often be conflated with consent for treatment,
and consent for
management of information.
There is been quite a lot of research
lately that also suggests that quote,
“consent is broken.” This is
not I think, particularly helpful
since we don’t have an alternative
to date. But I want you to know
that there is a lot of
discussion and debate
around some of these mechanisms
and these common requirements
as we move forward today.
So when we think about privacy, one
of the easiest ways to look at it
is by country. The Internet I
think, was very much intended
to be a global, “information
superhighway” I think was the quote
for a good long time for those of you
who might be old enough to remember.
And what we have seen of course is
that with increasing privacy laws
in different countries, the Balkanization
of the Internet is really much more
of a reality, and EU GDPR I think,
represents yet another aspect of that
with additional privacy laws
that are being passed specifically
in the European Union,
specifically for those countries,
but will have an impact on how we all use
information, and what our expectations are.
So let’s take that back to the United
States. I apologize for this eyechart.
I want you to take away from it the complexity
as opposed to the actual content here.
Our research team looked state-by-state
to examine what privacy requirements
there might be that we could
pull out issue by issue.
Now tools like this where you can do
these kinds of searches are fantastic.
However, they are mostly accessible
to lawyers, and by subscriptions
with not insignificant costs associated
with them. They are also behind firewalls.
And for example, they can often
still be written in legalese.
So if you are turning to look
at a law to try and understand
what you’re requirements are, it
can be a very challenging situation.
Let me give you an example.
In Alaska, which I am picking because
alphabetically it was our first result
after Alabama which has no requirements,
Alaska defines a privacy breach
as quote, “the unauthorized
acquisition or reasonable belief
of the unauthorized acquisition
of personal information
that compromises the security,
confidentiality, or integrity
of personal information maintained
by the information collector.”
Now, I can tell you as someone who
has studied and worked in this field
for about 20 years, I’m already lost.
That is not notably very long sentence,
but also contains a lot of words
that are open to interpretation.
For example, “reasonable belief”
of an “unauthorized acquisition.”
and then you deal with the
concept of what is considered
a compromised confidentiality. What is
considered a compromised to integrity?
How do you define an information
collector? All of these notions,
and thoughts, and expectations
are really open to interpretation.
Let me dive a little further
and give you another example.
If we go straight into that Alaska
requirement, we know for example,
that you can multiply this by most
states in different categories,
and then you would have a pretty in-depth
notion of breach notification just for example.
On the left side of this slide you will see
a general breach notification statute quote
that come straight out of Alaska,
and it effectively is attempting
to narrow the definition by suggesting
that if the organization owns
the personal information on a resident
of the state, then that is a breach,
and no reasonable
notification is required
if there is no reasonable
expectation of harm to that resident.
But if the organization doesn’t
own the personal information,
then it must notify the person
that does own that information.
And again, to me and to several of the
folks that I worked with on this project,
this becomes an extremely complex
almost Venn diagram or flowchart,
and bearing in mind, contextually we are
looking at one particular requirement,
for one particular breach notification
law, in one particular state.
So you need to kind of push
further, and push further,
and push further on each of these
definitions and it gets complicated.
The definitions also, about personal
information change substantially.
So for example here, there are specific data
points laid out on the right hand of the slide
that talk about first names, last names, initials,
Social Security numbers, driver’s licenses,
etc. etc. You can think about all
of the pieces of personal information
in your day-to-day life, or even
just look at your driver’s license
and begin to understand that
if you list out data sets
in terms of what is considered
covered by privacy requirements,
and what is not, that again, gets
very complicated very quickly.
Well given that there is not
consistency around breach notifications,
or requirements, or definitions
of what constitutes a breach,
the next question we had was
well, do breaches actually happen?
If you are looking at an unregulated
largely sector when it comes to privacy,
can you say that there was a breach?
And this is where
things get interesting.
This chart represents some data that we
obtained from the Privacy Rights Clearinghouse
which is a California
nonprofit corporation.
It is a 501(c)(3). It
has a two-part mission.
It deals with consumer
information and consumer advocacy.
It was established in 1992, so
they have been working in this area
for quite a while. And they have an
open searchable database of data breaches
that are reported by anyone. So for
example, if you know of a data breach,
you can go to their website,
enter a few data points,
and they begin to track those things.
In the NGO category that the
Privacy Rights Clearinghouse uses
which is nongovernmental
organizations, or effectively,
civil society nonprofits, you
can see that they have tracked
some really interesting
breaches. These numbers represent
the actual number of
records that were breached.
So you’ll see there is a
fairly significant spike in 2011
of 3 million records that were breached.
This refers to the 2011 US
Chamber of Commerce China hack
where hackers in China were able
to breach the computer systems
of the Chamber of Commerce, and
had access to the information
of roughly 3 million members
from November 2009 to May 2010.
It was discovered in May 2010,
but there is still some evidence
that suggests those systems were
compromised as of March 2011.
Email communications were also
compromised, and there were also things
like trip reports, schedules,
policy documents, meeting notes,
all kinds of information around expense
reports, and other bits and pieces
of folks who had dealt with
the US Chamber of Commerce.
So I think we know breaches
are definitely an issue.
You will see reporting of them
trailed off a bit in 2016 and 2017.
Nobody really knows why since it
is sort of an unscientific study,
but it represents an
interesting notion that NGOs
are still considered to
have these kinds of breaches.
So for another data point, we turn
to look at what is the coverage
of civil society and privacy,
effectively in the news.
What does the coverage look like?
Is it advisory? Is it about breaches?
What kind of reputational
risks are nonprofits facing?
And I pulled into this deck sort of
a highlight of some of these issues
based on our literature review.
This is one of the earliest
ones we found in 2012.
Compliance SAI Global is a newsletter
that operates predominantly in the UK,
but is of course online. This
article references the ICO,
the UK Information Commission
announced about a charity
called Enable Scotland that had
breached the Data Protection Act.
The ICO stated that there were two
unencrypted memory sticks and papers
which contained personal details of up to
101 people stolen from an employee’s home.
That information included people’s
names, addresses, dates of birth,
and health information, but
not insignificant breach.
In this particular article they report
that the ICO stated that the charity
did not have specific guidance for
homeworkers on keeping data secure.
Portable media devices were
not routinely encrypted.
And the chief executive officer
of the charity had signed
an undertaking committing the
charity to improve its compliance
with the Data Protection Act. So
again, this is a really good example
of an article where you would
expect some advice or guidance
about how to address some of these
issues, but it’s a little bit
more general than that. It’s mostly about
the actual context of the issue itself.
Let’s look at another example
also coming out of the UK in 2014.
This is a “Computer Weekly”
article. And I should mention
all of these links are in the slides,
so you can check on any of the details
that you would like. The take
away from this particular article
was that the ICO investigation found
that the charity failed to realize
that the website was storing
name, address, date of birth,
telephone number of anybody
who had requested a call back
for advice on pregnancy related
issues. The personal data
was not stored securely and a
vulnerability in the website’s code
allowed a hacker to access the
system and locate the information.
This was a really specific targeted
example of where data breach requirements
were not in place because the folks
who were running the charity of course,
didn’t have the cyber
security expertise necessary
to address some of these issues
that are now a little bit more common
and well-known in respect of using
websites, and having websites.
Let’s try the next one here.
Another article from 2014 was
published with the Nonprofit Times.
The Nonprofit Times actually has
done a really good job in this article
to try and frame this advice and these
issues around data breach notification
for nonprofits as something that
they could do. In this particular case
they talk about a targeted attack of
hackers, the LA Gay and Lesbian Center
last year, so that would
have been 2013, was a victim
of what the charity described
as a sophisticated cyber attack
designed to collect credit
card, social security numbers,
and other financial data. In
a statement that was published
in the end of 2013, the Center
said there was no actual evidence
that anyone’s information was
acquired, but approximately 60,000 clients
and former clients were notified
that the information might have been
compromised over a 4 or 5 month period.
This is a good example of how the
reputational issues associated with charities
can have a real impact on how it
is they manage personal information,
and give more credibility to why this issue
needs to be looked at in greater detail.
Another newsletter we found was
from a law firm specifically dealing
with a lot of nonprofits. They provided
the advice that smaller breaches
actually have more concern to leadership
and boards at nonprofit organizations.
They are managing a lot of
personal information. You all are.
And that is not just about
the members and volunteers.
It is also donor information,
employee information, all of which
can include all kinds of sensitive
data points, name, address,
phone number, email addresses etc.
Some nonprofits have mature
privacy and security programs,
but I think for the most part, what
we need is some standardization.
And we are starting to see in these
articles about the inconsistency
of how these requirements or best
practices should be implemented
from a resourcing perspective.
In 2015 things start
to shift a little bit.
And again, Nonprofit News raises a
really good points about best practices.
For example, permitting personal
information to be stored on laptops
or smartphones is problematic,
especially if they are not encrypted.
Drawing a line between what employees
do on their personal devices v.
what they do on work computers
is another important point.
One of the interesting things that
the Nonprofit Times highlights here
is that the standards, and this is in
2015, are becoming increasingly complicated.
And doing best practices in
data management and security
is especially challenging when
data is moving between devices
and storage sites. You have many privacy
regulations that require businesses
to protect personal information
no matter where it resides.
So that could be on a network.
That could be on an employee device.
It could be of course, on paper as well.
So thinking about a holistic approach
to dealing with data privacy
becomes more, and more important.
An insurance company newsletter in
2015 had some interesting case studies
about a Utah food bank breach
that occurred that year.
The unauthorized access
was gained to 10,385 donors
through the website. And again, the
Utah food bank had to absorb the costs
of offering credit monitoring
and offering restoration services
to people who were victim of
identity theft as a result of that
which took a significant
bite out of their budget.
I am happy to report we also
found at TechSoup article,
a really good one that outlined
different notions of nonprofit mandates
for the protection of data beyond simply
regulation, a.k.a., trying to do the right thing,
whatever that means. One
particular quote jumped out at us,
“Most nonprofit leaders admit
knowing too little about the risks
and consequences of failing
to adequately protect
personal information collected from
employees, volunteers, clients, and donors.”
I think the important take away
from that is that we are talking
about different kinds of personal
information with different contextual risks
around it. The potential breach of donor
data has very significant consequences
for a nonprofit, but they are very
different from a potential breach
of employee data. And that is
an important thing to consider
as you buildout privacy requirements.
In 2017, we start to see a little bit
more specificity in some of these articles
that are being written. This is a law
firm newsletter specifically geared
toward nonprofits. And it again,
reviews privacy breach that resulted
in essentially, malware being
installed on a nonprofit’s computer,
and then they were held ransom.
The files and data that were on
those computers were held ransom.
The hackers demanded $43,000, and
when the organization refused to pay,
the hackers posted on Twitter private
letters that the organization had
in their capacity for providing
services. And consequently,
in addition to losing all of their
files, the organization also lost funding
because they didn’t have any of the
administrative data they needed to apply
for grants. This article really
stresses that this is a much more common
kind of breach that doesn’t
necessarily make the main stream news,
but has dramatic and drastic
consequences for the nonprofits
that operate in the space, and can
be subject to those kinds of attacks.
Sorry, skipped ahead there.
Charity Digital News published a
series of notifications written in 2017
by Elizabeth Denham who is the
new Privacy Commissioner in the UK
in the ICO’s office. A lot of these posts
and myths have much more specific guidance
in them, and talk about how quickly
the details need to be provided
when a breach occurs to a regulator.
This is a fairly substantial change
from what we’ve seen in privacy in
the past where there was an expectation
that an organization could both identify,
stop, solve, and remediate breaches
within even as little as 72 hours. The
ICO and other data protection authorities
are now recognizing that that is not
feasible, and that you may become aware
of a breach within a certain time
period, and there is an expectation
to share that information with the regulator,
but not necessarily full remediation plans
or comprehensive reports at the
outset of the discovery or detection
of an incident. Regulators seem to be
pivoting to understand specifically,
nonprofits need some support and
guidance as they go through these
kinds of incidents. And I think
that is a really good sign.
There is another data
breach tracking site.
This one particularly
highlights specific data breaches
related to employee data. It
was kind of an interesting case.
There was an employee who
was working at a nonprofit
who sent spreadsheets containing
information of vulnerable clients
to his personal email
address without the knowledge
of the data controller which
was his employer, the nonprofit.
The defendant sent 11 emails
from his work email account
to his personal account. It contained
 sensitive data of 183 people
including children. The data points
were full names, date of birth,
telephone numbers, medical investigations.
And he had apparently done this
in the past as the nonprofit
dug into research this breach.
I find this one particularly interesting,
because there really is no motive
for this that is financial, or
suspects hacking, or identity theft,
or any of these other things. This
really does appear to be a case
of an employee who wanted to do work
at home who just wasn’t familiar
with what the best practices around
the management of client data should be.
And it’s probably a good example
of why it is training employees
about these best practices is a really
important part of doing business.
And last but not least, I wanted to share
this article from the Information Age,
also published in the UK. Data protection
laws have never been so stringent,
and this is evident today that as
it came to light that 11 UK charities
had been fined by the ICO for
misusing information about millions
of past donors to seek further
funds for future projects.
The different organizations
are listed in this article.
They include Oxfam, Cancer Research,
Cancer Support, The Royal British Legion.
And each of these fines ranged
from anywhere from £6000
all the way up to £16,000. Collectively,
the charities were fined £138,000.
And the ICO decided to keep
these individual fines quite low,
because the donors, the ones who had
actually quote been “exploited”
had specifically given statements
to the ICO that they would be unhappy
at more severe financial punishments
which is kind of a good news story,
I guess. The donors of these
charities really stood behind them
as they faced some rather rigorous
review for their data practices.
And the ICO also talked specifically
about why these charities were targeted.
They had screened millions of donors
to target them for additional funds.
Other charities had traced or
targeted new or lapsed donors
by putting together
personal information obtained
from other sources. And then other
charities yet, had traded personal details
with different charities to create
a large pool of donor data for sale.
And I think some of those
practices are fairly common.
And in the UK you will see an increasing
tendency toward trying to shut those down
which presents some really
interesting challenges.
So to sum up from a
literature review perspective,
there was really only 5
points that we managed to cull
from this massive amount of academic
and industry literature that we found.
First of all, make sure that
you have advice for employees
on how to keep data secure.
Second of all, make sure
you’ve got some mechanism
to deal with the understanding
of compliance obligations
related to managing
personal information.
Thirdly, be careful that your employees
only have access to the information
that they need to have access
to, that it is legitimate use.
And train them about not quote “snooping”
in data just for interest’s sake.
We have seen similar things
in the health care industry.
Fourthly, lots of
recommendations and restrictions
on not permitting personal
information to be stored on a laptop
or smart phone that is a work or
organization owned laptop or smart phone.
And then finally, I think what we
really see in a lot of this coverage
and literature is just that
bad things are going to happen.
It’s not a question of
preventing breaches entirely,
it is a question of exercising
the best due diligence
you can in context under those
circumstances, and then accepting
that you need to have
a well-established plan
for how to deal with
things when they go wrong.
Those observations though, were
just the beginning of the iceberg.
What we also began to look at was
what are the open questions that exist
for civil society and privacy.
So right now for example,
we know there is no overarching
privacy regulatory framework
that is specifically geared toward
civil society in the United States.
However, there is other regulation
both in the United States and beyond
that can help inform how civil
society should address privacy issues.
There is definitely media coverage which
means there is lots of case studies available,
and different things we can look at as
opportunities to learn what we need to do better.
We know that breaches do occur, but there
is a fairly significant girth of data
on what are the actual harms of those
breaches and how do we quantify that.
There is a lot of advice, definitely a
lot of advice, but really no information
or guidance on how to evaluate all of
it. How do you know which privacy advice
is good privacy advice? How
do you know what to implement
under what circumstances,
and in what order?
So that lead us to sit down and begin
to talk through what the hypothesis
and design should be to
help solve this problem.
The first thing that we did
was toss out that global map
that I showed you at the
beginning of our seminar today
that contains a list of all of the “privacy
laws, and regulations, and obligations”
by country. And there is some
fascinating academic research on this.
And the last total I believe
was 183 different laws globally.
If you go down below the state actor
levels to say provinces, states,
territories, etc., then you will discover
a whole other wealth of privacy law
and legislation. And again, if you go
further still to look at sector specific laws
and requirements, then you have
even more. In the United States
there’s probably upwards of
several hundred requirements
that pertain to different aspects
of both information management,
data breach notification, and
a little bit about how you need
to deal with specific data
points, for example, in HIPAA.
We decided that that was not helpful.
We could look at different ways
of concording all of those things,
and the approach that we took
was, where do those laws come from?
And in large part they are
based on a set of principles
that were generally developed by region.
One of the first was the OECD guidelines
privacy framework that
was established in the 70s
coming out of the European
Union that began to talk about
what are the principles associated
with privacy that we would see
and expect, not just legal principles,
but also the values and ethics
that people hold around privacy.
Well, we looked for that kind
of document across the globe
and we found them. And what
was particularly interesting
about these principles is they begin to
align, not necessarily in implementation,
but certainly in the values and
notions behind what the expectations
of privacy are by data subjects.
So let me give you an example. This is
a sample principle that we pulled out
of one of the guidance
documents that we have.
And this particular principal talks
specifically about programs management.
What are the expectations for having a
really strong program management for privacy,
but also for security? In this
case, it particularly highlights
the needs to have guidelines,
the need for those guidelines
to be context specific.
The need to be safeguards
specifically about
privacy risk assessments.
The need to incorporate that
into governance structure.
The need to have incident
management programs,
and the need to have
compliance monitoring.
These are effectively the best
practices for program management,
and they have been customized for
privacy. So we were quite excited
to see some of these commonalities
start to pop out as we began to review
these specific documents.
Here is another example. One of the
principles calls on organizations
to demonstrate that their privacy
program is quote “appropriate”
which leaves a lot of room
open for interpretation.
But if you go back to these requirements
set out in the first principle,
you can begin to see where you
could establish a code of conduct,
or give a binding effect to this
program by committing your employees
either through an employee
agreement or some other mechanism
to follow these program
requirements the same way you would
with any other best
practices or code of conduct
that are required in an organization.
Another principle that we saw very
well established in these documents
was the need to provide notice when
there has been a security breach
of any kind. Now, different jurisdictions
will quantify that differently.
Some will say a breach –
notification is required
when there has been a significant
breach. Notification is required
when there has been a reach that
well adversely affect a data subject.
I think the take away from this
really is if you suspect a breach,
go to your regulator and let
them decide, or help you decide
what kind of notifications
if any are necessary.
These patterns that we started
to see were really exciting
because once you start to see patterns,
we can begin to actually adopt a language
to describe what those
requirements are, and how consistent
you can be in the application of them.
And these were the patterns that
we found globally, 12 patterns.
We’re completely open to the
fact that there may be more,
but in effect what we see globally is
a requirement for some due diligence
around the collection of data, some
safeguards around the access to that data,
limitations on the use of that data aligned
often with consent or accountability,
limitations on the disclosure of data,
concerns around the safeguarding
of the disposal of data,
requirements limiting
the retention of data,
same for archiving and backup purposes,
a real push for
organizational transparency
which I think we can all see is
becoming more and more important
not just in the civil society
sector about how organizations
do all of these things, and
being open and transparent
with data subjects about what the
uses of their data are proactively,
not simply just asking for
blanket consents or signatories
to [indistinct] – sorry,
user license agreements.
We also see a right of
access in a lot of countries
where an individual can ask an
organization to see everything
that they have on
that particular person.
Now, there is a lot of debate in
this space. In Ontario for example,
there is a carveout for meta data
generated about a person’s information.
So in other words, if I was on Facebook
and wrote a post, I could ask Facebook
to give me back that post
which they will share with me
along with any other
information I’ve uploaded,
but there is still a lot of
debate as to whether or not
I would also be able to get access
to the logs that Facebook tracks
on everybody who uses their website.
And that is not just a Facebook thing.
That is every organization. Because
are those laws just about me,
or are they my data as well.
So there is some interesting ethical
issues that need to be sorted out
with right of access. Data
quality is an interesting one.
We often see this represented
as integrity. In other words,
the information you have
about me has to be accurate,
and that leaves nicely into
the right to correction.
So if I discover that an
organization has data about me
that it is inaccurate, I have a
right to have that data corrected.
Once we see these patterns in
all of the privacy requirements,
it helps us get to how
do we then do privacy.
That is still a lot of work and a lot of
open questions under each of those patterns.
So we begin to look at
what are the recommendations
that would apply to a specific situation.
We needed to talk about that though.
We needed to talk about
that in a meaningful way.
We developed a questionnaire
to help us understand
what users were looking for
in the management of data.
But then we also had to look at how
do we actually implement those things.
How do we prioritize them? How
do you find the right answers?
It requires a significant
effort to curate data,
and this was from people
who were doing this full time
with not another job on their hands.
We settled initially on
resources that were published
by regulatory groups, or
nonprofits, or advocacy groups
that specialize in privacy. For
example, the Future of Privacy Forum,
the Electronic Frontier Foundation,
the ACLU. We looked for templates,
for questionnaires, and documents
that could be used immediately
without requiring subject matter
expertise, or had significant cost
or time resources. As you can
imagine this is a significant effort,
and we have just begun to scratch the surface
of how we evaluate those kinds of tools.
We also wanted to look at other
jurisdictions that have different actions
based on the type and sensitivity of
data, so we are including that as well.
We need to know what location our users
are in so we can customize our advice.
But one of the real questions that we
have is how do we set civil society up
to do things in a way that is easy,
and repeatable, and defensible,
in a prioritization that makes sense?
To that end, one of the things
we researched, specifically
starting with prioritization,
from a practical perspective
I will walk you through
a personal example.
When I was first hired to create a privacy
program for a health care organization
20 years ago, the people I worked
for had no idea what I was doing.
They knew we needed
something to do “privacy”
and somebody to do “privacy,”
but they didn’t know what that meant,
or what it was I would need to do that.
They just wanted me to take care of it.
In effect, I sat at a desk and made
judgment call, after judgment call,
after judgment call, about what
those privacy requirements were,
how they should apply to
different lines of business,
what constituted good enough,
and how to create some kind of
effective program that could live on.
And I’m not trying to
toot my own horn at all.
I made a lot of mistakes doing that.
And I think that’s normal. The
problem is there is no scientific basis
for prioritizing, or looking
at what those requirements are,
and how they should be implemented.
I sat and I remember very well
counting out the list one by one
of all the legal requirements,
all the contractual obligations,
all the vendor commitments
we had at the time,
and came up with a list of 250
things that needed to be done.
Where do you start?
Where do you start with the question?
And that issue of
prioritization remains today.
We cannot reasonably
ask any organization,
but particularly a civil society
organization to undertake 80+, 200+, 500+
consecutive privacy tasks. We
need to know which are the ones
that are a better time
and resource investment.
So the way we did this was we counted.
Privacy regulators publish their
findings. And the FTC in the United States
is no different from any other privacy
regulator. We looked at every single case
that they published by the tags that
they used or the filings on their site.
The lawyers on our team looked at each
case, pulled out specific data points
that would help us understand
what the advice was,
what was being recommended. And
then we were using that information
to help weight the recommendations
and the prioritization in our tool.
So with that – oh, actually, let
me speak to this lovely pie chart.
I didn’t put data points on here
because what I wanted you to see
was just the visual. I think, I hope, your
eye is drawn to the bigger parts of the pie
which are in fact, exactly where the vast
majority of the findings from the FTC sit.
As a person who wanted to implement
privacy in an organization,
you can be damn certain that is
probably where I’m going to start.
And that gives me a really nice slice of risk
if I just look at half of my risk situation,
there is 1, 2, 3, 4, 5, 6 things
that I can do right off the bat
that will take care of 50% of the
regulatory and enforcement risks
I have for privacy in my organization.
To me, that’s when we
started to see real value add.
So we created a Web app, and
we called it Privacy Patterns
because we were looking for patterns,
and we want to surface those patterns.
This is our site. I will stress
again that it is very much in beta,
so we are working with the
designer and some other folks now
to make it a little
bit more user friendly.
We started of course, with the standard
that you always do in this space
which is a disclaimer. The lab
is providing this information
to help people try and prioritize what
actions they want to take for privacy
for their organizations. It is not
legal advice. It is best practices.
It is general guidance. It is
education. We always will recommend
to talk to an attorney about
some of these more complex issues,
but we want to surface for you the ones that
can be dealt with some general information.
Once you get past that disclaimer,
the first thing we want to talk about
is what is being done with
your data. We broke this out
into some of the different patterns
that we saw looking at for example,
are you interested in collecting data?
Are you interested in
accessing someone else’s data?
Are you interested in using the
data your organization already has
for some other new purpose?
Are you sharing that data with someone
else, or a different organization?
Do you have retention
schedules in place?
Are you interested in archiving
and backing up data that you have?
This helps us categorize the
advice we would want to give
based on those data points
that you are sharing with us.
We also asked for location. You’ll see
a sort of random smattering of countries
and states here that is not complete,
reflective again of are working through
these proof of concept issues. So
you select particular jurisdictions
that you are interested in. And
then we will show you the research
that we have found in that space and give
you a top 3 list of what are the things
that you should be doing, or taking
care of from a privacy perspective
for that area that you are interested in,
both geography and the privacy patterns.
So let me drill into
a little example here.
For example, one of the pieces of advice
that we often give is it’s important
to be specific about data handling
practices. We then provide you a link
with a document that is a
template that shows both notice
and policy requirements
that you can download,
customize for your organization, and
publish in about 10 to 15 minutes.
Lo and behold, that tool specifically
comes from the Digital Impact Group
which I’m sure some of you have heard
of, and is actually part of our lab.
So we were quite excited to
see that some of our research
actually bounced right back to some
of the tools that DCSL offers already.
Again, these documents are templatable.
It was our intention not simply to
say, “you need to be transparent,”
but to provide an organization with a vehicle
to be transparent that was easy to use.
We also wanted to put in
a quick glossary of terms,
recognizing that a lot of these
are specific for a given geography,
a given industry. A health care
definition might be different
from an education definition. We are
trying to solve the greater problem
of the sector, so a lot of these
are going to remain somewhat vague
with the intention of leaving
it up to the organization
to customize for their own purposes.
And that is our tool in a nutshell. So
as I promised, I wanted to leave some time
for specific questions, or
other things to walk through
and then I will share
some next steps with you.
Sima: Thank you Tracy,
that was super informative.
So if you guys have questions,
again, please use the chat box
to have Tracy answer them.
I can read them out to her.
One question that kind of surfaced to my
brain when you were going through this.
Is there a sort of – I know
for a lot of for-profit businesses
they have like a seal of approval that
shows they’ve taken all the steps
to be private and compliant. Is there
such a thing with nonprofit organizations?
Tracy: That’s a great question.
I think off the bat I would tell you
that Trust E used to be the
signatory seal for privacy,
and I believe they have now
rebranded themselves as Trust.
And I think that they do in fact, offer
a similar kind of service for nonprofits.
The only thing I am not sure
of is how much it would cost.
And I think that’s the special
consideration for this sector,
is thinking about the cost benefit of
having those kind of certifications,
largely dependent on the kind of data
that you hold and what your data subjects,
and employees, and
donors are interested in.
So it is definitely worth exploring.
Sima: Okay, got it.
Tracy: It also looks like
there is a question here –
Oh, George Hamilton weighed
in with a fantastic quote.
If you haven’t all read
it, you definitely should.
It’s a standard, standard
security statement which is,
“there are 2 groups of organizations,
those who have been breached,
and those who don’t know they have
been breached.” And it’s really true.
It is not a question of “if.” It’s
absolutely a question of “when.”
And then I see Whitney raised questions
about Organization for Ethics
and Compliance Programs Framework
and how to guide them on
developing stronger privacy posture.
Yes, okay, this is a great document.
For those of you who are not aware of it,
the US Department of Justice,
Guidance on Compliance Effectiveness
is a fantastic resource, and
has also a really good document
that outlines what core compliance
programs should have in them
as the beginning of
creating a privacy program,
or something related
to privacy compliance.
I really think the next step after
establishing that kind of program framework
is to figure out how to measure it.
And I know that’s a much longer topic,
one I speak about regularly and
I’m happy to share resources on.
But the next question becomes
not just having a policy in place,
but having a vehicle and mechanism by
which you can demonstrate effectiveness
of that policy. So that is a
really good starting place for that.
And Whitney, I’m happy to talk more about
that with you if you have more questions
in that space. Let’s
see what else we’ve got.
The best first step for privacy,
Jennifer, that’s a great question.
I’m going to be a little weaselly
here, and give you two steps.
The first thing I think you want
to do as someone who is concerned
about privacy at an organization
is understand what data you have.
There are lots of resources to
help conduct a data inventory.
AIMA is one of them, and I
will send a link around to Sima
afterward to share with you.
They have a lot of great resources
on their website. That is a
really good place to start,
because if you discover you have
a lot of really sensitive data,
you may want to take a
slightly different first step
than if you have somewhat
less sensitive data.
I think it’s probably also
worthwhile to think about establishing
some kind of policy or guidance
document. And I know some organizations
are a little loath to
implement formal policies.
So what I’ve seen some
organizations do is create
what is called “guidance
documents for privacy” as in,
these are the values that we hold
in the management of your data,
and then kind of work
through those issues together
and make sure that that is
reflective of the business practices.
In terms of your question
about Google Drive,
I’m probably going to have to
punt that one if for no other reason
than a lot of it depends on what you mean
by “safe,” what kind of information,
and how Drive is implemented.
And that is really the same
for any type of cloud storage
whether that is Microsoft, or AWS,
or Google. They are all going to
have different positives and benefits
around their services in
relation to compliance.
So if you want to reach out to me, I’m happy to
discuss that with you a little further off-line.
Melissa, “Are there resources
for how to decide who sees what
in an organization?” And
yes, many, many resources.
One specifically that springs
to mind that is a little obscure
but I still like it, the government
of Ontario has a security team
that publishes advice and guidance
on role based access control
that contain a really cool matrix
chart in them that effectively sets out
what is your role,
what is your function;
okay, this is the data that
you should have access to
and sets out recommendations
in a really handy little chart.
We are going to be including
that in our tool as a link.
It is not up yet, but again, if you would
like me to point you to that document,
I am happy to do so.
Sima, is that it, or are there
more questions that I don’t see?
Sima: I think that’s it. We
have just a few more minutes
so I think that’s good. If you
guys have any other questions,
Tracy has her information I
think in the next couple slides.
Tracy: I do. Could I slip to that
next slide now? Would that be okay?
Sima: Yes, yes.
Tracy: Okay, perfect. So thank
you very much for your questions.
It sounds like you guys are really
on point, and on top of all of this.
I would also ask you a question
which is for your feedback.
And any feedback on anything
I talked about today,
but also, any other aspects of the
project and what we are building,
we really want it to work for all of
you. And it is really important to us
that we represent both your
concerns, and answer your questions.
So I have created a really lightweight
open ended survey on Survey Monkey.
If you want to fill that out, I would
greatly appreciate your feedback.
Also, you can just email me directly
with any thoughts that you have.
And again, I want to stress, this
is not about building a product.
It is absolutely going to be an
open source tool that we maintain
through the lab for this community. So we
would be delighted to have that feedback,
and those comments or concerns, or
additional questions that you have.
So with that I will say thank you
very much to all of you for your time,
and turn it back over to Sima.
Sima: All right, thank you Tracy.
And we can send out that link
also in the follow-up email
when the webinar is over.
So just to wrap up, a little
bit about TechSoup again,
here is the link if you are
interested in our product donations.
And then one thing we always like
to do is just ask our attendees
to chat one thing that they
learned in today’s webinar.
We also have a post event survey, so any
feedback that you have for us really helps.
And then also, we are on social media.
So if you are on Facebook, Instagram,
or Twitter, we love social media love,
so please give us a like, or heart,
or whatever it is
relevant to the platform.
And then also please feel free to visit
our blog which is blog.TechSoup.org.
And we have a few more webinars
coming up. We have one next week,
Video Storytelling Made
Easy with Adobe Spark.
And on 5/31 we have 5 Clear Steps
to Get Your Nonprofit Cloud Ready.
If you are interested in watching
any of our archived webinars,
you can go to this link to watch this
one, or any of our other prior webinars.
And again, thank you Tracy
for all that information.
That was super helpful.
And thank you to LaCheka
for helping on chat.
And lastly, thank you to our
webinar sponsor ReadyTalk.
Thanks, and hope to see you all soon.
Tracy: Thank you.
