Technology & AI

Nonprofits Are Not Immune From Data Breach Risks

Digital tools bring privacy challenges to social enterprises.

March 08, 2016


A woman uses a computer in a lounge area.

Reuters/Thomas Peter

Each year, Lucy Bernholz, a research scholar at Stanford University’s Center on Philanthropy and Civil Society, releases a forecast for the philanthropic world called Blueprint.

This year, Bernholz used Blueprint to call nonprofits to action and to urge all kinds of social enterprises to be cautious in the way they use data and digital tools. While technology spurs innovation and empowers citizens, it also enables government and corporate surveillance and control.

This is not the easiest message for innovators like the people working in social enterprises to hear. We’re usually at the forefront of changes, embracing opportunities and risk. Who would welcome the need to slow down and think about possible collateral damage? But we should, Bernholz argued in a conversation we had recently.

Her biggest concern, Bernholz says, is how digital tools are transforming civil society, which she defines as the space where private resources are voluntarily used for public good.

“The biggest worry for American civil society is that we have fully drunk the Kool-Aid that digital technologies are only democratizing,” she says. “Ethical, safe, and just use of tools cannot be assumed.”

What Does a Digital Civil Society Look Like?

Bernholz cited the #yesallwomen discussion that exploded on Twitter in 2014 after murders near a Southern California college campus. The discussion broadened and moved to the mainstream media, helping to drive a political agenda just as #blacklivesmatter is doing now.

Protesters participating in civil unrest by joining a march can remain anonymous. Posting a tweet with a hashtag, on the other hand, is traceable and, Bernholz pointed out, likely being monitored and kept by the government. Surveillance knows no party lines: Over the past few years, in a slowly unfolding scandal, it has came to light that the IRS had targeted application audits of conservative groups applying for tax-exempt status, using words including “Tea Party” in the name to identify them for attention.

It’s easy to get hyper-focused on the cause and forget that in order to work on it, you need a safe space to do so.
Neil Malhotra

“Civil society exists so that individual citizens can express themselves outside the marketplace and outside of governmental obligations,” Bernholz says. “But the digital version of that space is entirely owned by government and corporations. We’re playing in their house.”

Most nonprofits and social enterprises have been absent from the debate over digital civil society, Bernholz argues. But, she says, they shouldn’t be. Many of them collect and use data to further their missions, many engage with partners and donors with high expectations of privacy, and many of them make full use of digital tools that ensure their work is public.

For instance, last year’s Blueprint included examples of an app set up in China to gather daily readings of airborne particulate matter. The information was made available by Twitter and phone app, and by 2013, individuals had launched dozens of apps that compared data on pollution.

The Chinese government eventually passed laws requiring publication of pollution details. There are many examples in the United States, too — and many of them come from outside the traditional nonprofit sector. For instance, mobile phones are being used to text messages that inform pregnant women about prenatal care; crowdsourcing enables urban entrepreneurs to start their own businesses.

Yet, what questions do those efforts raise about data, privacy, and the risks to the people running social enterprises and to the people benefiting?

Bernholz suggests that social enterprises that are employing such digital tools be aware of how vulnerable anyone collecting data is to hackers and people gathering information for their own purposes: “Know what you’re collecting and what you’re going to use it for, and only collect what you can protect.”

She offered three guidelines:

  • Consent: Obtain permission from those whose data you are gathering.
  • Privacy: Only collect what you can protect.
  • Default to openness: When possible, make the data you have collected available publicly.

The tensions and contradictions Bernholz pointed out in the Blueprint aren’t easily resolved. It’s worth remembering that while we’re working on our enterprises with all the digital tools now available, we’re also creating new rules and norms. It’s easy to get hyper-focused on the cause and forget that in order to work on it, you need a safe space to do so — the civil society.

For media inquiries, visit the Newsroom.

Explore More