Death and digital civil society

Last week's episode of the Raw Data podcast is mostly about death.

That said, the last 7 minutes or so include some thoughts from me about our relationships to our digital data, why we need new rules for this resource, and why it matters during life as well as after it.

Take a listen: https://soundcloud.com/rawdatapodcast/episode-9-the-digital-afterlife

I show up around minute 20:00.

In addition to being fun to record, the interview process prompted me to think hard about perpetuity, immortality, the law and digital data. This is exciting. It also ties in nicely with an event on Giving in Time that Stanford PACS is co-hosting with Boston College School of Law - public event on campus on April 4, 2016. Stay tuned for more details. 

Thanks to the folks at Worldview Stanford and the Stanford Cyber Initiative

Civil society's (and philanthropy's) digital roots

Today's assigned readings on digital civil society:

Cathy O'Neill on "ethical data science." She looks at the way that society's values, our assumptions and software code influence each other. They are mutualistic. And increasingly inseparable. Those who write the algorithms, those who use them, and those whose lives are affected by them - in other words, all of us - need to understand this, question it, and use data and tools to lend new insights, not reinforce existing power imbalances.
 
Neil Richards on the need to be able to regulate code - software code and those who create it - in many uses and forms in the digital age and his admonition that Apple's arguments about privacy are sound, while their arguments about free speech are problematic. Applying a free speech framework to software code will make it very difficult to monitor and regulate uses of code that discriminate or cause other harms. And, increasingly, we are going to recognize that our civil rights battles are being fought on digital turf.

All three articles focus on our need to assume software code is fundamental now - to how decisions get made in society, business, and policy making. Indeed, they argue that software code under girds how we act as private citizens, associate with one another, and express ourselves. These rights, in turn, support civil society as we know it. Those of us focused on improving nonprofit or foundation action, on using digital tools for social outcomes, on building globally influential digital tools for social good need to take these lessons to heart.

Philanthropy and civil society now rests on software code - it is digital civil society.


It's the data, stupid (note to self)

Last Friday I was part of the International Data Responsibility Group's second conference where I heard incredible examples of how the World Food Programme is using data to guide its work feeding people in conflict zones and thought long and hard about data collaboratives and the possibility of data philanthropy.

On Saturday I read about the ways Russia is targeting attacks on human rights and aid-related NGOs' digital systems in Syria.

On Monday, I participated in an incredible seminar on crowdsourcing and public decision making, partly organized around Beth Noveck's book - Smart Citizens, Smarter State.

On Wednesday, I heard Kevin Carey discuss his book, The End of College, which looks at the economic opportunities that digital tools bring to higher education. It only hints at how the digital data generated in those environments will become a key resource and point of contention between schools, students, employers, regulators, researchers, and teachers.

Today, I pushed forward in my attempts to convene scholars of crowd (sourcing and funding) - or what we're calling CrowdX - from across the Stanford campus, in disciplines as diverse as civil engineering and business, social algorithms and democracy theory.

I also read that the charges that Airbnb "cooked its data books" in releasing information to New York regulators hold up.  It took intrepid journalists to demonstrate this to the public and to regulators - and to get the company to fess up.

And, of course, like many people I am absorbed in the legal, ethical, and democratic arguments unfolding between the FBI and Apple.

All of these disparate events raise concerns about privacy, publicness, and data. But the one issue that I think they all raise - that has actionable, policy-related implications for philanthropy and civil society - is this:

  1. What data must be made public (and auditable) by platforms that facilitate public services (transportation, shelter, funding charitable or public goods)?
We require public businesses, nonprofits and foundations to report on their activities. This information provides some form of accountability, is helpful in fighting fraud, and - in the aggregate - provides a critical lens into the state of our economy, our social sector, and our democracy.

So, what data do we need access to, as a public, to understand, oversee, and yes, audit, platform companies that facilitate transactions that meet the same criteria of public interest. They may be charitable in nature, public good supporting, or mutual aid related.

Setting the bar at its lowest - shouldn't we at least have access to the same information from these platforms that we would gather were the transactions happening in some other way? Why would the platforms - even as proprietary as they are - be given a pass on reporting data on transactions that we'd otherwise publicly report? And, given their own self-touted role in a "big data" economy, setting the bar that low seems, well, practically analog and 20th century.

We're working on this question.  We don't yet have answers (although here are some ideas).  We do know the answer is not "nothing, trust us." And we do know that the getting the answer right matters to - and depends on - the active participation of existing institutions in civil society and philanthropy.

Let the lawsuits begin


We launched digitalIMPACT.io on Tuesday at the Data on Purpose conference. It was great.

One of the conversations I had during the event was initiated by someone asking me the question, "When will the lawsuits begin?"

The point was that organizations won't change until they really have to, so nonprofits won't start really digging into data governance practices and policies (what digitalIMPACT.io provides) until they're  legally required to do so. Lawsuits that lead to regulatory or legislative change, this person was suggesting, are a step toward the same type of organizational behavior change we're trying to support with digitalIMPACT.

History bears out this "theory of change."

Bad behavior, lawsuit, lawsuit, lawsuit, regulatory change is a plot line (or subplot) through a great deal of social and political history. Environmental protection. Civil rights. Gun laws. Campaign funding. And, yes, the protection of civil liberties online.

Tonight, I was catching up on email and half-watching the news when I heard this sentence from the professionally-alarmed local newscaster:
"Big news for parents. Your child's private data is likely on its way to a nonprofit advocacy group."
Needless to say, I gave the TV my undivided attention.

A nonprofit that advocates for special education has sued California school districts as part of their efforts to make sure kids are getting appropriate services. The school districts are (allegedly) sending electronic files of all students with names, addresses, and social security numbers to the organization. The news story went on to describe the privacy risks for children's data and point viewers (parents) to an opt-out process.

How long will it be until there is a countersuit?

Now we have an answer to Tuesday's questions. Lawsuits between nonprofits and public agencies about data have begun. Give it a minute and it will be the nonprofits getting sued. (Of course, lawsuits over digital data started years ago, at least as early as 1990, when EFF was founded).

Legal challenges for regulatory change are a tried and true means of changing policy. They are not the only way. There are things nonprofits and foundations can do to treat digital data with integrity and respect, and perhaps avoid litigation.  Check out digitalIMPACT.io.