I spent two challenging days at the first face-to-face meeting of the Privacy Advisory Group of UN Global Pulse in Den Haag. BIt was challenging because it was scheduled adjacent to a privacy commissioners’ conference and because the location was in Europe where privacy protection has been elevated to quasi-religious status. We as researchers are trying to solve problems that affect millions of people in developing countries such as traffic, unresponsive and poorly planned cities, the spread of diseases and so on. To us privacy and other harms matter, but in the foreground of our thinking we always place the social problems we are trying to solve. We attack the privacy problems because they get in the way of the larger purpose. But sometimes I got the sense that it’s the other way around in Europe. Privacy, not competition, not marginalization, is the main purpose. Everything else is secondary.
But UN Global Pulse did something very good in framing the issues: they wanted the advisory group to consider the risks of harms caused by misuse of big data along with the risks of harms caused by non-use of data.
So one side the possibility that political preferences of a village/group would be precisely identified. I pointed out to the people presenting this scenario that these kinds of things were known in the real world of developing countries. One did not have to conduct big data research to identify voting preferences of villages and groups.
On the other side the harms of not studying the underlying behaviors of groups for fear of some kind of group harm being visited upon them. Forms of “redlining” come to mind as I search for real examples of group harms. I will give this more thought and write about it at length. Hopefully, there will be more people who have got their hands dirty with data at future meetings and I will have less of a responsibility to represent the people who are actually working with data for public purposes.