harms


I have been a fan of Daniel Solove’s approach to privacy, where he foregrounds actual harms suffered by individuals rather than derive remedies from abstract principles. I have often said that the informed-consent model is of zero value when people find that their personally identifiable information stored by an organization has been stolen. The US Federal Trade Commission has called for comments on informational harms or injuries. I am tempted to respond. Would if there were 28 hours in a day.
I have been impatient with people who think that inform-and-consent is the end all of privacy. One of the actual greatest dangers is personally identifiable information being stolen from service providers by hackers. This is a real privacy harm. I have not gone into the details of the FCC’s decision and its competitive implications. But it’s worth knowing they were paying attention to real privacy harms.
An unexpectedly detailed description of our big data session was included in the Day 3 highlights: Big data is usually in the headlines for the wrong reasons – surveillance, exploitation of personal data for commercial or governmental ends, intrusion of privacy – but can also serve a valid and immensely exciting social purpose for development. Kicking off a fascinating, packed and highly-interactive session, moderator Rohan Samarajiva, Founding Chair and CEO, LIRNEasia, set out this contradiction in perception of big data as a “competition of imaginations” between hype and pessimism, reminding us that big data is “of interest to all of us, as we are the creators of this data, the originators of this data”. Our mobile telephones, and by extension we ourselves, are permanently in communication with the nearest towers, sending out details of our whereabouts and activities in an ever-growing, highly personal call record. This session aimed to “talk not about the imagination, but about what has been done”, exploring current and future trends in the use of big data for development.