privacy Archives — Page 3 of 5 — LIRNEasia


An unexpectedly detailed description of our big data session was included in the Day 3 highlights: Big data is usually in the headlines for the wrong reasons – surveillance, exploitation of personal data for commercial or governmental ends, intrusion of privacy – but can also serve a valid and immensely exciting social purpose for development. Kicking off a fascinating, packed and highly-interactive session, moderator Rohan Samarajiva, Founding Chair and CEO, LIRNEasia, set out this contradiction in perception of big data as a “competition of imaginations” between hype and pessimism, reminding us that big data is “of interest to all of us, as we are the creators of this data, the originators of this data”. Our mobile telephones, and by extension we ourselves, are permanently in communication with the nearest towers, sending out details of our whereabouts and activities in an ever-growing, highly personal call record. This session aimed to “talk not about the imagination, but about what has been done”, exploring current and future trends in the use of big data for development.
Much of what is discussed as “big data” does not include the poor, because smartphone penetration is still low, social media are not used by all classes and datafied records are rare in developing countries. Therefore, the session focused on research that has been/is being done on pseudonymized mobile network big data in developing countries. Instead the usual “battle of imaginations” which posits the optimistic scenarios that tend toward hype against the pessimistic scenarios that imagine all sorts of bad things that could happen, we began with reality. What had been actually done on the ground in countries as different as Namibia, Afghanistan and Sri Lanka were presented by data scientists who knew the ins and outs of data cleaning, pseudonymization, and what software needs to be used to analyze petabytes of data at a time. The active audience raised a range of questions.
I resisted the notion that we should start our work on guidelines for”big data” from the settled law of other jurisdictions. I did not do that in 1987 when I did one of the earliest policy studies on ICTs and the law in Sri Lanka, and I was not about to start in 2013. I had reservations about both the chaotic and piecemeal nature of US privacy law and the over-bureaucratic nature of European law that made even a simple list of course attendees a subject of “data protection” enforced by a Data Protection Commissioner. In addition, I sensed that big data was a qualitative jump from what existed before and it was wrong to simply extrapolate from the existing law. Looks like I was right.
Last week, LIRNEasia taught a course on broadband policy and regulation in Sohna. One of the modules was on privacy and surveillance. One of the instructors was Sunil Abraham, acknowledged for his thoughtful and creative approach to sticky ICT policy questions. Drawing a diagram, he pointed out that if surveillance was exclusively focused on the small percentage, perhaps five percent, of people who were engaged in terrorism or other bad acts, law enforcement would be more efficient and the liberties enjoyed by the non-terroristically inclined majority would be that much safer. On the face, a beguiling proposition.
John Podesta is no stranger to privacy issues. I can remember some interactions with him in the context of the Electronic Privacy Information Center (EPIC) during the Clinton Presidency. He has now been tasked with producing a big data-privacy report in 90 days. We are undergoing a revolution in the way that information about our purchases, our conversations, our social networks, our movements, and even our physical identities are collected, stored, analyzed and used. The immense volume, diversity and potential value of data will have profound implications for privacy, the economy, and public policy.
President Obama’s first response to the revelations of NSA malfeasance was jarring to many, an unhappiness articulated by Pratap Bhanu Mehta. Now we have Obama’s considered response: Mr. Obama also said he was taking the “unprecedented step” of extending privacy safeguards to non-Americans, including requiring that data collected abroad be deleted after a certain period and limiting its use to specific security requirements, like counterterrorism and cybersecurity. “The bottom line,” he said, “is that people around the world — regardless of their nationality — should know that the United States is not spying on ordinary people who don’t threaten our national security.” Full report.
For too long, the field of privacy has been becalmed by religious fealty to a concept propounded by two New England aristocrats who were annoyed by paparazzi taking pictures of a party in a home. The ill-considered explosion set off by the NSA in its zeal to prevent all future acts of terror has opened up space for new thinking on the subject. An op-ed in the Washington Post is a good example: This is an anonymity problem: The NSA cannot create a dossier on you from your metadata unless it knows that you made the calls the agency is looking at. The privacy question is all about data-gathering: Should the NSA have access to nationwide metadata? The right answer to that question is yes.
We think about transaction-generated data (TGD) a lot. The essence is that data generated as a by-product of some activity (and which is therefore highly accurate) can tell us more about behavior (even future behavior) than all the questionnaires in the world. Behavior associated with music, closely tied to emotion,seems like an even better candidate than reading. During the next federal election cycle, for instance, Pandora users tuning into country music acts, stand-up comedians or Christian bands might hear or see ads for Republican candidates for Congress. Others listening to hip-hop tunes, or to classical acts like the Berlin Philharmonic, might hear ads for Democrats.
Prof Hal Abelson of MIT recently shared his thoughts on privacy in the digital realm, at a online alumni webcast. Amongst the noise that one hears on this topic these days, his thoughtful comments resonated. Partly for sharing and partly for my own memory, I felt it justified a blog post and I capture his main points below: People don’t really know what they want when they think of privacy. They describe their privacy needs through use-case scenarios for e.g.
Viktor Mayer-Schonberger and I have been debating privacy since the early 1990s. We both had chapters in Technology and Privacy: The New Landscape, which remains a seminal book on technology and privacy, published in 1997. Just last month, we continued our conversation at IGF in Bali. He was not so forthright in Bali, but now he is putting into words what we have been kicking around in our internal discussions. At the just-concluded IAPP Data Protection Congress in Brussels, the audience heard a bold proposal from closing keynoter Viktor Mayer-Schönberger: “The naked truth is that informational self-determination has turned into a formality devoid of meaning and import.
This was a central claim in the highly significant ruling made by Federal District Court in Washington DC: In a 68-page ruling, Judge Leon said the N.S.A. program that systematically gathers records of Americans’ phone calls was most likely unconstitutional, rejecting the Obama administration’s argument that a 1979 case, Smith v. Maryland, was a controlling precedent.

Trust as the key ingredient

Posted on November 25, 2013  /  0 Comments

Even when one disagrees with a speaker, one can learn from the engagement. I enjoyed myself at the talk given by Futurist Gerd Leonhard at ITU Telecom World, partly because I was actively engaging his stream of consciousness by tweeting. One thing I agreed fully with was his emphasis on trust: As the tweet said: “If you are in ICT Business & don’t have trust, you will be out of business in 5 yrs. Futurist at #ituworld” This caused me to dig through some old writing. Here is what wrote back in 1999 in a UNESCO publication: The overall environment of a society has an impact on how its members approach electronic commerce.
My work on privacy in the 1990s greatly benefited from my teaching. My classes were like laboratories where we tested out scenarios and concepts. I (and my students) also engaged with science fiction. I still talk about the extraordinarily powerful, low-tech surveillance techniques described by Margaret Atwood in The Handmaid’s Tale. That was brought to me by a student.
I was reminded of that old chestnut about a flagman having to walk in front of early automobiles when I heard some participants talk at the workshop on big data, social good and privacy. Imagine imposing inform and consent rules on transaction-generated data (big data) belonging to large corporate entities such as mobile operators. They need the data on user mobility patterns to manage their networks; they need financial transaction data to manage their finances. All these things can be covered under broad inform and consent procedures that will be presented to customers as they sign up. What will not be possible would be to permit use by third parties for traffic management, energy management, urban planning etc, since these uses could not be conceptualized at the time of signing up customers.
President Obama’s support for surveillance predates his election. I believe that he has assessed the pros and cons of surveillance and concluded that it is necessary. The question then is how it is to be regulated, so that that negative outcomes can be minimized. One possible path is a variation of the FISA oversight solution, but with greater transparency. This may be the path being explored by Senator Markey, perhaps one of the most well informed US legislators on telecom and ICT matters.
The ethic of reciprocity is perhaps the most fundamental principle governing human interaction. I once studied this in some depth for the purpose of teaching interconnection of all things. My favorite was Rabbi Hillel’s formulation: “That which is hateful to you, do not do to your fellow. That is the whole Torah; the rest is the explanation; go and learn it.”—Talmud, Shabbat 31a, the “Great Principle” So now, Russia wants the ethic of reciprocity applied to the metadata, the collection of which President Obama said was no problem at all.