Tuesday, November 10, 2015, at 12:00 pm Berkman Center for Internet & Society at Harvard University Harvard Law School Campus, Wasserstein Hall, Milstein East
Lawyers and computer scientists hold very different notions of privacy. Notably, privacy laws rely on narrower and less formal conceptions of risk than those described by the computer science literature. As a result, the law often creates uncertainty and fails to protect against the full range of data privacy risks. In contrast, emerging mathematical concepts provide robust, formal models for quantifying and mitigating privacy risks. An example of such a model is differential privacy, which provides a provable guarantee of privacy against a wide range of potential attacks, including types of attacks currently unknown or unforeseen.
The subject of much theoretical investigation, these new technical methods for privacy protection have recently been making significant strides towards practical implementation. For example, researchers are now building and testing the first generation of tools for differentially private statistical analysis. However, because the law generally relies on very different methods for mitigating risk, a significant challenge to implementation will be demonstrating that the new privacy technologies satisfy legal requirements for privacy protection. In particular, most privacy laws focus on the identifiability of data, or the ability to link an individual to a record in a release of data. In doing so, they often equate privacy with heuristic “de-identification” approaches and provide little guidance for implementing more formal privacy-preserving techniques.
In this talk, Kobbi Nissim and Alexandra Wood will articulate the gap between legal and technical approaches to privacy and present a methodology for formally proving that a technological method for privacy protection satisfies the requirements of a particular law. This methodology involves two steps: first, translating a legal standard into a formal mathematical requirement of privacy and, second, constructing a rigorous proof for establishing that a technique satisfies the mathematical requirement derived from the law. The presenters will walk through an example applying this new methodology to bridge the requirements of the Family Educational Rights and Privacy Act (FERPA) and differential privacy. They will conclude the presentation with a discussion of how the methodology could help further the real-world adoption of new privacy technologies.
This talk summarizes early results from ongoing research by Kobbi Nissim, Aaron Bembenek, Mark Bun, Marco Gaboardi, and Salil Vadhan from the Center for Research on Computation and Society, together with Urs Gasser, David O’Brien, and Alexandra Wood from the Berkman Center for Internet & Society. Further work building from this approach is anticipated to form the basis of a future publication. This research is also part of a broader collaboration through the Privacy Tools for Sharing Research Data project, which aims to build legal and technical tools, such as tools for differentially private statistical analysis, to help enable the wider sharing of social science research data while protecting the privacy of individuals.
Kobbi Nissim is a Professor of Computer Science at Ben-Gurion University and a Senior Research Fellow at the Center for Research on Computation and Society at Harvard. Trained in cryptography, Kobbi always maintains a healthy level of paranoia, and feels the ground is shaky whenever issues of security and privacy are not formally defined and analysed.
Nissim's current work is focused on the mathematical formulation and understanding of privacy. His work from 2003 and 2004 with Dinur and Dwork initiated rigorous foundational research of privacy and presented a precursor of Differential Privacy, a strong definition of privacy in computation that he introduced in 2006 with Dwork, McSherry and Smith. With collaborators, Nissim established some of the basic constructions supporting differential privacy, and studied differential privacy in various contexts, including statistics, computational learning, mechanism design, and social networks. Since 2011, Kobbi has been involved with the Privacy Tools for Sharing Research Data project at Harvard University, developing privacy-preserving tools for the sharing of social science data. Other contributions of Nissim include the BGN homomorphic encryption scheme with Boneh and Goh, and the research of private approximations. In 2013, Nissim received with Irit Dinur the Alberto O. Mendelzon Test-of-Time award for their PODS 2003 work on privacy. In 2016, he will receive with Dwork, McSherry and Smith the TCC test of time award for their TCC 2006 work on differential privacy.
Alexandra Wood is a fellow at the Berkman Center for Internet & Society and a member of the Privacy Tools for Sharing Research Data project at Harvard University. A lawyer by training, her research explores new and existing regulatory frameworks for data privacy and their compatibility with approaches to privacy emerging from the literature in other fields. Alexandra has also been contributing to the development of new legal instruments, analytical frameworks, and policy recommendations to better support the sharing and use of research data while preserving privacy, utility, transparency, and accountability. Before joining the Berkman Center, she served as a legal fellow with U.S. Senator Barbara Boxer and as a law clerk with the Center for Democracy & Technology and the Electronic Privacy Information Center.
About the Privacy Series Starting in Fall 2015, the Berkman Center for Internet & Society at Harvard University is highlighting a series of talks, papers, and other activities focused on data privacy. In recent years, concerns about government surveillance and Big Data have focused national and international attention on questions of online privacy. With this series, we aim to illuminate many of the legal, economic, technological, and behavioral issues at play when it comes to data privacy, to foster discussion among multiple perspectives, and to explore alternative mechanisms for balancing consumer privacy with the potential benefits of Big Data.