Micah Altman, Aloni Cohen, Kobbi Nissim, and Alexandra Wood—collaborators with the Privacy Tools project—published a new article “What a Hybrid Legal-Technical Analysis Teaches Us About Privacy Regulation: The Case of Singling Out” in the Boston University Journal of Science & Technology Law.
The article presents a novel hybrid legal-technical approach to the evaluation of technical measures designed to render information anonymous in order to bring it outside the scope of data protection regulation. It addresses the substantial uncertainty that practitioners developing privacy-enhancing technologies face in assessing their compliance with legal privacy standards.
Doubts about the feasibility of effective anonymization and de-identification have gained prominence in recent years in response to high-profile privacy breaches enabled by scientific advances in privacy research, improved analytical capabilities, the wider availability of personal data, and the unprecedented richness of available data sources. At the same time, privacy regulations recognize the possibility, at least in principle, of data anonymization that is sufficiently protective so as to free the resulting (anonymized) data from regulation. As a result, practitioners developing privacy-enhancing technologies face substantial uncertainty as to the legal standing of these technologies. More fundamentally, it is not clear how to make a determination of compliance even when the tool is fully described and available for examination.
The authors argue that new hybrid concepts, created through technical and legal co-design, can inform practices that are practically complete, coherent, and scalable, and thereby address gaps in the current regulatory framework. As a case study, the article focuses on a key privacy-related concept appearing in Recital 26 of the EU General Data Protection Regulation (GDPR) called singling out. It first identifies a compelling theory of singling out that is implicit in the most persuasive guidance available and demonstrates that the theory is ultimately incomplete. It then uses that theory as the basis for a new and mathematically rigorous privacy concept called predicate singling-out and argues that any technology that purports to anonymize arbitrary personal data under the GDPR must prevent predicate singling-out. This enables, for the first time, a legally- and mathematically grounded analysis of the standing of privacy technologies like k-anonymity and differential privacy with respect to the GDPR anonymization standard.
The article concludes with a discussion of specific recommendations for both policymakers and scholars regarding how to conduct a hybrid legal-technical analysis. It demonstrates how this analysis can inform the design of new regulations and guidance documents that are consistent and meaningful from both legal and technical points of view. Rather than formalizing or mathematizing the law, the article provides approaches for wielding formal tools in the service of practical regulation.