Skip to the main content

Get To Know 22-23 RSM Visiting Scholar: Jon Penney

Jon Penney is a legal scholar and social scientist based at Osgoode Hall Law School, York University, in Toronto. He is also a Faculty Associate at Harvard’s Berkman Klein Center for Internet & Society and a long time Research Fellow at the Citizen Lab based at the University of Toronto’s Munk School of Global Affairs and Public Policy.

You have a background in law, as well as one in technology. For someone like me who might be only well-versed in one of those fields, how would you explain the intersection of law, human rights, and technology? 

The intersection of law, technology, and human rights speaks to some of the most complex and difficult, but for me, the most interesting questions and public policy challenges today. 

As technologies are developed and as they emerge and they become deployed in society, they have important implications. They impact people, they impact society, impact existing social relations. They impact existing legal rights and interests. And naturally, then people look to the law and they look to human rights to think about those implications—the law, of course, being one of the central tools and instruments that we have in society to protect certain interests and to protect certain human values, but also to foster the emergence and innovation and development of technologies that can benefit society. 

But I think as time has gone on, and we leave some of those more traditional perspectives, we also understand how each of these different social facts—that is, the law, technology, and human rights—they all shape each other. Law constructs technology; technology constructs law.  

It's one of those areas where, to answer those tough questions about technology [and its] social dimensions, approaching them from a legal perspective is really valuable. And vice versa–I think understanding technology and its development and how it's shaped and designed, I think that can really inform the law as well. So for me, it's interesting, it's fascinating, but it's also about learning and coming up with the best possible solutions to some of these complex challenges. 

You mentioned why this intersection, specifically, was important to you. Would you like to expand on that or on why should people over all be aware of or educated on this intersection? 

In order to get to get at these hard questions, I think what we've increasingly learned over time is that the most difficult challenges today—public policy challenges, social challenges, economic challenges—technology is right at the center of these. But what we've also learned… is that approaching these problems from a single discipline or from a single perspective, you're just not going to come to a sufficient answer or solution to the problems that technology raises in society. 

 I think, increasingly, to understand these difficult problems, you have to approach them from a multidisciplinary, multi-methodological, multi-field view. You're going to need help from people who really understand technology, be it a technologist, you're going to need people who understand the history of technology like a historian, people who theorize the social side of technology, like somebody who understands critical theories, someone from science and technology studies. You're going to have to talk to lawyers who understand the law and how that can be designed and implemented to resolve these problems. You might have to talk to sociologists or an economist to understand the economic and the social side. So to me, that's why this kind of intersection is important to come to the right answers or the best possible answers. 

Often, I see a lot of technologists or lawyers that have an interest in tech, but they don't necessarily take the social science route into explaining any of these concepts, and then often their solutions are very single minded. I saw in some of the previous work that you've done with chilling effects that you employed social theory to explain some of the phenomena that's typically attempted to be explained through other channels. 

How has your experience been with utilizing theories of sociology in explaining legal or technical concepts and their intersections? And do you think that this technique of applying direct social science concepts should be further utilized in these fields?  

Absolutely. You’ve really touched on a core aspect of my legal scholarship, at least in my recent work…Naturally, if you're legally trained, you think through the legal problems and you come up with mainly legal solutions. You theorize things in legal terms and in doctrinal terms. But I think there's...a lot of fruitful work that comes about if you engage with social theory and try to inform these legal concepts, trying on decades of social theory and literature that's been developed in these other disciplines and other fields to inform what's happening in the law. 

Now, of course, social theory does impact the law, and you see that in a variety of areas. Often, it's through expert testimony in courtrooms, and from time to time, eventually, actual social findings will sort of dribble through and impact on legal doctrine. And so, it does happen, but it happens far too rarely. It’s some real value to scholarship that looks to other fields to inform the law, especially given the impact of the law itself, how it affects people every single day, how it can hurt people if it's applied unfairly, if it's applied ignorantly, if it's applied in a way that's inconsistent with how we understand certain social phenomena. 

And that's what I attempted to do with chilling effects. It's a concept that has a particular legal understanding…and the outcome has been, I think, in proper application of the phenomenon in practice. I try to correct that, drawing on social theory, and then elaborate the implications. But there's a variety of areas in law that I think you could do the same, and I think technology, in particular, is one of those areas. Courts tend to simply defer to expertise. Courts tend to defer to technology and science, and sometimes that's important and valuable, but other times it can lead to great injustice. And I think in order to mitigate those injustices, courts need to have an informed perspective on technology. 

You mentioned that it's important to have people from all these backgrounds talk and interact with one another. I know that you've done a lot of a lot of work in surveillance and privacy from legal and social and technological frameworks. I was wondering if you noticed any gaps in knowledge that people from each of these communities hold that you feel should be better bridged upon? 

Absolutely, and one particular area that is really interesting right now is “privacy by design.” For the longest time, this became sort of a stayed concept; I think it was around the 2000s the term was coined, and there was a bit of a growth in scholarship that focused on privacy law, but we also need to think about how you design privacy into systems. 

Ari Waldman, in his recent book [Industry Unbound] ... what he sees, with the way these companies operate through the lens of surveillance capitalism, is that there's different sort of practices that emerge within these companies and make it really difficult for privacy principles to to be operationalized within design, within systems and enforced within systems. Even though you have people working with these companies who really are privacy advocates and are operating and working in good faith, it's really difficult within the systems themselves, given all the different broader forces that they're operating within.  

This is one really interesting area, and I think there's a lot more work to be done. In fact, there's an emerging body of literature that's thinking about these problems slightly differently…and that's something called friction by design. The idea here is it's not necessarily that you're implementing privacy principles, but you're implementing design features that slow systems down, and in particular, slow down and throw up barriers to the extraction of data, the sharing of data across and in between systems. The idea here is that you can protect not just privacy, but a range of other democratic values by designing in certain features in the system. So it's thinking about design, not just in relation to privacy, but in broader terms–maybe you could say “human rights by design.” 

[Waldman’s] book shows, in the end… that we need to come up with new ways to address these problems because the law can't solve everything. We need systems to be designed in a responsible way. And the only way that's going to happen is if we have computer scientists who have ethics, who are aware of privacy problems, who are aware of human rights issues, who are aware of the social implications of the technologies that they're developing and rather than moving fast and breaking things, moving slowly and designing thoughtfully and carefully with these rights, interests and values in mind. 

Interviewer 

Aarushi Dubey is a computer science student at the University of Maryland. She is interested in exploring privacy and security, as well as improving computer science education. During the summer of 2023, she interned with the Berkman Klein Center’s Institute for Rebooting Social Media.