Skip to the main content

Q + A with Danielle Citron, Professor of Law at UMD

Tomorrow's Tuesday Luncheon Series guest will be Danielle Citron, Assistant Professor of Law at the University of Maryland Law, and she'll be speaking on "Technological Due Process." Berkman intern Yvette Wohn helped put together this Q + A with Danielle, as a primer to her talk, where they discussed technology and government policy, online reputation, and privacy.

Q. A lot of organizations including schools and major companies, do initial sorting of applicants through automated systems, such as setting a minimum GPA. How do you think this kind of system discriminates people?

Such a system certainly could discriminate against groups of individuals.  Systems reflect the biases of their programmers.  For instance, Helen Nissenbaum studied an automated loan program that assigned negative values to applicants from certain locations, such as high-crime and low-income neighbors.  We can imagine a graduate school’s automated system that dilutes the GPAs of applicants from designated community colleges, such as certain rural schools whose student body is disproportionately less affluent or representative of particular minorities.  Because the source code for these systems is typically closed, no one can view the programmer’s instructions to the computer.  The bias remains hidden from interested individuals.

Q. Could automated systems be programmed to detect "exceptional" material?

As Professor James Grimmelmann explains, automated systems lend themselves to rules.  If the term “exceptional” can be broken down into a variety of rules, then an automated program can certainly detect and categorize a scenario as exceptional.  If, for example, a particular policy recognizes three situations that constitute “exceptional circumstances” warranting an extension of cash benefits for certain individuals, then programmers could translate those three conditions into a rules-based program.  However, if the policy cannot be broken down into defined rules and instead relies on a human being to consider variables that defy precise weighting, then the policy should not be automated.

Q. With technology becoming a more vital tool to execute policies, is the role of the programmer becoming more political?

It certainly has the potential to do so.  Consider Colorado’s public benefits system known as CBMS.  There, IT employees working for state agencies built the decision tables that encoded the rules-based portion of CBMS, which private vendor EDS incorporated into the system.  The automated program required eligibility workers to ask individuals seeking food stamps and other cash benefits whether they were “beggars,” even though neither federal nor state law required an answer to that question.  According to newspaper reports, eligibility workers expressed their dismay at the patently offensive question and urged its removal.  It is possible that a programmer’s political views or bias influenced the wording of that question.  And it is possible that a programmer could have encoded consequences to positive answer to that question that would reflect the programmer’s political views.

But the political beliefs of computer programmers should play no role in the construction of these systems.  Such a role would defy the rule of law.  Legislatures and agencies have not delegated policy-making power to programmers.  Instead, a programmer’s role should be limited to ensuring that an automated system accurately reflects the policy determinations of the legislature or agency.  Programmers should not usurp agency or legislative expertise with their own political views.   

Q. I know this seems like an extreme scenario from a science fiction movie, but do you believe that computers could some day rule the world?

Ray Kurzweil contends that artificial intelligence is just beyond the horizon.  If Kurzweil is right, then perhaps your scenario is not science fiction.  But I am very skeptical of this idea.  Even if computers provide human civilization with a tremendous amount of insight, it seems unlikely that computers could make fundamental value choices that political decisions require.  And I imagine that we would not want computers to play that role.  Some have suggested that online books and media will replace our hard-copy books and newspapers.  But that does not appear likely.  Books play too important a role in our cultural history.  In much the same way, individual creativity plays an indispensible role in our collective self-determination.  I certainly hope that computers do not someday rule the world.     

Q. You are an active blogger. How do you use your blog to develop your research projects?

Many law professors use their blog to generate ideas for their scholarship.  Professor Dan Markel comes to mind on that score.  A blog is a terrific way to engage insightful minds in your project.  It is certainly something that I might do when I guest blog for Concurring Opinions and Prawfsblawg this year.

Q.
How much of your personal side do you reveal in your blogs? Do you ever feeling that blogging invades or alters your online reputation and personal privacy?

I reveal very little of my personal life when I blog.  According to tireless blogger and information privacy law scholar Frank Pasquale, blogging can provide an important public service.  It helps inform interested individuals about issues that they care about but have little time to research on their own.  Blogging can help crystallize arguments made by litigants.  Blogs can inform judges and politicians.  Like Professor Pasquale, I hope that my blogging contributes to the discourse on important issues, both for the public and for my students.  Details of my personal life contribute little to this goal. 

I also worry that revealing personal information could endanger my family or be misconstrued in a reputation-damaging way.  Information privacy scholar Daniel Solove has written a terrific book, The Future of Reputation, about the dangers of revealing too much of ourselves online.  As Professor Solove explains, we live in an age when our off-line reputations hinge upon our online ones.  With this in mind, I focus my blog on ideas and public events, not on my personal life.

Q.
The Berkman Center will soon be celebrating its 10th anniversary. Reflecting upon the past ten years, how fast do you think technological advances will affect our society in the next decade?

Unlike the technologies that drove the Industrial Age, the technologies of the Information Age bring rapid-fire change.  Because law tends to move slowly, an interesting question that spans many fields is if law can keep abreast of those changes.  The Berkman Center will have much to tackle in its next ten years.  Congratulations on your tenth-year anniversary!