Skip to the main content

Free Software

Professor Jonathan Zittrain, Berkman Faculty Director

From the Berkman Center Blog:

The first topic on this last day of ILAW is online privacy, taught by professors Jonathan Zittrain and Molly Shaffer Van Houweling.

[Also see weblog notes and commentary from participants Frank Field, Aaron Swartz and Jim Flowers; they're catching what I miss.]

Larry: Good morning and happy Fourth of July. Great of you to show up today; you shall be rewarded with a barbecue downstairs. Also, the wrap-up session today is intended for you to be able to ask as many questions as you'd like about anything in the program.

With that, let me introduce Professor Zittrain and Molly Shaffer van Houweling. Molly was a student at HLS, a Berkman Fellow, is now a law professor and works on Creative Commons.

JZ: Good morning. I'll also say about Molly that she also worked for ICANN, possibly a bit longer than she wanted to. So she brings to this several levels of expertise.

We're going to talk about privacy. We've been talking about how to do this session: it's what we all feel strongly about, but discussing it can be like a big pile of mush. We don't have the answers.

Then we'll talk about other things we don't like to teach about--breaking down the pile of mush into smaller bowls of mush.

What are the categories of "privacy"? It means many things to many people. First: collection of personal data.

Molly: So: collections of data. You don't like it when people ask for your SS number; yet in some circumstances, it's all right. Another category is your personal environment (spam) and another--your right to your personal decisions (privacy in the bedroom).

We want control over certain things: the government is one entity that collects information about us; corporations are another. There are also nefarious individuals with sophisticated, cheap tools for gathering data. Basic snooping.

We will discuss certain topics: Carnivore/the Kyllo decision. "DIF"/TIA, cookies, price and service discrimination, Nuremberg Files.

I'll start with the government. Example: Carnivore. Carnivore is for wiretapping at ISPs. Collecting and filtering communications. We are looking for "probable cause" and a court order. The trick is that everyone is watched so some can be targeted. It captures all, but the government argues it looks only at the data for individuals for whom they have a court order.

Concerns: the Fourth Amendment. Kyllo deals with this. The police used thermal imaging technology to determine whether a man was growing marijuana under grow-lamps. He used the Fourth before the Supreme Court, and he won.

There's a catch, though: Scalia entered a loophole. The tech used was not okay because it was not in "general use." The tech could become in general use--so does the bright line then move?

JZ: What's the speed limit on 280? 65 miles per hour. There is a radar detector gun, and then a radar detector detector gun. Similar in Kyllo case. Somone invents the grow lamp, then police have grow lamp detectors...each side cries foul. But this could change: they could track you differently--tag you and check in at the toll booth. Another way to monitor cars: surveillance over the highway.

The first move can be in either direction. Calls can be encrypted, etc., to thwart wiretapping. Until the day comes that everybody can get heat dectectors--police can't do it. But once it is there, they can.

Jim Flowers: Isn't the basis of that decision our expectations? This is about the evolution of society.

JZ: Yes. But it's a circular reasoning. Part of your expectation has to do with what the law allows. But you're right, it's an attempt to peg the law to current expectations.

Molly: I think Kyllo is interesting because it's about privacy in the home. Supreme Court said you can have privacy in the home. If, in the future, your expectation of privacy in the home is "unreasonable," then you may not be protected there.

[...missed a bit...]

JZ: The typical protection you have against the government is the law, typically by the exclusion of evidence at a criminal trial. That's the way in which you go after the government.

Lisa Rein: I was wondering about Scalia's comment. You can buy heat scaning equipment; the tech is in the hands of the general public.

JZ: Let's keep that to ourselves, shall we?

Let's talk about cookies. The level of specificity in privacy law. As Larry described, cookies were a technical solution to the "stateless" state on the Internet. You needed to tag a person to better sell to them. Cookies are putting a bit of data onto your machine; dropping it onto the beachhead. Simplest use--the site labels you w/unique number.

Molly, is this a privacy problem?

Molly: So far I love it.

JZ: [Surfs] They have set a cookie on my machine. They have started compiling a dossier on me. They know which pages I am visiting. The site takes the "mouse droppings" and creates a profile. One advantage from the point of view of the website; they can serve you appropiate ads.

But it seems there is also a "cookie consortia"; they decide as a group which ads to serve me. DoubleClick may have more information about me; they may remember me as a dog-lover from prior behavior. So, Molly, what's so bad about this?

Molly: Do they know it's Jonathan Zittrain?

JZ: Perhaps; though I am always free to say I am Larry instead.

[...] see ads follow you around. IP addresses can also be advertised without you knowing it. Are you scared yet?

Molly: Not yet. But I am getting more concerned. It seems there is a link between the information gathered about me in one spot and the ads that follow me around. Can't I turn off my cookies?

JZ: Sites are designed to throw cookies at you, and you can choose to take it or not. But how do you know if you want it, or don't want it?

There have been legal interventions at this cookies stage. European and US versions of privacy are roughly the same, I think; differences overplayed. So long as your expectations are set properly, the law thinks you're okay. What should the privacy policy say? It can say anything at all. It's when it is hidden rom you that the legal problem arises.

Let's look at the difference between opt-in and opt-out. Did you read the policy? Should we put the burden on you, or the company? Switching a cell phone company today is an odd process; you get questions like "Are you of sound mind and body? Do you know what you're doing?"

There are generally two options in a privacy policy: take it or leave it. "I agree" or "I can't shop here."

Molly, what's your thought on opt-in, opt-out?

Molly: [...] I admit that I have cookies turned on. Tell me a scarier story.

JZ: Here's my best cut at a commercial story that may scare you. [Shows mock-up of a site, ""]. Two people come to the site, show interest in same product. They are quoted two different prices. Why? Tagging price to profile. On the Internet, casing the customer is a craft. How many seconds elapsed between time of looking and buying? That means you are one kind of customer.

Amazon will know that you're a cheapskate; it will offer you coupons. Coupons on steroids.

Scared you yet, Molly?

Molly: Well, to me it just keeps getting better and better. We've talked about copyright enough to know that price differentiation happens.

JZ: Okay, so here's a really scary picture. Frequent flier scenario. You know that card at Home Depot? They give you a card, you get a discount. But how does this help them? [...] How about if the quality of the service you got was dependant on how much you've previously purchased. Swipe your card and it says, "Freeloader."

Molly: I am a little scared. But I thought you were going to say something else. Something about poor people "purchasing" items with personal information.

JZ: Equivalent of selling a kidney, perhaps?

Molly: Tell me more about the dossier that is being compiled.

JZ: Well, let's go back to the government and the "DIF" formula. They did stats on tax returns to see if there is a correlation between certain things and whether someone is a tax cheat. When you turn in your form, it is run through the DIF formula.

Someone I knew FOIA'd the DIF formula. The government wouldn't let it go. It is an aggregation of otherwise innocuous data that can be used in a very powerful way.

TIA. These are judgments that a computer is putting together.

Participant: It reassures me when Amazon offers me something I am not interested in. I am more worried if General Poindexter has the wrong information about me.

JZ: [...] A Minority Report world? Things like race, gender, age could be collected. Have I worried you?

Molly: It does seem scary, this black box w/data about me. What's the alternative? Could the computer be doing a better job than the humans?

Cindy Southworth: What about databases with information about battered women? The first woman who dies because you're looking for Al Queda, I am going on CNN.

JZ: Odd the energy that is going into protecting copyright, but not as much to our medical privacy.

Andrew Rens: You can cross-examine a human; you can't cross-examine a machine. If at some point you can get human answers, it is better. We cannot make certain judgments just because the box says so. Do we require the state to bring the guy who created the box in to a criminal trial every time?

JZ: ...But if it's going to help you, why not use the box?

Participant: Let's say the box is completely accurate...

Andrew: Even is it's entirely accurate, there are some things society won't allow.

JZ: What would it take to change that?

Participant: One more terrorist attack.

Andrew: Haven't the terrorists already won, then?

JZ: ...But practical concerns will generally hold sway--once our physical security is threatened.

Andrew: So what are the rights given to me as a member of a civil society? Can I get a civil warrant to monitor what the police are doing to me?

JZ: ...Levels of privacy allowed: fair credit reporting act, etc.; the sorts of rights found in the European Data Directive. I tried to do this in Massachusetts; get data they were collecting about heart attacks. I asked them what they've got on me. In three months, they said "no, we have nothing." Three months of waiting. So again, lots of energy going into the protection of IP; not so much focus on our needs for privacy.

[...missed participant comment...]

JZ: You think that it would be okay to give the government all your data?

Participant: With qualifications, yes.


2nd participant: Fifty percent of people said they'd trade their personal location for two dollars off a sandwich. Do you see an evolution where there will be class distinctions happening? Privacy levels to be purchased? Privacy insurance?

JZ: ...Privacy may be harder to do under an insurance model.


Molly: There may be a market developing to increase accountability. Remove hassle that Jonathan has to go through, with the three months. An intermediary doing it would be easier.

So, I am getting more scared. I think there is another aspect of your story: RFIDs.


JZ: Yes. RFIDs are the next "cookies." The line between Internet and Internot is getting quite blurred. RFIDs are physical cookies. Objects that when scanned will broadcast back the right number. I have a RFID on my dog. Yes, there is a chip in the dog; she ran away twice. Can I afford not to have that? If it is good enough for a dog, wouldn't it be good enough for a child?

When will it become the responsible thing to do?

You can take a dish and go wardriving and scan for RFIDs. But again, this is a society where it is easier to digitize what was once analog.

Too useful not to have. But the consequences are as yet unmapped.

Participant: I am feeling weird about this whole thing. How is this the same conversation that we've had before? What is the CC version of this conversation?

JZ: Fred von Lohmann yesterday was the Scott McNealy of intellectual property.

Participant: What is the CC-style solution to this?

Molly: How about this: P3P. I don't know much about it but will attempt to describe it: a technical tool for describing your personal privacy preferences, to see if a site matches what you want...Here's when we see it's all data.


So this helps with the problem of conforming with privacy policies by default.

JZ: This doesn't stop the site from violating its own policy. This is negotiating a contract, not enforcement.

Larry: MS in IE 6 rolled into the tech P3P provisions. They discovered that sites all over the world were implementing P3P. This is an example of code rolling out a default that increased the level of conforming to P3P.


Molly: You've scared me into being a bit more careful, but there is certain information that's unavoidably available. My name, my face, etc. Here's the case of the Nuremberg Files.

Abortion doctors: their names, numbers, and a chart that marked physicians who had been killed and wounded. This was interpreted by the 9th Circuit as a true threat.

JZ: This is their post-lawsuit website. No graphic devices anymore. But the information is the same.

Molly: Hard to argue that you have a reasonable expectation of privacy in your name and address; yet this restricts your personal choices. Abortion cam. People walk into a clinic and their information is broadcast on the Web.


JZ: Two more things on the table. This case was not posed as a privacy case; it was posed as a threat case. It is illegal to interfere with people entering a clinic or a place of worship.

Here's another instance in which data framed in a certain way is powerful. Identifying spammers. People publish spammers' known email addresses, home phone number, etc. People then subscribe the spammers to catalogs, etc. Vigilantes.

Another instance: people convicted of sex crimes. A map showing the offender; their homes are marked with a skull and crossbones. These are goverment efforts to publish identifying information.

Participant: What about weblogs?


Molly: Some solutions to the IP problem cause a privacy problem.

Participant: There is a website about female infanticide. People go online and identify the doctors that perform abortions when there are female fetuses.

Participant: We haven't been talking about "why privacy" in the first place. People are supposed to live an autonomous life. We should have the concept of people having freedom, so we could arrive at answers.

Molly: This is part of why I find this difficult to teach; this topic is broad and deep. Multiple levels on which to discuss the competing interests here.

Chris Kelly (former CPO of Excite @ Home): This is complex because privacy is hard to define in the first instance. Giving pieces of data away can be all right.


In Europe, if it's personal data, it's regulated. Here that is not the case; it has to be a certain type of personal data. The US doesn't have that type of regulation. Huge difference in terms of regulatory overhang.

Lisa Rein: Regarding IE implementing P3P--the good news is that it brought attention to P3P. The bad news is that they didn't follow the spec.

JZ: So, what makes this an Internet law issue? Incredibly rapid advances in tech have an incredible potential to change our expectations. You used to be a grain of sand on the beach. Increasingly, no more. The first wave of ways to deal with it--law--has become unwieldy. Technology meant to protect consumers won't get as much attention as other kinds of technology.

Convenience will win. This is a distinct Internet problem. RFIDs will be everywhere.

Molly: Final food for thought for the future: 9/11. Expectations changed in its wake. What are the implications?

[Concluding remarks; end of session.]



Past Event
Jul 16, 2003
12:30 PM - 12:30 PM