Session 2 Transcript: Content Control and Encryption

Speaker: Daniel Weitzner

Zittrain

HA 102

9.17.97

P: -- if you've already written it, you need not write it again, and you can kind of pass that along, and that way we know exactly who's in the class. Are there new people here? The people here who weren't here last week. Five. I recognize some of you. Did you get permission?

Perhaps, then, the new people can quickly just say a few things about themselves, it's only fair, since we have been doing this last week, but it was quite pleasant and actually informative to see the actual skills, talents and interests in the class.

So why don't we start -- there was nobody in the front row, right? You want to start us off?

Greg: I'm Greg Harris, I'm a 2-L, and - skills and interest in computers.

P: Actually, is there any particular topic that's been floating your boat recently, not the internet.

Greg: Freedom of speech is probably at the top of it, and whether there's going to be any sort of regulation about, then, what Congress is probably going to have as a follow-up to the CBA.

P: You think of the right day to start coming to class. This is good.

Alice: I'm Alice Johnson, I'm a mid-career MBA at the

Kennedy School. I'm a legislature from Minnesota, and my interest is the recent case, and also taxation.

P: So this is wonderful, we now have a resident legislator in the class who has agreed, as a condition of entrance to the seminar, to indulge her hypotheticals about various bills that we then ...(inaudible) produce. And in fact, we might as well play this card right now. Is it not true that you've actually introduced a bill touching upon the internet and taxation?

Alice: Yes. Had a hearing on it. I had a bill that proposed putting a 6 percent sales tax on just the access fee on America On line, or whatever. Minnesota has a 6 percent sales tax on services.

P: So part of our proposal, actually, was to get some background materials together and actually review this bill as a text and see what amendments our class would make to it, what we would recommend, and kind of go through that. And Alice has very graciously agreed to --

Alice: I also have a bill that was introduced last year that was requiring schools to put a filter on their computers for ...(inaudible) for students, and we had a great debate on that.

P: Yeah. Filters don't have to be changed, every six months or so you have to put a new filter in.

Alice: We modified that.

__: ...(inaudible)

Jennifer: I'm Jennifer Capinelli, I'm a master of theological studies for ...(inaudible) Divinity School, and what I'm interested in researching is religious communities on the internet and the ways that those religious communities have been establishing themselves there, ...(inaudible) to those communities as a result.

P: Wonderful.

Melanie: My name is Melanie Scheck, I'm losing my voice, I'm sorry. I'm a 2-L here, and my interest ...(inaudible) is encryption, and I find that, in various ways, one of which was working in technical capacity for National Security Agency, one was working in ...(inaudible) capacity for ...(inaudible) standards for ...(inaudible), and one was war policy related, that was this past summer, doing work for the Office of ...(inaudible) area. And, through encryption, becoming interested in ...(inaudible) internet ...(inaudible)

P: Like Dow Chemical: better living through encryption. Wonderful to hear what you have to say as part of today's festivities.

Aaron: My name is Aaron Maller, I'm a 2-L, and I have an intern background as well, so I got interested in computers. And I guess most recently, this summer, I did a little bit of research at the law firm I was working with, as far as ethics of attorneys using the Web for e-mail and different issues like that.

P: Whether they opt to, or just whether they --

Aaron: Well, there's the --

P: How much they have to charge for it, actually.

Aaron: Exactly.

P: I remember one law firm told me with a straight face that they actually had an accounting method, so that every time an internal e-mail got sent in the firm, you had to put a client billing code on it, and a client would get billed. And they said, well, it just seems odd because it doesn't actually cost anything, marginally, to send an e-mail. They said, Yes, but that's a memo that we're not writing out and xeroxing, that would cost something. I saw the list on the back.

So, moving right along --

Larry: I'm Larry Lessig, I'm a 1-L. I'm actually just visiting the class.

George: George Danenhour, I'm a 3-L. My interest in the internet -- I guess I'm interested in the kind of business fraud issues on the internet, ...(inaudible) tracking down.

P: OK. How about in the back here.

__: ...(inaudible) since '81 and I'm here ...(inaudible) school.

P: Wonderful. OK. Well, might as well cut, then, to introducing the last person in the room here. His name is Daniel Weitzner. He founded, together with Jerry Berman, the Center for Democracy and Technology, I believe it was in December of, '93 was it?

Daniel Weitzner: '94.

P: December of '94. This is one of those issues where it may be supply creates demand, because shortly after the founding the communications decency act was proposed, and suddenly a mission was substituted for --

(laughter)

And has thus engaged the CDT now, for several years. They actually were basically the umbrella organization, as I understand it, sort of putting together the coalition, the American Library Association, that fought the CDA. And Daniel has also had an interest in the Clipper chip and in internet encryption, we'll hear about that today, and also introduce some text, through Senator Leahy, who was kind enough to consider the tax, introduce it on Daniel's behalf, that passed, with some amendments to the electronic communications privacy act, to cover transactional information on the internet.

So with that introduction, which you're free to amend and revise, why don't you get us started.

DW: Thanks, ...(inaudible). Thank you everyone. ...(inaudible). We've actually often joked that we should put Senator ...(inaudible) on our Board of Directors, because he was so instrumental in getting us off the ground. But the truth of the matter is that we work on a number of other issues related to the internet besides just the communication pieces and ...(inaudible). We work on privacy issues, as well as free speech issues.

And in thinking about what I was going to come and talk with you all about, first of all, I have to say, I was really -- I'm kind of jealous of all of you, that you all get to sit here and have at least some amount of time to sit and think about all these questions, whatever interests all of you. It really is such an exciting field, it's such an explosive area, and it seems to me that every day some new issue walks in the door, and unfortunately, I'm often in the position -- the first thing is essentially trying to do triage and say, Well, I don't understand that issue, I don't know what to do about that issue, so it must not be our mission. You're in a more fortunate position than I.

In trying to understand what is part of our mission, and what's important to civil liberties and demographic values on the internet, I think that we've arrived at one framework for looking at a variety of issues that helps us to sort them out, helps us to sort out what our positions are, where we ought to be focussing our energy, where we're going to need to focus energy.

And I want to kind of try and offer that framework to you as a way of looking at some of the questions that you're probably going to look at. And that's really a matter of looking at what goes on on the internet, and looking at policies related to the internet, by asking the question: Does this promote a centralized approach to whatever it is, or does this promote a decentralized approach?

Our preference, from a civil liberties perspective, from the perspective of constitutional values and from what we think is healthy democratic discourse, is to have a preference for decentralism. We think that decentralized solutions, decentralized approaches, policies which emphasize decentralism, and network architectures which emphasize decentralism, are good for democracy, are good for what the First Amendment is really about, are good from what I think a lot of the Constitution is about, and a lot of our democracy is about.

Decentralized solutions, when they really are that, I think, tend to empower users, if you will, whether those be users as citizens or users as consumers or users as speakers or artists or business people, or whatever they are.

And I think that what's so exciting about the internet, from our perspective, is that it does have such a decentralizing cast to it. We can have a long argument about whether that was intentional or accidental, whether this was all just sort of stumbled into or not, and I don't know if I even know the answer, I think I have an answer, to that question. But we have, I think, as a culture really, been handed an extraordinary decentralizing force that has come increasingly into the mainstream of our communications arena.

And, I think, from some people's perspectives that's a very positive thing, from other perspectives, I think, that's regarded as a threatening thing. People who are particularly concerned about the spread of sexually explicit material, or use of encryption, or the breakdown of cities, or a whole lot of things which are very serious, real issues, I think can look at the internet and say, Well, this is really not a very good thing. This is something that is promoting a whole lot of values which are ultimately not good.

I think that, hopefully as we talk I'm going to try and illustrate, also, that decentralism is a relatively fragile value. It is, I think, as I said, very positive for our culture, but has a tendency, I think, to flip very quickly into, at least in the case of the internet, into centralized control situations, and we'll see that in the area of content regulation, in the area of encryption, and others.

So, with that framework, I want to just start with the whole question of content regulation, the question of material that some people think is not appropriate for kids, some material that probably almost everyone thinks is probably not appropriate for kids, but which is nevertheless unquestionably out there on the internet, easily accessible from both the United States and outside the United States.

And I think the debate over this question, which really began with the debate over the Communications Decency Act in the U.S. Congress, really is a debate that has as its core questions about decentralism, empowerment of users, versus an effort to reimpose some kind of centralized control over this decentralized and, some may say, chaotic, out-of-control communications environment.

I know that most of you have had a chance to look at the Supreme Court decision, Reno v....(inaudible). I think it's a really important decision, not just because I worked on it but also because it really is the first time that the Supreme Court has taken a serious look at the internet, it's the first time just about any court has taken a serious look at the Internet.

If I was giving you any kind of suggestion about how to read it, I would really suggest that you focus on the findings of fact, much more so than the conclusions of law, although my own views of the conclusion of law are a little bit mushy. I mean, it came out right, from my perspective, but there's probably some work to do on that side.

But the findings of fact, I think, are tremendously important. If you shepherdize or do whatever you might do with Lexus on that case, you will find an increasingly large number of cases all around the country citing the findings of fact in Reno, and in the lower court decision, as a way of just understanding what this medium is.

The majority opinions stress aspects of the architecture of the internet, that they are directly on this question of centralization versus decentralist. The opinions stress that there is really, in many respects, no control point at the center of the internet, or at the center of the Web. There is no who controls membership in the Web, there is no who decides how old you have to be to get onto the Web or any individual Website. That sort of control, that we have seen in the broadcast media, whether it be media or television, simply does not exist.

The nexus that has existed for years between the Federal Communications Commission, or the Congress and Federal Communications Commission and broadcasters, whether they be radio or television, are a relatively small number, a few thousand around the country, just does not exist. We are talking, and the Justice has observed this, we're talking about millions and millions of speakers, not a few -- and there's commercial speakers, non-commercial speakers, individuals, organizations, not just a relatively small number of commercial entities that broadcast over the radio waves.

So the courts stress that there's no control at the center. In their findings, in fact, they also stress that there's a lot of control, a lot of potential for control, at the ends, at the edges of the network, with the users.

P: Can I ask quickly, is what your figuring comes from those findings of fact, that ought implies can. For the Government say, you ought to do this, it must be the fact that you could do it, you cannot allow for what you can't comply. And you're saying that the findings of the fact help underpin the argument that, given that there's no way, there's currently no age ID on the internet, to demand that people somehow verify that people are old enough to see this material is not a fair demand, constitutionally, to put on a supplier or a speaker.

DW: Both the lower court thing and the high court opinion looked at the question of whether it was possible to verify the ages of users visiting a particular Website, and determined that even in the cases where it's technically possible -- for example, on the Web, you could imagine a variety of mechanisms that check age, that require you to fax a copy of your driver's license, or whatever you would do, that even though that's possible on the Web, it's actually not at all possible, it's impossible in environments like use Net, in truly distributed discussion groups or chat rooms, where there's no -- where all of the communication just gets lumped into a cloud, as it were, for anyone to access, where there's no single person, there's no Web master controlling access, either for the speakers or the readers.

And in either one of those cases, whether it's technically possible or technically impossible, the court found that the burden was simply too high, in some respect because of the clear financial burden that would exist for non-profit organizations. We had a lot of testimony from librarians, talking about how much it would cost them just to go through even their card catalogue to figure out which items, which cards in the category, in the virtual catalogue, as it were, might be considered indecent and would have to be somehow segregated.

Justice O'Connor, in dissent, really looked to try to figure out whether there was a way to somehow zone the net, and obviously that view was just rejected.

__: Do those ...(inaudible) the alternative of tagging, which she suggests in her white paper is becoming easier and easier, and is something that Web page producers and other internet speakers can participate in, in terms of ...(inaudible) standards of ...(inaudible)

DW: Let me back up and say just a couple of things about pix (?) and filtering, because I think it's an important question. As Jonathan said, we argued, in this case, that the government does not need to regulate content on the internet, does not need to step in and protect kids, because parents have effective means of keeping whatever kind of content they want away from their kids.

And there are essentially two ways -- not getting into all the different technologies, because there are lots of technologies out there that do this, lots of products you can buy or will give away, or come with on-line services, but there are fundamentally two ways to do the kind of filtering of content that we were talking about.

One way relies on the tagging approach. This is an approach that was proposed by the government. In the lower court opinion, this was -- what did they say? Eighteen? I forgot now. They had some sort of funny tag -- X-18 or something. They wanted to be able to put some sort of tag, I can't remember what it was, on the ...(inaudible).

That's a self-labeling approach, according to whatever system, whether it's the Motion Picture Association G, PG, R, etc., tag, or there's a tagging system that has recently been promoted by a group called Larsack and the Software Publisher's Association that comes from the video game arena, where they tried to do tagging of video games, and they've now brought that to the internet.

That's self-rating, where lots of people get together and say, We're all going to tag our sites according to some code, and you as the user or as the parent can look at the tags, the tag that we put onto our site and decide whether that meets your criteria for access, either for yourself or for your kids. That's self-rating, and filtering based on self-rating.

__: Although the government could require ...(inaudible)

DW: The government could require that, that --

P: Not defined by you?

DW: No.

P: The Government can try --

DW: No, I think that's compelled speech and it's unconstitutional. I think that raises a different --

P: When you're saying the Government could require, you mean the Government could try it --

DW: The Government could certainly try it, and there's legislature floating around, not yet been introduced, but there are a number of members of Congress who are talking about requiring this. I think that raises a different set of First Amendment issues than we had in the CDA case.

__: Although you're structured very much the same, using tagging as a safe harbor, maybe, clearing up the ...(inaudible) of the indecency definition, and making it explicit in the text that tagging is a good faith effort to avoid children --

DW: It could, and then we would come back to the question of a burden, and we would also come back to the question of whether that is a least restrictive means, given the fact that the second category of user empowerment approaches is a third party filtering or labelling approach, where someone else decided whether your content is appropriate or not on whatever set of values they happen to have.

__: Right. Although wouldn't self-labelling help fill some of the holes in that, in terms of not being able to keep up with new sites and needing to exclude the ones that --

P: There are several megs of new Web pages cropping up every hour.

DW: Number one, I do think that self-labelling is unconstitutional compelled speech.

P: Can you differentiate that from the Surgeon General's warning, because -

DW: I think that, number one, that's not to ...(inaudible) -- I mean, that's an FDA regulation, which has always been thought of in another category. Any time the Government says you must add or subtract content from your speech, or you must identify your speech -- the most recent case is the McIntyre case, anonymity, which struck at an Ohio statute, saying, you can't require people to put their names on political pamphlets. Now, there are lot of good reasons to require people to put their names on political pamphlets. It's important to know who's saying what in the political process, etc.

But the Supreme Court said, very clearly, that is --

P: Of course, that's an anonymity thing. I mean, if you were to mislabel a carton of glass as a toy and put it in a store, I mean, that's -- you can imagine consumer protection law.

DW: That's fraud, and that's a different -- fraudulently labelling something is different than choosing not to label it, I think.

P: The surprise box, is what he's saying.

DW: And in Congress, you have consumer protection issues that are always --

P: And this is consumer protection, right?

__: Isn't it the same as consumer protection?

DW: No, I don't think speech is the same as consumer protection.

__: ...(inaudible) speech ...(inaudible)

DW: I think speech is an importantly distinct category from every other kind in Congress, absolutely.

__: Can we just go back for a second to the issue of ID system. Now, I mean, this might be -- violates CDC secrets, but as I remember --

DW: We have no secrets.

__: OK. Well, tell me if this should be a secret. Because I remember --

P: How about the ruling sanction.

__: I guess, by definition, it won't be, right?

__: When CDO was being proposed, you guys had a suggested language, which tracked Ginsberg, sort of harmful to minors standard, rather than the absurd standard that they actually ...(inaudible). And you had a good reason for that, in that at least that category of speech was much narrower than what has been spoken of, and the court had, ...(inaudible) Ginsberg 20 years ago, 30 years ago, essentially ...(inaudible).

I want you to imagine a CDA which has CDT's language. And the second part of the CDA, the Government sets up, not exclusively, but the Government offers, for a dollar, IDs which verify your age. And the Government says you can have other IDs if you want, but basically we'll set up a server and we'll sell anybody an ID for a dollar, that verifies their age, and any site can basically echo off that server to verify somebody's age ...(inaudible).

So, if you went to a Web page, put your number in, it lets you ...(inaudible) Government. So CDA 2 says, for Ginsberg's speech, or CDT's speech, if you're ...(inaudible), so long as you use a screening technology like this, you have a safe harbor, but if you don't use a screening technology you don't have a safe harbor and you can be prosecuted for exposing minors to explicit material.

What would the problem be with that?

DW: Well, let me just say something about the Ginsberg harmful to minors language, which was not our language and we never really ever supported it. We were asked --

(simultaneous voices)

There was -- let me say what that language was. At the end of the legislative debate on the CDT, we had gotten through the Senate, lost big time, gotten through the House, nearly lost, or we were about to lose. Rick White, who is a member of the Congress, from Seattle, asked a number of people, including us, what might be a constitutionally sound way of really protecting kids from what people are really concerned about, the stuff in Senator Exon's famous blue book that he had on the floor of the Senate, that a lot of Senators were very interested in looking at?

P: You're talking about the blue book with which most of you are familiar.

DW: It's not that blue book. Rick White proposed a different version of the CDA, which would have criminalized material which is harmful to minors, which is, essentially 48 states have laws that say material which is "harmful" to minor, or obscene to minors, can't be sold to minors and has to have various type of display blinders put up on it. So when you see Playboy, Penthouse, Hustler, etc., on the newsstand, you see a blinder in front of it. That's legally required, because it's harmful to minors.

So that was going to be a law that -- the White proposal was commercial speech which was harmful to minors could not be made available on the internet, unless it was labeled as such, in such a way as people could block it.

I have to say, I'd be far more comfortable with that approach, the commercial harmful to minors ban with a labelling defense, than an approach that says we're going to somehow now go out and give every single person in the country a national identifier to prove their age.

Number one, I don't think that's going to work very well at all. I mean, you can imagine --

__: Well, if it doesn't work, ...(inaudible) say it doesn't work, but the question is whether --

DW: Well, so it doesn't work matters to the court, when looking at restrictions on ...(inaudible) I think that does matter.

P: Now, when you offer this tagging proposal, you're saying, there, that's a lesser restrictive alternative to placing the burden on the supplier of the information to actually block it, to stop it at the warehouse door, at the Website door, from going out. Instead of, you're saying --

DW: Actually, it doesn't. I think that's not the right metaphor. I think that, in fact, the speech can go wherever anyone wants it to go, can go wherever it is requested. What the tag does, the label, again, whether it be a third party label or a self-label that would be applied as a defense under this statute, what it does is it stops it from coming into people's doors who don't want it to come in.

P: It also makes it a lesser restrictive alternative to some kind of scheme like Larry's, where you say you have to embargo it at your door unless someone presents to you the ID. It's less restrictive, but it also may be less effective.

DW: I mean, I'm just not very comfortable about the idea, from a privacy standpoint, I'm not comfortable with the idea of the government going through a whole extra process of creating a new national identification system. I think, with the exception of the social security number, which has been widely abused and the Privacy Act against ...(inaudible) will give you details up and down about how badly that has been abused, I think it's a very dangerous thing to do, not for this reason but for any other reason.

And I guess what it also is, is it's, in my mind, it's a centralized solution. It says we've got to go sort of one place to figure out who's a certain age. Instead of making people make the choice about what content they want to come into their homes or to their kids, and what they don't.

__: But, I mean, the story here wasn't that it required ...(inaudible) you could have any number of IDs. It was just the Government's offering this in a subsidized way.

DW: So there's no burden.

__: The part of the story that I was trying to get you to focus on was the findings, which you said we should focus on ...(inaudible), which seem to me, seem to say that the problem with this kind of screening technique was that it imposed such a huge burden on Web sites. My story has eliminated that. The Web site just has to echo out to one of these ID ...(inaudible) They don't have to maintain their own ID system.

DW: So is the Government also going to provide all Web masters with whatever service software they need to do that?

__: Right. Let's say the Government stayed out of my ...(inaudible) key checking system, and it's for free. You can have alternatives, private -- people who are worried about privacy, you can set up with CDT their own ID system.

P: What a way to coopt to the opposition.

__: But then the question of the burden is, if this burden's eliminated, what's left as an --

DW: Well, I think you may have almost the same burden. Let's take a Web site. First of all, that's a Web solution, so we have to remember that that's not a chat room solution, it's not a risk server solution, it's not a newsgroup solution. There are a lot of parts of the internet for which that's not a solution, but let's put that aside. There's no one to go to to accept an age verification for use net access. It's a different issue.

But even for the Web, let's say that my site, for argument's sake, has all kinds of stuff on it, including material that would be classified as harmful to minors. And I am still going to have to effectively go through the process of labelling my own site, because I'm going to have to decide what material is subject to ID check and what is not. And I'm not sure exactly what's involved in that. But I think there's definitely a burden there on speech. And I think there's also a stigma that's going to be attached to the speech that gets put into the bad category.

__: It's the same burden that ...(inaudible) real space right now.

DW: That's right.

__: Expensive a burden in bookstores. And the stigma is less, too, because nobody sees where you're going with that. And in bookstores, you do see. You're going into the adult section and ...(inaudible)

DW: Right. It's less of a stigma for the reader, it is a stigma for the publisher, for the Web site operator, because someone's going to click on some link on your site and get this notice that says --

P: Surgeon General's warning: this may be harmful to children.

DW: And I think that whole question of stigma just wasn't raised in Reno. It was sort of touched on, I guess, a little bit in the ...(inaudible) cable indecency case, and it's been looked at, obviously, in other compelled speech cases. So I'm not quite sure what the answer is about that, but I think there's going to be a very real issue there.

Also, the harmful to minors laws which we're kind of modelling this on are only about commercial speech, they only are about people who really are in the business of selling what we would all understand to be pornography. And I guess I have less problem with either stigmatizing or burdening or doing whatever to that category of speakers, principally because they're businesses and they are in the business, they're trying to sell something, and to place reasonable regulations on the way you sell something is, I think, relatively well-accepted in First Amendment traditions.

Again, it's the same problem: whether the problem is harmful to minors, or indecency, whatever the defenses are, the burden still is on the tens and millions of speakers out there on the internet who have to wonder whether their speech falls into this category.

Now, I will grant that if you say it's harmful to minors, they have less to worry about, but I'm not sure they have nothing to worry about. So the problem with the White, the sort of modified White CDA proposal, is still that it addresses millions and millions of speakers, unlike broadcast indecency law or the harmful to minors laws, which really only address a pretty narrow class of commercial speakers.

JJ: Apart from indecency, if under -- I forget, the Sixth Circuit position, Smith or Johns, whatever it was called, or even going back to Jiplinksy, certain forms of speech have been considered obscene, such as the type of pornography we've been discussing, or other forms of hate speech, or however defined, is there a method that would not overburden other types of indecent speech or make people worried about whether they can say something political? For example, can you think of a method of controlling speech which is illegal and has always been considered legal for the Government to control in some way, so that people with children aren't running into it in the streets of cyber space, so to speak?

DW: I think there are two questions there. Is there a method of controlling it, and is there a law that can control it without burden? My view is that there really is a method that all of these various filtering technologies out there are the method of controlling this kind of material. And I would suggest, and Larry's pointed this out, that people on the internet, parents on the internet have so much more control over what their kids see than any other media, including walking on the street. That the power of these filtering technologies, a lot of people argue, go too far.

So I think that there are ways to control the speech. Speech which is illegal, whether it's obscenity or child pornography, is really not, in my mind, part of this discussion. It's illegal, it's a crime, and that's -- there are occasionally problems on the margin with obscenity law and child pornography law. It is true that occasionally a parent loses his or her child for six months to a foster home because some photo finisher has found a nude picture of a little kid. There are abuses of the child pornography law, and the obscenity law, but for the most part I think they actually work pretty well.

So we do have laws which control the really, in view, the really serious problems here. And I think we have ways to help parents the other set of problems.

__: I guess where I'm somewhat confused is, I see a lot of the same arguments that could say that laws such as CDA, which are over-broad and are casting a broader shadow over speech which would normally not otherwise be the case, I'm having difficult distinguishing either technically, or, in terms of First Amendment jurisprudence, the arguments for not allowing boarder laws, while still preserving the ability of Government, which, Government's right to restrict the really obscene or otherwise impermissible speech.

DW: Well, I guess that the potential chilling effect of obscenity laws, of child pornography laws, has been reconcluded, really, in Miller, with the three-pronged ...(inaudible), to make sure that there's the minimum possible burden, the minimal possible chill, on speech.

And frankly, I don't think that you see people around the country worried about whether they're violating obscenity laws. I think when you see the ...(inaudible), there were really a lot of people, librarians, a lot of newspaper publishers, a lot of people who really were worried that they were going to violate that law, and weren't quite sure when that was going to happen, but they were worried about it.

By putting the illegal material in material that's not protected under the First Amendment is the same category of indecent material which is protected under the First Amendment, you kind of create a question that's a little hard to answer.

__: Well, now, in a way that seems to get at perhaps a belief that stuff that's really harmful to minors is probably stuff that is flat-out obscene; that there might be a belief on your part that if you've got some grey zone, it's not yet obscene but maybe it's harmful to minors, the harm is outweighed by the larger harm to the community of having individuals trying to worry about self-censorship and filtering their speech and doing tags. It's just kind of a balance that has to be struck. And you say maybe there's some speech that's harmful to minors that's not obscene, but let's assume it kind of gets through the net sometimes, and let the parents clean up that stuff on the margin.

DW: When you say harmful to minors, are you using the legal term harmful to minors, or the descriptive term?

P: Presumably they have some relationship.

DW: I assume they do, but I assume the kind of, the common meaning of harmful to minors is broader than the legal meaning of harmful to minors. What I think is harmful to minors are a lot of --

P: Just like an episode of "Married to Children," or something. I guess a question that might be asked is, among kids, we have some kids -- kids are the protected group here, this is what we all want to protect. Some kids have responsible parents, and if you empower the parents to filter out this stuff, then we might not have to worry about them so much. Some kids you might call...(inaudible), and they are kids who have either irresponsible parents, or the kids are so crafty that they get up in the middle of the night and sneak over to the computer and use it while the parents or sleeping or something. And basically they could get the benefit, or the harm, as you might see it, of this material, that might not be obscene but is still fairly, arguably, harmful for them to be delving into.

And unless the supplier is embargoing the material through some system, those kids are going to end up seeing it.

Now is your response to -- sometimes you break some eggs making omelettes, and those kids will just deal with it? Or is it OK to make a law targeted to protect those kids?

DW: Well, I think there are two issues there. One issue, I think, comes back to Larry's point IDs, that at some point, I do think, the harmful to minors cases, and other sort of adult material cases, have recognized that, especially if you're in the business of providing this kind of material, that you are going to bear some of the burden, whether it's figuring out the age or keeping it behind blinders or arranging your store so that kids don't walk into it, whatever it is. That at a certain point I do think we're going to have to say yes, the burden is not going to fall only on parents or only on kids who don't happen to be well taken care of. That part of the burden is going to have to ultimately fall on the providers of materials that is kind of generally regarded as bad stuff for kids.

And I think that it would be very hard -- I just think that's what most people in this country are going to believe, and I think that's what Congress is -- if Congress ever figures out how to deal with this issue coherently, they're going to come down at some point and say there's a burden on the providers of some kind of materials that deal with this problem, because they're creating the problem.

The second point, though, is the question about, what is the role and responsibility of the Government? Is the role of the Government to come in and be the parent, be in loco parentis, or it's the role and responsibility of Government to help parents to be parents, help parents to make the choices that they believe they should make, based on their own family's individual values.

And I do think we have to remember, we're not talking here about toxic substances that will poison children, or rusty nails, or lead paint -- we are not talking about stuff that is sort of scientifically established to be physically harmful to kids.

P: Your ...(inaudible) is an empirical matter to the scientific study that comes out that says ...(inaudible) is harmful to kids.

__: They look at it and the next day they beat up somebody in the school part.

DW: I'm open to that, but I guess I don't think -- I guess what I'm standing for is that there is not the scientific study that is going to settle, for every single person this country, what's right for a particular kid and what's not. Number one, because a 12-year-old is very different than a 17-year-old, and this statute, and possibly and ID solution, treats all kids exactly the same, and I think that's constitutionally suspect, you can't do that, and I think it just doesn't make sense.

So I guess what I'm saying is that it is a more complicated question than what to do about the fact that five year old are going to be exposed to a ...(inaudible)

(simultaneous voices)

Michelle: I wanted to see if you had any conception of how, maybe in legislative terms or something, you could actually define this stuff that we generally regard as pornography? Because, you know, I mean, there are obscenity laws, and they're pretty clear. There's some vagueness, obviously, at the borderline, but they're pretty clear about what's obscenity. I think that was one of the major things wrong with the CDA, was indecency, I think, in the way that you try to get across that idea, but, I mean, that is just way too broad.

But some -- you know -- how do you say that this is the kind of material we're talking about ...(inaudible) self-regulate, and then you're also talking about businesses --

(end of side A)

__: -- website and just post it because they're jerks. Things like that --

__: They've got a job for years under copyright law, don't worry.

__: My own personal opinion on ut, I really think exactly what you're saying, the government should provide a means for parents to be able to do something, and if anything at all. But as a parent myself, I just really bristle at the idea of someone else making those decisions for me. I would hope I could bring my daughter up to have a decent set of values, and I think what a lot of legislators are afraid of right now is just turning around and saying, Listen, nobody else can watch your kids for you, you're going to have to raise them, but if you don't want them to do this, you kind of have to raise them that way because you have a reality that you have to deal with. But I think there's a real concern that how can the government help if it seems just about every definition you can come up with to target these types of places is also going to include somebody that needs to be protected?

__: Can I ask you, as a parent, what do you think you need help with?

__: Well, basically, I think -- I'm on the 'net a lot, and there is very little inadvertent stumbling upon something you don't want to see sort of thing. I'm not terribly concerned about that, so I wouldn't be really concerned about her getting on and accidentally bumping into some pornography. I mean, I was actually kind of amazed, I went into a newsgroup to download a fine art image for a desktop and some of the titles was, like, fine art, but still, the title's there, I didn't have to open them, it was pretty obvious what I was about to get into if I did.

But I mean, things like cyber patrol and Net Nanny, and things like that, that there, they have a huge problem, from what I can see. So far, they're not products that I would be terribly interested in using because, while they block out a lot of stuff that is obviously you wouldn't want your kids seeing, at the same time they block out access to stuff that I really wouldn't mind. I mean, there's not a great level of control. I mean, they have the lists that have all the stored, the bad sites, whatever. But because that list isn't released, because they don't want the 12-year-olds getting the list to go to all those sites.

__: Well, the problem is you have no idea who cyber control is.

__: Right. I don't know who they're banning and that kind of thing, and I would like to have a lot more ...(inaudible) control over what can be seen and what can be seen in my house.

DW: And what would you say the Government ought to do to give you that control, or help to get you that control?

__: See, I would fully agree, I don't like net regulation, I don't like some Government, especially the United States Government, saying, hey we're going to tell you guys what to do here. It didn't start that way, it shouldn't continue that way. But kind of like somebody had to take over domain things, like we were talking about the trademark. I think somebody's got to offer their services in some way, maybe it's some sort of contractual relationship, kind of like we were talking about, the pix, the filtering systems last week.

If there's something like that offered, where you can get more control over what's going on. I'm not sure about asking people to self-regulate, because I really wouldn't trust a lot of these people to do it properly. I mean, I don't feel comfortable with saying, OK, I believe what these people say as far as the rating on their site, I'd really have to see it to believe it kind of thing.

So the idea of a third party is a lot more appealing, kind of like you ask your friends for advice. Like, what do you think, you've seen that movie, should my kids see it?

P: You might trust the CDT rating.

__: Right, exactly, that's something that I might have, but do you know what I mean? If there was something, an organization or something, that I had some level of faith in, I might be more inclined, but I would still want the ability to maybe de-select some of these things, that would be really important to me. If you blocked off 2,ooo sites, I would want to know what they were in some way, and maybe be able to pull back 1oo of them.

DW: Well, let me suggest that I think you've hit on a huge practical problem on the net, which is that these filtering companies have sort of sprung up, they do these things, they're basically software people who do this. I know a lot of them, they're very nice people, most of them, and based on the fact that I know them, if I wanted one of these products, I might decide that I'm going to buy this one instead of that one because I know who did this one.

But I'm in a very strange position. And I think that a real problem with this whole filtering arena is that there are not clear reputations established, there is no real basis for trusting or not trusting any of these filters.

Yet, a lot of the -- Consumer Reports has done a couple of tests, and they tend to say, Well, we found sites that they blocked, that they didn't block that they should have blocked, or we found sites that they did block and shouldn't have, but you don't really -- you still don't get -- my belief is that's probably inadvertence or carelessness on their part, not a reflection of sort of any set of values that says, well, we are going to block the National Organization for Women site, because we believe they're pernicious.

P: When you say yet, you're saying soon they will be --

DW: Let me say, the hope in creating the PICS standards, which are really just -- PICS is platform for internet content selection, which provide a platform, provide a set of technical standards by which anyone can either self-rate their own site, or any organization can create third party ratings of other sites, whether they're newsgroups or Websites or particular pages on Websites.

So the hope, and it may have been a naive hope -- I was involved in this quite a bit -- the hope, our reason for getting involved in this, was to enable the development of these kinds of trusted institutions, or to enable institutions that we already trust, whether it's TV Guide or the Christian Coalition or the PTA.

P: ...(inaudible)

DW: Millions of people open up TV Guide and decide what to watch every night, because they trust TV Guide. I don't read TV Guide, but whether they have stars or they have descriptions or whatever they have, they have something that makes people trust them.

And I think that, my own view is that to deal with this problem, which I think is a problem about stuff, I mean, that's not good for stuff, you have to have institutions that can help with that.

I happen to think the Government is a terrible institution to do that. It's going to become like the NEA or the NEH and become -- you're shaking your head.

Alice: I do have a comment. I think it's interesting that there's so much concern on the part of President Clinton and the administration, and I'm a Democrat, that we don't want any interference with the internet because that will stifle the development, and all this wonderful stuff.

And then, in fact, they're proposing something that many people, and I think there's pretty much agreement that this will be a real burden and infrequent, or stifling effect, on the internet.

So it's a contradictory thing. Also, the people who are pushing it are generally people who say no government interference in business.

DW: As a politician, you're going to have to explain that to us.

Alice: Well, I think people have a right to question people's inconsistencies in what's going on.

P: Can you describe the Bill you introduced about filtering?

Alice: I didn't introduce that one; that was introduced by a conservative member of --

P: And have you come out with a position on that?

Alice: And, in fact, it's the person right now who is running for Attorney General, because our Attorney General is stepping down to run for Governor. And so this legislator, who's name is Charlie Weaver, from ...(inaudible), which is right next to me, in my district, introduced this Bill, to say that schools had to have, that they had to have that on --

P: Have you come out with a position on it?

Alice: Yes. We, in fact, defeated his original legislation and modified it to say that the Department of Education had to provide some information and make that available, some software and recommendations, to schools. So that if they would want it, schools would have a place to go to, but they would have a choice whether to do it or ot. Which to me seems OK. If a school board wants to do it, and there's enough people in that area that say this is what they want, and it reflects their value - much like monitoring books - then OK.

So that was what we ended up passing. But it was a major bill, and this person is going to be running for Attorney General.

Melanie: I have a couple of things. One was on the role of government, because it seems like one of the things that PICS was designed to do was to be something that would allow you not to have Government take full control. The idea, I think, was that there would a voluntary movement toward the ...(inaudible) systems, and the role of government would be limited to something like making people liable if they were mislabel a site.

And PICS itself, actually, is a little more general, although it was developed for (CUPSID) selection, it's apparently generalizing now. What PICS lets you do is really just put a label on the site and send the labels. It's justa ...(inaudible) to do that. So you can actually build intellectual property protection systems based on PICS. You can build -- for libraries they're trying to figure out what the best ...(inaudible) resources on the 'net, and ...(inaudible) Websites were labeled as to what they contained, a library could build a resource system.

So PICS itself provides a base on which to build a labeling system that can be used for many different things. Well, one of the things is content selection. So here the question is, Well, how do I decide who gets to decide who gets to put the label on? who gets to decide what the label says? That's really the issue.

The argument about the thing, I think, is ancillary, because the real concern is just, what is the content. So, for example, if the Christian Coalition is deciding what the content is, it'll be very different from CDC deciding what the content is.

So, one of the things that's come up, that's a big issue that I was going to raise, is a practical question, is, what do you with something like a newspaper? I mean, if I'm a newspaper, every day I have to go through some labelling, and there's a question as to do you label the newspaper, do you label every page? If there's a picture of something happening in Bosnia, is that not a paper for children to see because it's graphic? Those are really tough questions.

The news organizations are an exception. And then there's the question of, well, if you get a news organization that accepts ...(inaudible). So there's some practical issues, too.

Anyhow, all this is to say I think that government doesn't necessarily have to be involved in order for the PICS movement to take off, because there will be enough demand from the parents's side, in terms of trying to find a way to limit children's access, that there would actually be a marketplace for this, therefore businesses would self-label because the rating systems that exist now will block out any ...(inaudible) site, which means that a business would have a strong incentive to label purely a business model --

DW: As long as the frameworks types of PICS, what would you say they're normally about, the level of refinement that think PICS is designed to get to? Is it down to the URL, the Web page? Is it to a site in general, is it to a particular paragraph?

__: It's designed to go down to a URL. Labels are associated with URLs, so that can be the whole site or it can be any page on the site, or any set of pages that fall underneath it, in particular the URL.

I just have to relate one, well, story from the CDA trial, in Philadelphia, that has to do with newspapers. I can't remember which day of the trial Judge Delzell, who was a political appointee who, in my view, is essentially acting as an ...(inaudible) up there -- I shouldn't say that, but he really was. One day, when the Government was presenting its case, had a copy of the Philadelphia Inquirer with him, and there was a colored picture below the fold on the front page that some of you may have seen. It was a terrible picture of -- it was from Liberia, it was of some police officer executing some Liberian citizen in the course of some protest. It was a very gruesome horrible picture.

And ...(inaudible) a little bit rude of the Judge, but right in the middle of a caraways-examination by the Justice Department attorney, Judge Delzell held up this newspaper and said, So are you saying we need a newspaper decency act now? Because there was this picture on the newspaper that was clearly of the class of information that a lot of people would consider bad for kids.

Let me just say one thing about the question of newspaper ratings, and I think it hears on the effectiveness of the self-rating approach versus the third party rating approach. My own totally subjective view -- this is not based on constitutional arguments -- is that self-rating as a general solution is never going ot work, because, first and foremost, as someone said, that number one, most people don't have any reason to trust the self-rating of a lot of publishers. Number two, because most people who public on the Web have absolutely no incentive to rate their own site.

And you can argue that it's simple, but I can tell you --

P: They have no incentive unless there's a government law.

DW: Right. The only way self-rating, I think, could ever be even remotely effective is if it's required. The other way that you might be effect is if, as you say, the -- let me just say. You mentioned the browser options for blocking unrated sites. The way that the Microsoft Internet Explorer Web Browser is currently configured to support PICS, it gives you the option of blocking out a site that does not have any rating at all. Now that does create what I view as a somewhat perverse incentive to get people to rate.

CDT tried to rate our site, based on the ...(inaudible) rating system, and we found we couldn't, or we could but the result didn't make sense, because we have testimony that we were invited to give by the Senate Judiciary Committee on the counter-terrorism bill having to do with bomb-making information the internet, and other sort of issues about terrorism and law enforcement.

So we have graphic descriptions of violence, in some cases, we have graphic descriptions of explosions and things like that. And we ended up with a violence rating, according to the ...(inaudible) scale, that made us look like the Jean-Claude Van Dam fan page.

P: If the shoe fits.

DW: And that -- now, were it the case that was either a law requiring rating, or if browsers were set up so that we were penalized for not rating by essentially denying us access to anyone who wanted to see our page that we didn't rate, I think that would be a very unfortunate result.

What I think is better about third party rating is that it does not need to pretend to be objective. They can be, by definition, subjective ratings. They can be the Christian Coalition's value judgments about content; they can be the Boy Scouts of America' value judgments about content. And no one worries about -- and you can decide that if the Boy Scouts of America doesn't happen to rate the CDT page, you can say, Well, don't go there. But that's going to be a decision based on, hopefully, reasonable judgement of someone you trust, as opposed to the fact that someone got sort of pushed into a self-labeling system that doesn't make much sense.

__: Are you concerned about the impact of that model on the little guy, non-commercial speaker, who is confused and can't afford to figure out how to rate his site on a daily basis? And, too, maybe, is it likely to be on the radar screen of any third party rater, and so will be an unrated site --

P: Thanks to the default in the Internet Explorer that says, Don't look at sites --

__: Which makes sense, from an effectiveness standpoint, from the parents' point of view, but means that commercial speakers will even more dominate the net than they do already.

DW: I don't know whether I'm -- I guess that I am -- I think that I am concerned about the small versus large speaker. I'm not sure that there's a difference between the commercial versus non-commercial speaker, but I do think there may be an issue about small speakers.

At the same time that this browser default issue was going on, the people who were involved at our site were trying to get search engines, Alta Vista, Yahoo, to not index pages that were not rated. The search engines ...(inaudible) said, no, we're not going to do that because -- we're just not going to do that.

I guess I have some amount of comfort that there's going to be a kind of equilibrium here, that people are not going to choose this sort of default option if what it means is that hey all of a sudden lose access to 90 percent of the site.

It may be that some parents do make a choice to say, I only want my kinds to see a certain list of sites that have been approved by someone. That may be a very reasonable decision for a parent to make, and it may mean that that kid doesn't get access to the CDT site, or doesn't get access to a site that's frankly much more controversial, sex education sites, etc. That's an issue, I guess I think that's an issue in the real world, and it's an issue about choices that parents make, and all of our opponents, who support the CDO, always point out how easy it is to go around filters, or go to the library, or go to the neighbor, or go to wherever, and I think that is going to always be important to have as a kind of safety net.

__: Just to add to that, just directly related to that, isn't there something kind of unfair in the sense of the third party rating systems? There's never any kind of notification, or it's very hard to know, if you're speaking, whether it's going to be heard --

P: It sounds like a credit rating or something.

__: You just don't really know what's going on. You put your speech out there, but you've been axed by everybody because -- for whatever reason. And there's no sense of whether your speech is ever being heard.

P: So you're calling for Government regulation requiring third party raters to notify the ratee how they've been rated.

__: I'm not saying what I'm calling for, but doesn't that bother you at all?

DW: I don't know if that bothers me. I mean, I've thought about it. I guess that my view of third party rating is that what it will do, ultimately, if it ever really happens on a large scale, is that it will give people who use the 'net the option of making the 'net somewhat more like traditional media, and that may not be a good thing. But it will essentially say that, yeah, you wake up in the morning and you read the New York Times, or you read U.S. Today, or you read The Boston Globe, or you listen to NPR, you do whatever to kind of filter your view of the world.

And if it required to look at -- if it was legally required to look at a filtered view of the 'net every day, yes, I would be very concerned. If what we're doing is enabling lots of different slices of the 'net, lots of different perspectives on what's out there on the 'net, I can understand the potential problem about that but my reaction is that it's the way that we engage in kind of mass speech anyway.

__: I have question about -- I was wondering what your thoughts on the view that seemed more common during the sort of "raunchy" TV debates, about NYPD Blue, and the like, about the view that violent and racist speech, the Nazi Web page, is much more harmful to children, and to everybody else, and it's caused much more damage in history than has sexually explicit speech. And a society that regulates sexually explicit speech at this fast structure, and blah-blah-blah, what does that say about ...(inaudible).

DW: I tell you. I think that -- you go around this country and you talk to different people who are concerned about one kind of speech or another, and the fact of the matter is, you go around the world, you go to Germany, the OACD, which is an organization about thirty more industrialized countries around the world, you look at the question of content regulation on the internet, and what is being found in that process is that virtually every country has its own idea about what's the really bad stuff.

And I guess the reason that I'm drawn to the third party filtering and rating approach is because people can make their own choices about what's the bad stuff and what's not the bad stuff.

I'm frankly most concerned -- I'm going to be father in about a week or so, so when I kind of think forward to my future child, whoever he or she is, I'm probably more concerned, myself, about commercials than anything else. I'd really like to have a C chip to get rid of commercials.

And once I have this child I imagine I'll have a different set of ideas. But I think that this brings up back, really, to this question of harm and what's harm and who decides what's harm, and it's not as simple as lead paint, and I know there are a lot of studies of violence on television ...(inaudible). Some people think they are very conclusive studies and other people think they really don't demonstrate anything at all, other than that we live in a violent culture.

So, I think when it comes to speech, it's just better for people to be able to make their own decisions.

__: Especially ...(inaudible) it seems like sort of ...(inaudible) a common belief that at least obscenity ...(inaudible) and wouldn't that point about, if we've made a choice to allow the harms of Nazi speech to go on because of other values, isn't it somehow perverse to not do the same for obscenity?

DW: Well, we already do it for obscenity, I mean, obscenity is illegal.

__: Not illegal, but Nazi ...(inaudible)

P: You're saying, as long as you're letting anything go you might as well let everything go.

__: Well, I'm saying it says something about priorities and value choices, about letting this genre --

DW: It's a political statement about the fact that the United States has chosen to focus on obscenity and other sexually explicit speech, where German and the European countries have chosen to focus on neo-Nazi speech or xenophobia. Yeah, it is reflecting of whatever our cultures think is especially -- I don't know that there's a kind of universal consistency standard that we ought to apply. It's a cultural choice that has real reasons behind it, I think.

Larry: I want to get back to your commercial/non-commercial ...(inaudible) figure this out. Let's say that you ...(inaudible) person and you wanted to spread porn in the world, in this case. Say you wanted to have 12 million people a day seeing a porn. It's just the fact that, in real space, to do that you've got to be a commercial venture, it costs a lot of money to do that. You want to take pictures, you want to print it ...(inaudible).

So the only way to do that, unless you're a billionaire, is to charge.

Now, go to cyber space. You can distribute 12 million pictures in cyber space, every day, for pretty cheap, just about the cost of the scanning, and sending it off into the use net, or even have ...(inaudible). And the consequence of this is that this is the biggest boon in the porn industry, is the ...(inaudible) producers, or amateur porn producers. This is what the user net is filled with, is this stuff.

Now, you were saying that it was OK in real space, because in real space ...(inaudible) commercial ...(inaudible) it's OK to impose ...(inaudible) on that, because they're already in this business.

But I guess I'm wondering about the proper base line when you get to cyber space, because you could look at it, you could say, Look, cyber space has given the porn producer an extraordinary subsidy. It's now possible to do for practically free what used to cost millions of ...(inaudible) Playboy ...(inaudible).

So why not sort of think about it -- if you thought of it as this kind of subsidy, then you already ...(inaudible) just imposing a little bit more of the burden that you would be facing in real space anyway, in order to filter this ...(inaudible) not to have it. And now, relative to real space, doesn't seem to be such an extraordinary thing.

So how -- what is the significance of the commercial/non-commercial distinction ...(inaudible).

DW: I guess my impression of the case law is that the significance of the discussion of burden is, has to do with the concern about not burdening speech that might somehow be important, and that the -- so that the burden on pornographer in the real world, in the physical world, is OK because, number one, they can bear it, and number two, because the speech is not quite as valuable.

__: I think bear it means, there are people that can publish and bear it, but if they didn't have the burden in real space, there'd be more porn publishers, in this case. I think it's cheaper to produce porn in real space, there'd be more publishers in real space.

P: You're saying the commercial/non-commercial, the distinction may be just a proxy designed to capture the evil person, or the highly motivated person you describe, who just wants to spread porn. That person, in real space, has to end up charging for it. It may be a sort of non-profit, eleemosynary organization, it might be NAMBLA, but they still have to charge because, in real space, you just have hard costs.

But now, do you think it is, it's a proxy or is it just commercial/non-commercial gets at, hey, if you're in the business, if you're there to turn a buck, then we assume you're a sophisticated player and you should internalize this cost like any other, so it's fair to give you laws that are difficult parse, you have to hire an attorney or something.

__: I don't think it's a proxy, I really think it's a reflection of whether -- you've got a government purpose, right, which is protecting kids from -- whatever, and you've got the question, I think, that the courts have to face: can that government purpose be met without doing violence to the free speech rights of the producers and of the adult receivers?

And I think that, off the net, the commercial/non-commercial distinction has helped sort out the answer to that question: Can this speech which is constitutionally protected, maybe a little less, but it's still constitutionally protected, be accorded the appropriate amount of First Amendment protection, while still --

__: But ...(inaudible) kids, right, because there are no non-commercial porn distributors in this space. And so, in the 48 states where we have laws that sort of regulate harm to minors, you don't have the sort of Jonathan's group set up next to Playboy, that says, well, we're doing it for non-commercial reasons, so we don't have to live up to ...(inaudible). The point is, you just don't have them in real space.

P: So you're saying it could be a proxy, we just don't know, right?

__: I'm not sure your proxy ...(inaudible). I mean, I'm not sure ...(inaudible) the proxies. I'm just wondering about how easily you can carry over the commercial/non-commercial distinction in cyber space, given there's no non-commercial distribution of porn in real space.

DW: Right. Well, I guess the reason that -- I still think that the appropriate analysis is about -- it is still -- it's always, I think, going to be, whether, regardless of which kind of speech is...(inaudible), I really think it is always going to be the balance between the legitimate interest in protecting children and the free speech rights associated with that content.

And so I think that, yes, in the physical world, the commercial/non-commercial distinction is a way of sorting out that balance, it's a line that I think you can draw. And I guess I think that it's appropriate to draw that line in cyber space also, because, at least what we saw in the Reno, was that there are real costs associated with complying with the defense in the CDA, and that that will have a real effect on the free flow of this constitutionally protected information.

And I guess I do think there's some kind of equity principle background that says, yeah, some people can bear that cost and on some people the cost is too great, because it will interfere with their ability to exercise their right.

P: As a matter of doctrine, is commercial speech done by a profit-making entity, or is it speech for which there's a charge to hear or view the speech?

__: Or is it advertising?

__: What is commercial or non-commercial speech, actually, even in relevant ...(inaudible). Has it?

DW: It's not come up, and that's a nice point, is that we don't have that case. My point, though, is that the cases we have are all about speech of commercial entities who have the money to deal with the regulatory burden.

__: Even with George Carlin?

DW: It wasn't George Carlin, it was WBAI and Pacifica. George Carlin wasn't -- I mean, he wasn't even in that issue there, it just happened to be his monologue. It was 'BAI and Pacifica could go and hire lawyers and deal with the FCC and deal with the courts, and yet Carlin didn't. And the other big indecent speaker these days is Howard Stern, and it's not Howard Stern the private citizen, it's Howard Stern --

__: So you think now that the ...(inaudible) of Howard Stern ...(inaudible)

P: There must be a lot of Howard Stern wannabes actually on the internet, right? Maybe they're even broadcasting real audio.

DW: Could be.

P: And they're non-commercial.

__: using ...(inaudible)

DW: I think Reno said that that is the difference, though. I think that Reno said the burdens on those individuals are simply too great. And I think that is, in substantial part, because of the ...(inaudible) who came before the court. Because you had librarians, because you had Stop Prisoner Rape, because you had Critical Path AIDS Project, it was really, I thought, instructive to just sit and listen to the whole argument, and see that what the justices were concerned about was, how is this going to affect librarians? how is this going to affect people who use the 'net? Not, how is this going to affect Penthouse?

I think the case would have been totally different it were Reno v. Penthouse. I think we would have lost. I mean, Playboy lost ...(inaudible)

__: ...(inaudible) protection of non-commercial porn providers. Do you think ...(inaudible) can be extended to say --

DW: I don't really accept porn provider as a legal category. I mean, the pro-censorship people try to create that as a legal category. But yeah, I think --

__: What did Ginsberg's speech provide?

DW: I don't think it's answered the question of Ginsberg's speech providers. Of the speech providers who are squarely and solely in the Ginsberg, I don't think we know the answer to that. Do you?

__: Well, Ginsberg's a commercial ...(inaudible) commercial ...(inaudible) provider of Ginsberg's speech. That's an easy answer. But you're saying, if it's generalized behind the commercial, I'm not sure whether Ginsberg's speech would ...(inaudible)

DW: Right. The harmful to minors cases are interesting. There is one harmful to minors, sort of harmful minors that was litigated, the ALA v. Hitaki, which is a litigation about an extension of the New York harmful to minors law to the 'net.

And the interesting issue that came up there is that the whole case essentially turned on the commerce clause. I mean, there was some nice First Amendment language there, but it was a commerce clause case. Now, that's partly because we got the state level. But I haven't thought about what happens when you bring 'net into that arena.

P: We're actually prepared to shift gears shortly, but did you have something to state that --

__: Yeah, it's a ...(inaudible) question. I don't think it will take very long, though. You talk a lot about the 'net being decentralized, I think that's apparent for a lot of reasons. Isn't there an exception in the world, though, where 80 percent of the speech on the Web is ...(inaudible) to a Netscape Browser, and still always either be ...(inaudible) Bill Gates has his way, in a couple of years it'll be 80 percent internet explorer and 20 percent Netscape.

If that's the case, isn't it in fact true that this is almost like all the communication, it's like ...(inaudible) except one country owns and controls all the television sets. So aren't there a whole set of issues that you've raised, then, just vis-a-vis Netscape and Microsoft, and aren't they a lot like the providers of a scarce broadcast medium?

Can they be free to do what they want? What if Netscape tomorrow said, we're not going to let anybody see on our browser democratic ...(inaudible) sites, which is ...(inaudible) proposing. Can they do that? What if they decided to regulate pornography. Or could the government tell them that they have to dole out this scarce medium and operate their window to the world? Because there are special kind of rights, duties, ...(inaudible) and all that.

And this would go back to all the First Amendment case law that regulates the ...(inaudible) and all that.

So, I don't know, isn't there a huge issue there? There is, in fact, a central ...(inaudible) the Web, and it's not ever going to change, or it's not likely. You ask anyone about the browser wars, there's going to be one or two people on it, but no more.

DW: IT's a really interesting issue, and I think the World Wide Web consortium up the street here is engaged in a very delicate balancing process between Netscape and Microsoft and all the other kind of, the one and two percent browser categories, and all the other people who are involved in the Web standards for companies that have an interesting Web standard.

I think that if -- so far the Web consortium, I think, has been reasonably successful at keeping the standards moving in a way that reflects the general interests of the commercial internet community, if you will. Now, not totally because I think there certainly have been points where they've been heavily influenced by Microsoft and by Netscape.

I think the moment that the appearance of that opens standards process falls apart, someone's in for big anti-trust problems, and I think all the players know that, and I think that's why they keep the Web consortium going, in some part.

So there's an anti-trust issue aside from a free speech issue. I guess my view of the scarcity categorization in broadcast regulation is a little bit different than Netscape, because I don't really think the browser is the medium. Right now, the browser, at least my sense is that right now the browser does not impose content restriction; the browser does not say, you can only three channels.

Now, I notice that in Internet Explorer 4.o they have this thing called channels, and I honestly haven't figured out what it is. But there's point cast, which -- that's right.

And so certainly on the point cast network, I guess, they have eight channels, or something like that. Is that broadcast scarcity? I don't really think so, because I can go out and create the CBT network and have my own eight channels.

__: But you don't own 80 percent of the population. Microsoft, they're going to have net casting in all the next versions, they're fighting about it, so there is going to be two players who decide -- Netscape is going to describe what eight channels get broadcast to 80 percent of the people on the Web, and it's not unimaginable that Bill Gates, because he's fighting in the home segment's, going to say, By the way, we have set up, these are the 4oo worst porn sites in the world, and just click here and your browser can't get into them. I mean, these people have enormous power to dole out --

DW: But can ...(inaudible) get it to them?

__: They may give you an option, they may not.

DW: Well, is that still scarcity, is that still control?

__: You can't deny that there's control.

DW: I mean, I can't click on my radio station and say, I want your radio station, because it just won't happen. I think there's a difference.

__: Wait a minute, there can't be a difference here. If Jim Barksdale and Netscape can put anything they want into that browser, unless you believe that there's different First Amendment considerations, they can publish a browser tomorrow which says that you only get these eight channels, or these or the eight channels we recommend -- they can do anything they want.

DW: They'll get a different browser, right?

__: There's no alternative, really, to internet explorer, because it's not compatible --

__: Well, ...(inaudible) that are not doing this, there are only two ...(inaudible) All of sudden Netscape's saying they can only ...(inaudible) And there's nothing to block alternatives coming in.

__: What's the alternative? On the desktops in the world there are only two browsers.

(simultaneous voices)

__: There's nothing to stop an entry.

P: But now, do you have to worry about disclosure? I mean, you don't know what's in the innards of Netscape or Internet Explorer, or Yahoo. I mean, we have some sense that when you do a search on Yahoo, the things that come up first might be things that have some financial relationship with Yahoo? I mean, you think they must be paying a fee to come up first. And if it turned out if Pat Robertson owned Yahoo, would it be a scandal? Wait a minute, I'm not going to search with Yahoo anymore. What if he's tweaking.

__: But I think that would be a good thing, because I think that everybody who likes Pat Robertson can say, Great, I'm going to go to Yahoo.

P: Right. But now should there be disclosure? That's what I'm asking.

__: Does The New York Times how they decide what they put in the front page, or what they left on the editing floor? Should they disclose? I think we'd probably up in arms and say that's a horrible First Amendment violation, to force our way into your editorial decisions, that's what I think we'd say.

P: And as a an editorial decision, when you click on Microsoft internet Explorer button that says, do a net search, tell me everything you can find with such-and-such, and ...(inaudible) Microsoft software.

__: That is absolutely ...(inaudible) I think it absolutely is. And in libel law, I think they would be liable not as just the sort of carrier of that stuff, they would have a higher level of liability for those decisions. Absolutely.

P: Well, I think we should probably shift gears. Actually, ...(inaudible) we realize we may only hit two of our topics today rather than three. Perhaps it's partly a fortuity that the computer apparently is not working, with no explanation given by AV, but it doesn't work. They actually said, well, the internet card isn't initialized.

__: It's Netscape. barksdale was here.

P: But we do, I think, have at least somebody assigned with telling us any late-breaking news, from last week to this, of relevance ot internet and society. So maybe we should just take two or three minutes and hear that news. Kathryn, is that you?

Kathryn: That's me. I did a search, a million things with just doing internet in the last couple of days, and I didn't find anything of late breaking news. But I can spend ten minutes telling you what I found.

P: No, that's OK. It's sort of, John Brown's still dead.

Kathryn: Yeah. The only ...(inaudible) somewhat importance is something that happened last Wednesday, and I don't even remember if we talked about it or not, but we talked about the sale of CompuServe, that the anti-trust division at the Justice Department is investigating whether or not it's going to make America Online and World Com, whether it's going to violate ...(inaudible) positions, anti-trust positions, because they're going to be the only ...(inaudible)

P: Except maybe Microsoft. It's small.

Kathryn: That's it. But I'll think of what I found. One is a survey that was just published that says that the internet's dramatic growth is continuing, according to a new survey.

P: This really is John Brown's still dead. Internet growth has halted.

Kathryn: It's been doubling every year, and it's still doing it.

P: And what are the units?

Kathryn: The units?

P: How do you measure internet growth?

Kathryn: Actually, don't say that. They call it centralized server computers, workstations, and each modem ...(inaudible) internet service providers. And as of September, this month, there were 26 million.

__: In the United States, or everywhere?

P: Somewhere.

Kathryn: Everywhere.

P: 26 million things.

__: 26 million what, modems, servers and whatever, 26 million of those?

P: That's more internet. I mean, it's precise, it's not necessarily accurate.

(simultaneous voices)

Kathryn: Christian Returna, the chief scientist in ...(inaudible) internet architectural research laboratory, did the survey, so you can give him a call if you're not satisfied with the survey.

And ...(inaudible) Digital and Intel ...(inaudible) Monday the first in a series of online art installations. This is going to the first online art gallery.

(end of tape 1)

__: -- treaties for copyright. I was just saying that the industry is putting more pressure on Congress to sign the ...(inaudible) treaties, and to draft and pass alternative, or not alternative but additional legislature clarifying the liability of on service providers, online service providers.

P: Yes. I fact, there's a bill in Congress right now about that, too.

__: Yeah. By Ashcroft.

__: Johnny Cash testified on that bill today.

P: Johnny Cash did?

__: Yep.

P: By voice or by song?

__: No, I don't know if he sang.

P: And your methodology for those stories was you went to a search engine and you typed in internet.

(laughter)

__: No, actually, I didn't do this on the Web, I did on ...(inaudible), and I typed in internet and last set of days. But I only went into --

P: How many hits did you get?

__: Huh?

P: Well, I limited it to the Wall Street Journal, the New York Times, and a couple of online magazines. That wasn't too successful, I would say. But I had about 1,ooo, so I picked up the first 1oo.

P: Great.

__: Yeah.

__: Well, I have another piece of news to add. The -- what we talked about last week, not this week, last week, last week the FDI proposed, for the first time publicly, clearly, in legislative terms, that they want to ban the manufacturer or sale of any encryption product or service which does not provide law enforcement what they call immediate access to the plain text. So that essentially means that it would be illegal to sell -- to manufacture or sell a product or service, and the additional of the service is really important -- any kind of encryption technology that the F.B.I. doesn't have back-door access to.

This was really partly because we've been bombarding the press with every little sort of tit and tat in the encryption debate. They've gotten sick of us.

P: It's the crying wolf syndrome.

__: That's right. But it's very significant, and I'm sure you've all, for one reason or the other, followed this encryption debate that's been going on since the Clipper chip. The Clipper chip was a proposal whereby the Government would build something called the clipper chip, which would be an encryption device for use on telephones, you could hook it up to your telephone and have a conversation with someone else who had a clipper chip device on their telephone, and the conversation could be scrambled. The only hitch about it is that two Federal agencies, which was going to be the Treasury Department and the Commerce Department, would hold keys to unscramble that encrypted communication, which they would turn over to appropriate law enforcement agencies upon presentation of appropriate legal process.

I think encryption is another one of these issues where you have sort of radically decentralized network technology that puts really amazing power in the hands in the individuals, the power to scramble up your communication in a way that makes it very hard, if not impossible, for law enforcement government agencies to decrypt it.

The reaction how that we've seen is that there's an effort to sort of reimpose centralized control by saying at various times there have been different proposals, but the proposals essentially amount to what is called key recovery, or key escrow, whereby when you encrypt a particular piece of communication, or information, the key to decrypt that piece of information would have to be held by some third party, a bank or an insurance company or -- who would be answerable to law enforcement process.

So that the same way that the F.B.I. goes to the phone company with a Title 3 ...(inaudible) based on probable cause that says, we have authorization to tap this particular phone line to get these particular conversations, that some process like that would be presented to the recovery agent, the key escrow agent, and law enforcement would get your key.

There are a whole lot of technical and operational issues with this. We work with a bunch of cryptographers in the last year to look at just the practicalities of doing this. What they found is that it's very expensive to do this. You're talking about figuring out how to store probably hundreds of millions or billions of keys in databases somewhere, that have to be provided to law enforcement and not provided to anyone who anyone who pretends their law enforcement -- on two hours' notice.

That's a complicated, expensive proposition. It's also risky because it means that these key recovery databases become incredible targets for hackers, and not just kids who are fooling around but people who think they can make money by having access to people's private keys.

P: Now, if they have access to the keys, they also need access to the conversation -- I mean, they have to have access to the door, right?

DW: That's right.

P: So, the key is only of use if they're also eavesdropping on the telephone line.

DW: Well, this goes far beyond telephone lines, and I think that that's really -- in my mind, one of the really interesting constitutional issue, Fourth Amendment issues about this, is that if you -- if anyone is interested in going back to 1968 and looking at the debate over the wire, the Federal wiretapping statute, that the civil liberties community at the time, joined by Arlen Spector, who was then the District Attorney of the City of Philadelphia, all said that this amounts to a search that violates the Fourth Amendment because it's a search without notice. The famous duel in, I can't remember what 17th century case, a man's home is his castle, is that the bailiff can't come into your house without knocking on the door and presenting a warrant.

The knock and announce requirement is still a requirement that is very important in just day-to-day criminal cases. Law enforcement can find exceptions to this requirement based on exigent circumstances, the usual sort of Fourth Amendment analysis.

But as a general matter, the bailiff can't come into your house without explaining to you, or at least having a document that explains to you the justification of you doing that.

So, wiretapping, if you think about it, is a search, or a seizure, depending on how you think about it, without notice. And it's sort of obvious why it's without notice. If John Giotti picked up a telephone and got a little beep that said the F.B.I. was listening in, he probably wouldn't say what he was going to say.

And the argument presented by law enforcement during the '68 debate was that this was not just going to make law enforcement's life harder, which the Fourth Amendment often does, but it would discourage the very creation of evidence that would otherwise just not exist.

So that was the way that everyone talked themselves out of this Fourth Amendment requirement and decided it was OK to make an exception.

When we're talking about access to keys, access to any information on the internet, we're talking about much more than just phone conversations. Now, we know that there are now phone conversations on the internet -- they're silly, but you can do it.

P: You think they're silly?

DW: Yeah.

P: Because?

DW: Because you have telephones. It's not free. But it's free if you have free internet access.

P: Or front rate.

DW: In any case, it sounds bad. But there's also on the internet documents, stored information, and in traditional, in the Fourth Amendment context, if a law enforcement officer wants to come into your house and go through your filing cabinets and get some documents, the knock and announce requirement applies to that law enforcement officer; and if the police break into your house without presenting you the warrant, without knocking and accounting, and they don't have an accepted reason for doing that, then the search is not valid.

P: Do you think the intention of the key itself, at the moment the government gets the key, is a search ...(inaudible).

DW: Absolutely. I think a key is a very private, personal thing that has every bit as much privacy interest attached to it, uniquely, by itself, as a lot of other things that ...(inaudible) possess.

P: Like a diary.

DW: No, like a key. I mean, like a key.

P: If the Government were to, I don't know, take a picture of your front door lock, it's in plain view of the lock, and somehow, through examination of the picture, was able to take it to a locksmith and generate a key to your house, have they searched and seized you?

DW: This is a hard Fourth Amendment question that I don't think has a good answer. When you use magnification devices to intrude on private spaces, which, arguably, anything past the front of your lock is, you can't -- you don't necessarily get a plain view exception. You might, but you don't necessarily.

So I don't think it's just like that. And certainly, law enforcement cannot break into your house to get a key to your office because they need to get into your office. They have to ask you for it; it's your private property, that key.

P: But now, at the time you transmit the key to, let's say, the title company, the insurance company or something, I suppose that's not a Fourth Amendment search, unless you're saying they're an agent of the Government.

DW: Well, if you're compelled to, by law, which the F.B.I. proposal essentially does, it's a general search, which is also a violation of the Fourth Amendment. You can't tell every citizen in the country to undergo a search just because they might later be the subject of an investigation.

P: Of course, this law doesn't compel you, the phone user, to do it; it compels the phone manufacturer to see that the phone does it when you the phone user uses the phone.

DW: Right, it amounts to the same thing. I mean, that's worse, in my view, because you don't even necessarily have notice that that's happening. So it amounts to the same thing. It amounts to something that, I think, you have a privacy interest in being taken from you, being seized, in advance of any investigation, just in case there might be an investigation.

Now, it's not to say there aren't regulatory searches, that there aren't tax returns and things like that, that everybody else could do, but those kinds of requirements are not made lightly and have to meet a whole series of requirements. When police want to do a roadside checkpoint for alcohol, for DOI, they have to have a good reason for why they're placing the checkpoint where they're placing it. I mean, there's a whole lot of steps they've got to go through before they can do that. And this is equivalent of sort of one big, you know, checkpoint, that everyone ...(inaudible).

Larry: One way of describing what they're doing, and I think what they're doing is something to really be upset about ...(inaudible); one way to describe it is they're requiring architecture of the internet, make it be designed such that it's possible for the Government to search ...(inaudible).

Now, when the Government did that in the digital telephony act, it basically said that the architecture of the telephone network shall be such that we can ...(inaudible). One might say, why isn't the very same issue raised?

__: ...(inaudible)

Larry: Well, what they said was that they had to, they were going to ...(inaudible) from redesign.

__: They're not going to redesign the architecture. The original ...(inaudible) only said that the phone companies would be compensated to help the F.B.I., on the existing architecture get things, and then in the future. But the F.B.I. was originally not envisioned to be part of the redesign of the network. The redesign, in fact, was ...(inaudible) the network itself would still be designed solely by the phone companies.

__: That's fine, but whoever is doing the designing, what they're doing --

P: They have to design it for a certain purpose.

__: They have to design it for a certain purpose such that you can tap it.

__: But they don't.

DW: What you're getting to, I think, is the critical question, which is, is this a status quo proposal or not? Is this -- I mean, the digital telephony act, Kolia (?), as it's now called, was arguably a guarantee that law enforcement could maintain their status quo electronics surveillance capabilities, that people were still making phone calls but what was happening was that the network underneath was changing in ways that did, in fact, make it hard, and in some cases impossible, for the F.B.I. to do the kind of surveillance that they had been doing, that they'd been authorized to do for quite some time.

This, I think, is different, because, as you say, it's prospective, it's saying there are certain kinds of services that cannot be offered now. Interestingly enough, the F.B.I. proposal probably would make illegal ...(inaudible), the security electronic transaction standard that's used for -- that was just designed and is going to be used for credit card transactions on the internet -- it's a very heavily encrypted infrastructure and there's no key recovery in it, there's no way to get immediate access. You can eventually get access to a lot of the information as you go to the ...(inaudible) transaction, the same way that law enforcement does now.

So I think that's the critical question, is that this is the FBI -- and the Justice Department really does believe that what they're doing, or they say they believe that what they're doing is they are guaranteeing status quo ante -- What we think is that they're building a new surveillance network, and that's, I think, the crux of the policy question that's going to have to be figured out.

__: You spoke of the decentralizing nature of internet in general. I'm concerned somewhat that with the possibility of encryption, they cannot be broken with the government, unless they force them to perform ...(inaudible) on people, that people with bad intent would be able centralize relatively more power in their hands, since they would have the time, money and motivation to the design the equivalent, say, a house that the government could not break into, even if they knocked unannounced and tried to beat the door down all they wanted.

DW: I think that there's no question that this technology gives new power to criminals, there's no question about that. I think that -- and that's a simple issue, I don't think anyone can deny that.

I think the more complicated issue is, where's the balance? What's happened to the overall balance between criminals and law enforcement?

And that's an even more complicated question. When, as you know, when you use the internet, you leave traces of what you do all over the place, in lots of different ways. And those traces, I think, are very powerful investigative tools for law enforcement to follow. It's true, you can try to cover them up, but the fact is it's very hard to cover them all up. The fact of the matter is that -- I'll just give you an example my office, and hopefully we're not criminals, but there are things in my office that we don't put in e-mail, that we just don't want ever --

P: I thought you had no secrets.

DW: That's right. This is why we have no secrets, this is how we have no secrets. They never exist. But seriously, you know, the simple fact that we're talking about a shift from a communication environment where you have a bunch of conspirators sitting around the table, drinking coffee and talking about what the conspiracy is, that's very hard to ever get at, unless you have one of them turn, as an informant.

If they're sitting all around the world, drinking their own separate cups of coffee, or having a conversation by e-mail, that conversation is recorded in multiple places. It's recorded on the computer, it's recorded in the on-line services, there are lots of places to find that.

And the thing about encryptions, remember, is that encryption ultimately has to be decrypted, otherwise it's just wasting a lot of processing cycles scrambling things up.

So even if it's that there's a vide camera placed up above someone's monitor to capture a picture of the conspiratorial e-mail message that they just got and had to be decrypted in order to read, that's just one example of a way in which this technology actually creates a lot more evidence --

P: So this is a view that sort of says, let's just keep this a fair fight, let's make it so that the criminals can invoke whatever technological aid they have to protect themselves, and the Government can pay the NSA and the F.B.I. a lot of money to try and crack it on their own, and let the best organization, the best technician --

DW: I don't really think the Fourth Amendment is about a fair fight; I think the Fourth Amendment is about fundamental limits on Government power, and what's happening in that, as was the case in digital telephony, what has now been in the case with encryption, in the case of a lot of other crime policy issues, you have law enforcement coming and saying, we need to push that Fourth Amendment line towards more power for the Government and less protection for individual citizens, and we need to do it because there's x kind of crime that we can't solve, but there's some huge threat of this -- there's some reason to push that line ...(inaudible) public safety demands it.

And I think that's what most Fourth Amendment issues that we have, probably have ever had, are really about, is where is that line. And the line's always pushed based on ...(inaudible) that law enforcement has somehow diminished capacity or needs more capacity.

And so I'm just raising -- and I don't pretend to have the answer to this question of balance, but when law enforcement says x y and z technologies are eliminating our ability to conduct investigations, I think we also have to ask, what are those same technologies doing to enhance their ability?

I'll just give you one other quick example. Messages, even if the -- let's just take a conspiracy case, that's really fundamentally where big, bad crimes can ...(inaudible) these days, big drug crimes, big terrorist rings, tend to be conspiracies. That's why ...(inaudible).

And traffic analysis, the ability to look at who's communicating with whom, even if you don't know what they're communicating, because the content is encrypted -- I mean, you probably can't say what you do at the NSA, but the NSA spends the bulk of its time doing traffic analysis around the world, not trying to decrypt or intercept ...(inaudible). A huge part of what they do, to the extent that anybody really knows what they do, a huge part of what they do is about looking at who's talking to whom, and where a message flow is going.

P: Of course, the sum of encryption is going to be encrypting the header information, so you route the packets in an encrypted way, and you just know it's disappearing into the ether?

DW: Yeah, but that's pretty hard.

P: This is the fair fight again. It's hard to do.

DW: But there's also not particularly much of a reason to do that. There's less of a reason to do that than to encrypt the contents of messages.

__: ...(inaudible)

DW: Right, that's right. But I mean, that's an infrastructure. I mean, that's where everyone who's sitting around the IATF has to decide the next version of IP routing protocols are going to totally encrypt all addressing information and you're never going to be able to find it.

And I just think that there's not very much demand for that kind of security, because there are not many legitimate uses of that kind of security.

Tim: Just from what you said, I don't see how there's a logical link between increasing law enforcement's power and decreasing Fourth Amendment protection. I mean, it seems like it might often happen, but is it -- I mean --

DW: I just think in this case. I think if you -- I think if you give law enforcement the ability to do DNA testing in a week instead of a month, I don't think that creates a Fourth Amendment problem. No, I didn't mean to say that there's a necessary --

Tim: But again, in this case -- so your argument for - what does ...(inaudible), for example, give law enforcement? How does it move the line to minimize to less than Fourth Amendment protection, exactly, over wire taps?

Is it just the scope of things?

P: ...(inaudible) a little bit of a fiction that says this key is a personal -- I mean, it's a string of numbers.

Tim: Just that question, what the key escrow moves that line to less than Fourth Amendment protection, without them --

DW: Without a key escrow requirement, I could choose to either escrow my key or not escrow my key, I have a choice, and I'm going to choose that based on what my needs are.

For example, if I'm having an encrypted phone conversation with you I'm probably going to decide it's not worth the money to escrow the key to that conversation, because once we've had that conversation, it's over, and if I wanted to keep a copy I would have had a tape recorder running, I wouldn't go back and get the encrypted text and try to decrypt it, I just wouldn't do that.

But under the F.B.I. proposal, I have to keep a key, or a key has to be kept, somewhere; and is then acceptable to access by law enforcement.

And that key may do a number of things. That key may decrypt that particular conversation, which otherwise was not going to be accessible, by law enforcement or anyone else, without the key. And that key may be useful for other things. Maybe I use that key somewhere else. Maybe I've got a badly designed encryption system where that key --

(simultaneous voices)

__: But you've moved your baseline out of the encryption. I'm saying as compared to a pre-encryption design --

__: Can I try ...(inaudible) That argument is that ...(inaudible) stuff, but there's an argument that minimization procedures are a lot better in this context.

So, for example, under Title 3 what you actually have to do is show probable cause, but it's actually more than that; for the police to get to your communication, they have to show that they're minimizing use of it, and some other things as well, in order to show that they're not ...(inaudible) problem.

So the issue here becomes not just the Fourth Amendment arguments, but you can also argue that they're statutory, because once you get the key, what he was just saying is, look, once you get the key to one piece of information, you might be able to get -- you might have that same key being able to decrypt a lot more information.

And so minimization procedures ...(inaudible) minimize the amount of information for ...(inaudible).

__: You think that creates a Fourth Amendment issue?

__: No. I'm saying that --

__: The statute, but I mean, do we really care if we can ...(inaudible) the statute, right?

__: I think we do care. Right now, there's no movement to change the statute. So I think what you're doing is ...(inaudible) F.B.I. proposed, and the issue of the Fourth Amendment ...(inaudible) how you treat the key, but if you're still treating the underlying information or the ...(inaudible) requirement, then in some sense I don't think it requires how you treat the key.

__: But you're saying it's not a Fourth Amendment issue.

__: It could be a First Amendment --

DW: The Fourth Amendment issue, I think, is hard to get at by making a comparison between the pre-encryption world and the post-encryption world, because what we're really talking about is the pre-internet and the internet world, a world where a lot of what we do, hopefully in private, or a lot of activities that would be accorded Fourth Amendment privacy interests under any analysis, are all of a sudden going to be accessible by law enforcement in a whole variety of ways.

So there is a shifting scale here, I agree with that, I think you're right to point that out. But those activities on the internet are activities which would have required a traditional Fourth Amendment knock and announce access ability, and that's not what's being suggested by law enforcement here.

__: But ...(inaudible) still do things the old way, having a conspiracy around the kitchen table.

P: We actually have run out of time, I'm afraid. Dan, thanks so much for coming today.

(applause)

P: And as for next week, stay tuned to your --

(end of class)

Back to index of transcripts