Pre-class Discussion for Jan 16

From Cyberlaw: Internet Points of Control Course Wiki
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Filtering

  • A very cool tool to play with to check if a certain url is filtered in certain countries can be found at this link. Data and search from the good folks at ONI. As an example "playboy.com" is blocked by the United Arab Emirates, Azerbaijan, Bahrain, China, Iran, Myanmar, Oman, Saudi Arabia, Sudan, Singapore, Syrian Arab Republic, Thailand, Tunisia, and Yemen. harvard.edu, on the other hand, is only blocked by China. Kp 01:01, 16 January 2008 (EST)
  • Also, the obligatory link to google.cn and google.com side by side image search. Kp 01:08, 16 January 2008 (EST)

Robert Faris & Nart Villeneuve, Measuring Global Internet Filtering

  • This article provides a quantitative analysis of various internet filtering methods used around the world. It maps out which countries are using which methods and (perhaps more importantly) what content is being filtered. (Mshacham 18:20, 15 January 2008 (EST))
  • Perhaps the most interesting part of this article is Table 1.6, which details the extent to which different types of content are filtered out by countries that block content providers. "Militant groups" accounted only 1% of the filtered content, while "political parties" and blogs accounted for 19% and 20% respectively. (Mshacham 18:26, 15 January 2008 (EST))
    • The fact that blogs are the most blocked sources of content might be the best rejoinder to Andrew Keen. Jason 07:12, 16 January 2008 (EST)
  • There is an important means of filtering that the article seems to ignore: pressure on private companies. The article discusses how some countries block keywords URL paths, a strategy that "most often affects search queries in search engines" (pg. 15). But a much more direct way of doing this is simply forcing the company that provides the search engine to filter itself. Google's corporate philosophy is to "make money without doing evil," but it nonetheless agreed to create a special, censored search engine for china. How does this sort of strategy compare to the more direct filtering strategies discussed in the article? (Mshacham 13:55, 15 January 2008 (EST))
    • This may relate to the discussion of the "Proposal" below, although the issue may be more broad. A government could use private companies as a point of control by imposing direct regulation with the force of law (this seems implicit in the Proposal). But it may not need to do this, since economic incentives may suffice. (Mshacham 17:44, 15 January 2008 (EST))
  • Are there ever situations where filtering should be considered appropriate? Filtering of the sort discussed in this article would generally not be legally feasible in the U.S.: blocking political content would obviously violate the 1st Amendment, and while obscenity is not protected, attempting to block such content (which the authors categorize as "social") would run into the overinclusiveness problems we discussed earlier in the course. But suppose we had a hypothetical way of blocking only websites that provided only obscenity, without affecting access to legitimate material; or suppose we had a way of filtering pure "hate speech" or defamation, without in any way affecting political speech -- what would be the arguments against such filtering? (Mshacham 13:55, 15 January 2008 (EST))
    • It would be especially interesting to consider this in light of the comments of Danielle Citron, which we read for the previous class. If certain online environments "accelerate[] dangerous group behavior," and if such environments have a disproportionately negative effect on women (which the author compares to the effect that the KKK had on racial minorities), would it be acceptable for the government to attempt to remedy this problem through filtering (again, assuming you could get around the overbreadth problem)? Cf. Beauharnais v. Illinois. (Mshacham 13:55, 15 January 2008 (EST))
  • On a related note, how does the filtering discussed in this article compare/relate to state and federal laws that require internet blocking software in schools and libraries? (Mshacham 14:08, 15 January 2008 (EST))
  • A few technical questions
    • Can someone elaborate on the concept of an "international gateway"? This article focuses primarily on a point of control that is very familiar to us at this point (ISPs), but the notion of filtering at the "international gateway" seems different -- and perhaps more particularly relevant to the U.S-context. (Mshacham 13:55, 15 January 2008 (EST))
    • One fairly simple way to get around some of the more common filtering techniques, such as IP blocking and DNS tampering, would be to use a proxy server that allows for anonymous web browsing; if I understand it correctly, such services allow a user to browse to other sites through the proxy site, thus circumventing any direct blocks. The article notes that fourteen countries use filters to block access to such sites (pg. 12). But isn't this effectively impossible? That is, a country may block sites that are known to provide proxy services -- but can't any computer basically serve this same function? A person in the country that implements the filtering could connect directly to a computer that is outside that country, and then use that computer as a proxy to get whatever material he or she wants. As such, all it would take is a friend in another country who is willing to help out. Am I understanding this correctly? (Mshacham 13:55, 15 January 2008 (EST))

Proposal To Use Search Engines To Administer the Internet and Suggested Next Steps

  • I must admit, I'm rather confused about this piece. Neither the syllabus nor the paper itself lists an author (unless I'm completely missing something), so it's very difficult to put this in context. A search on "Ministry of Information Industry" reveals a Chinese government website, and ditto for the Market Administration Department, so my best bet is that this is a plan for effective internet filtering in China. If this is the case, then it connects well to the comment I made above, relating to how private companies can be used as a powerful point of control. If I'm wrong about China, then my only other guess is that this is intended as a parody -- a sort of reductio ad absurdum for internet control; the whole thing sounds ominously 1984-ish (the ministry names, and the stated goals of "control[ling] bad information" and achieving a "clean network"). (Mshacham 17:41, 15 January 2008 (EST))
  • As the title suggests, this piece is a proposed method for harnessing search engines and government oversight in order to cleanse the net of "bad information." The basic idea is that internet search engine companies will perform day-to-day monitoring of internet content (by scanning "all content in the websites to which it is connected"); when it discovers a "problematic link," it must immediately delete that link and "report the relevant particulars (including the name of the offending website, the internet address/IP address, etc.)" to the government. The search engines companies must coordinate among themselves and must meet regularly to share information (so that if one missed a site blocked by the others it can take corrective action). The government, meanwhile, will perform regular evaluations to see whether each search engine has met its goal of reducing bad content; the results are measured by the number of links that each search engine returns on a set of "bad" keywords. The ultimate goal, presumably, is that none of these keywords will produce any results in any of the available search engines -- a "clean" internet environment. (Mshacham 18:02, 15 January 2008 (EST))
  • How technologically feasible is this proposal? Each search engine is required to scan "all content" of every page to which it connects "in real time." Each company has the responsibility of installing whatever "monitoring equipment" is necessary for this and for "ensur[ing] that its own technical supervision platform has the ability" to perform this task. This would seem like an extraordinarily expensive system to implement; and even cost aside, would a search engine like Google, which is connected to billions of sites, be technically able to do this? (Mshacham 18:10, 15 January 2008 (EST))
    • It all depends on how "deep" the filtering will be expected to go. If all this entailed was banning any URL or even any site that featured certain key terms (ie. Falun Gong) then it would be fairly easy to implement, especially if the government helped in the creation of this sort of list. This of course would make the filtering very shallow and it would be easy to get around such bans through the use of code words and the like. If the filtering were expected to get deeper, then they would either need a powerful AI or they would have to go extremely broad, generating a great deal of both false positives and probably still missing a few sites. The most drastic measure, short of completely blocking all access to the internet, would be to go as broad as possible by banning all sites that aren't on a government-approved list. Anna 00:54, 16 January 2008 (EST)
  • The more important feasibility question is whether there will be sufficient time to complete the weekly reports by 3pm each Monday! Perhaps 4pm or 5pm would be better? Jason 07:24, 16 January 2008 (EST)
  • How does this method for filtering internet content compare to the ones discussed in the article above? Is it more or less effective? (Perhaps it is meant to work in tandem with the others, so this point may be moot.) I think one could get around this just as easily by using a proxy server (to connect to a search engine that is not part of the proposal). But perhaps the real power of this is that it makes it considerably more difficult for a person to find that proxy server (or any other problematic site) in the first place. Maybe we rely on search engines to such a great extent that a website that does not appear in Google is, for all intents and purposes, "dead," even though it can still be accessed. (Mshacham 18:17, 15 January 2008 (EST))

Jurisdiction

JZ, Jurisdiction in Cyberspace

Exercising jurisdiction requires an analysis of three different things, personal jurisdiction, choice of law, and effective enforcement of judgments. From an American perspective this seems relatively straight forward on the first two. Personal jurisdiction is what we learned in civ pro and conflicts of law, while complicated, are still governed by traditional legal principles. Where things get dicey is in the enforcement area. Kp 00:36, 16 January 2008 (EST)

Enforcement of Judgments

For US companies, at least currently, a lot of policy seems to be decided based on where companies geographically locate hardware and personnel. For example, Microsoft has servers and employees in China and as result doesn't have all that much choice about following Chinese law. On the other hand, Facebook seems to have a policy to not locate any servers in China to avoid this problem. I'm not sure that there is anyway around this. So long as a company has assets in a geographic location they are going to have to follow the laws there. While they might want to try and petition the local government for a certain legal process to be followed, in the end, if the local government wants something bad enough and you have a physical presence there you aren't going to be left with any alternative but to comply. Kp 00:36, 16 January 2008 (EST)

  • While Sealand might have seemed like a neat idea I think it failed for two main reasons. 1) It was still dependent on an ISP for internet access that was in turn subject to the control of a traditional government. 2) Governments don't seek to coerce entities through direct force against the data, rather they seek coercion against the individuals in control of that data. For anyone that doesn't actually reside on Sealand it thus provides no protection from the power of the state. Kp 00:50, 16 January 2008 (EST)

Ideal Extent of Jurisdiction

There is a debate about what is the appropriate extent of jurisdiction for governments. While we might think that an international free community is ideal that isn't reality and doesn't seem to be likely in the future. To the contrary, while the promise of the net might have been an international freedom, we are seeing that the net is becoming more and geographically discrete and that states are becoming capable of even more regulation of it.

  • While this balkanization might seem to be a bad thing I'm not sure of that. By dividing the internet into different neighborhoods we allow local groups (admittedly defined by current physical boundaries) to set their own norms about what behavior is acceptable. As a result we can avoid a race to the bottom and, for example, European speech restrictions don't have to be applied to speakers in the United States. Of course the flip side of this is that we aren't able to force our ideas of free expression into places where we might think they should be (i.e. oppressive regimes)Kp 00:42, 16 January 2008 (EST)
    • The downside to this is that the restrictions of any one country will be compromised at best and futile at worst, even if they are for beneficial purposes rather than repressive ones. For instance, if defamatory information about an individual is removed from a site or de-linked from a search engine in the US, it would only have limited effect since this material will still be available in other countries. While the reputational damage thus caused might be slight for the majority of Americans since their reputations don't extend beyond the border, there will still be a subset adversely affected by this with absolutely no remedy. Anna 01:17, 16 January 2008 (EST)
      • I'm not sure of that. It seems that if the content at issue isn't accessible in any way that can cause the damage there isn't any harm. If defamatory material is inaccessible in Germany but accessible in the United States who is harmed? Germans have decided that their society is better without this content, while Americans have decided that our society is better with this content. Seems like a win-win for both polities. There might be single individuals that are harmed by this policy, but I'm not sure how it is any different from the fact that any law passed will end up having negative consequences for someone. Kp 01:49, 16 January 2008 (EST)

I think there is one other interesting issue that wasn't addressed here. That is, how much of this identification technology do we actually encourage as consumers? For example, as consumers don't we want websites to know where we are so that the websites can better deliver us content/services/goods? This sort of technology seems to mirror some of the privacy issues, we love it when it helps us but we still get scared by it's potential. Kp 01:42, 16 January 2008 (EST)

Net Neutrality

Chris Yoo vs. Tim Wu, Keeping the Internet Neutral?

  • The "Bits" technology blog at the NY Times featured a debate between Tim Wu and Rick Cotton, the GC of NBC Universal, regarding ISP filtering. The post, along with one about copyright from the day before, can be found at Bits debate on filtering --Tseiver 08:46, 16 January 2008 (EST)
  • Below I've provided a summary of Yoo and Wu's back-and-forth. I think it will be easier to post questions or comments in response to each of their sub-points instead of taking the argument as a whole.--Will 02:21, 16 January 2008 (EST)

Yoo 1

  • Yoo argues against government-enforced network neutrality, calling the current less regulated environment “network diversity.” --Will 02:29, 16 January 2008 (EST)
  • Yoo says that not every bit should be treated equally because some bits are particularly time-sensitive (e.g., streaming video), while others are less so (e.g., e-mail). Allowing different networks to offer different services – some that specialize in less time-sensitive delivery at a lower cost and some that are faster at a premium – gives consumers greater choice.--Will 02:29, 16 January 2008 (EST)
  • Finally, with the rapid changes in network uses, government regulation should be put off until it is actually shown that consumers have been harmed.--Will 02:29, 16 January 2008 (EST)

Wu 1

  • Wu agrees with Yoo that there are permissible types of discrimination such as dealing with congestion or offering different types of networks. However, there are also types of discrimination that harm the consumer yet are beneficial to the ISP, such as blocking VoIP when it competes with an existing ISP service. Wu doesn’t think that ISPs or consumers will properly discount the blocking of VoIP services in that example because companies don’t tend to “chang[e] business models and establish[] new consumer pricing patterns…” Yoo says that VoIP blocking is not a far-off hypothetical but has occurred in the US and Mexico.--Will 02:29, 16 January 2008 (EST)
  • Yoo notes that blocking is anticompetitive and diminishes new market entrants and new technologies. However, Yoo says that former FCC Chair Michael Powell declared such blocking illegal.--Will 02:29, 16 January 2008 (EST)

Yoo 2

  • TCP/IP delivers bits on a “first come, first served” basis and it does not guarantee that every packet will be delivered (“best efforts”). While this might be acceptable for some services – the missing pixel in Britney Spears’ eye might not bother you – other services require faster delivery with greater accuracy (analogizing to the USPS v. FedEx). This is access tiering: charging websites to deliver their contents with premium service.--Will 02:29, 16 January 2008 (EST)
  • Antitrust jurisprudence has only prohibited those practices that are always anticompetitive and has applied the “rule of reason” to practices that can be beneficial but may not always be so, which requires challengers to the practice to show concrete competitive harm. Thus, the type of VoIP blocking Wu posed may be found to be illegal per se, but does not require a broader legislated neutrality rule.--Will 02:29, 16 January 2008 (EST)

Wu 2

  • Wu says he is more skeptical of large firms and their anticompetitive impulses because he believes innovation is driven by new market entrants, not incumbents, and so “the challenge is to bar the worst abuses without destroying an incentive to become an incumbent in the first place.”--Will 02:29, 16 January 2008 (EST)
  • Wu fears that if one company has a monopoly over broadband in a given area, it can choose various services to work more efficiently on its network thus blocking out competition from other content providers. Thus, one monopoly builds on another.--Will 02:29, 16 January 2008 (EST)

Yoo 3

  • Yoo again frames the issue in antitrust law terms as one of vertical integration between content and access providers. Antitrust law has evolved since the 1970s to be less hostile to vertical integration, so long as the integrated unit doesn’t hurt competition. This brings Yoo to his main point: we should not be too worried about competition in content and applications, which is a highly competitive market, we should be worried about last-mile ISP competitiveness.--Will 02:29, 16 January 2008 (EST)
  • Yoo says that the Brand X case showed that content and application providers did not have a right to cable or DSL access. The result was that the big content companies invested in creating their own last-mile services (or substitutes). Ensuring this type of market entry would solve the problem at which net neutrality aims.--Will 02:29, 16 January 2008 (EST)
  • Finally, Yoo questions the administrability of net neutrality between so many network and content players.--Will 02:29, 16 January 2008 (EST)

Wu 3

  • Wu takes on Yoo regarding market entry for last-mile service. He says that it is the classic economic problem of building infrastructure: high fixed cost and very low marginal cost. This has deterred new market entrants such that most homes have at best two last-mile services, cable and DSL/phone.--Will 02:29, 16 January 2008 (EST)
  • There are three solutions to the infrastructure problem: (1) have a central authority build it (ex: the government builds highways), (2) subsidizing infrastructure, (3) the government does nothing and hopes the market takes care of it.--Will 02:29, 16 January 2008 (EST)
  • Wu doesn’t provide guidance on which solution he would choose but, it seems, in the interim he would deal with a potential consequence of the infrastructure problem – temptation to discriminate in network delivery – by requiring net neutrality.--Will 02:29, 16 January 2008 (EST)

Yoo 4

  • Yoo just seems to disagree with Wu on the ability new participants have to enter the market for last-mile service. The cost of deploying last-mile networks has dropped, he argues, and spectrum can be repurposed if telecoms fail such that real entry cost (that is, risk) has gone down. Also, while cost of deployment has gone down, the value of the service to the consumer that the last-mile provider offers has gone up.--Will 02:29, 16 January 2008 (EST)

Wu 4

  • Wu cites Verizon’s FiOS service – estimated at a cost of $2500/household – as an example of the high entry cost for last-mile service.--Will 02:29, 16 January 2008 (EST)
  • He poses two final questions/ideas:
  • 1. Will the new entrants come on to the scene quickly enough? Will they cover all of our country?
  • 2. Net neutrality won’t cost last-mile providers a lot of money relative to their size, but it does make a difference to start up application companies.--Will 02:29, 16 January 2008 (EST)
  • Since the argument between Yoo and Wu seems to boil down to the actual ease or difficulty of market entry for last-mile providers, here are a few links to information about recent or attempted entrants:
Sprint WiMAX
Ricochet -- an older wireless service that my family had at one time.
Google potentially bidding on wireless spectrum.
Cell phone companies such as Verizon.
Motorola tried Powerline Broadband.
Satellite Internet access.

--Will 03:56, 16 January 2008 (EST)

General Reaction

  • I was shocked that Yoo pointed to the NFL Sunday Ticket as a positive analogy for a non-neutral net. He pointed out that DirecTV offers customers exclusive access to the NFL Sunday Ticket, a premium service you can subscribe to so you can watch all NFL games every week. He analogizes this to the ability of ISPs to differentiate their products by offering certain content at extra-fast speed (or normal-speed access to content other ISPs are slowing). The Sunday Ticket exclusivity, and its baseball equivalent, is very unpopular with sports fans, in part because satellite is not available everywhere and in part because of the high up-front costs of switching to satellite (examples [1] [2]). Doesn't this example actually help undermine Yoo's argument? If users live in a market where none of the (typically two or fewer, as Wu points out) high-speed ISPs offer reasonable access to the content the consumers desire, the consumers will be in the same boat as football fans who can't get satellite - unable to get the content they want. Dankahn 01:13, 16 January 2008 (EST)


DoJ, In the Matter of Broadband Industry Practices

Initially, I’m most struck by how DOJ’s antitrust lawyers frame net-neutrality so squarely within antitrust jurisprudence. In other words, they can’t see it as anything but an antitrust issue and, accordingly, the FCC should step off their turf (I think we can see this from DOJ’s strong suggestion that the FCC not “substitute special economic regulation of the Internet for free and open competition enforced by antitrust laws.”).

The application of the “rule of reason” requires that the private action not always be anticompetitive. This seems to come back to Yoo’s challenge to Wu to show actual harm instead of conjecture. Again, we can see this in DOJ’s language (calling net neutrality a prophylactic rule, and consistently putting the term ‘net neutrality’ in scare quotes), and we can see this in DOJ’s list of potential harms caused by net neutrality.

The potential harms DOJ lists include:

NN would prevent networks from managing their traffic efficiently.
Wu concedes that this is a legitimate function of ISPs.
NN would threaten consumer choice because it would prevent differentiated products (and, presumably, pricing).
Wu responds that last-mile providers form a duopoly and are not very susceptible to the competitive forces that drive product differentiation.
NN would shift the cost of network upgrades to all consumers instead of the particular content providers that sought to use the higher-grade network capacity.
This one makes some sense to me. Let’s assume there is a high cost to upgrading the network, and the primary use for the upgraded network is for limited high-end content providers to deliver bits to a limited number of high-end consumers. This scenario poses two difficult questions:
(1) How can we assign costs to those who benefit from the upgrade the most? Implicit in this question is that it is fairer or more market-oriented to assign costs to those who benefit; my grandma who only checks her e-mail shouldn’t have to pay for the spammer or download junkie who lives in her neighbor’s basement.
(2) How do we assign costs between the high-end content provider and high-end user?
A simple way to assign costs fairly is to charge high-end content providers – they’re easily identifiable. Another simple solution [I say this without much tech knowledge, so bear with me] is to prevent any user on the network from accessing high-end sites without paying the ISP a fee. It would seem that NN knocks out both of these solutions. An alternative (without sticking it to my grandma) is to come up with some proxy for high-end use and to assign costs accordingly. For example, one could imagine ISPs transitioning from an all-you-can-eat payment system to tiered access pricing based on how much data one downloads. Cell phone providers are in a similar situation (high fixed costs, low marginal costs, different user patterns) and charge under per-minute or per-data increment. Thoughts?--Will 03:32, 16 January 2008 (EST)

Optional Material

ONI, Pulling the Plug

I'm not sure that I agree with the conclusion put forth here, namely, that this is evidence of irreversible gains for the power of IT to get information in and out of a country in a time of crisis. Rather, I was left feeling more that this shows that the internet isn't really free and that local governments can and do exercise substantial control over it. While this may have allowed for some information to get out to the world more readily I don't see how it gets any information to the people all that more effectively than traditional methods(think De Gaulle using the BBC to broadcast into France during WWII). Kp 01:38, 16 January 2008 (EST)

Other Links

  • I'd add this to the home page, but it's locked for editing. I thought this might be of interest: a man in Vermont refuses to give police the password to his PGP-encrypted hard drive, claiming a Fifth Amendment right against self-incrimination. See the washingtonpost.com story here. kim 00:42, 16 January 2008 (EST)