Pre-class Discussion for Jan 10: Difference between revisions

From Cyberlaw: Internet Points of Control Course Wiki
Jump to navigation Jump to search
Line 14: Line 14:
:*Would more and more end users necessarily be unsatisified?  Might it be reasonable to expect that manufacturers (keeping a close eye in turn on lead users and their innovations) would push trusted systems just up to the intersection of enforceable security and demand if for no other reason than the vast majority of end users probably tend to be ''far'' more easily satisfied and less knowledgeable and thus less likely to look for alternatives (even if they feel some vague impulse to do so or would if they understood the extent of the lockdown) than lead users? [[User:Jhliss|Jhliss]] 10:59, 10 January 2008 (EST)
:*Would more and more end users necessarily be unsatisified?  Might it be reasonable to expect that manufacturers (keeping a close eye in turn on lead users and their innovations) would push trusted systems just up to the intersection of enforceable security and demand if for no other reason than the vast majority of end users probably tend to be ''far'' more easily satisfied and less knowledgeable and thus less likely to look for alternatives (even if they feel some vague impulse to do so or would if they understood the extent of the lockdown) than lead users? [[User:Jhliss|Jhliss]] 10:59, 10 January 2008 (EST)


:* I was particularly thinking of NY Times article describing the DeCSS code developed by hackers in Europe to decode DVDs, where although the content of the code is illegal in the states and sites linking to the illegal content may be legally liable under the DCMA , it seems like nothing was done about the code itself.  The point above however is a very interesting point, and I agree, I think it depends a lot on how the market reacts to consumer wishes and demands, and satisfied consumers would in turn be less incentivized to find creative alternatives. [[User:Cseif|Cseif]] 11:15, 10 January 2008 (EST) Cseif
:* I was particularly thinking of NY Times article describing the DeCSS code developed by hackers in Europe to decode DVDs, where although the content of the code is illegal in the states and sites linking to the illegal content may be legally liable under the DCMA , it seems like nothing was done about the code itself (perhaps because of unenforceability) .  The point above however is a very interesting point, and I agree, I think it depends a lot on how the market reacts to consumer wishes and demands, and satisfied consumers would in turn be less incentivized to find creative alternatives. [[User:Cseif|Cseif]] 11:15, 10 January 2008 (EST) Cseif


=='''Cyber Law Journal: Assessing Linking Liability'''==
=='''Cyber Law Journal: Assessing Linking Liability'''==

Revision as of 12:39, 10 January 2008

There's an interesting episode of the "Cranky Geeks" (great name!) webcast, hosted by PC Magazine columnist John Dvorak, that features Whit Diffie of public key encryption fame. It's a half-hour long, but it covers a wide range of the topics considered today and in this class as a whole. It is also fairly entertaining. At the end of the interview Diffie said he's proposing a goal of strict product liability for software in 10 years. See ref [1] --Tseiver 20:35, 9 January 2008 (EST)

  • One interesting topic discussed on this clip was whether or not encryption systems have a backdoor in which hackers can access the protected information. Dan Farmer suggested that it is extremely difficult to understand and figure out the mathematics behind the encryption system and only a few would be able to hack into the system. Regardless, this should be a concern for publishers who want to use encryption programs to secure their copyright works. It will only be a matter of time before hackers discover the backdoor into the encryption. I found one article from 2000 discussing DVD encryption hackers posting the de-encryption code on the Internet (the same decision discussed in "Cyber Law Journal: Assessing Linking Liability") DVD Hackers. My understanding is that the few are the ones we should be concerned about in these hacking situations. Generally, most computer users are not the ones developing software or decrypting codes to hack into systems, but it is the few who make the de-encryption codes available to others.KStanfield 21:47, 9 January 2008 (EST)
  • Harkening back to our discussions about defamation, the NY Times has an article today claiming that LA federal attorneys have filed a subpoena against MySpace relating to the suicide of the teenage girl in Missouri. The investigation is to determine whether setting up a false identiy online for the purpose of harassment can be defined as Internet fraud under federal statutes. See ref supoena article --Tseiver 08:00, 10 January 2008 (EST)
  • Here's two other short NY Times articles directly related to trusted systems technologies. The first regarding an open source authentication program being sponsored by Yahoo! See ref Yahoo! OpenID The other reports on a panel discussion at the Consumer Electronics Show of ISPs and telcos regarding network-level packet filtering. See ref ISP filtering --Tseiver 08:12, 10 January 2008 (EST)


Zittrain: Technological Complements to Copyright

  • Mark Stefik’s “Trusted Systems (I)”

In this article Stefik argues that usage-rights language is “essential to electronic commerce: and the range of things that people can or cannot do must be made explicit so that buyers and sellers can negotiate and come to agreements.” While I agree that it is important for end-users to understand the fees and conditions for any particular trusted system they may want to connect with, is there really a negotiation though? I think this has an interesting tie-in to Zittrain’s “The Future of the Internet” which states that how generativity allows for a variety of software to be built and content exchanged without anticipating what the market wants and is concerned that the harm that arises from generativity will create a lockdown. However, I wonder if the opposite could be true as well: if end-users are unable to negotiate the terms of trusted systems, and trusted systems continue to lock-down their products into very specific-single use machines, then perhaps even more generativity will result. More and more unsatisfied end-users that are unable participate in deciding the terms of their lock-downed machine may be incentivized to think of alternative software or generative platforms to run their programs (or perhaps as we’ve seen in many cases if chances of being caught for legal liability under 1201 of the DCMA are slim, users that create codes circumventing trusted systems lockdowns). Cseif 08:25, 10 January 2008 (EST) Cseif

  • Would more and more end users necessarily be unsatisified? Might it be reasonable to expect that manufacturers (keeping a close eye in turn on lead users and their innovations) would push trusted systems just up to the intersection of enforceable security and demand if for no other reason than the vast majority of end users probably tend to be far more easily satisfied and less knowledgeable and thus less likely to look for alternatives (even if they feel some vague impulse to do so or would if they understood the extent of the lockdown) than lead users? Jhliss 10:59, 10 January 2008 (EST)
  • I was particularly thinking of NY Times article describing the DeCSS code developed by hackers in Europe to decode DVDs, where although the content of the code is illegal in the states and sites linking to the illegal content may be legally liable under the DCMA , it seems like nothing was done about the code itself (perhaps because of unenforceability) . The point above however is a very interesting point, and I agree, I think it depends a lot on how the market reacts to consumer wishes and demands, and satisfied consumers would in turn be less incentivized to find creative alternatives. Cseif 11:15, 10 January 2008 (EST) Cseif

Cyber Law Journal: Assessing Linking Liability

  • Why are copyright holders (or those being defamed, etc) going after the linker? Shouldn't they be more concerned about the destination site? If the answer is simply that the destination site is judgment-proof or that the linker has deeper pockets, such motivations seem skewed and out of line with the goals of copyright law. Why not view linking as a double-edged sword: although it provides surfers with access to forbidden information, it also provides copyright holders etc with free police work. The offended party can now go after the destination of the link. Cjohnson 10:37, 10 January 2008 (EST)
  • * The linked site may be in a jurisdiction that doesn't recognize posting such information as a violation of copyright law; or it might be in a jurisdiction that doesn't recognize U.S. subpoenas, on an ISP that refuses to give out the name of the site owner. Going after linking sites would then be an imperfect way of stopping the flow of information, but in the eyes of the copyright holders, better than nothing. Eroggenkamp 10:45, 10 January 2008 (EST)
  • What happens with news organizations? Although they may not meet the last prong of Kaplan's test, couldn't the news organizations still be liable for linking to pages with decrypting codes? If not, many users will be able to more readily access the codes and potentially infringe on the copyright holder's rights. Mark Lemley's concern about subjecting a too many people to liability is legitimate. With Kaplan's decision, I do not see why organizations or companies like the MPAA will not distribute a number of cease and desist letters/e-mails to linkers, including news organizations. As the article states, following Kaplan's decision the MPAA sent approximately 100 cease and desist letters to linkers that they believed had "intentions" to distribute DeCSS. Furthermore, this still leaves the MPAA (and other copyright holders) subject to de-encryption by existent sites that do not meet Kaplan's good/bad links test. However, I agree with Kaplan's hesitancy to extend the ruling to linkers that do not intend to distribute infringing software. The chilling effect can be detrimental to information available on the internet. Additionally, restricting linkers the ability to post links to pages with valuable information can compromise a person's free speech rights. KStanfield 11:34, 10 January 2008 (EST)

Cohen: Copyright and the Jurisprudence of Self-Help

  • Can manufacturers of tethered appliances avoid privacy concerns via contract? If the consumer is bound by a EULA that states that the device might be shut off remotely (paralleling terms of service on Web 2.0 platforms - see comment by Jendawson below), it seems that there would no longer be a "reasonable expectation of privacy" and the controversy is avoided. Cjohnson 10:37, 10 January 2008 (EST)


Zittrain: FOI/Web 2.0

I think this article makes a good point about the dependence that Web 2.0 spawns, even if it untethers us from the physical boxes we used to rely on. Looking over the Facebook terms of service provided, it's obvious the lack of recourse users have if their data is lost or the service discontinued. I've recently started doing some productivity-type tasks online with Google Documents and their TOS (which I just looked up!) are just as harrowing, if not more. "You acknowledge and agree that Google may stop (permanently or temporarily) providing the Services (or any features within the Services) to you or to users generally at Google’s sole discretion, without prior notice to you. ... You acknowledge and agree that if Google disables access to your account, you may be prevented from accessing the Services, your account details or any files or other content which is contained in your account." In essence, the only guarantee I have that Google won't ruin all my stuff and delete my documents is their reputation. I may start doing more frequent exports of what I post there--Microsoft Word suddenly seems to have one big advantage. Jendawson 09:48, 10 January 2008 (EST)

  • I wonder how businesses, especially sensitive information businesses like banks and credit card companies deal with this when they use wildly successful business solutions that are SaaS such as Salesforce.com. I wonder if the market/contracting takes care of ensuring that data is secure and preserved lest the service gets disrupted? I'm not certain of this but I think Google intends to target businesses as well with their hosted productivity solutions, as a substitute to costly in-house IT. --Jumpingdeeps 10:24, 10 January 2008 (EST)
  • * I'm betting that this contract is more CYA than something they expect to be perfectly enforced if, for some reason, they suddenly lose all of somebody's valuable data. I think of this as a parallel to, for example, an airline's frequent flyer miles program. All of those agreements say something like 'this program may be modified or ended at any time,' but the companies would never do so without warning their customers because they'd lose business. The main difference I see is the potential for outside interference; it would be very difficult for someone malicious to destroy an airline's frequent flyer program (although if they could get at the database where it is stored, they could probably do some damage), but it would probably be much easier for a malicious hacker to destroy the data on the Google Documents server. We've already seen how easy it is for a random MIT grad student to write a virus that gives him root control over most of the computers in the world; computer security is better now than it was in 1988, but so are the hackers. I think the real danger in using trusted systems is not that their contracts of adhesion are skewed in their favor, it is that it's difficult to really trust that they won't be the subject of attack, particularly once they reach a size that makes them a tempting target. The obvious solution is to periodically load all of the shared documents on to an autarkic backup computer, but even that is imperfect if a document is itself infected with a destructive virus. Eroggenkamp 10:56, 10 January 2008 (EST)