Paradigms for Studying the Internet

From Technologies of Politics and Control
Jump to navigation Jump to search

January 31

Before we can even begin exploring the who's, what's, and why's -- we need to answer the critical question of how. Indeed, the phrase "studying the web" could embrace a staggering world of possible routes to explore, even before beginning to examine its relationship with society and culture. We need something to guide us through this massive field of (very interesting!) foxholes, and link the ideas we encounter into a consistent piece. We need some kind of structure to allow us to understand what we are looking at, the same way a chemist thinks of things in terms of atoms and molecules, or a philosopher can think about things in terms of schools of thought.

This class will explore different frameworks for studying the web, which will structure both the discussion and topic matter covered in the course, as well as the methodology that you should apply to your assignments.


Readings


Optional Readings

Videos Watched in Class

Links

For people interested in a more technical primer on the architecture of the web, how email works, etc. check out ethan zuckerman and andrew mclaughlin's Introduction to Internet Architecture and Institutions

Class Discussion

Lessig’s view on the “Principle of Bovinity” with regards to steering the mass toward a way of being or thinking is an appropriate thought. However, in this unique situation, as opposed to the examples given of the past (i.e. seatbelts, discrimination, drugs, etc), internet regulation has an inherent property of being connected. What I mean to say is that you may steer groups toward a thought or way of being through whatever means (architecture, laws, norms, market) for after a while, they will be disconnected from the original pioneering idea. With regards to the internet though, there just seems to be too many means of communication that the original ideas will continue to be re-introduced, and hence the mass will never forget. Mvalerio 17:06, 31 January 2012 (UTC)


I find it very interesting to observe Lessig's constraints in action. I also find some of the examples troubling, and the likelihood of feedback loops disturbing. Take for example, privacy policies and user license agreements. As people get used to the idea of having information online, such as facebook or the now largely irrelevant myspace, internet privacy becomes less of a priority. The social norm becomes a lower expectation of privacy. Since few people thoroughly read privacy policies anyway, a company can start putting things in them that are not in the user's best interest. This becomes the market norm, because it gives a company a slight competitive edge. Because it helps them to become better integrated, companies now start encoding this lower level of privacy into their applications. Now, users may figure out what is happening, but because it is less outside of their societal norms, consumers shrug, say everyone else is doing it (market norm), and maybe things wouldn't work the same without doing things this way (architecture), and change their societal norms. Meanwhile, laws are kept nonrestrictive, because the companies make money from this level of expectations (market constraints acting on legal constraints), there is little pressure from consumers (societal constraints), and it would be difficult to figure out how to do it anyway because of how things are programed (architecture).

I'm not saying that this pattern has to happen, but it certainly can. As a change gains momentum, it can become very difficult to stop. I wonder if there is any particular constraint that is more or less likely to change the others.

BlakeGeno 05:08, 31 January 2012 (UTC)



Not sure this is exactly how we're supposed to use this section, but I though I'd post a few of my thoughts on the readings for comment and discussion, and I'd love any criticism since I'm not someone with much of a background at all in these areas:

The point in Benkler about Lessig's "Principle of Bovinity" highlighted one of the most interesting aspects of the readings to me: How easy it is to fall into the trap of oversimplifying and assuming that a lot of these issues of control and technology boil down to what will be a recognizable total victory for one side and total defeat for the other. The ratio of creators of technology to consumers does not have to become fully in favor of the business/government creators in order for them to win; it just has to get close enough, resulting in a herd of consumers who are unable to break out of their pattern of accepting technologies exactly as they are when purchased. This reduces the generativity (from Zittrain) that can produce the sort of unanticipated evolutions and improvements that drive technology and innovation forward, often in ways that are more to the benefit of the public than to the groups who are trying to impose their specific type of control.

Indirect and hidden regulation of how these technological resources are used is troubling not only because it (obviously) attempts to stifle this sort of unintentional and often subversive innovation but also because it (less obviously) has a major effect that we often cannot easily perceive on the forces that control our interactions and creations online (law/norms/architecture/market/etc., from Lessig). None of those forces exist in a vacuum. So, not only is one possible outcome to the current technological/cultural upheaval a victory for the strictly control-oriented creators where the majority of people become simple consumers, it is quite possible that many will not even notice the driving factor behind that change. As someone who isn't very adept at using the internet as a creator anyway, that was a troubling conclusion for me.

On a mostly unrelated note, the other most interesting aspect of the readings for me came in Zittrain, where he noted that while the PC and Internet are almost endlessly adaptable, we have often been dealing with problems by making the problems RELATIVELY smaller, and not actually in any way solved (his examples being increased bandwidth for ISPs to deal with spam and more computing cycles for PCs to deal with malware). This reminded me of a presentation I just saw that was given by the head of the EPA for the New England region. He pointed out that for the last 30 years or so, the EPA was solving problems with an equal level of total disregard for sustainability. If water was dirty, create a water plant to purify it, and who cares how much power that takes! It is only in the last few years that a realization has really taken hold that we are unsustainably wasting energy, even in the pursuit of worthwhile results. Now, for example, some purifying plants are being powered by solar, not coal, with the same direct results (pure water) and much better side benefits (no pollution, endlessly renewable resource). Early on in any new field or technology, it is easy to simply minimize and defer issues, but that practice is never sustainable, and a day will come when specific problems actually have to be solved. It would be nice if we could all do ourselves a favor and reach that point voluntarily for internet/computing, and not through necessity. AlexLE 21:03, 28 January 2012 (UTC)


I really appreciated Benkler’s exploration of how the "industrial producers of information" have a vested commercial interest in controlling information and communication at the expense of the commons, and how the US Government’s focus has been on restricting freedoms as opposed to upholding rights (Lessig). To appropriate language from the 9/11 Commission Report [1], many Internet users believe the “need to share” trumps the “need to know”, which is disruptive to the off-line status quo that clearly delineates between producers and consumers.

The idea of “need to share” also has important tie-ins to innovation and collaboration, two benefits of an unrestricted (or minimally restricted) Internet that are currently threatened. Benkler questions the concept of innovation based on exclusive rights as opposed to innovation from commons- and services-based sources, and the legal framework for valuing the former over the latter. Additionally, to mash-up Zittrain and Lessig, the features and benefits of a “generative” system - leverage, adaptability, ease of mastery, and transferability - can be seriously constrained by the interaction of law, social norms, the market, and architecture on users.

All of this leads me to wonder why more companies don’t see disruption as opportunity rather than a crisis. The marketplace is clearly communicating to them that their current model is headed for obsolescence and that this might be a great opportunity to build on some of the ideas and innovations already floating around. It’s also interesting to me that the US’s “capitalist” system seems more willing to protect current industry leaders than to step back and permit a true competition over ideas and users in the marketplace.

@AlexLE Thanks for bringing up the insidious impact that a lack of accountability/transparency can have on "law/norms/architecture/market." There were so many great topics in this week's readings - looking forward to discussing them in class. Aditkowsky 19:48, 29 January 2012 (UTC)

Once again, I enjoyed reading Zittrain’s discussion of generativity, which he defined in terms of “unfiltered contributions” from a wide-range of diverse audiences leading to “unanticipated change”.

Especially interesting was his discussion of the “generative pattern,” which demonstrates how an idea, such as Wordpress blogging software (which I use daily), can start in relative obscurity. When first launched, it was only partially developed, yet it was put out on the internet for others to use (and fix later, another use of the procrastination principle). As with other open source projects like OpenOffice, contribution to the Wordpress community is encouraged, which results in even greater use. According to the Wordpress developer's site, updates to the software are made nearly every single day.

As with other generative software, when Wordpress was launched, no one could have foreseen the “unanticipated change” brought about by the widespread use of this blogging software. But, as with other generative technologies, people new to the neighborhood have started abusing the "openness" of the system. One such example is the wide-spread use of Wordpress software (free, easy to use) to host spam sites and scraper websites. To help curtail these abuses, the Wordpress community created an easy form for people to report spammers (http://en.wordpress.com/report-spam/). This is just one example of the Wordpress community’s attempt to curtail abuse. Another issue frequently discussed on Wordpress forums is the unreliability of many plug-ins created by amateur developers. Trusting users (such as myself) have discovered the hard way that installing a Wordpress plugin can have a catastrophic impact on your blog (the dreaded socket error). Undoubtedly, according to Zittrain's concept of a generative pattern, there will be more and more movement towards “enclosure” in the Wordpress community to prevent such abuses going forward.

Lessig’s discussion of the four types of regulation and their interdependence was also enlightening. Clearly, undesirable behavior on the internet can be curtailed by more than one means. Of course, this brings SOPA and PIPA legislation to mind, making one wonder if there is a better way to attain the same goal without directly threatening punishment for the undesirable behavior.

Perhaps an indirect approach, for example, altering the social norm, could be just as effective. For many Americans, sharing software and music is seen as perfectly acceptable behavior. People like to share things they enjoy with other people. However, if doing so becomes unacceptable in the eyes of most Americans, overall behavior would start to change. Clearly, this is a complex issue, but Lessig does an excellent job of communicating alternatives to direct legislation, which often has far-reaching consequences, far beyond the original scope of what was intended by the law. Joymiller 02:37, 30 January 2012 (UTC)

Benkler’s essay on laws that regulate the internet is interesting in the sense that our internet behavior with the internet can be seen as a reaction to regulations. We logically believe our online behavior is based on our own desires and consumer needs when in fact our behavior can be a function of the copyright infringement laws imposed upon us. Benkler indicates that “laws do affect human behavior by changing the payoffs to regulated actions directly” and that “they also shape social norms” (386). The jaywalking example is effective in demonstrating how a variety of other behaviors can be affected by a law.

When I look back at major changes in my online behavior as a result of copyright laws, iTunes comes to mind. Naturally I have been concerned about any ramifications related to illegally downloading software. The copyright laws required the end users to purchase entire albums which can restrict access to newer, lesser-known music; Apple responded to these laws by providing a legal service that allowed for individual songs to be previewed and downloaded, and the nature of our internet behavior, particularly concerning internet music exploration changed, all because of internet laws. How we view music and the internet now is fundamentally different than prior to iTunes. Jimmyh 17:50, 30 January 2012 (UTC)


All of the readings for this week were very interesting and were able to get me thinking about how some of the concepts cited in the works can be tied to my everyday usage of certain programs. At one point in the Benkler article it almost seems to me that the author is trying to give a symbolical picture of two entities such as the big industries or the "giants", and their opposition formed by the majority of the population, in a perennial contrast. On one hand, the big industries are trying to protect their economic interests, while on the other hand, the majority of the population is trying to resist against this forced regulation. It also seems to me that when something new is discovered for the first time, as soon as people learn about such unregulated and innovative feature, they try to exploit it to their own benefit. But after some time, the founders or industries realize that they have to protect their "money making machines" from public and unregulated usage; therefore laws in regards are passed, patents are deposited, trademarks are extended etc. in order to contrast this phenomenon. I personally recall what was done in order to fight or at least limit piracy in the music industry a few years ago. Once, anyone was able to illegally download music from programs such as "emule" or "bearshare", where chances of actually getting caught were very little but at the same time these actions violated certain federal laws enforced by the F.B.I. So what has been done after that, was that instead of downloading illegal music, which would "violate the law" and also result risky for what concerns contracting computer viruses, was transformed in a just process where songs could be bought for one dollar or so on specific programs, the most famous being "iTunes". This allows artists to make some profit from their work but at the same time it allowed people to buy the songs at a cheap price and most importantly, in a legal manner. For what concerns "sharing", I once heard by a friend that "if you upload material on a specific program, you are allowed to download other stuff without facing legal charges", but then the question is: how does someone regulate that "material sharing"? it was very easy for anyone to just download without uploading anything in return. But returning to the authors, the most interesting part for me was Lessig's "principle of bovinity", described by Benkler and how he explains the principle of controlling a large number of people or animals in a metaphorical sense with little resources and very few rules, although constantly enforced. In conclusion I personally think that the whole process of protecting and regulating the usage of a given site or program is just part of a never ending cycle where someone creates a program for instance, which will be then used and abused, with subsequent establishment of laws and protection which will just be overcome and the whole cycle will start again. Perhaps this is a pessimistic view of how things are actually functioning but my feel is that this process is just like a virus vs. antivirus eternal battle. Emanuele 18:56, 30 January 2012 (UTC)