Paradigms for Studying the Internet
January 31
Before we can even begin exploring the who's, what's, and why's -- we need to answer the critical question of how. Indeed, the phrase "studying the web" could embrace a staggering world of possible routes to explore, even before beginning to examine its relationship with society and culture. We need something to guide us through this massive field of (very interesting!) foxholes, and link the ideas we encounter into a consistent piece. We need some kind of structure to allow us to understand what we are looking at, the same way a chemist thinks of things in terms of atoms and molecules, or a philosopher can think about things in terms of schools of thought.
This class will explore different frameworks for studying the web, which will structure both the discussion and topic matter covered in the course, as well as the methodology that you should apply to your assignments.
Download this week's slides (PDF)
Readings
- Yochai Benkler, The Wealth of Networks (Read pages 379-396. The rest of this chapter expands the discussions of each layer in more detail, if you want to read more about them)
Optional Readings
Videos Watched in Class
Links
For people interested in a more technical primer on the architecture of the web, how email works, etc. check out ethan zuckerman and andrew mclaughlin's Introduction to Internet Architecture and Institutions
Class Discussion
I was reading The Whale and The Reactor: A Search for Limits in an Age of High Technology and found the theme of the book very relevant to the topic of this class. In The Whale, the author tries to build a philosophical framework to explore the relationship between technical change and political powers.
Zitterain's article on generativity is a celebration of the limitless creative possibilities associated with the Internet and PC technologies. I agree with his assessment and appreciate his analysis. But I think the article manifests some form of naive technological determinism. It focuses narrowly on the Net technology and its social impacts, but fails to look behind technical devices to see the social circumstances of their development, deployment and use.
So I think what Lessig and Benkler's theories are extremely useful in understanding the interactive forces influencing the society with the Net. I like how Lessig points out four forces that help regulate the cyberspace (law, market, norm and architecture).
YHHsiaoGermaine (You Hwa) Hsiao
Lessig’s view on the “Principle of Bovinity” with regards to steering the mass toward a way of being or thinking is an appropriate thought. However, in this unique situation, as opposed to the examples given of the past (i.e. seatbelts, discrimination, drugs, etc), internet regulation has an inherent property of being connected. What I mean to say is that you may steer groups toward a thought or way of being through whatever means (architecture, laws, norms, market) for after a while, they will be disconnected from the original pioneering idea. With regards to the internet though, there just seems to be too many means of communication that the original ideas will continue to be re-introduced, and hence the mass will never forget. Mvalerio 17:06, 31 January 2012 (UTC)
I find it very interesting to observe Lessig's constraints in action. I also find some of the examples troubling, and the likelihood of feedback loops disturbing. Take for example, privacy policies and user license agreements. As people get used to the idea of having information online, such as facebook or the now largely irrelevant myspace, internet privacy becomes less of a priority. The social norm becomes a lower expectation of privacy. Since few people thoroughly read privacy policies anyway, a company can start putting things in them that are not in the user's best interest. This becomes the market norm, because it gives a company a slight competitive edge. Because it helps them to become better integrated, companies now start encoding this lower level of privacy into their applications. Now, users may figure out what is happening, but because it is less outside of their societal norms, consumers shrug, say everyone else is doing it (market norm), and maybe things wouldn't work the same without doing things this way (architecture), and change their societal norms. Meanwhile, laws are kept nonrestrictive, because the companies make money from this level of expectations (market constraints acting on legal constraints), there is little pressure from consumers (societal constraints), and it would be difficult to figure out how to do it anyway because of how things are programed (architecture).
I'm not saying that this pattern has to happen, but it certainly can. As a change gains momentum, it can become very difficult to stop. I wonder if there is any particular constraint that is more or less likely to change the others.
- Edit: Speak of the devil. Last year,Mark Zuckerberg declaring that societal norms had changed with regards to privacy, right after facebook changed its architecture to force that trend.
BlakeGeno 05:08, 31 January 2012 (UTC)
Not sure this is exactly how we're supposed to use this section, but I though I'd post a few of my thoughts on the readings for comment and discussion, and I'd love any criticism since I'm not someone with much of a background at all in these areas:
The point in Benkler about Lessig's "Principle of Bovinity" highlighted one of the most interesting aspects of the readings to me: How easy it is to fall into the trap of oversimplifying and assuming that a lot of these issues of control and technology boil down to what will be a recognizable total victory for one side and total defeat for the other. The ratio of creators of technology to consumers does not have to become fully in favor of the business/government creators in order for them to win; it just has to get close enough, resulting in a herd of consumers who are unable to break out of their pattern of accepting technologies exactly as they are when purchased. This reduces the generativity (from Zittrain) that can produce the sort of unanticipated evolutions and improvements that drive technology and innovation forward, often in ways that are more to the benefit of the public than to the groups who are trying to impose their specific type of control.
Indirect and hidden regulation of how these technological resources are used is troubling not only because it (obviously) attempts to stifle this sort of unintentional and often subversive innovation but also because it (less obviously) has a major effect that we often cannot easily perceive on the forces that control our interactions and creations online (law/norms/architecture/market/etc., from Lessig). None of those forces exist in a vacuum. So, not only is one possible outcome to the current technological/cultural upheaval a victory for the strictly control-oriented creators where the majority of people become simple consumers, it is quite possible that many will not even notice the driving factor behind that change. As someone who isn't very adept at using the internet as a creator anyway, that was a troubling conclusion for me.
On a mostly unrelated note, the other most interesting aspect of the readings for me came in Zittrain, where he noted that while the PC and Internet are almost endlessly adaptable, we have often been dealing with problems by making the problems RELATIVELY smaller, and not actually in any way solved (his examples being increased bandwidth for ISPs to deal with spam and more computing cycles for PCs to deal with malware). This reminded me of a presentation I just saw that was given by the head of the EPA for the New England region. He pointed out that for the last 30 years or so, the EPA was solving problems with an equal level of total disregard for sustainability. If water was dirty, create a water plant to purify it, and who cares how much power that takes! It is only in the last few years that a realization has really taken hold that we are unsustainably wasting energy, even in the pursuit of worthwhile results. Now, for example, some purifying plants are being powered by solar, not coal, with the same direct results (pure water) and much better side benefits (no pollution, endlessly renewable resource). Early on in any new field or technology, it is easy to simply minimize and defer issues, but that practice is never sustainable, and a day will come when specific problems actually have to be solved. It would be nice if we could all do ourselves a favor and reach that point voluntarily for internet/computing, and not through necessity. AlexLE 21:03, 28 January 2012 (UTC)
I really appreciated Benkler’s exploration of how the "industrial producers of information" have a vested commercial interest in controlling information and communication at the expense of the commons, and how the US Government’s focus has been on restricting freedoms as opposed to upholding rights (Lessig). To appropriate language from the 9/11 Commission Report [1], many Internet users believe the “need to share” trumps the “need to know”, which is disruptive to the off-line status quo that clearly delineates between producers and consumers.
The idea of “need to share” also has important tie-ins to innovation and collaboration, two benefits of an unrestricted (or minimally restricted) Internet that are currently threatened. Benkler questions the concept of innovation based on exclusive rights as opposed to innovation from commons- and services-based sources, and the legal framework for valuing the former over the latter. Additionally, to mash-up Zittrain and Lessig, the features and benefits of a “generative” system - leverage, adaptability, ease of mastery, and transferability - can be seriously constrained by the interaction of law, social norms, the market, and architecture on users.
All of this leads me to wonder why more companies don’t see disruption as opportunity rather than a crisis. The marketplace is clearly communicating to them that their current model is headed for obsolescence and that this might be a great opportunity to build on some of the ideas and innovations already floating around. It’s also interesting to me that the US’s “capitalist” system seems more willing to protect current industry leaders than to step back and permit a true competition over ideas and users in the marketplace.
@AlexLE Thanks for bringing up the insidious impact that a lack of accountability/transparency can have on "law/norms/architecture/market." There were so many great topics in this week's readings - looking forward to discussing them in class. Aditkowsky 19:48, 29 January 2012 (UTC)
@Aditkowsky - I often wonder the same thing: why do companies/industries seemingly refuse to build on or latch on to the innovations already out there? Why won't the movie industry first make new movies available for download on demand (simultaneous to the opening night of a new movie) on a platform similar to iTunes in order to "combat" movie pirating? For many people, it's a matter of convenience and not only a matter of free movies. If piracy is as poisonous and profit-eating as Hollywood claims, why is there a continual increase in [bad] movies being made and record profits? The industry must appreciate innovation to some degree, as they now offer digital downloads, movies on demand, etc., but it has historically resisted new technologies and innovations, including the VCR. Aberg 20:03, 31 January 2012 (UTC)
Once again, I enjoyed reading Zittrain’s discussion of generativity, which he defined in terms of “unfiltered contributions” from a wide-range of diverse audiences leading to “unanticipated change”.
Especially interesting was his discussion of the “generative pattern,” which demonstrates how an idea, such as Wordpress blogging software (which I use daily), can start in relative obscurity. When first launched, it was only partially developed, yet it was put out on the internet for others to use (and fix later, another use of the procrastination principle). As with other open source projects like OpenOffice, contribution to the Wordpress community is encouraged, which results in even greater use. According to the Wordpress developer's site, updates to the software are made nearly every single day.
As with other generative software, when Wordpress was launched, no one could have foreseen the “unanticipated change” brought about by the widespread use of this blogging software. But, as with other generative technologies, people new to the neighborhood have started abusing the "openness" of the system. One such example is the wide-spread use of Wordpress software (free, easy to use) to host spam sites and scraper websites. To help curtail these abuses, the Wordpress community created an easy form for people to report spammers (http://en.wordpress.com/report-spam/). This is just one example of the Wordpress community’s attempt to curtail abuse. Another issue frequently discussed on Wordpress forums is the unreliability of many plug-ins created by amateur developers. Trusting users (such as myself) have discovered the hard way that installing a Wordpress plugin can have a catastrophic impact on your blog (the dreaded socket error). Undoubtedly, according to Zittrain's concept of a generative pattern, there will be more and more movement towards “enclosure” in the Wordpress community to prevent such abuses going forward.
Lessig’s discussion of the four types of regulation and their interdependence was also enlightening. Clearly, undesirable behavior on the internet can be curtailed by more than one means. Of course, this brings SOPA and PIPA legislation to mind, making one wonder if there is a better way to attain the same goal without directly threatening punishment for the undesirable behavior.
Perhaps an indirect approach, for example, altering the social norm, could be just as effective. For many Americans, sharing software and music is seen as perfectly acceptable behavior. People like to share things they enjoy with other people. However, if doing so becomes unacceptable in the eyes of most Americans, overall behavior would start to change. Clearly, this is a complex issue, but Lessig does an excellent job of communicating alternatives to direct legislation, which often has far-reaching consequences, far beyond the original scope of what was intended by the law. Joymiller 02:37, 30 January 2012 (UTC)
Benkler’s essay on laws that regulate the internet is interesting in the sense that our internet behavior with the internet can be seen as a reaction to regulations. We logically believe our online behavior is based on our own desires and consumer needs when in fact our behavior can be a function of the copyright infringement laws imposed upon us. Benkler indicates that “laws do affect human behavior by changing the payoffs to regulated actions directly” and that “they also shape social norms” (386). The jaywalking example is effective in demonstrating how a variety of other behaviors can be affected by a law.
When I look back at major changes in my online behavior as a result of copyright laws, iTunes comes to mind. Naturally I have been concerned about any ramifications related to illegally downloading software. The copyright laws required the end users to purchase entire albums which can restrict access to newer, lesser-known music; Apple responded to these laws by providing a legal service that allowed for individual songs to be previewed and downloaded, and the nature of our internet behavior, particularly concerning internet music exploration changed, all because of internet laws. How we view music and the internet now is fundamentally different than prior to iTunes. Jimmyh 17:50, 30 January 2012 (UTC)
All of the readings for this week were very interesting and were able to get me thinking about how some of the concepts cited in the works can be tied to my everyday usage of certain programs. At one point in the Benkler article it almost seems to me that the author is trying to give a symbolical picture of two entities such as the big industries or the "giants", and their opposition formed by the majority of the population, in a perennial contrast. On one hand, the big industries are trying to protect their economic interests, while on the other hand, the majority of the population is trying to resist against this forced regulation. It also seems to me that when something new is discovered for the first time, as soon as people learn about such unregulated and innovative feature, they try to exploit it to their own benefit. But after some time, the founders or industries realize that they have to protect their "money making machines" from public and unregulated usage; therefore laws in regards are passed, patents are deposited, trademarks are extended etc. in order to contrast this phenomenon. I personally recall what was done in order to fight or at least limit piracy in the music industry a few years ago. Once, anyone was able to illegally download music from programs such as "emule" or "bearshare", where chances of actually getting caught were very little but at the same time these actions violated certain federal laws enforced by the F.B.I. So what has been done after that, was that instead of downloading illegal music, which would "violate the law" and also result risky for what concerns contracting computer viruses, was transformed in a just process where songs could be bought for one dollar or so on specific programs, the most famous being "iTunes". This allows artists to make some profit from their work but at the same time it allowed people to buy the songs at a cheap price and most importantly, in a legal manner. For what concerns "sharing", I once heard by a friend that "if you upload material on a specific program, you are allowed to download other stuff without facing legal charges", but then the question is: how does someone regulate that "material sharing"? it was very easy for anyone to just download without uploading anything in return. But returning to the authors, the most interesting part for me was Lessig's "principle of bovinity", described by Benkler and how he explains the principle of controlling a large number of people or animals in a metaphorical sense with little resources and very few rules, although constantly enforced. In conclusion I personally think that the whole process of protecting and regulating the usage of a given site or program is just part of a never ending cycle where someone creates a program for instance, which will be then used and abused, with subsequent establishment of laws and protection which will just be overcome and the whole cycle will start again. Perhaps this is a pessimistic view of how things are actually functioning but my feel is that this process is just like a virus vs. antivirus eternal battle. Emanuele 18:56, 30 January 2012 (UTC)
I found the readings very thought provoking. In “What Things Regulate”, I found it really interesting how the author evaluates threat to liberty. It is interesting how norms, law, market and architecture provide a powerful combination for regulation. The contrast between norms and law was quite insightful and I was left wondering how in our life, norms rule our behavior. I agree with author’s opinion that state should not be using nontransparent means when transparent means are available. I spent some time arguing why the government would feel compelled to use non-transparent means. How would they defend using such means. In Benkler’s article, discussion on enclosure and openness seems very relevant and could be applied to real life example. Facebook succeeded over MySpace due to its open platform. iPhone gained much of its popularity due to the applications developed on open platform. It is worth evaluating how Eclosure restricts creativity and innovation? Is it worth for the little girl to spend time on replacing her picture frame by frame in the clip of movie Schindler’s List? It is interesting to see how the regulations would evolve with today’s technological advancements and the very need to share. Recent strikes by Wikipedia and Google point out to that very conflict between the government and technology. Pgaur 17:07, 31 January 2012 (UTC)
The article “What Things Regulate” brings up an interesting point. I think that there would be more government involvement and regulation in the future of the internet in terms of privacy. For example, currently there is no law against Facebook’s privacy settings, and one is not banned from dispersing such private and personal information. It is up to the users to control their own privacy. Moreover, there is no guarantee that your personal information would not be breached. I think that the government will attempt to have more control and regulation over the internet. However, simultaneously there would be an outburst by the consumers who demand the right to open information. Qdang 20:39, 31 January 2012 (UTC)
@AlexLE thanks for starting off the discussion- particularly the parameters/constraints/theories surrounding the internet. Zittrain discusses concept of hierarchy and polyarchy provoked many discussions about the iPhone/Android particularly open/close sourced applications. iPhone’s sandbox approach allows programmers to create open source applications subject to Apple's approval. Some might argue that these enforced limitations create a barrier between the programmer and the consumer. Google, unlike Apple, does not require approval to release applications on their Android phones- relying on the consumer to weed out harmful applications. Personally, I prefer Apple’s approach- it creates a more user-friendly environment. Linux, similarly to Android is completely open sourced which is great for programmers but too time consuming and complicated for the average consumer. Szakuto 20:56, 31 January 2012 (UTC)
I agree with Eric von Hippel’s idea that firms should welcome improvements to their projects by customers instead of relying solely on internal Research and Development departments. Zittrain states, “Activity led by amateurs can lead to results that would not have been produced in a firm-mediated market model.” I believe a good example of a company or “firm” incorporating customer feedback is Facebook. Facebook continually updates the format and access to information based on customer feedback. The following are examples of how Facebook incorporates Zittrain’s five principle factors of generativity. 1. Leverage: Facebook makes a difficult task of reaching out to friends and family extremely easy. 2. Adaptability: Since the development of Facebook, a newsfeed has been incorporated to make it easier for users to get updates when friends’ information has change. Ease of 3. Mastery: Facebook users do not have to be technologically savvy to understand how to use the program. 4. Accessibility: A user only needs access to the Internet and the program is free to use. 5. Transferability: Users are given notices about updates to the program with tutorials. User HSolomon 21:12, 31 January 2012 (UTC)
I also have to agree that the articles this week were very thought provoking particularly how Benkler reviews the legal aspects of regulating the internet and how the “institutional ecology of the digitally networked environment is waged precisely over how many individual users will continue to participate in making the networked information environment.” It is key for its survival and for those (or us average consumers) to continue to actively pursue the creation of various platforms, technologies, and the digital environment, and fight against wanton regulation and attempts to constrict and impose superfluous standards/laws such as SOPA and PIPA. This is also in accordance with his statement of how “legal responses” and the “primary role of law has been reactive and reactionary,” thus it is to society’s advantage to continue to produce, create, and contribute.
I also really enjoyed reading Zittrain’s concept of Generativity and demonstrating how the system operates from a fundamental perspective. This is what is so fascinating about these digital technologies and how extensive and powerful they can affect the growth of a society i.e. remarkable mobile technology growth in emerging markets or the evolution of the internet resulting in significant increases in digital/media/business companies in conflicted countries such as Afghanistan.
@HSolomon, I was thinking of the exact same example and thought of Facebook upon reading that. JennLopez 22:03, 31 January 2012 (UTC)
Benkler really uncovers a tug-of-war between those sectors of society which want freedom and those that don’t, and exposes how the traditional industrial-commercial sector of society prefer to keep control. To me, this is not surprising, and having lived in many parts of the world, I have found that this attitude has no national boundaries. Human nature is international, and those who control economy and power tend to want to control that and stay on top. About the “Principle of Bovinity” of Lessig, I believe that is applied to many spheres of influence in society, not just the internet, and shows that there are those who truly seek to control, and quite consciously, quite astutely, study the most effective socio-psychological means to do it. I agree that freedom can need guidelines at times, or the freedom of one can infringe upon the freedom of another, and who or what can be the judge of that? It’s a hard question to answer and requires deep reflection. I believe we need freedom, yet also responsibility. Still, we come to the question on who can be the unbiased judge of that? (By the way, I'm pretty new at editing wikis, couldn't find how to upload this, was late in the process, and maybe edited everything else in sight during the process. I'll get better!)Mike 22:17, 31 January 2012 (UTC)
@Szakuto While I agree that Apple's sandbox approach is an excellent means of generation for 'average' consumers to participate in the proliferation of that platform, if Lessig's approach to freedom of expression, as he defines it using Mill's principles, is applied to the Apple sandbox, it is then nothing more than a limit on the freedom of expression for developers. I do agree that they have simplified the process for easy development, but since every app must be approved by Apple, as well as every program delivered through Apple's iTunes, it limits the creativity of the developers, and is nothing more than a form of censorship, rather than enabling the generativity that Zittrain speaks of.Nthib 22:25, 31 January 2012 (UTC)
I’ve enjoyed this week’s readings, particularly “What Things Regulate” and I tend to agree that we need regulation and stiff penalties where the message can be very clear to those who break the law. I like Zittrain’ expressions of how interdependency of “architecture” & “norms” are dictating our lives and the importance of PC. I am looking forward to this class. Sophia 5:30 January 31, 2012 (UTC)
Weekly Response
January 31: Paradigms for Studying the Internet
In terms of the architecture: The primary problem is that if we start to give things away for free, then we are starting to resemble communism somewhat. So, companies have to find new ways to make people spend their money. This becomes an issue in the digital landscape, particularly while trying to maintain or enforce capitalism within the framework of the internet. By making the architecture of the internet consistent with market economy structures, then it becomes easier to reinforce and perpetuate market capitalism. And there must be reasons to sustain the public faith, not just gimmicks and tricks in order to maintain a state of mass confusion. Especially, when getting something for free is way better than having to pay for it! Although, strictly enforcing this as the only option without any freedom of choice should not be looked at as a long term strategy, particularly as the common man starts to wake up to this actuality. Legalities alone will not work to control, but corporations have to offer honest incentives to give real reasons for consumers to buy their products.
When it came to Google Books, there was a bit of controversy surrounding this notion of accessibility. Google had a great idea with digitizing all of the books in the world and putting them online, while allowing anyone to access the material like a library. It was a brilliant idea, but was stopped and instead turned into an Amazon like store. Not very good for progress towards all of humanity, but a sure step-back to Capital control. Understanding, of course, that this is one of the main aims at Google. Keeping in mind that the spreading of ideas and information is partly what the internet was designed for, this goes against what the internet is fundamentally meant to be about.
The double standard, of course is that if we are going to continue within a capitalist system, then we must continue to remember the thought that this is my computer, this is my money, and this is my life. And this stems from the natural world – for example: this is my head, and my hand, not yours. Once we start to stray away from that, then we are not talking about progress or innovation anymore. We are talking about something fundamentally different than what we supposedly are all working for. We are talking about a form of fascism. And this becomes somewhat deceptive on the part of the corporations, governments, and so forth – even if the next generation of consumers have no idea what the deal is to begin with.
If companies want to make a product then they should be able to do so in a free and open market economy. And they should try and make it as best as is possible. Companies should continue to invent and innovate in order to improve on their products. People will always modify the products and customize them however they want. Tinkering with things goes back to “caveman” days, and is what continues to progress society -- even today. When a person purchases a car and decides to give it a new paint job, or to install a new stereo system, this is not the concern of the company that manufactured the automobile, nor should it be. Although, I do understand the concerns by manufacturers trying to maintain a public image. If you see a Rolls Royce painted with blue flames and monster truck wheels, the manufacturer might have some disapproval under some circumstances because the manufacturer wants to maintain control over the public perception of the brand. When I purchase something, it is mine – I am the owner of it – and not the company that I bought it from. Therefore, I see no reason to think that a company should have control over something once it has been purchased. However, when I purchase something I want to be awed. I need a reason to purchase it. So, this presentation showroom “complete package” that I am given by corporations becomes a good idea in this sense. And I understand how corporations try to maintain this best composure by making sure consumers do not break their warranty. Heaven forbid if the dumb consumer should break their laptop by installing some new ram by themselves! Of course, this leaves open room for new jobs and industries – for example: Best Buy's “Geek Squad.” Although, I do think that once I purchase the item I should be able to do whatever I want with it. And if I no longer like it, I should be able to return it, or donate it, or even smash it if I so choose to do so. And I do not think that this should be illegal as a condition of my purchase. Yet, this becomes a very much “put your hands where my eyes can see them” kind of mentality, in fact treating consumers as though they are criminals even before any crime has been committed.
The thing about participation is that it is already a controlled process, but a matter of choice to initiate. We have religion to control us, along with laws, government, and the educational system.
“Fortunately, there are many ways in which people have a chance to build and contribute. Many jobs demand intellectual engagement, which can be fun for its own sake. People take joy in rearing children: teaching, interacting, guiding. They can also immerse themselves in artistic invention or software coding” (Zittrain).
And so, there is this notion of an underlying architecture already in place that is reinforced through society. Our parents raise us, and we are in line with the rest of our surroundings. So, do I want to see the advertisements through Google, or do I want to chat with my friends on Facebook? Depending on who your friends are, there is no difference between the two – because of the architecture already existent within society. You think: How was that movie, or that book, or that song, or that sports game? When in fact everything we are surrounded by reinforces the system. So, in this sense, just because the internet is fairly new does not automatically mean that it is not safe. Even if that new song by the Black Eyed Peas is being pirated somewhere, it is still reinforcing Capitalism -- it is a byproduct of it, programmed only to reinforce control. So, maintaining a balance over the long term will be a challenge, because the internet is fundamentally about me and you.
Just Johnny 01:01, 7 February 2012 (UTC)