FOR EDUCATIONAL USE ONLY
Copr.
© West 2000 No Claim to Orig. U.S. Govt. Works
10
SHCLJ 1133
(Cite as: 10 Seton Hall Const. L.J. 1133)
Seton
Hall Constitutional Law Journal
Summer
2000
Comment
*1133 CASTING A NET OVER THE NET: ATTEMPTS TO PROTECT CHILDREN IN
CYBERSPACE
Jennifer
Zwick
Copyright
© 2000 Seton Hall University, Seton Hall Constitutional Law
Journal;
Jennifer Zwick
I. INTRODUCTION
Ten years ago, relatively few Americans had
ever heard of the Internet, let alone logged onto it. By contrast, at the end of 1998, studies estimated that eighty‑four
million Americans used the Internet from either home or work. [FN1] As of mid‑year 1999, that number had
risen to over one hundred million users. [FN2]
Consequently, the Internet "has gone from being a curiosity to
being a daily source for e‑mail, shopping, research, and news."
[FN3] A wealth of information from across
the globe is now accessible with a few simple mouse clicks.
With every technological innovation comes
drawbacks, and the Internet is no exception.
While permitting instant communication, the worldwide dissemination of
ideas, as well as easy access to the ideas of others, the Internet has also
generated concern regarding its effects on the constitutional rights of its
users, especially in the context of children.
This Note explores the implications of the
Internet on the privacy rights of children, as well as the possible
infringement of Web publishers' First Amendment rights resulting from attempts
to protect children. The first section
addresses both government and industry regulation of online speech in efforts
to shield minors from inappropriate material.
Thereafter, this Note examines such endeavors as Congress' Children's
Online Protection Act ("COPA"), [FN4] which is *1134 currently under review in the Third Circuit, as well as
attempts by Congress, the states and the public to force libraries to install
filtering software, thereby prohibiting users from accessing questionable
material. This section concludes that
both the COPA and forced library filtering will likely be held
unconstitutional.
The second section scrutinizes efforts to
guard the informational privacy of children online. Special focus is placed on the Child Online Privacy Protection
Act ("COPPA") [FN5] and its accompanying Rule [FN6] as set forth by
the Federal Trade Commission ("FTC"). This Note concludes that the combination of federal mandate and
industry initiative created by the COPPA is a reasonable formula by which to
protect children from the collection of their personally‑identifiable
data.
II.
PROTECTING CHILDREN ONLINE AND THE FIRST AMENDMENT IMPLICATIONS
Jimmy is a fifteen‑year old boy and a
sophomore in high school. He has
thought for a long time that he might be gay but has been too scared to mention
it to his family or friends. Needing
someone to talk to, Jimmy decides to find Web sites and chat rooms for gay and
lesbian teens. It seems like the
perfect solution, allowing him to talk to others in his situation, while not
having to reveal his name or any details about himself. His parents cannot afford a computer at
home, and using a school computer is out of the question because one of his
classmates could walk in at any time.
Instead, Jimmy goes to the public library to use one of the computers
there. Surely none of his friends would
show up at the library.
Once there, Jimmy accesses the Internet on
one of the library computers and attempts to locate an appropriate chat
room. Unfortunately, the library has a
policy of filtering its Web access; therefore, all chat rooms, as well as sites
having to do with sex, have been blocked.
While patrons may request that the librarian unblock legitimate sites
that have been inadvertently filtered,Jimmy is hesitant to tell the librarian
of the type of site he wishes to reach.
His whole reason for wanting to access the chat room through the library
was the anonymity it would provide.
"Now what?" he laments to himself.
Across town, Brittany is researching a
report for her fourth grade social studies class which is due on Friday. Her assignment requires her to write a two‑page
paper concerning the United States government.
With just a few mouse clicks on her home computer, she is connected to
the Internet. Not understanding the
difference between the top‑level domains, [FN7] she types *1135 in
"www.whitehouse.com" and is greeted, not by the President, but rather
by multiple pictures of topless women. [FN8]
The warning to children under 18 is small and escapes Brittany's notice
as she clicks on a picture labeled "First Ladies." [FN9] There, Brittany encounters both "Miss
White House" and April, the "Intern of the Month," in two
revealing photographs. [FN10] Realizing
that this Web site is not going to help her with her report, Brittany decides a
better way might be to run a search using Yahoo! (R) [FN11]
Although fictional, neither of these
scenarios is far from reality. Chief
Circuit Judge Sloviter, sitting for the Eastern District Court of Pennsylvania,
aptly observed that ". . . content on the Internet is as diverse as human
thought." [FN12] Human thought,
however, is not always profound or academic.
While we applaud the breadth and depth of information that the Internet
has brought to our fingertips, we are at the same time cognizant of a slightly
darker element that has found a home on the Web, one from which children should
be protected. [FN13] Few would argue
that Brittany's best interests are served by exposure to pornography without
parental guidance. On the other hand,
in protecting children from such exposure, we should not hinder Jimmy's quest
for legitimate information regarding his sexuality. Most agree that something should be done. So far, however, neither industry initiative
nor legislative action has been successful in *1136 achieving an acceptable balance between shielding minors
from harmful material on the Internet and preserving the First Amendment right
to free speech of publishers, as well as the adult population at large. [FN14]
A.
Non‑Internet Attempts to Protect Children Through the Regulation of
Speech
Not all types of speech receive the
constitutional protection furnished by the First Amendment. [FN15] While the First Amendment generally
guarantees the almost unlimited right to disseminate, as well as receive, ideas
and information, [FN16] speech deemed unprotected [FN17] can be more heavily
regulated by the Government based on its content. [FN18] For example, in 1957, the Supreme Court of
the United *1137 States determined
that the First Amendment's constitutional protection did not extend to obscene
speech or speech containing obscene language. [FN19]
That same year, in Butler v. Michigan,
[FN20] the Court addressed the injury that might befall a child from exposure
to unprotected speech. [FN21] In
Butler, a Michigan statute prohibited the sale of any literature
"manifestly tending to the corruption of the morals of youth."
[FN22] The book in question was sold,
not to a child, but to a police officer. [FN23] Though the state argued that
the purpose of the statute was to "shield juvenile innocence," the
law simultaneously rid the general public of any material potentially harmful
to minors. [FN24] Finding the law
overly broad, the majority held that such material could not be completely
banned, as adults should not be reduced to reading only that which is
appropriate for children. [FN25]
Since Butler, the federal and state
legislatures have continued to pass statutes aimed at protecting minors from
sexually‑explicit materials. The
Supreme Court, in Ginsberg v. New York, [FN26] upheld a state statute
proscribing the sale of material considered obscene as to minors under the age
of seventeen. [FN27] The *1138 Court found that protecting the
well being of minors was a compelling state interest. [FN28] Furthermore, the New York statute was
distinguishable from that struck down in Butler because adults were not hindered
from purchasing such material. [FN29]
Thus, the Court found that the statute was narrowly tailored and did not
violate the First Amendment rights of adults. [FN30]
FCC v. Pacifica Foundation [FN31] marked the
federal government's first successful attempt to regulate not just obscene, but
"indecent" [FN32] speech in the name of protecting children. [FN33] The case evolved from a sanction levied by
the Federal Communications Commission ("FCC") against Pacifica as a
result of its radio broadcast of George Carlin's now infamous "Seven
Filthy Words" monologue. [FN34] A
father and his child heard the morning broadcast while listening to the car
radio. [FN35] In addition to the time
of day of the broadcast, the Court focused on the medium of communication
utilized in upholding the FCC action. [FN36]
Radio, the *1139 Court
opined, was a scarce [FN37] and intrusive medium, which could come into the
home when children might be in the audience. [FN38] A listener cannot simply avert her ears when something indecent
is aired. [FN39] Consequently, speech may enter the home without warning,
thereby taking the listener by complete surprise. [FN40] Additionally, the majority found that the
scarcity of the broadcasting spectrum gave the Government more flexibility in
its regulation for the public interest; accordingly, the Court upheld the FCC
action. [FN41]
In contrast, the cable industry is not based
upon the allocation of a limited broadcast spectrum and, in the past, has been
treated differently by the courts than the radio and television broadcasting
industries. [FN42] A more recent
Supreme Court case, however, has put this tenet on unstable ground. In Denver Area Educational Consortium v.
FCC, [FN43] the Court reviewed a federal statute, which regulated
"indecency" on cable television and was aimed at "protect[ing]
children from exposure to patently offensive sex‑ related material."
[FN44] Refusing to apply strict
scrutiny, a plurality of the Court cited the pervasive and intrusive character
of cable television as similar to that of traditional broadcast media, which
dictated a less demanding analysis. [FN45]
The dissent disagreed, stating that strict scrutiny was, in fact, the
appropriate test. [FN46] Thus, the
dissent opined, the *1140 statute
should have been found to violate the First Amendment. [FN47]
The Supreme Court also grappled with the
constitutionality of prohibiting
"dial‑a‑porn" in Sable Communications of
California, Inc. v. FCC. [FN48] The California statute in Sable forbade both
obscene material, as well as that which was simply indecent. [FN49] While recognizing a compelling state
interest in protecting children from the harmful emotional effects of such
communications, [FN50] the Court found that less restrictive means existed to
achieve the Government's goal. [FN51]
Applying strict scrutiny, the Court distinguished the medium of
telephone communications from that of broadcast, thereby finding that the danger
of exposing a child to the prohibited material was less likely. [FN52] Relying on the Butler decision, the Court
advocated an approach that would protect the rights of the adult population,
while simultaneously shielding children from pornographic speech. [FN53]
B.
Attempts to Protect Children Through the Regulation of Speech Over the Internet
Unlike traditional broadcast media, the
Internet is not a scarce medium of communication, [FN54] nor is it intrusive.
[FN55] Similar to the telephone, in
order to receive information, it must actively be sought. [FN56] The Internet is, however, the most pervasive
medium the world has ever known and, as a result, great concern has arisen with
regard to what children might see, read and, increasingly, hear on *1141 their computers. Both
legislative action on the state and federal levels, as well as industry
initiative, have endeavored to tackle this problem.
1.
Proscription of Material Harmful to Minors
a.
Federal Regulation
Congress' first attempt to regulate sexually
explicit material on the Web came in the form of the Communications Decency Act
("CDA") [FN57] a small section of the Telecommunications Act of 1996.
[FN58] Two provisions in particular
were noteworthy. Section 223(a)
prohibited the knowing transmission of any "communication which is obscene
or indecent, knowing that the recipient of the communications is under
[eighteen] years of age." [FN59]
Section 223(d) prohibited the knowing transmission or display of any
"communication that, in context, depicts or describes, in terms patently
offensive as measured by contemporary community standards, sexual or excretory
activities or organs" to recipients under the age of eighteen. [FN60] The statute further provided an affirmative
defense for those defendants who had made a reasonable, good faith effort to
"restrict or prevent access by minors . . . [utilizing] any method . . .
feasible under available technology." [FN61] A defense was also afforded to those who had attempted to employ
restrictive measures such as the "use of a verified credit card, debit
account, adult access code, or adult personal identification number."
[FN62]
In the words of one academic, "[i]f we
had sat down and purposely set out to write a statute that completely violated
the First Amendment and [would] undoubtedly be struck down by the Supreme
Court, we could not ourselves have come up with better language to accomplish
that purpose." [FN63] The American *1142 Civil Liberties Union
("ACLU"), and ultimately the Supreme Court, apparently agreed. In ACLU v. Reno, the District Court for the
Eastern District of Pennsylvania ruled that the "indecent
transmission" and "patently offensive display" provisions of the
CDA were overly broad, vague and thus, unconstitutional violations of the First
Amendment and the Due Process Clause of the Fifth Amendment. [FN64] In 1997, the Supreme Court affirmed the
district court's ruling, but solely on the basis of the First Amendment issue,
[FN65] and only to the extent that the Act went beyond the prohibition of
"obscene" material. [FN66]
The Court identified many problems with the
language of the CDA. First, the
terminology of the CDA was vague and inconsistent. Section 233(a) proscribed "indecent" speech, while
section 233(d) forbade "patently offensive" speech; but nowhere
within the Act was either term defined. [FN67]
Not only would such ambiguity confuse potential speakers, the Court
observed that it would further "undermine the likelihood that the CDA
ha[d] been carefully tailored to the congressional goal of protecting minors
from potentially harmful materials." [FN68]
Second, the Act was unconstitutionally
broad. Applying strict scrutiny,
Justice Stevens explained that, while the Court had "repeatedly recognized
the governmental interest in protecting children from harmful materials,"
[FN69] the CDA had the effect of limiting communications to which adults were
constitutionally entitled. [FN70] This
burden on free speech, the Court surmised, would be especially onerous in the
context of the Internet where, for example, among the many recipients of an
indecent communication in a chat room, at least one minor could be expected to
be in the audience. [FN71]
Additionally, the CDA did not distinguish
between commercial and non‑ commercial speakers. [FN72] Therefore, the affirmative defenses provided
would be *1143 prohibitively
expensive and unavailable to many non‑ commercial defendants. [FN73] As a result, many potential publishers might
decline to speak for fear of conflicting with the law. [FN74] Since such a restraint on protected speech
would be a violation of the First Amendment, [FN75] the Supreme Court struck
down the Communications Decency Act on June 26, 1997. [FN76]
Returning to the drawing board, on October
21, 1998, Congress enacted the Children's Online Protection Act (COPA) as part
of the Omnibus Appropriations Act for the Fiscal Year 1999. [FN77] Within twenty‑four hours, the ACLU,
along with sixteen other plaintiffs, [FN78] filed suit seeking an injunction to
block enforcement of the COPA. On
November 20, 1998, the date that the COPA was slated to take effect, the
District Court for the Eastern District of Pennsylvania issued a temporary
restraining order, [FN79] stating that "no one, the government included,
has an interest in the enforcement of an unconstitutional law." [FN80] On February 1, 1999, the court granted a
preliminary injunction. [FN81]
The COPA was an attempt by Congress to
respond to the Supreme Court's comments in Reno, and has been alternatively
referred to as "CDA II" or "Son of CDA." The COPA differs from the CDA in two main
respects. First, the COPA discards the
"indecent" and "patently offensive" language of the CDA and
is directed instead to speech that is "harmful to minors." [FN82] Second, the COPA applies *1144 only to online speech for "commercial purposes."
[FN83] However, telecommunication *1145 carriers, Internet Service
Providers, and search engine operators are explicitly exempt from liability
under the COPA. [FN84] Others found in
noncompliance with the COPA might be subject to criminal and civil penalties.
[FN85] The COPA, like its predecessor,
provides affirmative defenses for those defendants who, in good faith, attempt
to restrict access by children to communications that are harmful to minors.
[FN86] Interestingly, the COPA does not
include the CDA's requirement that such restrictions be "effective."
[FN87]
Despite these statutory changes, Judge Reed
ruled that the plaintiffs would likely prevail in a case on the merits, and
thus issued a preliminary injunction enjoining enforcement of the COPA.
[FN88] Defendant, Attorney General
Janet Reno, has appealed the decision to the Third Circuit. Oral arguments were heard on November 4,
1999, but the court is not expected to issue an opinion for several months.
b.
State Regulation
To date, attempts by state legislators to
regulate speech on the Web have also been unsuccessful. In American Library Ass'n v. Pataki, [FN89]
the District Court in the Southern District of New York struck down a New York
statute criminalizing the knowing communication to children of material deemed
harmful to minors over the Internet. [FN90]
In granting a preliminary injunction, the court found *1146 that the statute violated the
Commerce Clause [FN91] in three ways. [FN92]
First, the communications falling within the purview of the statute were
not intrastate in nature given the worldwide scope of the Internet and thus,
could not be regulated by a state government. [FN93] Second, the court found that the burden on interstate commerce
created by the statute outweighed any benefit to be derived. [FN94] Finally, the court recognized that federal
regulation would be more appropriate given the widespread nature of the
Internet, and thus New York was pre‑empted from legislating such
communications. [FN95]
In November of 1999, the Tenth Circuit Court
of Appeals enjoined a similar New Mexico statute through the use of the First
Amendment, as opposed to the New York court's Commerce Clause analysis. [FN96] Affirming the District Court of New Mexico's
grant of preliminary injunction, in ACLU v. Johnson, [FN97] the Tenth Circuit
held that the statute was unconstitutionally broad, burdening otherwise
protected adult communications on the Internet. [FN98] Furthermore, the court
briefly addressed the Commerce Clause implications of the statute and found
that, as in Pataki, the Act created "an extreme burden on interstate
commerce." [FN99]
*1147 c. Filtering in Libraries
Libraries have traditionally been the
purveyors of constitutionally protected expression. That which a library may stock on its shelves, as well as that
which a library may remove from its shelves, has long been a source of heated
debate. [FN100] In striving to provide
their patrons with a wealth of information, almost three quarters of public
libraries across the country offer public Internet access. [FN101] As concern regarding sexually‑explicit
material on the Web has increased, however, so has pressure on libraries to
restrict public access.
The American Library Association (ALA) has
long supported a policy of unrestricted access to information, regardless of
content. [FN102] Additionally, the Library Bill of Rights provides that access
shall not be denied on the basis of age. [FN103] The ALA interprets this right to mean that libraries, unlike
schools, do not stand in loco parentis, [FN104] thereby forcing parents to bear
the responsibility for what their children view. [FN105]
Despite these policies, many libraries have
nonetheless decided to install filtering software on their computers.
[FN106] Filters essentially block
access to Web sites that are either on the blocking software's
"blacklist" or contain certain
*1148 words that have been deemed unacceptable. [FN107] Consequently, in
the earlier hypothetical situation, if Brittany's parents had installed such
filtering software on her computer, the program would have examined
"www.whitehouse.com" before the site was displayed. Most likely, the software would have blocked
Brittany's access. However, in Jimmy's
scenario, the filtering software worked to the user's detriment, effectively
blocking access to information that was not pornographic, but of legitimate
concern.
Indeed, filters have been known to block
access to Web pages that are in fact not obscene, but rather designed to help
people. [FN108] Software screening for
the word "breast," for example, may inadvertently block Web sites for
breast cancer support groups or even Web sites providing recipes in which an
ingredient is chicken breasts.
Moreover, many blocking software producers will not allow potential
purchasers to review their "blacklists" before buying their software.
[FN109] Thus, these software
manufacturers have total discretion as to which sites are blocked and could
potentially limit access to sites that criticize their blocking techniques.
[FN110]
One library in Virginia installed such
software on all of its computers and consequently, found itself in a lawsuit
with the ACLU. In Mainstream Loudoun v.
Board of Trustees of the Loudoun County Library, [FN111] the District Court for
the Eastern District of Virginia found that the library's policy of prohibiting
access to all sites containing "material deemed harmful to
juveniles," [FN112] was a prior restraint on free speech, thereby
violating the First Amendment. [FN113]
The library argued that its policy was enacted not to control
expression, but to prevent the sexual harassment of its staff and to conform to
laws protecting minors from obscenity and child pornography. [FN114] Accordingly, the library asserted that the *1149 policy was content‑neutral.
[FN115]
Judge Brinkema, however, found that these
stated objectives were not legitimate secondary effects because ". . .
neither [could] be justified without reference to the content of the speech at
issue." [FN116] Conversely, the court concluded that the policy was an
attempt to regulate speech based on its content, mandating the application of
strict scrutiny. [FN117] Judge Brinkema
further found that although the library's contended interests were compelling,
no demonstrable problems existed for which the policy was intended to cure.
[FN118] Therefore, the library's
interests were not advanced by its chosen course of action. [FN119] Recognizing that circumscribing the access
of all patrons to only materials fit for minors was over‑inclusive, the
court emphasized that less restrictive alternatives existed. [FN120]
In addition, the policy, as adopted, lacked
the standards and procedural safeguards necessary to survive strict scrutiny.
[FN121] Decisions to censor were left
to the discretion of the blocking software manufacturer. [FN122] As the court explained, "a defendant
cannot avoid its constitutional obligation by contracting out its
decisionmaking to a private entity." [FN123] While the policy included a method by *1150 which patrons could request the unblocking of materials that
were wrongfully blocked, the court noted that requiring a patron "to
publicly petition the Government for access to disfavored speech ha[d] a severe
chilling effect." [FN124]
The court concluded by pointing out that it
was the choice of the library to provide Internet access to its patrons.
[FN125] In so doing, the library was
obligated to follow the mandates of the First Amendment in any attempt at
limiting such access. [FN126] The
policy of the Loudoun County Library, however, was not in accordance with the
constitutional guarantee of free speech. [FN127] The Board of Trustees subsequently decided not to appeal and
removed the filtering software from its computers. [FN128]
Although free speech advocates might
consider this a victory, it is a small one because the Mainstream Loudoun
decision binds only the Eastern District of Virginia. [FN129] To have nationwide precedential value, the
case must be heard and affirmed by the United States Supreme Court. Until another case of its kind makes its way
through the court system, libraries will remain uncertain as to whether
filtering is an appropriate means by which to protect young patrons from
pornographic material. [FN130]
One such case, with an interesting
variation, has been filed in California. Rather than patrons suing to have
filters removed from library computers, this plaintiff sued the city and its
public library to have filters installed.
A parent, known only as Kathleen R., filed suit after her 12‑year‑old
son, on multiple occasions, distributed print outs of pornographic images he
had downloaded to a disk from a library computer. [FN131] The plaintiffs initially filed suit on
public nuisance *1151 grounds.
[FN132] That lawsuit was dismissed by
virtue of a viable provision of the CDA which exempts access providers from
liability for third‑party speech. [FN133] Plaintiffs then amended their complaint and returned to court,
arguing that permitting unrestricted access to the World Wide Web was a
violation of a citizen's constitutional right to protection against the
arbitrary actions of the state under the Fourteenth Amendment. [FN134] Plaintiffs contended that "the library
actively places children in danger of severe psychological harm by providing
them with obscene pornography." [FN135]
On January 13, 1999, the judge dismissed the second complaint without an
opinion, [FN136] and in July of 1999, the plaintiffs filed an appeal with the
California Court of Appeals. [FN137]
That decision is pending.
Kathleen R. is not the only party trying to
force libraries to limit Internet access. Library filtering has also been the
focus of several federal bills. Arizona
Senator, John McCain, introduced the Childrens' Internet Protection Act in
January of 1999. [FN138] This Act would
require all schools and libraries receiving E‑rate funding, that is,
discounts for Internet access, to install filtering software. [FN139] The American Library Association has stated
that it will fight the bill in court if it is signed into law. [FN140] Representative Bob Franks of New Jersey
introduced a *1152 similar bill in
the House. [FN141] The Neighborhood
Children's Internet Protection Act, introduced in August of 1999by Senator Rick
Santorum, a Republican from Pennsylvania, is slightly less rigid in that the
installation of filters would not be made mandatory. [FN142] Instead, each community would be required to
adopt a plan to best protect its children from harmful materials on the
Internet. [FN143] The Act would require
only a certification either that a system to filter or block inappropriate
material had been implemented or that an Internet use policy addressing the
issue had been adopted. [FN144] All
three of these bills allow each community to determine what is inappropriate
for minors to view. Finally, the Child
Protection Act of 1999, [FN145] or Istook Amendment to the Labor/HHS
Appropriations bill, requires public schools and libraries receiving Federal
funds "for the acquisition or operation of computers to install software
to protect children from obscenity." [FN146]
As of July of 1998, thirty thousand schools
and libraries had applied for E‑ rate funding, [FN147] giving the
ultimate outcome of these bills a significant effect. The ALA has reluctantly endorsed the approach
proposed by Senator Santorum, finding it more reasonable than that of Senator
McCain. [FN148] However, the ALA would
ultimately prefer no federal mandate. [FN149]
It appears that the libraries cannot
win. Although one court has held that
filtering is unconstitutional, [FN150] the federal government nonetheless
continues in its *1153 attempts to
obligate libraries to filter. [FN151] Until a coherent decision is reached,
some libraries have implemented alternative policies, thereby attempting to
find a compromise between the conflicting concerns. One such method is the "Tap on Shoulder" approach.
[FN152] If a librarian happens to see a
patron viewing material that is inappropriate, the Internet user will be tapped
on the shoulder and told to stop. [FN153]
This, of course, raises the issue of providing librarians with unlimited
discretion in determining what library patrons may view. [FN154] Alternatively, some libraries have placed
privacy screens over their monitors, making it more difficult for anyone, other
than the immediate user, to view what is on the computer screen. [FN155] Other libraries have created a kid's
computer section, thus installing filtering software on only some of their
computers. [FN156] Finally, and
probably most disturbing, some libraries have decided to discontinue the
availability of Internet access to their patrons altogether. [FN157]
2.
Industry Initiatives
In order to avoid more potential federal
regulation of Internet content, several initiatives have been undertaken by the
industry to regulate itself. Web‑based
search engines, including AltaVista®: and Yahoo!®:, have
established family friendly search engines, which purport to "reduce
objectionable content from your search results." [FN158] In addition, industry leaders are developing
rating systems *1154 to better allow
users to filter content based on their own opinions and values.
The World Wide Web Consortium has advanced a
protocol to bring such ratings systems to fruition called the Platform for
Internet Content Selection (PICS). [FN159]
The PICS is not a rating system in and of itself, "specif[ying]
little more than the syntax and protocols used to label content and transmit
the labels." [FN160] Essentially,
Web site publishers or third parties label content according to a standard
generated with the PICS. [FN161] Users then choose from an assortment of
existing ratings systems, [FN162] configure their browsers or PICS software to
use that ratings system and finally, determine the level of information they
want to allow. [FN163] Such ratings systems could conceivably allow users to
filter out not only violent and indecent materials, but also opinions with
which they don't agree. [FN164] The
dangers of such a system stem from the imperfections of ratings systems in
general, including imprecision and, of course, mislabeling by content creators.
[FN165] Moreover, labels are
subjective, and what might be obscene or indecent to one, may be art to
another. [FN166]
In September of 1999, representatives from
world governments, the Internet and media industries, as well as the Internet‑user
community, gathered in Munich, Germany to discuss the possibility of an
international rating scheme. [FN167] *1155 One proposal considered involves a
voluntary rating system by which content creators would label their materials.
[FN168] It further makes available to
the end user a choice of many blocking systems, reflecting the various
ideologies and cultures of the world. [FN169]
Some applaud the effort, believing it to not only enable parents to
protect their children more effectively, but also to stave off government
regulation. [FN170] Others, however,
see it as the first step towards a government‑mandated, uniform labeling
system in a space where the government should have no involvement. [FN171]
III.
PROTECTING CHILDREN's PRIVACY
As one commentator has noted, "privacy
is central to freedom and autonomy and the rights of the individual to have a
private life." [FN172] Individuals
have become more concerned than ever over protecting their informational
privacy, and the widespread collection of such personal data over the Internet
into huge databases has only heightened the concern. [FN173] The gathering of personally‑identifiable
information regarding children has become an especially hot topic. Involved in the debate are privacy groups
who wish to protect children from exploitation, as well as those doing business
on the Web, who recognize the economic value in mining information regarding
this group to whom advertisers market heavily. [FN174]
*1156 A. Background
While no express grant of a right to privacy
can be found within the language of the Federal Constitution, the Supreme Court
has long recognized certain personal privacy rights. [FN175] Such protections prohibit government
intrusion into the lives of individuals concerning particular intimate
decisions. [FN176] Nonetheless, to
date, the Supreme Court has not acknowledged a right to have personal information
kept private, [FN177] although the Court did briefly address this issue in
Whalen v. Roe. [FN178] In dicta, the Court recognized that as technology allows
for the increased collection of personal information in computerized databases,
an implicit threat to privacy arises. [FN179]
While admitting that a privacy right may be implicated, the Court
declined to address the possible issues that might become manifest from the
unjustified disclosure of such information. [FN180]
In spite of, or perhaps due to, the Court's
unwillingness to interpret the Constitution as providing an individual right to
informational privacy, a patchwork of federal and state legislation has
emerged. In the public sector, the
Privacy Act of 1974, [FN181] the Computer Matching and Privacy Protection Act
(CMPPA) of 1988, [FN182] and the Freedom of Information Act (FOIA) of 1966,
[FN183] all serve to *1157 provide a
basis for privacy protection of information stored within government
files. The private sector, on the other
hand, remains largely unregulated in its collection and disclosure of personal
information. Statutes that do govern
private enterprise cover only specific industry sectors, [FN184] leaving other
areas of industry relatively uninhibited in their data collection practices.
[FN185] For example, the Fair Credit
Reporting Act (FCRA) of 1970 prevents private sector misuse of an individual's
informational privacy, but the FCRA does little to restrict the sale or
exchange of such personal data. [FN186]
A consumer report may be provided to anyone for whom the consumer
reporting agency has reason to believe has "a legitimate business need for
the information." [FN187] Notice
to the individual of disclosure of a credit report, however, is required only
in limited situations, such as where it is used as a ground for the rejection
of a consumer‑initiated insurance, credit or employment benefit request.
[FN188]
State and common law have filled in some of
the gaps left by federal legislation.
Most notable is the 1972 amendment to the California State Constitution
providing an express, inalienable privacy right, [FN189] which includes
informational *1158 privacy [FN190]
and is enforceable against both private and public entities. [FN191] At common law, there are four invasion‑of‑privacy
torts, first articulated by William Prosser, [FN192] and adopted by the
Restatement (Second) of Torts. [FN193]
Still, these provide little assistance to one seeking to protect
personally identifiable information, because such a breach of privacy is
generally based upon the disclosure of facts that are highly offensive or
embarrassing. [FN194]
*1159 As citizens of the information age become ever more paranoid
that George Orwell's "Big Brother" [FN195] is developing into a
reality, increasing attention is being focused on protecting informational
privacy and, in particular, the informational privacy of children. Although Supreme Court cases, such as
Vernonia School District 47J v. Acton [FN196] and New Jersey v. T.L.O., [FN197]
recognized a child's right to privacy, these cases examined the issue of search
and seizure in schools. Indeed, in
these cases, the Court found that a child's right to privacy was actually
diminished in settings demonstrating "special needs," such as a
school. [FN198]
*1160 B. Efforts to protect children's
privacy online
In general, a child's informational privacy
has been regulated not in terms of a right belonging to the child, but rather
as an interest of the parent or guardian.
The Supreme Court, in Ginsberg v. New York, [FN199] interpreted the
Constitution to include a parent's right to control the raising of her
children. [FN200] Additionally, with
the Family Educational Right to Privacy Act (FERPA), Congress provided parents
with control over the collection, maintenance and use of data gathered and
contained in their children's educational records. [FN201]
On the Internet, the privacy of children is
not so easily granted or maintained.
Web site operators can seamlessly collect data openly as well as covertly.
[FN202] Many sites invite children to
sign a "guest book," while others require children to convey personal
information in order to participate in activities within the site or to
register for contests. [FN203] In
addition, the use of "cookies" permit an operator to follow a user's
clickstream, that is, to track the user as she browses through a site,
observing how long she stays, what links she follows, and how often she
returns. [FN204] Thus, a Web site
operator may compile multitudes of information *1161 about a person and her online behavior both with and without
her knowledge or consent. [FN205] More
importantly, a Web site operator may compile such information about an
unsuspecting child without the parent's knowledge or consent.
Of course, personally identifiable
information is collected, bought and sold everyday in the brick and mortar
world. Junk mail arrives in our
mailboxes on a daily basis from companies, of which we have never heard, let
alone given our names to, revealing the common practice of selling names
through mailing lists. We are inundated
with opportunities to apply for new credit cards, donate to charities and
purchase items from catalogues.
Information is valuable, especially to companies wishing to market their
products to the consumers most apt to make a purchase. [FN206] Despite this offline parallel, Internet
users seem to be especially protective of their personal information gathered
online. [FN207]
Data collection on the Internet does,
however, serve a non‑clandestine purpose, as it allows Web site operators
to better serve returning visitors. For example, by remembering the type of
bands a cyber guest researches when surfing through a music Web site, an
operator can alert that guest to upcoming concerts, new album releases and
general news concerning those bands through e‑ mails. *1162 Alternatively, an
operator might program the home page to retrieve the user's preferences upon
her next visit and display such information.
The consumer benefits from this personalization because she avoids
clicking through a myriad of pages in which she has no interest. Instead, the
information she most desires is provided immediately to her.
The problem, however, and the basis for much
of the wariness associated with greater utilization of the Internet, is that
this data collection also allows for the aggregation of huge amounts of
personally‑identifiable information into digital databases, which can be
manipulated with considerable ease. [FN208] Not only is the amount of accumulated
information daunting, but the manner in which its collectors are controlling it
is cause for serious concern. [FN209]
Moreover, instances of inadvertent Web site mishaps have been known to
reveal personal data to the masses. [FN210]
In its report to Congress regarding online privacy, the Federal Trade
Commission (FTC) cautioned that while the online marketplace has the potential
for robust, worldwide economic activity, such activity would likely be stunted
without consumer confidence regarding the use of private information. [FN211]
Although the federal government has been
studying the need for adequate privacy controls on the Internet, its main focus
has been in the context of commerce. In
1995, the Clinton Administration's National Information Infrastructure Task
Force (NIITF), an interagency work group led by Vice President Al Gore, issued
its Principles for Providing and Using Personal Information, thereby seeking to
sustain "information privacy, information integrity, and information quality."
[FN212] Hoping to encourage industry
self‑regulation, the NIITF *1163
put forth general maxims by which all online companies should abide. [FN213]
*1164 The principles were meant to further the global electronic
marketplace and thus, did not specifically address the privacy needs of
children. [FN214] However, the NIITF
principles have served as a starting point for subsequent attempts to protect
informational privacy in general by focusing on notice and consent as the two
key elements. [FN215]
In July of 1997, under the leadership of
then Senior Advisor to the President for Policy Development, Ira Magaziner, the
NIITF published A Framework for Global Electronic Commerce (Framework).
[FN216] With the objective of fostering
"increased business and consumer confidence in the use of electronic
networks for commerce," [FN217] the NIITF again looked to the private
sector to monitor itself and provided broad guiding principles without
government involvement. [FN218] In
terms of privacy, the Framework recommended that Web site operators work to
assure personal privacy. [FN219]
Specifically,
[d]ata gatherers should tell consumers what
information they are collecting and how they intend to use it. Consumers should have meaningful choice with
respect to the use and re‑use of their personal information. Parents
should be able to choose whether or not personal information is collected from
their children. In addition, redress
should be available to consumers who are harmed by improper use or disclosure of
personal information or if decisions are based *1165 on inaccurate, outdated, incomplete or irrelevant personal
information. [FN220]
Compared to the self‑regulatory
approach of the United States, The European Union (EU) has taken a decidedly
more hands‑on stance. [FN221] In
October 1995, the Data Privacy Directive (EU Directive) was published, adopting
rules for the collection and use of the private information of EU citizens.
[FN222] Member States were given three
years in which to pass legislation implementing the Directive. [FN223] Aside from requiring unambiguous consent
before any personal information can be collected and used, [FN224] the EU
Directive demands "specified, explicit and legitimate purposes" for
the collection of such data. [FN225] Article 25 of the Directive, compelling
countries outside of the EU to guarantee "an adequate level of
protection" before any personal data will be transferred, has the most
direct impact on the United States. [FN226]
In response, on November 15, 1999, the U.S.
Department of Commerce issued a draft of International Safe Harbor Privacy
Principles (Draft) in an attempt to assist American companies in adhering to
the EU Directive. [FN227] As issued, *1166 the Draft focuses on notice,
choice of the subject to opt‑out of data disclosure, security of the
information, data integrity, access to one's own personal information and
enforcement to ensure compliance. [FN228]
The EU has not yet agreed to the Draft, because concern still exists
that the guidelines fail to meet the EU's standard of an "adequate"
level of control over personal information. [FN229] Recognizing that without the ability to gather data regarding EU
citizens U.S. industry will be significantly hampered, the Department of
Commerce is making progress in attempting to balance the European Union's
privacy concerns with those of American enterprise.
While the U.S. has generally allowed the
private sector to lead the way in its handling of online privacy issues, the
Administration has been willing to take a more active role when it comes to the
collection, use and exchange of personally identifiable information of children
under the age of thirteen. In general,
it is widely held that children in this age bracket do not understand the
ramifications of conveying personal data such as name, address and phone number
to unseen information compilers, [FN230] nor are they capable of legitimately
giving their consent for the use and dissemination of this information. [FN231]
The fear of exposing children to
exploitation is not unfounded. On
October 14, 1997, the FTC took a "snapshot" of Web sites geared
towards children to examine the level of informational privacy protection
accorded minors. [FN232] The results
were far from impressive. Of the 126
sites visited, eighty‑six percent were found to collect personal
information directly from children. [FN233]
Less than thirty percent of those Web sites furnished a privacy policy
outlining their information collection procedures, [FN234] and only four percent
provided a method for obtaining parental consent. [FN235]
*1167 The FTC conducted a second survey in 1998. [FN236] Of the 212 child‑directed Web sites
analyzed, eighty‑nine percent (188 sites) collected personal data.
[FN237] Forty‑two percent of
those (seventy‑nine sites) posted a privacy policy, and less than eleven
percent (twenty sites) attempted to secure permission from parents.
[FN238] While online business was
showing some improvement, it did not seem to be taking the initiative to voluntarily
regulate itself.
Indeed, the deceptive practices of Web site
operators, who collect and use the personally identifiable information of
children, were brought to the forefront through three FTC opinions. The first involved a petition from the
Center for Media Education (CME), alleging unfair and deceptive operation of
the Web site "KidsCom," a site directed at children and a self‑described
"[a] Communications Playground for kids ages 4 to 15." [FN239] The site required children to register, which
included the completion of a survey asking for such information as the child's
name, sex, birthday, e‑mail address, home address, number of family
members and grade. [FN240] Activity
sections also requested that children provide preferences regarding products
and activities. [FN241]
While the FTC declined to take action
against KidsCom due to the site's modification of its information collection
procedures, [FN242] the Commission used the opportunity to comment upon the
informational privacy of children and to
*1168 suggest steps to avoid potential, future law enforcement action.
[FN243] The FTC defined personally
identifiable information as including the child's name, e‑mail address,
home address, and/or telephone number. [FN244]
The Commission also identified parental consent and notice to parents
regarding the intended use of the child's information as the linchpins of fair
data collection practices. [FN245]
The second FTC enforcement action was
brought in 1998 against GeoCities, a popular California‑based company
known for providing such services as free e‑ mail accounts to over two
million members, in addition to hosting and organizing members' personal home
pages into "virtual neighborhoods." [FN246] Members under the age of thirteen, numbering 50,000, were
encouraged to join the "Enchanted Forest," a community created
especially for children. [FN247]
Additionally, the "GeoKidz Club" allowed minor users to enter
special online contests. [FN248] *1169 Registration for membership, as
well as participation in each contest, required filling out online forms
requesting additional personal information. [FN249]
Although their privacy policy stated that
GeoCities would "NEVER give [a
customer's] information to anyone without [her] permission," [FN250] the
FTC alleged that in reality, GeoCities had sold, rented, or otherwise marketed
or disclosed this information, including information collected from children,
to third parties who ha[d] used this information for purposes other than those
for which members [had] given permission. [FN251]
In addition, GeoCities characterized itself
as the sole moderator of the Enchanted Forest, its accompanying promotions and
its data collection practices. [FN252]
According to the FTC, however, GeoCities played no part in these
activities. [FN253] In fact, third
parties directly collected and maintained the information gathered from child
members. [FN254] The FTC deemed these
acts unfair, deceptive and in violation of Section 5(a) of the Federal Trade
Commission Act. [FN255]
Ultimately, GeoCities settled the matter
with the FTC in a consent agreement reached in August of 1998, which ordered
GeoCities to discontinue its misrepresentations regarding the collection and
use of its members' personal information. [FN256] To comply with the order, GeoCities agreed to place a "clear
and prominent" notice to customers on its Web site, explaining its
procedures with respect to information collected online. [FN257] The order also prohibited the collection of
information from a minor when GeoCities had actual knowledge of a lack of
parental consent. [FN258] Thus,
GeoCities is now required to obtain express permission *1170 from a child's parent before any personally identifiable
information is collected. [FN259]
GeoCities was given the flexibility to comply with this provision by
alerting parents via e‑ mail and advising them how to grant consent if
they so choose. [FN260] Information already obtained from minors, prior to the
order, was to be deleted unless the parent affirmatively gave permission for
its continued retention and use. [FN261]
In addition, the agreement required GeoCities to request that third
parties, to which information was previously disclosed, delete such information.
[FN262]
Most recently, the FTC lodged a complaint
against Liberty Financial Companies, Inc. (Liberty Financial), a Massachusetts
asset management corporation, for its deceptive practices in gathering
information from children. [FN263]
Liberty Financial invited minors to complete its "Young Investor
Measure Up Survey." [FN264] In
addition to eliciting the child's name and e‑mail address, the survey
requested financial information including the amount of the child's weekly
allowance, the typesof financial gifts the child received (e.g. stocks, bonds
and mutual funds), spending habits, plans for college, and family finances. [FN265] In exchange for this information, the child
was promised an e‑mail newsletter. [FN266] Participants were further urged to register for a quarterly
drawing to win a prize. [FN267]
Three misrepresentations were cited by the
Commission's complaint. First, Liberty
Financial had claimed that any data obtained would remain "totally
anonymous," meaning that it would be impossible to determine who had
provided *1171 what information.
[FN268] All data collected, however,
were stored in one central database. [FN269]
Thus, responses to the "Young Investor Measure Up Survey"
could, in fact, be connected to the names and e‑ mail addresses of those
who had provided the information. [FN270]
Second, no e‑mail newsletter was ever sent to the survey
participants as promised. [FN271]
Lastly, no prizes were ever distributed in conjunction with the
quarterly drawings. [FN272] In fact,
there was no evidence that the drawings ever took place. [FN273]
Liberty Financial agreed to a consent order
similar to that of GeoCities.
[FN274] The order prohibited
Liberty Financial from making future misrepresentations regarding their data
collection and maintenance practices for children under the age of eighteen.
[FN275] As in the GeoCities matter,
when actual knowledge exists that a child under the age of thirteen does not
have a parent's consent to supply personal information, such information may
not be collected. [FN276] In addition,
any *1172 information collected from
minors prior to the agreement must be deleted. [FN277] Finally, a "clear and prominent"
statement of the Web site's privacy policy must be displayed in areas directed
at children. [FN278] This statement
must include "what information is collected, its intended uses, to whom it
will be disclosed, and the means by which a parent can access and remove the
information that has been collected." [FN279]
Dissatisfied with the online business
community's attempts toward self‑ regulation, Congress enacted the
Children's Online Privacy Protection Act of 1998 (COPPA). [FN280] The COPPA seeks to protect children under
the age of thirteen from unfair and deceptive conduct in the collection of
personally identifiable information over the Internet. [FN281] It codifies the "actual knowledge"
standard, as articulated by the FTC in its opinions regarding KidsCom,
GeoCities and Liberty Financial. [FN282]
Thus, when a general‑ audience Web site attempts to collect
information from a minor and actual knowledge exists that the parent has not
granted consent, such collection is impermissible. [FN283] Finally, the COPPA directs the FTC to promulgate
regulations implementing the Act. [FN284]
One year later, on October 20, 1999, the FTC
released its final version of the rule implementing the COPPA. [FN285] The COPPA Rule is scheduled to take effect *1173 on April 21, 2000, thereby
giving Web site operators six months to comply. [FN286] Five general requirements of Web sites or
online services directed to children are outlined. According to the COPPA Rule, an operator [FN287] must:
(a) Provide notice on the website or online
service of what information it collects from children, how it uses such
information, and its disclosure practices for such information; [FN288]
(b) Obtain verifiable parental consent prior
to any collection, use, and/or disclosure of personal information from
children; [FN289]
(c) Provide a reasonable means for a parent
to review the personal information collected from a child and to refuse to
permit its further use or maintenance; [FN290]
(d) Not condition a child's participation in
a game, the offering of a prize, or another activity on the child disclosing
more personal information than is reasonably necessary to participate in such
activity; [FN291] and
(e) Establish and maintain reasonable
procedures to protect the confidentiality, security, and integrity of personal
information collected from children. [FN292]
The COPPA Rule provides for a "sliding
scale" approach to verifiable parental consent which will sunset in April
of 2002, two years after its effective date. [FN293] Under this approach, the amount of information collected, and the
use of such information, will determine the level of permission required.
[FN294] If a child *1174 wishes to take part in a chat room discussion or other
activity involving disclosures, a more reliable method of consent will be
required, including credit card or print‑and‑send verification.
[FN295] If, on the other hand, information will be used only internally, an e‑mail
from the parent will suffice to signify consent as long as the operator can
verify that the e‑mail was not falsified. [FN296]
The COPPA Rule does carve out a few
exceptions, however, to allow children to communicate without parental consent.
[FN297] Children may, for example, send
questions via e‑mail to online companies and enroll to receive e‑mailed
newsletters. [FN298] In addition, Web
sites offering chat rooms may avoid the verifiable parental consent requirement
by adopting a practice of removing all personal information from the postings
of a minor and deleting such information from the site's databases.
[FN299] These exceptions will allow a
child to utilize the interactive capabilities of the Internet while at the same
time avoiding exploitation.
The COPPA Rule further provides a safe
harbor for operators who adopt approved self‑regulatory guidelines.
[FN300] TRUSTe and BBBOnline (R) are
the most publicized independent self‑regulatory programs that might
qualify for safe harbor status. TRUSTe
is a non‑profit organization that has worked since 1996 to become one of
the pre‑eminent "seals of approval" for Web site privacy
policies. [FN301] Sites that comply
with TRUSTe's privacy principles are licensed to display *1175 a "trustmark," [FN302] which links directly to
that Web site's privacy statement. [FN303]
If a Web site is geared towards children under thirteen, the site must
also meet TRUSTe's seal requirements for children. [FN304] Generally, these requirements mirror those
of the COPPA Rule, demanding notice of privacy policies, the ability to access
personal information and verifiable parental consent before the collection and/or
distribution of a minor's data. [FN305] TRUSTe's program is slightly stricter than the COPPA Rule,
however, in that it also prohibits the offline collection of identifiable
information from children without prior verifiable parental consent. [FN306] Monitoring its licensees through periodic
reviews, TRUSTe also provides consumers with a resolution process in the event
that a licensee violates its posted privacy policy. [FN307] Microsoft (R), AOL (R), eBay (R) and Yahoo!
(R) are listed as some Web publishers meeting TRUSTe's standards. [FN308]
BBBOnline (R) is a competitor program
launched in March of 1999 by the Council of Better Business Bureaus.
[FN309] Like TRUSTe, licensees of
BBBOnline (R) to display the BBBOnline (R) privacy seal. [FN310] To display the BBBOnline (R) Kid's Privacy
seal, sites designed for children under thirteen, and those which collect
information from minors known to *1176
be under thirteen, must also file a separate Children's Supplemental Assessment
Questionnaire. [FN311] In addition to
random, surprise audits of its licensees' Web sites, the seal program further
requires participants to adhere to the BBBOnline (R) Dispute Resolution Policy
and abide by the decisions reached. [FN312]
As of September 1999, however, only two of the eight claims received
[FN313] have been found eligible for review. [FN314]
Both the BBBOnline (R) and TRUSTe programs
are gaining influence in the industry. [FN315]
Nevertheless, neither safeguard company has the ability to punish
violators beyond removing the right to display their seals. [FN316] For example, the *1177 purpose of BBBOnline (R)'s Privacy Policy Review Service in
its Dispute Resolution Process is "for determining the eligibility of a
complaint and evaluating, investigating, analyzing, and making a decision on
the merits of an eligible complaint." [FN317] However, privacy violations under the BBBOnline (R) regimen must
be reported to the FTC to initiate any legal action. [FN318] The same holds true for TRUSTe's program.
[FN319] In addition, at least one
industry expert has expressed concern that the seal programs won't be
completely understood by consumers and will only serve to provide a "false
sense of consumer security . . . tend[ing] to raise a false expectation of
privacy." [FN320]
The use of filtering software may enable the
consumer to take a more active role in the protection of her informational
privacy. Currently under development is
the Platform for Privacy Preferences, better known as P3P, [FN321] a project of
the World Wide Web Consortium (W3C). [FN322]
P3P would allow users to specify and automate their desired privacy
policies. [FN323] It would then
identify Web sites having policies matching the preferences of the user, thereby
availing users of sifting through endless privacy statements in search of those
conforming to the *1178 user's
wishes. [FN324] Furthermore, with the
installation of P3P, parents may determine, in advance, how much information
their children may disclose to Web sites, as well as how much information Web
sites may disclose, if any, to third parties. [FN325]
A P3P‑compatible Web browser would be
necessary to enjoy the benefits of P3P. [FN326] More importantly, Web sites would have to incorporate P3P
specifications into their sites. [FN327]
For this purpose, AT&T has developed a Proposal Generator based on
the W3C's latest working draft, which would allow Web site operators the
ability to easily generate a privacy statement in P3P code. [FN328]
A 1999 two‑week survey conducted by
the Center for Media Education (CME) demonstrates that while the most‑visited
children's Web sites are improving their informational privacy practices,
others are not taking the concerns seriously.
The CME reviewed 155 Web sites, consisting of a random sampling of
seventy‑five sites and a second sample of eighty of the most popular
children's sites. [FN329] The random
sample revealed that ninety‑five percent (seventy‑one sites)
collected personally identifiable information from children. [FN330] Of those, less than six percent attempted to
obtain any kind of consent from parents, and only twenty‑seven percent
posted a privacy policy. [FN331] The
sample of popular children's Web sites fared slightly better. Of the eighty sites observed, eighty‑eight
percent (seventy sites) were found to collect information from minors.
[FN332] Of those seventy sites, less
than twenty‑six percent (eighteen sites) attempted to obtain parental
consent, but almost three‑quarters (fifty‑two sites) posted a
privacy policy. [FN333] When compared
with the 1997 and 1998 FTC surveys*1179
the progress is significant. Still,
there is much room for improvement.
IV.
CONCLUSION
Governing the Internet has proven to be no
easy matter. While the intentions of
legislators may be noble, their solutions are short‑sited. Congress has jumped too quickly in
attempting to net the Net before its potential has been fully realized. Former Representative Rick White of
Washington, who before losing his re‑election bid was known as one of the
most "Net‑friendly" legislators in the House of
Representatives, [FN334] has noted that Congress' view is too high‑level.
[FN335] Because members of Congress
have to know a little information about many topics, it is impossible for them
to fully understand the intricacies of that which they seek to regulate.
[FN336]
For example, despite improvements over its
predecessor, the Children's Online Protection Act ("COPA") [FN337]
will likely meet the same fate as that of the Communications Decency Act
("CDA"). [FN338] The Third
Circuit will most likely find that the COPA suffers from many of the same
problems as the CDA, thereby affirming the ruling of Judge Reed, of the Eastern
District of Pennsylvania, that the COPA is unconstitutional because it still
unduly burdens protected speech. [FN339]
Additionally, methods of adult verification
called for by the COPA are prohibitively expensive. Therefore, potential speakers will be deterred from publishing
questionable material on the Web. By
the same token, adult users may be deterred from accessing information behind
verification walls due to their desire to keep credit card numbers or other
required information confidential.
Moreover, the COPA only applies to commercial Web sites. [FN340] Since non‑commercial, as well as
foreign‑based, sites will not fall under the scope of the Act, the COPA
will prove to be an ineffective method by which to protect minors from harmful *1180 materials. Furthermore, the COPA is not the least
restrictive means available. Tools
empowering parents to filter out material that they find objectionable on their
own computers may better serve the needs of families without government
intervention. Finally, the COPA does
not specify which community's standards would apply in deciding whether
material is harmful to minors. [FN341]
During oral arguments, the Third Circuit voiced concern over the
implications of international transmissions imposing United States values on
foreign countries or vice versa. [FN342]
Aside from being unconstitutional, the COPA,
as with any Internet legislation at the national level, "fails to
recognize the international nature of the Internet and electronic
commerce." [FN343] In his
statement to Congress regarding the COPA, Representative White asserted that it
would "breed a false sense of security" in American Internet users
because, in practice, it would not cover commercial sites residing
internationally. [FN344] Additionally,
he surmised that the bill as written might "lock us into the wrong
technology [for shielding minors from exposure to sexually explicit materials],
technology that is obsolete and will not do as good a job as technology that
might come along in the future." [FN345]
Mandating filters on public library and
school computers is an issue that will undoubtedly continue to cause
controversy. The Mainstream Loudoun
[FN346] decision *1181 was just the
first step in attempting to determine how best to protect children using
federally funded Internet access. [FN347] Kathleen R.'s appeal [FN348] will
likely be dismissed, as were her first two attempts at a court‑ordered
filtering policy because, although she has presented an imaginative cause of
action, the Fourteenth Amendment [FN349] simply does not apply in Kathleen R.'s
case. A library's decision not to
install filters onto its computers does not constitute a state action violating
the rights of its patrons as prohibited by the Fourteen Amendment. [FN350] Even
conservative groups seem opposed to obligating public libraries and schools to
install filtering software. [FN351] A
letter written to Representative Thomas Bliley, Chairman of the House Commerce
Committee, reflects the view of fourteen such groups [FN352] concerned about
the implications of the Istook Amendment. [FN353] While the letter "applaud[s] Mr. Istook for his moral
concern over this issue," [FN354] it states that private industry is
better equipped to cope with the situation at a local level and, in fact, the
Internet community has already begun to implement*1182 procedures to protect children online. [FN355]
Interestingly, while the government has
introduced a proliferation of bills mandating policies to curb indecency on the
Internet, its solution to the issue of protecting children's privacy was the
result of compromise with the industry.
The Children's Online Privacy Protection Act [FN356] and its
accompanying FTC Rule [FN357] have been lauded by privacy groups and industry
insiders alike. [FN358] Why the difference
in strategy? It might be true that the sight of a nude woman in a suggestive
pose has a more direct and immediate impact on a child's well being than the
misuse of the same child's private information. Perhaps the more likely explanation, however, is that the
government finds it more important to foster a robust online marketplace rather
than a robust marketplace of ideas. The
approach taken in formulating the COPPA Rule could be emulated in regulating
sexually explicit material on the Web.
A little bit of legislation can go a long way. [FN359] Too much legislation can be stifling.
An important outgrowth of the effort to
protect children online is recognizing how the safeguards implemented will
hinder their freedom of speech. While
the First Amendment rights of children are not "co‑extensive with
those of adults," [FN360] children should not be hindered in their use and
enjoyment of the Internet. The COPPA
Rule commands that Web sites, which require children to register, obtain
verifiable parental consent before such information is collected. [FN361] Silence is therefore not enough. Parents must affirmatively express their
permission before a child can register and enjoy Web site offerings including *1183 chat capabilities. [FN362]
However, children may decide to surf away rather than take the time to get the
permission necessary for them to gain access to the Web site. Children are thereby denied the opportunity
to view whatever unique information may be presented on the Web. Additionally, children may want to access
material without alerting their parents, especially when the information sought
is of such a private nature that, due to fear or possibly embarrassment,
children do not want their parents to know.
Children under the age of thirteen are part
of a generation that will not remember life before the World Wide Web. Similar to those born after the advent of
the telephone, who cannot imagine an existence without the ability to
communicate with family and friends by simply picking up a handset and dialing,
the children of today will be accustomed to immediate communication through e‑
mail and instant messaging. Their
development and future success will no doubt be dependent upon their ability to
utilize these ever‑evolving new technologies. Additionally, the wealth of information to which children have
access today is far broader than any set of encyclopedia that parents of
yesterday might have bought for their children. An understanding of the possibilities awaiting them on the
Internet will enrich a child's learning experience and create a desire for
knowledge.
Steps taken to protect children from
sexually explicit material, as well as from exploitation of their informational
privacy may, in the long run, hinder their ability to receive information and
use the Internet to its fullest capacity.
Parental supervision is still the most effective method in overseeing a
child's Internet use, [FN363] but parents cannot always be present. Thus, future guidelines and legislation
should recognize that in the process of protecting the privacy and safety of
children, any interference with their right and ability to communicate should
be carefully avoided.
[FN1].
See CyberAtlas, Internet Becoming a Daily Essential (last modified Apr. 7,
1999) <http://cyberatlas.internet.com/big_
picture/demographics/article/0,1323,5901_150321,00.html> (citing Strategis
study estimating the number of Internet users as more than one hundred
million).
[FN2].
See The Strategis Group, U.S. Internet Breaks The 100 Million Mark (visited Jan. 17, 2000)
<http://www.strategisgroup.com/press/pubs/iut99.html>.
[FN3].
CyberAtlas, supra note 1 (quoting Jeff Moore, a Strategis Internet Consultant).
[FN4].
47 U.S.C.A. § 231 (West 1999) [hereinafter "COPA" ].
[FN5].
15 U.S.C.A. §§ 6501‑6506 (West 1999) [hereinafter "COPPA" ].
[FN6].
16 C.F.R. § 312 (1999) [hereinafter "COPPA Rule" ].
[FN7].
A top level domain (TLD) is made up of the letters coming after the "dot" in a domain name. See Heather N. Mewes, Memorandum of
Understanding on the Generic Top‑Level Domain Name Space of the Internet
Domain Name System, 13 Berk. T. L.J. 235, 236 (1998). Examples include ".com" and ".net." See id.
In Brittany's case, ".gov" would have been the appropriate TLD
signifying a government Web site.
Brittany instead typed in ".com", which took her to a
commercial Web site.
[FN8].
See WhiteHouse.com (visited Jan. 20, 2000) <http:// www.whitehouse.com>.
[FN9].
Id.
[FN10].
WhiteHouse.com, First Ladies (visited Jan. 20, 2000) <http://
www.whitehouse.com/tour1_ladies.html>.
[FN11].
See Yahoo! Search Engine (visited Jan. 20, 2000) <http://
www.yahoo.com/>.
[FN12].
American Civil Liberties Union ("ACLU") v. Reno, 929 F.Supp. 824, 842
(E.D. Pa. 1996).
[FN13].
See Sarah E. Warren, Filtering Sexual material on the Internet: Public
Libraries Surf the Legal Morass, 73 Fla. B. J. 52, 53 (Oct. 1999) (quoting
Jeannette Allis Bastian, Filtering the Internet in American Public Libraries:
Sliding Down the Slippery Slope, First Monday Internet Journal (1997)). Ms. Bastian remarked, "[c]an and should
the Internet be censored by filtering is a question bedeviling thousands of
public librarians who have rushed to embrace this seemingly limitless and
economical information source only to find that it includes a distinctly dark
and dirty side." Id.
[FN14].
U.S. Const. amend. I (providing "Congress shall make no law... abridging
the freedom of speech").
[FN15].
See Schenck v. U.S., 249 U.S. 47 (1919) (stating that not all speech is
constitutionally protected); see also New York v. Ferber, 458 U.S. 747 (1982)
(holding that child pornography, even if not legally obscene, is nevertheless
unprotected speech); Brandenburg v. Ohio, 395 U.S. 444 (1969) (concluding that
language directed at inciting imminent illegal conduct, which is likely to
produce such conduct does not fall under First Amendment protection); New York
Times v. Sullivan, 376 U.S. 254 (1964) (deciding that defamatory statements
made with knowing or reckless falsity are afforded no constitutional
protection); Roth v. U.S., 354 U.S. 476 (1957) (stating that obscenity is not
within the area of constitutionally protected speech); Chaplinsky v. New
Hampshire, 315 U.S. 568 (1942) (finding that fighting words are not protected
under the First Amendment).
[FN16].
See Kleindeinst v. Mandel, 408 U.S. 753, 775 (1972) (Marshall, J., dissenting)
(arguing that the First Amendment forbids the federal government from denying
American citizens the opportunity to hear even the words of one with whose
political views the Government disapproves); Stanley v. Georgia, 394 U.S. 557,
565 (1969) (opining that "[i]f the First Amendment means anything, it
means that a State has no business telling a man, sitting alone in his own
house, what books he may read or what films he may watch"); Griswold v.
Connecticut, 381 U.S. 479, 482 (1955) ("The right of freedom of speech and
press includes not only the right to utter or to print, but the right to
distribute, the right to receive, the right to read... ").
[FN17].
Generally, categories of expression will be deemed unprotected if they
"are of no essential part of any exposition of ideas [and] of... slight
social value as a step to truth that any benefit that may be derived from them
is clearly outweighed by the social interest in order and morality."
Chaplinsky v. New Hampshire, 315 U.S. 568, 572 (1942).
[FN18].
See Warren supra note 13 and accompanying text. Regulation directed at the communicative impact of an expression
requires a court to first determine whether the speech is protected or
unprotected. If protected, the
regulation is presumed unconstitutional and a strict scrutiny analysis is
applied. The regulation will be
sustained only if it serves a compelling government interest and is necessary
in that it is narrowly tailored to achieve that objective. See Widmar v. Vincent, 454 U.S. 263, 269‑70
(1981) (finding that legislative acts limiting unprotected expression are
subject to a lower level of scrutiny); but see R.A.V. v. City of St. Paul, 505
U.S. 377, 382‑ 84 (1992) (holding that while the government may proscribe
unprotected speech, disagreement with the viewpoint of such speech is not a
valid motive for doing so).
[FN19].
See Roth, 354 U.S. at 484 (stating that "implicit in the history of the
First Amendment is the rejection of obscenity as utterly without redeeming
social importance"); see also Miller v. California, 413 U.S. 15, 20 (1973)
(refining the obscenity standard to the definition used today); Chaplinsky, 315
U.S. at 571‑72 (noting in dicta that the "prevention and
punishment" of "lewd and obscene" expression does not violate
the First Amendment).
[FN20].
352 U.S. 380 (1957).
[FN21].
See id.
[FN22].
Id. at 381.
[FN23].
See id. at 382.
[FN24].
See id. at 383.
[FN25].
See Butler, 352 U.S. at 383.
[FN26].
390 U.S. 629 (1968).
[FN27].
See id. at 637. The statute defined
"harmful to minors" as:
... that quality of any description or
representation, in whatever form, of nudity, sexual conduct, sexual excitement,
or sado‑masochistic abuse, when it:
(a) Considered as a whole, appeals to the
prurient interest in sex of minors; and
(b) Is patently offensive to prevailing
standards in the adult community as a whole with respect to what is suitable
material for minors; and
(c) Considered as a whole, lacks serious
literary, artistic, political and scientific value for minors.
N.Y.
Penal Law § 235.20 (McKinney 1999).
[FN28].
See Ginsberg, 390 U.S. at 639‑40.
[FN29].
See id. at 634.
[FN30].
See id.
[FN31].
438 U.S. 726 (1978).
[FN32].
The Court defined "indecent" as that language not conforming to
generally held standards of morality, but not necessarily appealing to a
prurient interest. See id. at 740.
[FN33].
See FCC v. Pacifica Found., 438 U.S. 726 (1978).
[FN34].
See id. at 729‑30.
[FN35].
See id.
[FN36].
See id. at 731.
[FN37].
The broadcasting spectrum is limited.
See id. at 731 n.2. Only a
certain number of broadcasters may occupy the airwaves and use them to reach
listeners. See id. Thus, the government may allocate the
spectrum through licenses to those who are deemed to serve the public
interest. See id. See also Red Lion Broad., Co. v. FCC, 395
U.S. 367, 394 (1969) (determining that the FCC's practice of licensing
broadcasters for the public interest was consistent with the First Amendment
due to spectrum scarcity).
[FN38].
See Pacifica, 438 U.S. at 731.
[FN39].
See Cohen v. California, 403 U.S. 15, 21 (1971) (holding that the proscription
of speech is unwarranted where unwilling viewers or listeners may "avert[
] their eyes" from the offensive material).
[FN40].
See Pacifica, 438 U.S. at 732.
[FN41].
See id. at 731 n.2.
[FN42].
See e.g., Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 637‑41 (1994) (finding that cable broadcasts should
receive the full constitutional protection of the First Amendment).
[FN43].
518 U.S. 727 (1996) (plurality opinion).
[FN44].
Id. at 743
[FN45].
See id. at 748.
[FN46].
See id. at 783 (Kennedy, J., concurring and dissenting in part).
[FN47].
See id. (Kennedy, J., concurring and dissenting in part).
[FN48].
See Sable Communs. of Cal., Inc. v. FCC, 492 U.S. 115 (1989).
[FN49].
See id. at 126.
[FN50].
See id.
[FN51].
See id.
[FN52].
See id. at 128. One must affirmatively
pick up the phone and dial in order to receive the indecent messages with
"dial‑a‑porn," whereas a radio broadcast may invade a
listener's home without warning. See
id.
[FN53].
See id. at 126.
[FN54].
The Internet is not based on a limited broadcast spectrum and therefore, does
not require allocation by the government.
Anyone with a computer and an imagination may set up a home page and
make her ideas available to the masses.
[FN55].
See ACLU v. Reno, 929 F.Supp 824, 844 (E.D. Pa. 1996).
[FN56].
See id. at 845.
[FN57].
47 U.S.C.A. § 223 (Supp. II 1996) (repealed 1997).
[FN58].
See Pub. L. No. 104‑104, 110 Stat. 56 (codified as amended in scattered
sections of 47 U.S.C. and 15 U.S.C.).
[FN59].
47 U.S.C.A. § 223(a) (1) (B).
[FN60].
47 U.S.C.A. § 223(d) (1) (B).
[FN61].
47 U.S.C.A. § 223(e) (5) (A).
[FN62].
47 U.S.C.A. § 223(e) (5) (B).
[FN63].
Dan L. Burk, Panel I: Direct/Government Regulation of the Internet, 15 N.Y.L.
Sch. J. Hum. Rts. 1, 56 (1998).
[FN64].
ACLU v. Reno, 929 F.Supp. 824, 857 (E.D. Pa. 1997).
[FN65].
See Reno v. ACLU, 521 U.S. 844, 864 (1997).
[FN66].
See id. at 883.
[FN67].
See id. at 870‑71.
[FN68].
Id.
[FN69].
Id. at 875.
[FN70].
See id. at 875.
[FN71].
See Reno, 521 U.S. at 876.
[FN72].
A commercial speaker is one who publishes for a profit. As written, the CDA made liable all non‑profit
Web publishers, including individuals.
See id. at 877.
[FN73].
See id. at 876‑77.
[FN74].
See id.
[FN75].
See id.
[FN76].
See Reno, 521 U.S. at 876‑77.
[FN77].
47 U.S.C.A. § 231 (West 1999).
[FN78].
The plaintiffs included Androgyny Books, Inc. d/b/a/ A Different Light Book
Stores, American Booksellers Foundation for Free Expression, Artnet Worldwide
Corporation, Blackstripe, Addazi Inc., d/b/a Condomania, Electronic Frontier
Foundation, Electronic Privacy Information Center, Free Speech Media, Internet
Content Coalition, OBGYN.net, Philadelphia Gay News, Planetout Corporation,
Powell's Bookstore, Riotgrrl, Salon Internet, Inc. and West Stock, Inc. See Reno v. ACLU, 521 U.S. 844, 864 (1997).
[FN79].
ACLU v. Reno, 1998 WL 813423, 3 (E.D.Pa. 1998).
[FN80].
Id. at 3‑4.
[FN81].
See ACLU v. Reno, 31 F.Supp.2d 473, 498‑99 (E.D.Pa. 1999).
[FN82].
See COPA, 47 U.S.C.A. § 231(a) (1).
COPA defines "harmful to minors" as:
any communication, picture, image, graphic
image file, article, recording, writing, or other matter of any kind that is
obscene or that ‑
(A) the average person, applying
contemporary community standards, would find, taking the material as a whole
and with respect to minors, is designed to appeal to, or is designed to pander
to, the prurient interest;
(B) depicts, describes, or represents, in a
manner patently offensive with respect to minors, an actual or simulated sexual
act or sexual contact, an actual or simulated normal or perverted sexual act,
or a lewd exhibition of the genitals or post‑pubescent female breast; and
(C) taken as a whole, lacks serious
literary, artistic, political, or scientific value for minors.
47
U.S.C.A. § 231 (e) (6).
[FN83].
Id. The COPA defines "commercial
purposes" as:
A person shall be considered to make a
communication for commercial purposes
only if such person is engaged in the
business of making such communications.
47
U.S.C.A. § 231(e) (2) (A).The COPA defines "engaged in the business"
as:
... the person who makes a communication, or
offers to make a communication, by means of the World Wide Web, that includes
any material that is harmful to minors, devotes time, attention, or labor to
such activities, as a regular course of such person's trade or business, with
the objective of earning a profit as a result of such activities (although it
is not necessary that the person make a profit or that the making or offering
to make such communications be the person's sole or principal business or
source of income). A person may be
considered to be engaged in the business of making, by means of the World Wide Web,
communications for commercial purposes that include material that is harmful to
minors, only if the person knowingly causes the material that is harmful to
minors to be posted on the World Wide Web or knowingly solicits such material
to be posted on the World Wide Web.
47
U.S.C.A. § 231(e) (2) (B).
[FN84].
See 47 U.S.C.A. § 231(b).
[FN85].
See 47 U.S.C.A. § 231(a) (1) ‑ (3).
[FN86].
See 47 U.S.C.A. § 231(c) (1) ‑ (3).
[FN87].
See COPA, 47 U.S.C.A. §231(c) (1) ‑ (3); see also CDA, 47 U.S.C.A. §
223(e) (5) (A) (Supp. II 1996) (repealed 1997).
[FN88].
See ACLU v. Reno, 31 F.Supp.2d 473, 498 (E.D.Pa. 1999).
[FN89].
969 F.Supp. 160 (S.D.N.Y. 1997).
[FN90].
See id. at 163. An individual was in
violation of the law when:
Knowing the character and content of the
communication which, in whole or in part, depicts actual or simulated nudity,
sexual conduct or sadomasochistic abuse, and which is harmful to minors, [to]
intentionally use any computer communication system allowing the input, output,
examination or transfer, of computer data or computer programs from one
computer to another, to initiate or engage in such communication with a person
who is a minor.
Id.
at 163 (citing challenged amendment to N.Y. Penal Law § 235.21 (McKinney 1999)).
[FN91].
See U.S. Const. art. I, §8, cl. 3 (granting Congress the power "[t]o regulate Commerce with foreign
Nations, and among the several States, and with the Indian Tribes"). Congress' power to regulate interstate
commerce pre‑ empts the ability of the individual States to do so.
[FN92].
See Pataki, 969 F.Supp. at 169.
[FN93].
See id. The court defined interstate
communications via the Internet as commerce for purposes of the Commerce
Clause. See id. at 173.
[FN94].
See id. (explaining that the statute,
in attempting to enforce New York law upon communications originating outside
of New York, subjected Internet users to "inconsistent regulations,"
thereby burdening interstate commerce).
[FN95].
See id. at 182. "The Internet,
like... rail and highway traffic..., requires a cohesive national scheme of
regulation so that users are reasonably able to determine their
obligations." Id.
[FN96].
See ACLU v. Johnson, 194 F.3d 1149, 1152 (10th Cir. 1999).
[FN97].
See id.
[FN98].
See id. at 1160.
[FN99].
Id. at 1162; see also Am. Library Ass'n ("ALA") v. Pataki, 969
F.Supp. 160, 179 (S.D.N.Y. 1997).
[FN100].
See Bd. of Educ. v. Pico, 457 U.S. 853, 866 (1982) (holding that "the
State may not, consistently with the spirit of the First Amendment, contract
the spectrum of available knowledge").
[FN101].
See American Library Association/OITP, The 1998 National Survey of U.S. Public
Library Outlet Internet Connectivity (1998)
(last modified Oct. 22, 1999)
<http://www.ala.org/oitp/survey98.html>.
[FN102].
See ALA, Access to Electronic Information, Services, and Networks: an
Interpretation of the Library Bill of Rights (last modified Nov. 11, 1999) <
http://www.ala.org/alaorg/oif/electacc.html> (adopted by the ALA Council
Jan. 24, 1996).
[FN103].
Article V of the Library Bill of Rights provides "[a] person's right to
use a library should not be denied or abridged because of origin, age,
background, or views." ALA,
Library Bill of Rights of the American Library Association (visited Jan. 22,
2000) <http://www.ala.org/work/freedom/lbr.html>.
[FN104].
In loco parentis is defined as "in the place of a parent; charged,
factitiously, with a parent's rights, duties, and responsibilities." Black's Law Dictionary 708 (6th ed. 1991).
[FN105].
See ALA, Access for Children and Young People to Videotapes and Other Nonprint
Formats (visited Jan. 22, 2000) <http://
www.ala.org/alaorg/oif/acc_chil.html>.
[FN106].
See Warren, supra note 13, at 53 (stating that as of the summer of 1998,
fifteen percent, or 1,700 libraries, used filtering software, and of those,
over half had filters on all of their computers).
[FN107].
See PeaceFire, Blocking Software FAQ (visited Jan. 20, 2000) <
http://www.peacefire.com/info/blocking‑software‑faq.html>.
[FN108].
See id.
[FN109].
See id.
[FN110].
By trial and error, PeaceFire has come up with a list of sites which are, or
were at one time, blocked by filtering software. Under Cyber Patrol, for example, Web sites for Planned Parenthood
and freedom of expression advocates, as well as newsgroups discussing atheism
and feminism, are listed as blocked.
See PeaceFire, Cyber Patrol Examined (visited Jan. 22, 2000) <http://
www.peacefire.com/censorware/Cyber_Patrol/>.
[FN111].
24 F.Supp.2d 552, 556 (E.D. Va. 1998).
[FN112].
Id.
[FN113].
See id. at 570.
[FN114].
See id. at 564.
[FN115].
See id. A law that regulates the
activity of speaking for reasons other than the speech's content is considered
"content‑neutral." See
City of Renton v. Playtime Theatres, Inc., 475 U.S. 41, 47 (1986). It is less offensive to the First Amendment
because the law does not seek to restrict or limit the idea expressed, and
therefore must meet a lower level of scrutiny than a "content‑based"
regulation. See id. In order to be constitutional, a law must
serve a significant government interest (as opposed to compelling), be narrowly
tailored to achieve such interest and leave open alternative channels by which
information may be communicated. See
id.
[FN116].
Mainstream Loudoun, 24 F.Supp.2d at 564.
[FN117].
See id.
[FN118].
See id. at 565‑66. Defendants
cited only one complaint in another library in Virginia, and three other
complaints in libraries nationwide. See
id. In the Loudoun County Library, no
incidents had been reported by either the library employees or library patrons
of inappropriate Internet use. See id.
[FN119].
See id.
[FN120].
See id. at 567.
[FN121].
See Mainstream Loudoun, 24 F.Supp.2d at 569.
[FN122].
See id.
[FN123].
Id.
[FN124].
Id. at 570 (quoting Mainstream Loudoun v. Board of Trustees of the Loudoun
County Library, 2 F.Supp.2d 783, 797 n.25 (1998) (citing Lamont v. Postmaster
General, 381 U.S. 301, 307 (1965))).
[FN125].
See id.
[FN126].
See id.
[FN127].
See id.
[FN128].
See 2 Andrews Telecomm. Indus. Litig. Rep. 16 (June 1999) (citing Mainstream Loudoun v. Board of
Trustees of the Loudoun County Library, No. 97‑2049‑A (E.D. Va.,
Alexandria Div., Apr. 19, 1999 vote)).
[FN129].
See Michael Schuyler, Porn Alley: Now at Your Local Public Library, 19 Cptr.
Lib. 32 (1999).
[FN130].
See id. (asking for volunteers to involve themselves in a case that would
settle the issue).
[FN131].
See Kathleen R. v. City of Livermore, Plaintiff's Complaint, PP 1‑14 (May
28, 1998) available at <http://www.filteringfacts.org/liv‑comp.htm>.
[FN132].
See id. PP 28‑30.
[FN133].
See Janet Kornblum, Library Filtering Suit Dismissed, C‑Net News.com,
Oct. 21, 1998 <http://news.cnet.com/category/0‑1005‑200‑334499.html>;
see also Pub. L. No. 104‑104, 110 Stat. 137 (codified at 47 U.S.C. §
230(c) (1) (2000)).
[FN134].
See Janet Kornblum, Library Net access
under renewed attack, C‑Net News.com, Jan. 4, 1999
<http://news.cnet.com/category/0‑1005‑200‑336932.html>.
[FN135].
Id.
[FN136].
See ACLU, Court Upholds Livermore Library's Uncensored Internet Access Policy
(last modified Jan. 14, 1999) <http:// aclu.org/news/1999/n011499a.html>.
[FN137].
See Kathleen R. v. City of Livermore, No. A086349 (Cal. Ct. App. filed July 16,
1999).
[FN138].
See S.97, 106th Cong. (1999).
[FN139].
See id.
[FN140].
See Karen MacPherson, Compromise in the Works for Children's Internet Access,
Pittsburgh Post Gazette, Nov. 14, 1999, at A12.
[FN141].
See Childrens' Internet Protection Act, H.R.896, 106th Cong. (1999).
[FN142].
See S.1545, 106th Cong. (1999).
[FN143].
See id.
[FN144].
See id.
[FN145].
See Child Protection Act of 1999, H.R. 2560, 106th Cong. (1999).
[FN146].
Id.
[FN147].
See ALA, Capitol Hill Rally Urges Rapid E‑Rate Implementation, American
Libraries (July 27, 1998) (quoting Illinois Representative Bobby Rush)
(available on the ALA Web site at http://
www.ala.org/alonline/news/1998/980727.html).
[FN148].
See Internet Content Update: Recent Legislative Actions, ALAWON (ALA, Washington, D.C.), Aug. 13, 1999
(available on the ALA Web site at
http://www.ala.org/washoff/alawon/alwn8082.html).
[FN149].
See id.
[FN150].
See Mainstream Loudoun v. Board of Trustees of the Loudoun County Library, 24
F.Supp.2d 552 (E.D. Va. 1998).
[FN151].
See supra notes 138, 141, 145.
[FN152].
See Ann Beeson, Symposium, 15 N.Y. L. Sch. J. Hum. Rts. 1, 46‑ 47 (1998).
[FN153].
See id.
[FN154].
See id.
[FN155].
See Warren, supra note 13, at 55; see also Mainstream Loudoun, 24 F.Supp.2d at
565 (recalling that the only incident of complaint in a Virginia library was
remedied by installing privacy screens which, according to the librarian,
"work[ed] great").
[FN156].
See Warren, supra note 13, at 55.
[FN157].
See Anick Jesdanun, Libraries Caught in Internet Crossfire, Indianapolis Star,
Dec. 11, 1999, at A19 (describing one such Michigan library).
[FN158].
Alta Vista, Family Filter Frequently Asked Questions (visited Jan. 22, 1999)
<http://doc.altavista.com/help/search/family_help.shtml>. Alta Vista blocks "offensive
images," chat rooms, information relating to drugs, tobacco, gambling,
hate speech, sexual explicitness and violence, as well as those Web sites
"deemed inappropriate by [their] editors." Id.
[FN159].
See W3C, Platform for Internet Content Selection (PICS), (visited Jan. 22,
2000) <http://www.w3.org/PICS/>.
[FN160].
R. Polk Wagner, Filters and the First Amendment, 83 Minn. L. Rev. 755, 764
(1999).
[FN161].
See id.
[FN162].
See id. at 767 (reporting that as of summer 1998, six self‑ rating
systems had been developed as well as several third‑party PICS compatible
ratings systems).
[FN163].
See id.
[FN164].
See id. at 765 (illustrating that one could create a personal ratings system
which blocked out pieces by Rush Limbaugh).
[FN165].
See Wagner, supra note 160 at 762‑63, 765 (describing the problems
associated with ratings systems).
[FN166].
See id.; see also Miller v. California, 413 U.S. 15, 40‑41 (1973) (Douglas, J., dissenting) (explaining
that the determination of what is obscene deals with individually subjective
tastes).
[FN167].
See Bertelsmann Foundation, Internet Content Summit (visited Jan. 22, 2000)
<http://www.stiftung.bertelsmann.de/internetcontent/english/frameset_
home.htm>.
[FN168].
See id.
[FN169].
See id.
[FN170].
See Robert MacMillan, Munich Conference Worries Privacy Advocates, Newsbytes,
Sept. 6, 1999.
[FN171].
See id. (quoting David Sobel of the Electronic Privacy Information Center
("EPIC")).
[FN172].
Frank James, Somebody's Watching You: Concerns About Privacy Issues Bridge the
Political Divide in Washington, Greensboro News & Record, Oct. 31, 1999, at
H1 (quoting Dierdre Mulligan, senior counsel for the Center for Democracy and
Technology).
[FN173].
See id.
[FN174].
Groups supporting the individual's privacy rights include the Electronic
Information Privacy Center ("EPIC") <http://www.epic.org>, the
Center for Media and Education, ("CME") <http://www.cme.org>
and the Center for Democracy and Technology ("CDT")
<http://www.cdt.org>. Information
gatherers such as DoubleClick <http://www.doubleclick.net>, which designs
Internet advertising plans for companies wishing to reach Internet users, are
on the other side of the spectrum.
[FN175].
See Griswold v. Connecticut, 381 U.S. 479, 484 (1965). The Court explained that "specific
guarantees in the Bill of Rights have penumbras, formed by emanations from
those guarantees that help give them life and substance. Various guarantees create zones of
privacy." Id.
[FN176].
See Paul v. Davis, 424 U.S. 693, 713 (1976). (identifying individual choices
"relating to marriage, procreation, contraception, family relationships,
and child rearing and education" as protected under a right to privacy);
Loving v. Virginia, 388 U.S. 1, 12 (1967) (prohibiting the government's
interference with an individual's right to marry); Griswold, 381 U.S. at 485‑86
(holding the right to marital privacy includes decisions regarding the use of
contraceptives).
[FN177].
See Susan E. Gindin, Lost and Found in Cyberspace: Informational Privacy in the
Age of the Internet, 34 San Diego L. Rev. 1153, 1185 (1997) (noting that no
constitutional right to privacy of personal information has been found to exist
either explicitly or implicitly).
[FN178].
429 U.S. 589 (1977).
[FN179].
See id. at 605.
[FN180].
See id. at 605‑06.
[FN181].
5 U.S.C. § 552a (1999) (regulates collection and disclosure procedures
regarding data of federal agencies).
[FN182].
5 U.S.C. § 552a (1999). An amendment to
the Privacy Act, CMPAA establishes mechanisms to regulate the matching of
personal information across federal databases.
See id.
[FN183].
5 U.S.C. § 551 (1999). FOIA provides
access to federal records, but precludes the disclosure of information
constituting an unwarranted invasion of privacy such as medical files. See id.
[FN184].
See, e.g., Video Privacy Protection Act of 1988, 18 U.S.C. §§ 2710‑2711
(1999); Right to Financial Privacy Act of 1978, 12 U.S.C. §§ 3401‑3422
(1999).
[FN185].
See Elizabeth deGrazia Blumenfeld, Privacy Please: Will the Internet Industry
Act to Protect Consumer Privacy Before the Government Steps In?, 54 Bus. Law.
349, 354 (1998). The article explained
that "current federal privacy legislation takes an industry sectorial
approach." Id. at 360. See also Maureen S. Dorney, Privacy and the
Internet, 19 Hastings Comm. & Ent. L.J. 635, 642 (1997) (noting that laws
regulate only "particular industries" within the private sector in
the collection and use of personal information).
[FN186].
15 U.S.C.A. §§ 1681, 1681(a)‑(t) (West 1999); see also George B. Trubow,
Protecting Informational Privacy in the Information Society, 10 N. Ill. U. L.
Rev. 521, 531‑32 (1990) (discussing privacy protection in the private
sector); Dorney, supra note 185, at 648.
[FN187].
15 U.S.C. § 1681b(a) (3) (F).
[FN188].
15 U.S.C. § 1681m(a) (1)‑(2).
[FN189].
Cal. Const. art I, § 1 (stating "[a]ll people are by nature free and
independent and have inalienable rights.
Among these are enjoying and defending life and liberty, acquiring,
possessing, and protecting property, and pursuing and obtaining safety,
happiness, and privacy"); see also Ill. Const. art. I, §§ 6, 12 (granting
every person the right to be secure from invasions of privacy and also
guaranteeing a remedy for such invasions).
[FN190].
See White v. Davis, 533 P.2d 222 (Cal. 1975)
In seeking votes to pass the amendment, proponents of California's
privacy initiative listed the objectives sought as:
... [preventing] government snooping and
data collecting... [preventing] government and business interests from
collecting and stockpiling unnecessary information about us and from misusing
information gathered... [and providing] the ability to control circulation of
personal information.
Id.
at 233‑34.
[FN191].
See Hill v. NCAA, 865 P.2d 633, 644 (Cal. 1994) (en banc) (holding that the privacy amendment to the
California State Constitution conferred upon individuals the right to bring
invasion of privacy actions against private parties).
[FN192].
See William L. Prosser, Privacy, 48 Cal. L. Rev. 383, 389 (1960).
[FN193].
See Restatement (Second) of Torts §§ 652B‑E (1977) (listing the causes of
action associated with the invasion of one's privacy as (1) unreasonable
intrusion upon the seclusion of another, (2) appropriation of another's name or
likeness, (3) unreasonable publicity given to another's private life, and (4)
publicity that unreasonably places another in a false light).Professor Samuel
D. Warren and Justice Louis Brandeis are generally credited for first
identifying a general right to privacy, which later led to Prosser's seminal
work distinguishing the four privacy causes of action in tort. See Blumenfeld, supra note 185, at 349. Warren and Brandeis argued that
technological innovations created a need to provide security of an individual's
right "to be let alone," and that "[i]nstantaneous photographs
and newspaper enterprise have invaded the sacred precincts of private and
domestic life; and numerous mechanical devices threaten to make good the
prediction that 'what is whispered in the closet shall be proclaimed from the
house‑tops.' " Samuel D. Warren & Louis D. Brandeis, The Right
to Privacy, 4 Harv. L. Rev. 193, 195 (1890).
[FN194].
Intrusion "upon the solitude or seclusion of another or his private
affairs" protects an individual against the gathering of personal
information without his consent. Restatement (Second) of Torts § 652B (1977). The individual must not have voluntarily
disclosed the information, and the intruder's conduct must be highly offensive
to the reasonable person. See
Restatement (Second) of Torts § 652B cmt. c. "Misappropriation"
protects the individual from the use of one's name or likeness by another for
commercial gain without one's consent.
Restatement (Second) of Torts §652C (1977). Generally, this tort is used
by celebrities to prevent the unauthorized conversion of their names, images or
voices. See, e.g., Midler v. Ford Motor
Co., 849 F.2d 460 (9th Cir. 1988) (alleging misappropriation of Bette Midler's
vocal style in a commercial for Ford when a "sound‑alike" was
used). "Public disclosure of private facts" involves the
dissemination of information highly offensive to the reasonable person. Restatement (Second) of Torts, § 652D, cmt.
c (1977). First Amendment free
expression precludes this cause of action where such information is of a
legitimate public concern. See id.
Finally, the tort of "False Light" is triggered when a false
representation is made to the general public subjecting the victim to ridicule
and contempt, but not rising to the level of defamation. Restatement (Second) of Torts, § 652E
(1977).
[FN195].
George Orwell, Nineteen Eighty‑Four (Downtown Book Center 1966) (1949).
[FN196].
515 U.S. 646 (1995). In Vernonia, the
Supreme Court examined the Vernonia School District's program of randomly
testing student athletes for drug use and created a three part test analyzing
(1) the nature of the students' privacy interest, (2) the character of the
intrusion, and (3) the nature and immediacy of the governmental concern, and
the efficacy of the policy to address the problem. See id. at 654‑64.
The majority found that the student athletes had a lower expectation of
privacy in light of the element of communal undress associated with
participation in sports and the voluntary nature of such participation. See id. at 657. Further, the Court opined that the character of the intrusion was
"negligible" since the test as applied to all screened was designed
to only detect drugs and the results were viewed only by certain school
administrators. See id. at 658. Finally, the problem in Vernonia was an
immediate crisis of "epidemic proportions." Id. at 662‑63. In implementing the drug testing program,
the Vernonia School District had focused on the students it knew to be the
source of the problem, the student athletes.
See id. at 649‑50. The
policy used to address the problem in Vernonia was therefore found to be
effective in light of the nature and concern of the school. See id. at 664‑65.
[FN197].
469 U.S. 325 (1985) (recognizing a student's Fourth Amendment right against
unreasonable search and seizure in the context of a school setting, but finding
that the search as conducted was reasonable).
[FN198].
See Vernonia, 515 U.S. at 653 (noting that certain settings demonstrate
"special needs" and therefore do not require the Fourth Amendment
formalities of probable cause and a search warrant). Due to the nature of an educational environment, courts have
recognized schools as such a setting. See id.
The role school administrators play as supervisors and tutors requires
that they been given greater latitude in the dissemination of discipline. See id.
Additionally, the constraints under which they work do not afford
administrators adequate opportunity to retrieve a warrant. See id.
[FN199].
390 U.S. 629, 639 (1968).
[FN200].
See id.; see also Meyer v. Nebraska, 262 U.S. 390, 399 (1923) (recognizing a liberty interest in
child rearing).
[FN201].
20 U.S.C. § 1232g (1999) (giving the parents of minor students the right to
inspect, correct and control the disclosure of educational information).
[FN202].
See The Privacy Exchange, Protecting Web Privacy (visited Jan. 4, 2000)
<http://www.privacyexchange.org/tsi/webout.htm> (describing the methods
in which personal information is given both knowingly and unknowingly).
[FN203].
See FTC, Staff Report: Public Workshop on Consumer Privacy on the Global
Information Infrastructure § IV.B. (Dec. 1996), available at <http://
www.ftc.gov/reports/privacy/privacy5.htm> [hereinafter "FTC Staff
Report" ] (identifying the information collection practices of the
Internet sites included in its survey).
[FN204].
See FTC, Privacy Online: A Report to Congress, § II.B.1 n.4 (June 1998)
[hereinafter "FTC, Privacy Online" ] (defining the "cookie"
as used on the Web) (available on the Commission's Web site at http://
www.ftc.gov/reports/privacy3/priv‑23a.pdf).
'Cookie' technology allows a Web site's
server to place information about a consumer's visits to the site on the
consumer's computer in a text file that only the Web site's server can read. Using cookies a Web site assigns each
consumer a unique identifier (not the actual identity of the consumer), so that
the consumer may be recognized in subsequent visits to the site. On each return visit, the site can call up
user‑specific information, which could include the consumer's preferences
or interests, as indicated by documents the consumer accessed in prior visits
or items the consumer clicked on while in the site. Web sites can also collect information about consumers through
hidden electronic navigational software that captures information about site
visits, including Web pages visited and information downloaded, the types of
browser used, and the referring Web sites' Internet addresses.
Id.
However, the ability for users to set their
browsers to reject all cookies diminishes the cookie's effect as an intrusion
on one's privacy. See id.
[FN205].
See FTC Staff Report, supra note 203, at 4‑5.
[FN206].
See Trubow, supra note 186, at 521 (likening the holding of information to a
"key to power" and "the foundation for rational
decisions"); see also Erika S. Koster, Zero Privacy: Personal Data on the
Internet, 16 NO. 5 Computer Law 7, 8‑9 (May 1999) (estimating that the
private information of one individual could be worth as much as $500).
[FN207].
See CyberAtlas, Consumers Fear for Their Online Privacy, (last modified Nov. 1,
1999) <http://
cyberatlas.internet.com/markets/retailing/article/0,1323,6061_228341,00.html>
(citing Forrester Research survey finding that "[n]early [ninety] percent
of online consumres want the right to control how their personal information is
used after it is collected").
[FN208].
See FTC, The FTC's First Five Years: Protecting Consumers Online, 19 (Dec.
1999) [hereinafter "FTC, Five Year Report" ] (citing FTC, Self
Regulation and Privacy Online: A Report to Congress, 2 (July 1999), which states that "87 percent of
experienced Internet users were somewhat or very concerned about threats to
their privacy online") (available on the Commission's Web site at
http://www.ftc.gov/os/1999/9912/fiveyearreport.pdf); see also Koster, supra
note 206, at 7 (pointing to multiple surveys in which consumers named privacy
concerns as the main reason for not engaging in online commerce).
[FN209].
See generally Trubow, supra note 186, at 531‑32; Blumenfeld, supra note
185, at 354.
[FN210].
See Koster, supra note 206, at 8 (citing occurrences of accidental disclosures
over the Internet of large amounts of collected personal information by Yahoo!,
Nissan and the federal government).
[FN211].
FTC Privacy Online, supra note 204, Executive Summary at i.
[FN212].
NIITF, Privacy and the National Information Infrastructure: Principles for
Providing and Using Personal Information, § I.1 (last modified June 6, 1995)
<http://www.iitf.nist.gov/ipc/ipc/ipc‑pubs/niiprivprin_
final.html>.
[FN213].
Id. at §§ I‑III. The principles
state:
I.A.
Personal information should be acquired, disclosed, and used only in
ways that respect an individual's privacy.
I.B.
Personal information should not be improperly altered or destroyed.
I.C.
Personal information should be accurate, timely, complete, and relevant
for the purpose for which it is provided and used.
II.A. Information users should:
Assess the impact on privacy in deciding
whether to acquire, disclose, or use personal information.
Acquire and keep only information reasonably
expected to support current or planned activities.
II.B. Information users who collect personal
information directly from the individual should provide adequate, relevant
information about:
Why they are collecting the information;
What the information is expected to be used
for;
What
steps will be taken to protect its confidentiality, integrity, and quality;
The consequences of providing or withholding
information; and
Any rights of redress.
II.C. Information users should use
appropriate technical and managerial controls to protect the confidentiality
and integrity of personal information.
II.D. Information users should not use
personal information in ways that are incompatible with the individual's
understanding of how it will be used, unless there is a compelling public interest
for such use.
Id.
[FN214].
See Dorney, supra note 185, at 653 (examining the approaches taken by the NIITF
and the National Telecommunication and Information Administration (NTIA) to
create fair guidelines that would not unduly hinder commerce, yet still protect
consumer privacy).
[FN215].
See id.
[FN216].
National Information Infrastructure Task Force, A Framework for Global
Electronic Commerce, Executive Summary (visited Jan. 7, 2000) <http://
www.whitehouse.gov/WH/New/Commerce/summary.html>.
[FN217].
Id.
[FN218].
See id. The first principle articulated
in the NIITF Report reads, "[t]he private sector should lead. The Internet should develop as a market
driven arena not a regulated industry.
Even where collective action is necessary, governments should encourage
industry self‑regulation and private sector leadership where
possible." Id.
[FN219].
See id.
[FN220].
Id.
[FN221].
See Koster, supra note 206, at 11 (contrasting the three‑person Clinton
Administration privacy team with the independent private agencies of the
Netherlands, having fifty employees, and the United Kingdom, with one hundred
employees, to monitor privacy procedures).
[FN222].
Council Directive No. 95/46/EC of 24 October 1995 on the Protection of
Individuals with Regard to the Processing of Personal Data and on the Free
Movement of Such Data, 1995 O.J. (L 281) 31 [hereinafter "EU
Directive" ].
[FN223].
See id., art. 32(1), at 49.
[FN224].
See id., art. 7(a), at 40.
[FN225].
See id., art. 6(1) (b), at 40.
[FN226].
See id., art. 25 at 45‑46. While
"adequate" is not explicitly defined within the EU Directive, case‑by‑case
analysis of a country's privacy protection will include:
... the nature of the data, the purpose and
duration of the proposed processing operation or operations, the country of
origin and country of final destination, the rules of law, both general and
sectoral, in force in the third country in question and the professional rules
and security measures which are complied with in that country.
Id.
[FN227].
U.S. Department of Commerce, Draft: International Safe Harbor Privacy
Principles (last modified Nov. 15, 1999) <http://
www.ita.doc.gov/td/ecom/Principles1199.htm>.
[FN228].
See id.
[FN229].
See id. at nn.1‑2.
[FN230].
See FTC Staff Report, supra note 203,' IV.A (acknowledging the vulnerability of
minors and the legislation passed to protect them).
[FN231].
See Restatement (Second) of Contracts, § 12 (1979) (allowing minors to
disaffirm contractual obligations).
[FN232].
See FTC, FTC Surfs Children's Web Sites to Review Privacy Practices (last
modified Dec. 15, 1997) <http:// www.ftc.gov/opa/1997/9712/kids.htm>.
[FN233].
See id.
[FN234].
See id.
[FN235].
See id.
[FN236].
See FTC Privacy Online, supra note 204.
[FN237].
See id at iii.
[FN238].
See id.
[FN239].
Jodie Bernstein, Letter to the President and Executive Director of the Center
for Media Education (last modified July 15, 1997) <http://
www.ftc.gov/os/1997/9707/cenmed.htm> [hereinafter "KidsCom Letter"
] In her letter, Ms. Bernstein, the Director of the FTC's Bureau of Consumer
Protection, responded to the CME's petition against the KidsCom Web site.
[FN240].
See id.
[FN241].
See id.
[FN242].
See id. In the interim between the
CME's petition to the FTC and the release of Ms. Bernstein's letter, KidsCom
revised its information collection practices.
See id. The company began
notifying parents via e‑ mail upon receiving a registration from a minor,
thereby informing parents of the company's procedures regarding the collection
of personally identifiable information.
See id. In addition, parents
were given the opportunity to choose whether to allow the release of their
child's data to third parties. See id.
No release of such information would occur without verifiable,
"prior parental approval," i.e. a signed statement mailed or faxed to
KidsCom. Id.
[FN243].
See id. For example, the Commission
stated:
It is a deceptive practice to represent that
a Web site is collecting personally identifiable information from a child for a
particular purpose (e.g., to earn points to redeem a premium), when the
information will also be used for another purpose which parents would find
material, in the absence of a clear and prominent disclosure to that effect.
Id.
[FN244].
See id.
[FN245].
See KidsCom Letter, supra note 239. The
Commission directed:
To be effective, any disclosure regarding
collection and use of children's personally identifiable information must be
made to a parent, given the limited ability of many children within the target
audience to comprehend such information...
An adequate notice to parents should
disclose: who is collecting the personally identifiable information, what
information is being collected, its intended use(s), to whom and in what form
it will be disclosed to third parties, and the means by which parents may
prevent the retention, use or disclosure of the information.
Id.
[FN246].
FTC, In the Matter of GeoCities ‑ Complaint, P 4 (last modified Aug. 13,
1998) <http://www.ftc.gov/os/1998/9808/geo‑cmpl.htm> [hereinafter
"GeoCities Complaint" ].
[FN247].
See id. PP 8‑9, 17.
[FN248].
See id. P 17.
[FN249].
See id.
[FN250].
Id. P 12.B.
[FN251].
GeoCities Complaint, supra note 246 at P 14.
[FN252].
See id. P 18.
[FN253].
See id. P 19.
[FN254].
See id.
[FN255].
See id. P 20.
[FN256].
GeoCities, 63 Fed. Reg. 44,624, Part I (F.T.C. 1999) [hereinafter "GeoCities Proposed Final Order" ].
[FN257].
See id. Part IV.
[FN258].
See id. Part III.
[FN259].
See id.
[FN260].
See id. Part V.
[FN261].
See GeoCities Proposed Final Order, supra note 256, Part VI.
[FN262].
See id. Part VI.H.2.
[FN263].
See In re Liberty Fin. Co., Inc., File No. 982‑3522 (F.T.C. May 6, 1999)
[hereinafter "Liberty Financial" ].
[FN264].
Liberty Financial Complaint P 4 (FTC 1999) [hereinafter "Liberty Financial
Complaint" ] (available on the Commission's Web site at http://
www.ftc.gov/os/1999/9905/lbrtycmp.htm); s ee also Liberty Financial, 64
Fed.Reg. 29,031‑02 (F.T.C. 1999).
[FN265].
See Liberty Financial Complaint, supra note 264 at P 4.
[FN266].
See id.
[FN267].
See id.
[FN268].
Id. PP 4‑5.
[FN269].
See id. P 6.
[FN270].
See id.
[FN271].
See Liberty Financial Complaint, supra note 264 at PP 7‑9.
[FN272].
See id. PP 7‑9.
[FN273].
See id. P 9.B.
[FN274].
See Liberty Fin. Co., Inc., Agreement Containing Consent Order, Part I (F.T.C.
1999) [hereinafter "Liberty Financial Agreement" ] (available on the
Commission's Web site at http://www.ftc.gov/os/1999/9905/lbtyord.htm); see also
FTC, Young Investor Website Settles FTC Charges (last modified May 6, 1999)
<http://www.ftc.gov/opa/1999/9905/younginvestor.htm> (analyzing the
Liberty Financial Agreement).
[FN275].
See Liberty Financial Agreement, supra note 274 at Part I. In the order, the Commission defines
"personal information" as:
... individually identifiable information
about an individual collected online, including first and last name, home or
other physical address including street name and name of a city or town, e‑mail
address, telephone number, Social Security number, or any information
concerning the child or the parents of that child that the website collects
online from the child and combines with an identifier described in this
definition.
Id.
[FN276].
See id. Part II.
[FN277].
See id. Part V.
[FN278].
See id. Part III.
[FN279].
Liberty Financial Agreement, supra note 274 at Part III.
[FN280].
See COPPA, 15 U.S.C. §§ 6501‑6 (1998).
[FN281].
See 15 U.S.C.' 6502.
[FN282].
See 15 U.S.C.' 6502(a) (1); see also KidsCom Letter, supra note 239; GeoCities
Proposed Consent Order supra note 256; Liberty Financial Agreement supra note
274. In those cases, the FTC ordered
the companies involved to refrain from the collection of any information from
children, if the company had actual knowledge that the child did not have his
or her parent's permission to provide the information.
[FN283].
See 15 U.S.C.' 6502(a) (1).
[FN284].
See COPPA, 15 U.S.C.' 6502(b) (1).
[FN285].
See COPPA Rule, 16 C.F.R. Part 312 (1999); see also FTC, New Rule Will Protect
Privacy of Children Online (last modified Oct. 20, 1999) <
http://www.ftc.gov/opa/1999/9910/childfinal.htm>. The Rule was published on
October 20, 1999 after a draft proposal in April of 1999 and a workshop to
discuss the meaning of "verifiable parental consent." Id.
[FN286].
See id.
[FN287].
According to § 312.2 of the COPPA Rule, an "Operator" is a person who
runs a Web site for a commercial purpose and "who collects or maintains
personal information from or about the users of or visitors to such website or
online service, or on whose behalf such information is collected or
maintained." COPPA Rule, 16 C.F.R.
at § 312.2. Thus, if the personal
information collected flows to a third party, and the Web site or online
service serves merely as a conduit having no further control over the
information, it shall not be considered an operator. See id.
[FN288].
See COPPA Rule, 16 C.F.R. § 312.4(b).
[FN289].
See 16 C.F.R. § 312.5.
[FN290].
See 16 C.F.R. § 312.6
[FN291].
See 16 C.F.R. § 312.7.
[FN292].
See 16 C.F.R. § 312.8.
[FN293].
See COPPA Rule, 16 C.F.R. § 312 at 59902.
[FN294].
See id.
[FN295].
See id.
[FN296].
See id. Methods suggested by the FTC
include confirming the consent through e‑mail, postal mail or telephone
call.; see also Interactive PR, Teens Surf the Net Like Nobody Else, But That
Doesn't Mean They Trust It, Dec. 10, 1999 (finding that 77 percent of 10‑to‑12
year olds surveyed would attempt to circumvent parental consent requirements by
lying about their age).
[FN297].
See 16 C.F.R. § 312.5(c) (1)‑(4).
[FN298].
See 16 C.F.R. § 312.5(c) (3). Parents
must be notified of their child's enrollment for the newsletter, however, and
be given the opportunity to cancel it.
See id.
[FN299].
See COPPA Rule, 16 C.F.R. § 312.2(b).
The practice of deleting a child's personally‑identifiable
information before posting that child's message on an online chat room or
bulletin board brings the operator into compliance with the COPPA Rule, because
no information is deemed to have been collected. See id.
[FN300].
See 16 C.F.R. § 312.10.
[FN301].
See generally TRUSTe, The TRUSTe Story (visited Jan. 4, 2000) <
http://www.truste.org>; see also Koster, supra note 206, at 13.
[FN302].
TRUSTe, About TRUSTe: Frequently Asked Questions (visited Jan. 4, 2000)
<http://www.truste.org/about/about_faqs.html#trustmark>.
[FN303].
See TRUSTe, Program Principles (visited Jan. 4, 2000) <http://
www.truste.org/webpublishers/pub_principles.html>.
[FN304].
See TRUSTe, Children's Privacy Seal Program (visited Jan. 4, 2000)
<http://www.truste.org/webpublishers/pub_child.html>.
[FN305].
See id.
[FN306].
See id. The COPPA Rule "applies
only to personal information submitted online, and, therefore, a parent's
access rights under the Act do not generally extend to data collected
offline." See COPPA Rule, 16
C.F.R. § 312 at 59904.
[FN307].
See TRUSTe, TRUSTe Oversight (visited Jan. 4, 2000) <http://
www.truste.org/webpublishers/pub_oversight.html>; see also TRUSTe,
Resolution Process (visited Jan. 4, 2000)
<http://www.truste.org/webpublishers/pub_ recourse.html>.
[FN308].
A list of TRUSTe participants can be found at <http://
www.truste.org/users/users_lookup.html>.
[FN309].
See BBBOnline (R) (visited Jan.4, 2000) <http://www.BBBOnline.org>.
[FN310].
See BBBOnline (R), Eligibility Criteria for BBBOnline (R) Privacy Seal (visited
Jan. 4, 2000) <http://
www.BBBOnline.org/businesses/privacy/eligibility.html> [hereinafter
"BBBOnline (R) Eligibility" ].
[FN311].
See BBBOnline (R), Compliance Assessment Document (visited Jan. 4, 2000)
<http://www.bbbonline.org/businesses/privacy/assess‑html.html>. The Questionnaire asks seal applicants to
describe their privacy practices concerning the collection of information from
minors, including procedures with regard to verifiable parental consent and disclosure
to third parties. See id.
[FN312].
See BBBOnline (R) Eligibility, supra note 310.
BBBOnline (R)'s Dispute Resolution Policy is available at <http://
www.BBBOnline.org/download/DR.html>.
[FN313].
See BBBOnline (R), Privacy Program Dispute Resolution Statistics (March 17, 1999‑September 30, 1999)
available at <http://
www.BBBOnline.org/businesses/privacy/dr/statistics.html> [hereinafter
"BBBOnline (R) Statistics" ].
The first eligible complaint involved a Web site that violated its
privacy policy when it failed to act on complainant's request to have his e‑mail
address removed from a mailing list.
See id. The violation was
remedied upon BBBOnline (R) contact with Respondent. See id. As of this writing, the second complaint is still under
review and involves allegations of misuse of credit card information collected
online. See id.
[FN314].
See id. Complaints are analyzed for
eligibility by a subsidiary of the Council of Better Business Bureaus, Inc.
called the Privacy Policy Review Service (PPRS). See BBBOnline (R),
Privacy Program Dispute Resolution Process, § 1.1 (Feb. 11, 1999) [hereinafter
"BBBOnline (R) Dispute Resolution" ] (available on the BBBOnline (R)
Web site at http:// www.bbbonline.org/download/DR.html). The PPRS found the remaining claims
ineligible because they did not meet the criteria necessary for review. See BBBOnline (R) Statistics, supra note 313
(finding respondents were not involved in questionable data collection
practices, did not have posted privacy policies to which to compare practices
or were not participants in the BBBOnline (R) seal program).
[FN315].
See BBBOnline (R), Search Results (visited Apr. 4, 2000) <http://
www.BBBOnline.org/database/search/default.cfm> (boasting 4567 participants
of BBBOnline (R)); see also TRUSTe, News & Views (visited Feb. 22, 2000)
<http:// www.truste.org/about/about_1000th.html> (announcing approval of
its 1000th Web site).
[FN316].
See BBBOnline (R) Dispute Resolution, supra note 314, § 2.5 Available Remedies
(making available to the complainant such corrective actions as the correction
of the individual's information and modification of a respondent's privacy
policy); see also TRUSTe, Resolution Process
(visited Jan. 4, 2000)
<http://www.truste.org/webpublishers/pub_recourse.html>.
[FN317].
BBBOnline (R) Dispute Resolution, supra note 314, § 3.1 (explaining the
function of the PPRS in the dispute resolution process).
[FN318].
See BBBOnline (R), BBBOnline FAQs, (visited Jan. 4, 2000) <http://
www.bbbonline.org/about/FAQs.html#privacy>.
[FN319].
See TRUSTe, Resolution Process (visited Feb. 22, 2000) <http://
www.truste.org/webpublishers/pub_recourse.html>.
[FN320].
Chris Oakes and James Glave, The Web Privacy Seal, Take 2, Wired News, Mar. 17, 1999, at 3, available at
<http:// www.wired.com/news/news/politics/story/18517.html>.
[FN321].
See W3C, P3P Working Draft (last modified Nov. 2, 1999) <http://
www.w3.org/TR/P3P> (explaining the latest P3P specifications).
[FN322].
The World Wide Web Consortium (W3C) is an independent, international group co‑hosted
by MIT's Laboratory of Computer Science as well as the National Institute for
Research in Computer Science and Control (INRIA) in France and Keio University
in Japan. See W3C, About the World Wide
Web Consortium (last modified Mar. 2000)
<http://www.w3.org/consortium>.
The W3C works to create common standards to ensure the interoperability
of the Web. See id.
[FN323].
See W3C, Platform for Privacy Preferences (P3P) Project (visited Jan. 4, 2000)
<http://www.w3.org/P3P> [hereinafter "P3P" ]; see also W3C, P3P
and Privacy on the Web FAQ (visited Jan. 4, 2000) <http://
www.w3.org/P3P/P3FAQ.html>.
[FN324].
See P3P, supra note 323.
[FN325].
See id.
[FN326].
See The Privacy Exchange, P3P (visited Jan. 4, 2000) <http:// www.privacy
exchange.org/tsi/p3p.htm>.
[FN327].
See id.
[FN328].
See id.
[FN329].
See CME, Assessment of Data Collection Practices of Children's Web Sites (last
modified July 20, 1999) <http://www.cme.org/ministudy.html>.
[FN330].
See id.
[FN331].
See id.
[FN332].
See id.
[FN333].
See id.
[FN334].
See Bilge Ebiri, Coming Down off the Hill, Yahoo! Internet Life from ZDWire,
Nov. 1, 1999, available in 1999 WL 14789049.
[FN335].
See id.
[FN336].
See id.
[FN337].
47 U.S.C.A. § 231 (West 1999).
[FN338].
47 U.S.C.A. § 223 (Supp. II 1996) (repealed 1997).
[FN339].
See ACLU v. Reno, 31 F.Supp.2d 473 (E.D. Pa. 1999).
[FN340].
See 47 U.S.C.A. § 231(e) (2) (A).
[FN341].
See U.S. v. Thomas, 74 F.3d 701, 710‑11 (6th Cir. 1996) (applying Tennessee community standards to a
Web site originating in California and finding the operators liable for
publishing obscenity).
[FN342].
See Pamela Mendels, Judges Raise Questions About Federal Anti‑
Pornography Law, New York Times Online, Nov. 5, 1999, available at <http://
www.nytimes.com/library/tech/9911/cyber/articles/05child.html> (quoting
Third Circuit Judge Leonard I. Garth in the COPA appeal oral arguments, who
stated, "[i]t seems to me that in terms of the World Wide Web, what the
statute contemplates is that we would be remitted to the most severe,
conservative community standards, perhaps those in Iran or Iraq where exposure
of a woman's face is deemed to be inappropriate..."). Id.
[FN343].
Rep. George W. Gekas and James W. Harper, Annual Regulation of Business Focus:
Regulation of Electronic Commerce, 51 Admin.L.Rev. 769, 790 (1999) (summarizing
the comments of Rep. White cited in 144 Cong. Rec. H9908‑09 (daily ed.
Oct. 7, 1998)); see also ACLU v. Reno, 929 F.Supp. 824, 848 (1997) (estimating
that as much as 40% of all content on the Web originates abroad).
[FN344].
Gekas & Harper, supra note 343, at 790; see also Michael Hatcher et al.,
Computer Crimes, 36 Am. Crim. L. Rev. 397, 443‑44 (1999) (recognizing
that "[a] universal jurisdictional standard has yet to be developed by the
courts").
[FN345].
Gekas & Harper, supra note 343, at 790.
[FN346].
24 F.Supp.2d 552 (E.D. Va. 1998).
[FN347].
See id. at 556.
[FN348].
See Kathleen R. v. City of Livermore, No. A086349 (Cal. Ct. App. filed July 16,
1999).
[FN349].
See U.S. Const. amend. XIV, § 1.
[FN350].
See id. (stating "[n]o State shall
make or enforce any law which shall abridge the privileges or immunites of
citizens of the United States; nor shall any State deprive any person of life,
liberty, or property, without due process of law")
[FN351].
See Letter to Honorable Thomas J. Bliley, (last modified Oct. 18, 1999)
<http://www.eff.org/pub/Censorship/Internet_censorship_bills/1999_
bills/19991018_hr2560_conserv_letter.html>.
[FN352].
The letter was signed by members of the Technology Policy Free Congress
Foundation, the Association of Concerned Taxpayers, Neighborhood
Research/Mountaintop Media, Alabama Citizens for Truth, Citizens Against
Repressive Zoning, Missouri Christian Coalition, the Eagle Forum of Wisconsin,
the Wisconsin Information Network, the American Policy Center, Gun Owners of
America, the Eagle Forum of Ohio, Tradition, Family, Property, Inc., the
Liberty Coucil, and the Delaware Home Education Association. See id.
[FN353].
See id. (citing instances of filtering
software blocking the Web sites of the American Family Association, Eagle Forum
and Gun Owners of America, as well as the discretion given to teachers and
librarians to use the software to block Web sites in opposition to their
political beliefs, and concluding that filtering "harms conservative
organizations as well as anyone with a website carrying a conservative
message"); see also Child Protection Act of 1999, H.R. 2560, 106th Cong.
(1999).
[FN354].
See id.
[FN355].
See id. (describing solutions already
being implemented by the industry such as server‑level filtering and the
Center for Democracy and Technology's project entitled GetNetWise, which
advises parents, teachers and librarians attempting to shield children from
inappropriate online material).
[FN356].
15 U.S.C.A. §§ 6501‑6506 (Wet 1999).
[FN357].
16 C.F.R. Part 312 (F.T.C. 1999).
[FN358].
See Jeri Clausing, New Privacy Rules for Children's Web Sites, New York Times,
Oct. 21, 1999, at G11.
[FN359].
See Wagner, supra note 160, at 801‑02 (suggesting that Congress take an
"indirect" approach to regulating Internet labeling by legally
mandating that operators label their Web sites according to the PICS standard
or through the use of "market‑influencing techniques" tied to
federal subsidization).
[FN360].
Tinker v. Des Moines Independent School District, 393 U.S. 503, 515 (1969)
(Stewart, J., concurring).
[FN361].
See COPPA Rule, 16 C.F.R. § 312.5(a) (1999).
[FN362].
See id.
[FN363].
See Cyberspace Communications, Inc. v. Engler, 55 F.Supp.2d 737, 752‑53
(E.D. Mich. 1999).
Although it is difficult in today's society
to constantly monitor the activities of children, it is still the right, and
duty, of every parent to teach and mold children's concepts of good and bad,
right and wrong.... This includes
setting limits, and either being there to enforce those limits, or utilizing
the available technology to do so. With
such less restrictive means to monitor the online activities of children, the
government need not restrict the right of free speech guaranteed to adults.
See
id.
END
OF DOCUMENT