A brief from the American Civil Liberties Union and other groups asking a federal judge to delay enforcement of the Communications Decency Act of 1996. (From Academe Today - Posted February 12, 1996)
UNITED STATES DISTRICT COURT
EASTERN DISTRICT OF PENNSYLVANIA
________________________________________
)
AMERICAN CIVIL LIBERTIES UNION, et al., )
)
Plaintiffs, )
)
v. ) Civ. No.__________
)
)
JANET RENO, in her official capacity as )
ATTORNEY GENERAL OF THE UNITED STATES, )
)
Defendant. )
)
________________________________________
PLAINTIFFS' MEMORANDUM OF LAW IN SUPPORT OF A MOTION FOR A TEMPORARY RESTRAINING ORDER AND PRELIMINARY INJUNCTION
INTRODUCTION
The plaintiffs in this First Amendment challenge to the
"Communications Decency Act of 1996" seek emergency relief to
stop the enforcement of provisions of the Act that criminalize
their expression of constitutionally protected information and
ideas over computer communications systems(1). The Act bans all
expression that is "indecent" or "patently offensive" from all
online systems that are accessible to minors. Not only does this
ban unconstitutionally restrict the First Amendment rights of
minors and those who communicate with them about important
issues, but, because of the nature of the online medium, it
essentially bans "indecent" or "patently offensive" speech
entirely, thus impermissibly reducing the adult population to
"only what is fit for children." Butler v. Michigan, 352 U.S.
380, 383 (1957).
The prohibitions are also unconstitutionally vague and
overbroad. The terms "indecency" and "patently offensive" are
not further defined. None of the plaintiffs knows how to define
the Act's terms or how much of their communications are criminal
under the Act. The Act explains neither how to comply, nor which
participants in the distribution of online speech may be held
liable. Further, there are many alternatives already available
for those parents who wish to shield their children from online
communications that they deem inappropriate. Finally, the Act
interferes with the privacy rights of minors, and impermissibly
discriminates against computer communications by imposing
censorship that would not be permitted for the print medium(2).
The plaintiffs are providers and users of online
communications with significant educational, political, medical,
artistic, literary, and social value that deal with issues such
as sexuality, reproduction, human rights, and civil liberties.
The censorship provisions that they challenge threaten not only
to chill these important communications but to dismantle the free
and open nature of a promising new medium that could empower
citizens and promote democracy in the next millennium. The
exponential growth in computer technology, and international
computer networks like the Internet, is transforming the nature
of communication. Computer networks have created new communities
with new opportunities for people with similar interests to
communicate with each other. Computer networks embody the values
that underlie the First Amendment by nurturing the robust
exchange of ideas. By imposing vague and broad-ranging standards
wholly inappropriate for this new medium, the Act would stifle
the creativity and breadth of expression occurring in cyberspace.
This result cannot be reconciled with the First Amendment.
Because plaintiffs and their members and online audiences face
the irreparable loss of First Amendment rights, plaintiffs ask
the Court to enter preliminary relief enjoining the Act's
enforcement.
STATEMENT OF THE CASE
A. The Plaintiffs' Online Speech
Plaintiffs include more than twenty organizations and individuals
who use online computer networks to send, display and view
information. All of the plaintiffs are both online speakers
and online listeners or recipients of information who communicate
through electronic mail ("e-mail"), online discussion groups, and
online databases that can be accessed by millions of other online
users simultaneously. Plaintiffs sue on their own behalf and on
behalf of those who access their online communications.
Plaintiffs who are membership organizations sue on their own
behalf and on behalf of their members who use online
communications.
All of the plaintiffs use online networks to send, display
or view information that could be considered to be "indecent" or
"patently offensive." Some communicate important health-related
information about sex(3). Others communicate important news
and educational information about human rights and civil
liberties(4). Still others communicate material that contains
strong language that many consider unsuitable for
minors to read or hear and that the Federal Communications
Commission has found "indecent" in the broadcast context(5).
Notwithstanding the social value of plaintiffs' speech for both
minors and adults, all face possible prosecution under the Act.
More specifically, plaintiffs include(6):
* American Civil Liberties Union (ACLU): a national civil
rights organization, the ACLU posts online information that
includes the language deemed offensive in the Supreme Court's
1978 decision in FCC v. Pacifica , 438 U.S. 726 (1978), and hosts
online discussions on civil liberties issues such as arts
censorship, obscenity and indecency law, discrimination against
gay men and lesbians, and reproductive freedom. The ACLU also
sends and receives information about abortion through online
networks, the mails, telephone and FAX lines(7).
* Human Rights Watch (HRW): an international human rights
organization, HRW uses computer technology to communicate around
the world with members, interested persons, and the public. These
discussions, and HRW's published online human rights reports,
sometimes contain graphic language about prostitution, rape and
torture involving sexual mutilation(8).
* Electronic Privacy Information Center (EPIC): a research
organization advocating for free speech and privacy rights in the
online medium, EPIC maintains extensive online resources that
include references to censored material. For example, EPIC has
posted poems that were written by subscribers of America Online
(AOL) and then removed from AOL on the grounds that they were
"vulgar or [contained] sexually oriented language."(9)
* Electronic Frontier Foundation (EFF): a national non-partisan
organization advocating for civil liberties in the online medium,
EFF maintains extensive online resources. EFF's electronic
resources, like those of the ACLU and EPIC, include considerable
material about censorship including quotations from previously
censored material(10).
* Journalism Education Association (JEA): a national organization
of high school journalism teachers, JEA members teach minors how
to access information on computer networks and assist minors with
online research on many subjects, including censorship, gay and
lesbian issues, teenage sexuality, reproduction, abortion, art,
literature, and law(11).
* Computer Professionals for Social Responsibility (CPSR): a
national organization of computer professionals, CPSR and its
members are involved in every aspect of computer technology.
They use the online medium as a primary method of communication
and also host a number of online discussion groups that include
frank discussions of sex(12).
* National Writers Union (NWU): a national organization of
writers, NWU and its members use computer technology to
communicate with each other often in frank terms and some of whom
post erotic fiction on the networks(13).
* ClariNet: publishers of an electronic newspaper, ClariNet
distributes news articles that sometimes use frank, strong
language, and describe sexual subjects. ClariNet also publishes
a humor newsgroup which posts jokes, some of which include vulgar
language or sexually explicit material(14).
* Institute for Global Communications (IGC): a national
online service provider, IGC provides Internet web sites, access
to the Internet, and other online services primarily to nonprofit
organizations, including SIECUS (the Sex Information and
Education Council of the United States), the Family Violence
Prevention Fund, Stop Prisoner Rape, Human Rights Watch, and
Pacifica Radio(15).
* Stop Prisoner Rape (SPR): an organization dedicated to
advocacy to end prison rape, SPR hosts an Internet site that uses
frank street terms to discuss the problem of rape in the nation's
jails, prisons, and juvenile facilities in order to assist
inmates or former inmates in dealing with the consequences of
that experience(16).
* AIDS Education Global Information System (AEGIS) and
Critical Path AIDS Project (Critical Path): organizations that
offer vital information about AIDS and HIV, AEGIS and Critical
Path often necessarily contain discussions of sex because
HIV/AIDS is a sexually transmitted disease. In order to ensure
that those accessing the information fully understand prevention
methods, Critical Path and AEGIS discussions often use street
terms for sexual organs and/or acts. Critical Path also provides
access services for connection to other online networks(17).
* Safer Sex Page: an Internet site that provides safe sex
education materials, Safer Sex Page often uses frank and explicit
language and pictures. Safer Sex Page also hosts an online
discussion group that allows individuals to discuss sexual
subjects relevant to safer sex(Fn 18).
* BiblioBytes: a publisher of electronic books for sale over
the World Wide Web ("the web"), BiblioBytes offers romance
novels, erotica, classics, adventure, and horror (19).
* Wildcat Press: a publisher that specializes in classic gay
and lesbian literature, Wildcat Press advertises its books by
publishing excerpts online. Wildcat also sponsors two online
youth magazines that publish poetry, fiction, essays, fine art
and photography by teenagers, some of which is sexually
explicit(20).
* Queer Resources Directory (QRD): one of the largest online
distributors of gay, lesbian, and bisexual resources on the
Internet, QRD includes some material about human sexuality that
is sexually explicit(21).
* Justice On Campus (JOC): a student-operated Internet site
on free speech, JOC posts and discusses material that has been
censored, particularly material censored by schools(22).
* Cyberwire Dispatch (CWD): an online editorial column about
telecommunications issues, CWD often uses vulgar and graphic
language to protest censorship. Brock Meeks, publisher and
editor of Cyberwire Dispatch, also writes for other print and
online magazines(23).
* The Ethical Spectacle: an online monthly newspaper, The
Ethical Spectacle discusses ethical issues including Nazi
experimentation and the morality of pornography. In the course
of those discussions, works that have in the past been censored
or considered pornography are discussed and quoted(24).
* Planned Parenthood Foundation of America (PPFA): the
leading national voluntary health organization in the field of
reproductive health care, PPFA sends and receives, through online
communications, telephone, FAX, and regular mail, a broad range
of information about abortion(25).
B. The Censorship Provisions of the Act
Plaintiffs principally challenge two sections of the Act.
Section 502, amending 47 U.S.C. Section 223(a)(1)(B) (hereinafter
Section 223(a)(1)(B) or "the indecency provision"), provides
in part that anyone who, "by means of a telecommunications
device," "makes, creates, or solicits" and "initiates the
transmission" of any material "which is obscene or indecent,
knowing that the recipient of the communication is under 18 years
of age," "shall be criminally fined or imprisoned." Section 502,
adding 47 U.S.C. Section 223(d)(1) (hereinafter Section 223(d)(1)
or "the patently offensive provision"), makes it a crime to use
an "interactive computer service" to "send" or "display in a
manner available" to a person under age 18, any material that
in context, depicts or describes, in terms patently
offensive as measured by contemporary community standards, sexual
or excretory activities or organs ...
Plaintiffs also challenge Section 223(a)(2) and Section
223(d)(2), which makes it a crime for anyone to "knowingly
permit[] any telecommunications facility under his control to be
used for any activity prohibited" in Sections 223(a)(1)(B) and
223(d)(1).
Finally, plaintiffs challenge 18 U.S.C. Section 1462, as
amended by the Act, which prohibits the sending and receiving of
information by any means regarding "where, how, or of whom, or by
what means" "any drug, medicine, article, or thing designed,
adapted, or intended for producing abortion ... may be obtained
or made."
C. The Nature of the Online Medium
To understand the urgency of the issues presented by this
case, it is necessary to appreciate the unique nature of the
online medium. Online networks represent a revolutionary
synthesis of several traditional means of communication and
places for communicating and exchanging information -- including
the telephone system, the postal service, a television or radio
broadcast, a newspaper, a library or book store, a fax machine, a
town hall or public park, and a shopping mall. The following
section discusses the ways people communicate over online
networks, the types of online systems and how they operate, and
the distinctions between the online medium and traditional forms
of communication.
1. Types of Online Systems
Although computer communications systems are various and
complex, there are a few basic types and functions that are
critical to understanding why censorship of material that is
"indecent" or "patently offensive" is unnecessary and
unconstitutional on these networks. An estimated 75,000 online
systems currently exist, varying widely in size, subject matter,
scope and features. These systems are accessed with a computer,
phone line, and modem. There is usually a start-up and
subscription fee, which varies in price depending on the size and
features of the system. Subscribers are provided with a user
name and a password that allows them to access the online
service. While some users employ their full proper names as
their online user names, others have online names that are
pseudonyms. These users therefore may send, view, and receive
online communications anonymously.
Most online systems offer a package of services that can
include e-mail to transmit private messages to one or a group of
users or to an established mailing list on a particular topic;
chat groups that allow simultaneous online discussions; ongoing
discussion groups; informational databases; and access to the
Internet. Text, audio, and video files can all be exchanged on
an online system if the user has the right computer hardware and
software. Once users obtain online access, they may generally
use all of the services without providing further identification
or paying an additional fee(26).
The quintessential online system is the Internet, the
largest online network in the world. The Internet is an enormous
network that links a large number of smaller networks set up by
universities, industry and government. While estimates are
unreliable due to its astronomical growth, the Internet is
believed to connect at least 59,000 computer networks and 2.2
million computers in 159 countries(27). There are an estimated
20-40 million users of the Internet(28). The Internet grows at a
rate of 10-15 percent per month, and a new online network is
connected to the Internet every 30 minutes(29).
Many Internet users are connected to the service through an
Internet Service Provider (ISP). ISP's provide connection,
software, and tools for using the Internet(30). Larger
businesses
and institutions often have a direct connection to the Internet.
Most universities in the United States are now directly connected
to the Internet and provide free accounts on their participating
computers to students, faculty, and staff.
Online users can communicate over the Internet in many
different ways. E-mail is the most basic online communication
method; users are given a personal e-mail address that allows
them to exchange messages or files with anyone else with an
Internet e-mail address. Gopher and the World Wide Web ("the
web") are two popular ways to create and access permanent
information databases, or online sites, established by thousands
of organizations and individuals through the Internet. Both
gopher and the web allow the user to print or download documents
from the Internet(31). The web, the newest Internet tool,
provides thousands of sites that contain menus, text, and
graphics. Most sites allow users to link instantly to other
documents and web sites by clicking on highlighted words in the
text of the document being viewed.
The Internet and other online services also provide access
to "online discussion groups," which are set up by particular
computer networks connected to the Internet. The host of the
discussion group sets up a section on the network that is devoted
to the discussion of a particular issue (akin to a public
bulletin board), and any other online user with access to the
host network can post messages on the topic by sending an e-mail
message to the discussion group. Users can also post responses
to particular messages(32). Plaintiffs host online discussion
groups on topics such as AIDS education; safer sex practices; and
university censorship(33).
Online users can also communicate using "chat rooms," which
are usually dedicated to a particular topic and allow users to
engage in simultaneous live interactive discussion (similar to a
multi-party phone call). Like online discussion groups, chat
rooms are usually hosted by particular networks that are
connected to the Internet(34).
Software is also available that allows any online user to
establish an "online mailing list" for a particular topic or
purpose. Other online users "subscribe" to online mailing lists
by sending messages from their own e-mail addresses. Any
subscriber can then send a message that is distributed to all of
the other subscribers on the list(35).
There are a number of methods available for searching for
information on the web. These methods, often called "search
engines," allow an online user to insert a string of words and
simultaneously search the thousands of databases on the web for
information on a particular subject(36). While users may tailor
their searches to exclude some extraneous information, it is not
possible to screen all unrelated information from appearing in
the search results. The search results provide users with a
citation list of sites on the subject searched, and the user then
chooses which of those sites to access(37).
The summary above provides only a cursory overview of a very
complex and promising new communications medium(38). All online
systems, though, have two important features in common:
users must seek out with specificity the information
they wish to retrieve and the kinds of communications in which
they wish to engage.
online systems provide users with a multitude of
options for controlling and limiting, if desired, the kinds of
information they access through the networks.
2. Who Runs Cyberspace
Nobody owns cyberspace, and the ability of anyone to control
what goes into or through online networks varies widely depending
on the nature of the system. Many aspects of online networks and
sites run automatically without the active involvement of the
host. For example, online system software automatically answers
the telephone when a user attempts to log on, verifies passwords,
connects the user to the system, allows users to exchange
messages, downloads and uploads files when requested by users and
disconnects the user when the user logs off the system.
Large online services like America Online and Prodigy create
their own content files and also negotiate with other information
providers to post content on their systems. Some of these online
services review the content from outside information providers
before it is posted(39). However, in contrast to the control and
review of information they create themselves or received from
third parties, these systems have little prior control over the
content of subscribers' e-mail or the speech that takes place in
their simultaneous chat rooms. In addition, it is impossible
to monitor access to other networks and sites through the host
network. For example, Plaintiff ACLU's web site provides a
"link" to Plaintiff EFF's web site, but Plaintiff ACLU has no
power to monitor EFF's web site communications(40).
There are other gatekeepers in cyberspace known as
moderators. Online mailing lists, online discussion groups and
chat rooms on a particular subject are often "moderated." Some
moderators are employed by universities or companies that set up
the list or newsgroup, but the overwhelming number are people who
volunteer to serve as moderator because they are interested in
the topic. These moderators review incoming messages before they
are posted to a public site or sent to a mailing list to
determine whether the message is related to the subject matter or
conforms to other standards set up by the discussion group. For
example, Plaintiff Safer Sex Web Page hosts an online discussion
group about safe sex, but the creator of the web page reviews
messages posted by others before he posts them to the public
discussion group in order to screen out messages that do not
relate to the topic(41). Given the lack of centralized Internet
gatekeepers and the huge flow of online information, moderators
play a valuable role in focusing online discussion and
eliminating superfluous messages.
3. How Cyberspace Differs From Other Media
Users of online networks are producers as well as consumers
of information. Perhaps the most revolutionary aspect of
cyberspace is its ability to turn the passive consumer into a
mass producer of information. Online users, through services
like e-mail, online discussion groups, or the web, can publish or
post information to other users -- or to the entire Internet --
and then use the same services to read or receive information.
In fact, online networks make no distinction between information
providers and information users, and "most users play both roles
from time to time.(42" Unlike radio or television networks, in
which spectrum scarcity limits the number of potential
information producers, an online network can accommodate a
virtually unlimited number of both users and producers of
information(43).
Cyberspace is also more decentralized than any other
communications medium. It is comprised of thousands of individual
computers and computer networks, with thousands of individual
speakers, information providers, and information users, and no
centralized distribution point. Access to start-up technology,
content production, and connectivity are all decentralized in
cyberspace. Anyone can purchase the necessary equipment to get
online or to create a web site from her home computer. Once a
person becomes connected to global networks like the Internet,
there are no central gatekeepers who determine where that person
can travel in cyberspace. Many commentators have noted that the
decentralized nature of cyberspace is what has made the medium
flourish(44). It also makes cyberspace fundamentally different
from the broadcast medium(45).
Attempts to control content in cyberspace affect not just a
few distributors and producers, but the millions of US citizens
and international users who speak daily online(46). The effect
of
censorship is thus much broader than on radio and television,
which have a limited and identifiable number of producers; it is
even broader than print because information travels
instantaneously across national boundaries. Congress has
conducted no study to determine how the Act's censorship
provisions would affect the interactive environment, or indeed,
whether they would be effective in keeping ostensibly harmful
materials from children.
Cyberspace also differs from print, television or radio
because it is "interactive." Other, traditional media are
one-way communications systems with no opportunity for input from
the user. Online communications, by contrast, allow users to
shift fluidly from the position of listener to that of speaker,
and from the role of consumer to that of information provider.
Moreover, unlike the traditional phone or fax, cyberspace
communications can be more than just two-way. There is no limit
to the number of people on either side of the sending or
receiving end of the communication.
Also unlike traditional media, cyberspace contains various
types of interactive communications. Online users can exchange
e-mail to one or a specified group of other users; engage in an
ongoing exchange of postings on a particular subject through
online discussion groups; talk simultaneously with others in an
online chat group; or retrieve documents from web sites.
Online media thus "offer users tremendous control over the
information that they and their children receive. Unlike
traditional mass media which 'assaults' viewers with content,
interactive media requires users to seek out information from any
number of the millions of available [online sites].(47" Viewing
messages or files in cyberspace does not happen automatically.
Each participant in this form of communication chooses not only
whether, when and where to participate, but also whether to send
or receive information at any specific time; at what rate writing
and reading (sending and receiving) will occur; and what topic
this communication will concern. Thus, in contrast to television
or radio, it is very difficult to be "assaulted" with images
online. There is little risk of accidental exposure to
established online files, because an online user sees a subject
line or headline describing the content before it is viewed, and
actively chooses what she wishes to see or hear in cyberspace.
Computer communications and online communities also differ
from other media in their global reach. The Internet is
accessible from a growing number of countries around the
world(48). Once information is posted to an international online
network like the Internet, it is not possible to allow only
residents of a particular country to download that information;
the information becomes available to anyone in the world who can
access the Internet. Similarly, it is impossible to prevent
persons in other countries from posting information to
international online networks. There is currently no
technological method for determining with specificity the
geographic location from which users access or post to online
systems.
Finally, unlike other media, online systems offer both
"public" and "private" spaces for communication. E-mail and
online mailing lists are private. Web sites, online discussion
groups and chat rooms are "public" in the sense that any Internet
subscriber can access them, but they are not akin to a town hall
or public park because it is impossible to identify the physical
characteristics of other online users. This fact is particularly
relevant to legislation targeting minors. In public parks and
other public spaces in the geographical world, adults can easily
determine whether children are present, and may decide to alter
their speech and conduct accordingly. On the Internet, as it
currently functions, it is impossible to determine whether a
child or teenager is participating in a chat room or whether a
minor is accessing a public space on the network. Thus, any
regulations governing communications to minors inevitably affects
communications among adults.
4. Screening and Filtering Devices Available to Control Content
As described above, the very nature of the online medium
puts control of information and content in the hands of the
users. In addition, there are an increasing number of devices
that assist users in screening and blocking access to certain
kinds of information. Almost all online information has a
headline or subject line that tells the online user what will be
viewed if the user chooses to access the information. Online
users can simply choose not to view or download information if
the headline relates to information the user finds objectionable.
There are also methods that allow users to block out all incoming
messages from a particular person (for example, an harassing
e-mailer), or messages related to particular subject matter in
Usenet newsgroups or mailing lists(49).
In addition, some online services offer filtering and
screening devices specifically designed for parents, and industry
continues to develop software programs for the specific purpose
of assisting online users in controlling the information they
receive through their systems. Currently, there are four general
categories of technological options, each providing "a slightly
different, but equally effective, point of intervention."(50
First, commercial online services like America Online, Prodigy,
and Compuserve provide optional features to prevent children from
accessing simultaneous chat rooms and to block access to Usenet
newsgroups based on keywords, subject matter, or specific
newsgroup. They also offer screening software that automatically
blocks messages that contain language such as the "seven dirty
words," and tracking and monitoring software so that parents can
see which sites their children have accessed. In addition, there
are "kids-only" discussion groups that are closely monitored by
adults. Finally, these services offer telephone help and
detailed instructions for parents(51).
Parents who subscribe to the Internet through an ISP can
also purchase software applications to control access to
content(52). "SurfWatch" software allows parents to block access
to Usenet groups and Internet sites which are known to contain
sexually explicit material. SurfWatch employs a group of
professional "net.surfers" who monitor the Internet for new
sites; these findings are then reviewed by a group of parents and
educators, and the list is automatically updated on the home
computer. "NET NANNY" allows parents to block any areas on the
Internet that the parent deems appropriate, to prevent children
from giving personal information to strangers by e-mail or in
chat rooms, and to keep a log of all online activity that occurs
on the home computer. "CYBERsitter" allows parents to monitor
their children's computer activity and can prevent children from
downloading specified files(53).
Other available software caters to schools and businesses
that provide access to the Internet. Products such as the
"Netscape Proxy Server" and "WEBTrack" allow schools and
businesses to block specific sites from access by all users on
the network, and to track and monitor Internet use. WEBTrack is
providing its software free to all K-12 schools(54).
These products are only some of the currently available ways
that parents can control their children's access to the Internet
and other online services. New products are constantly being
developed. About two dozen online companies have formed a
coalition entitled the Platform for Internet Content Selection
(PICS) to develop technical standards to enable voluntary rating
of a variety of online content. The standards would enable
content creators voluntarily to label their own content so that
individuals and families could block material, if they chose.
The group will also create standards to allow multiple
third-party rating of online content(55).
These programs are not foolproof. New online sites are
created daily and no software can guarantee that it will block
access to every site that discusses sex or uses "vulgar" words.
However, the various blocking mechanisms are much more effective
than a government ban in keeping minors away from material that
their parents and teachers deem inappropriate. Particularly
given the inability of any government to ban material posted
outside its borders, blocking mechanisms are a more effective
alternative than censorship.
ARGUMENT
Plaintiffs more than satisfy the requirements for preliminary
injunctive relief. In order for this Court to grant a Temporary
Restraining Order and Preliminary Injunction pursuant to Rule 65,
Federal Rules of Civil Procedure, plaintiffs must establish: (a)
that they are likely to prevail on the merits; (b) that they will
suffer irreparable harm if injunctive relief is not granted; (c)
that potential harm to the defendants from issuance of a
temporary restraining order does not outweigh possible harm to
the plaintiffs if such relief is denied; and (d) that the
granting of injunctive relief would not be against the public
interest. See In re Arthur Treacher's Franchise Litigation, 689
F.2d 1137, 1143 (3d Cir. 1982); Constructors Association of
Western Pennsylvania v. Kreps, 573 F.2d 811, 814-15 (3d Cir.
1978).
A. Plaintiffs, Their Members, and Audiences Will Suffer
Irreparable Harm if Preliminary Relief is not Granted
Plaintiffs have no adequate remedy at law for deprivation of
the constitutional right of free expression. As the Supreme
Court ruled in Elrod v. Burns, 427 U.S. 347, 373-74 (1976), "The
loss of First Amendment freedoms, for even minimal periods of
time, unquestionably constitutes irreparable injury." Plaintiffs
all use the online medium to communicate information that fits
within the broad definitions of "indecent" and "patently
offensive," and the statute's vague and overbroad terms will
force some plaintiffs to self-censor(56). Others, who either
choose not to self-censor or are unable to apply the statute's
vague and overbroad terms, will face the risk of criminal
prosecution if the Act is not temporarily enjoined(57).
In addition, many of the plaintiffs rely on online providers
and other carriers to distribute their online information. If
the statute is not temporarily enjoined, these providers will
likely ban communications that they consider potentially
"indecent" or "patently offensive" in order to avoid criminal
prosecution themselves(58). By doing so, they would deprive the
plaintiffs, their members, and those who use their online
resources of the ability to communicate about important issues.
B. Plaintiffs Have a Substantial Likelihood of Success on
the Merits
1. The Act Violates The First Amendment Because It
Criminalizes Constitutionally Protected Expression
The "indecency" and "patently offensive" standards in
Sections 223(a)(1)(B) and 223(d) are unconstitutional because
they criminalize constitutionally protected expression(59). As
noted, A7223(a)(1)(B) criminalizes "mak[ing], creat[ing], or
solicit[ing]" and "transmi[tting]" any communication "which is
obscene or indecent, knowing that the recipient of the
communication is under 18 years of age." Section 223(d)(1) makes
it a crime to use an "interactive computer service" to "send" or
"display in a manner available" to a person under age 18, any
communication that in context, depicts or describes, in terms
patently offensive as measured by contemporary community
standards, sexual or excretory activities or organs . ...
Yet "indecency" (unlike obscenity) is constitutionally protected
speech that often has substantial social value. Sable Comm. v.
FCC, 492 U.S. 115, 126 (1989).
Subject only to "narrow and well-understood exceptions, [the
First Amendment] does not countenance governmental control over
the content of messages expressed by private individuals."
Turner Broadcasting System v. FCC, 114 S. Ct. 2445, 2458-59
(1994) (citing R.A.V. v. St. Paul, 112 S. Ct. 2538, 2547 (1992);
Texas v. Johnson, 491 U.S. 397, 414 (1989)). The "indecency" and
"patently offensive" provisions of Sections 223(a)(1)(B) and
223(d) are unquestionably content-based bans, and thus are
presumptively unconstitutional. Content-based regulations of
speech will be upheld only when they are justified by
"compelling" governmental interests and "narrowly tailored" to
effectuate those interests. See Turner Broadcasting System, 114
S. Ct. at 2445; Simon & Schuster, Inc. v. New York State Crimes
Victims Bd., 502 U.S. 105 (1991); Sable Comm. v. FCC, 492 U.S.
115, 126 (1989); Fabulous Assoc., Inc. v. Pennsylvania Pub. Util.
Comm., 896 F.2d 780, 784 (1990). The Supreme Court has applied
strict scrutiny to content-based regulations because "[a]t the
heart of the First Amendment lies the principle that each person
should decide for him or herself the ideas and beliefs deserving
of expression, consideration, and adherence." Turner
Broadcasting System, 114 S. Ct. at 2458. The censorship
provisions of the Act fail this strict scrutiny test.
First, there is simply no evidence of a "compelling
government interest" in protecting minors from a vague category
of "indecent" or "patently offensive" material in the online
medium. When First Amendment rights are at stake, courts cannot
defer to a legislative judgment but must make an independent
inquiry to assess whether the record supports the government's
interests. Sable, 492 U.S. at 129; Landmark Comm., Inc. v.
Virginia, 435 U.S. 829, 843 (1978); Turner Broadcasting System,
114 S. Ct. at 2471. The Court has found this "particularly true
where the Legislature has concluded" that the statute "does not
violate the First Amendment." Sable, 492 U.S. at 129. The chief
proponent in the Senate of the online indecency legislation
described its purpose by waving around a "blue book" of images
and declaring that children must be protected from such
images(60). Most of these images were of hard-core pornography
or child pornography, both of which are currently subject to
criminal prosecution under existing federal law(61). While
courts have found "a compelling interest in protecting the
physical and psychological well-being of minors," Sable, 492 U.S.
at 126; FCC v. Pacifica Foundation, 438 U.S. 726, 749 (1978);
Ginsberg v. New York, 390 U.S. 629, 640 (1968), to discuss that
interest in the abstract "is not to scrutinize the Government's
assertions as applied to this case." ACT III, 58 F.3d at 678
(Edwards, C.J. dissenting). Even assuming that the images
circulated among Congressmen as a justification for the
Communications Decency Act were not already illegal under
existing obscenity and child pornography laws, Congress provided
not one iota of evidence that minors are actually harmed by
exposure to communications deemed by some government agent to be
"indecent" or "patently offensive."(62) Unlike Ginsberg, 390 U.S.
at 639, which targeted speech that was obscene under a variable
obscenity test applied to minors, and that therefore lacked any
serious value, the Act targets any reference to sexual activity
or body parts that is considered "offensive," even if the ideas
or information in question undeniably has serious literary,
artistic, scientific, or educational value. Offensiveness is
surely not a proxy for harm(63). The Act thus bans much material
that is unquestionably valuable rather than harmful to older
minors, including information on safe sex practices, human rights
abuses, and civil liberties issues(64).
The Supreme Court has stated, "It is cardinal with us that
the custody, care and nurture of the child reside first in the
parents, whose primary function and freedom include preparation
for obligations the state can neither supply nor hinder."
Ginsberg, 390 U.S. at 639 (quoting Prince v. Massachusetts, 321
U.S. 158, 166 (1944)). Yet rather than facilitate parents'
supervision over what material their children should view, the
Act establishes the Government as final arbiter. A far cry from
gentle government assistance to parents, the Act actually puts
parents at risk of criminal prosecution if they choose to expose
their children to material deemed "indecent" or "patently
offensive" by a government officer. In fact, the inquiry-driven
nature of online communications may put parents and teachers at
risk of prosecution if they simply allow children to use online
communications, since any online use by a minor could result in
the "transmission" or "display" of "indecent" or "patently
offensive" material.
The statute fails the second prong of the strict scrutiny
test as well. That is, even if the government could establish a
compelling interest in protecting minors from "indecent" or
"patently offensive" material in cyberspace, the government
cannot show that a total ban on indecency is a "narrowly
tailored" way to achieve that interest. Indeed, the
Congressional Record shows no consideration of alternative ways
to restrict children's access to indecent materials, although
alternatives clearly exist(65). This failure to examine itself
makes the statute constitutionally infirm. How can the
Government argue that a ban on indecency is the "least
restrictive means" without ever having examined other means? In
Sable, the Supreme Court struck down a content-based statute
banning "indecent" commercial telephone messages on the ground
that "the congressional record contains no legislative findings
that would justify us in concluding that there is no
constitutionally acceptable less restrictive means, short of a
total ban, to achieve the Government's interest in protecting
minors." 492 U.S. at 29; see also id. at 131 (Scalia, J.,
concurring) (available technological alternatives render statute
invalid, even though some children "would manage to secure
access")(66).
Had Congress bothered to hold hearings on various ways to
restrict minors' access to communications with sexual content or
vulgar words, it would have learned of a myriad of ways in which
all online users, including parents, can control the information
they receive(67). While not failproof, these methods put
responsibility for making choices about minors' access to
sexually explicit material "where our society has traditionally
placed it -- on the shoulders of the parent." Fabulous Assoc.,
Inc., 896 F.2d at 788 (citing Bolger, 462 U.S. at 73-74).
In fact, because of the nature of the online medium, even a
total ban will be ineffective at ridding online networks of
"indecent" or "patently offensive" material. See TBS, 114 S. Ct.
at 2470 (regulation must "in fact alleviate the ... alleged
harms in a direct and material way.") (citing Edenfield v. Fane,
113 S. Ct. 1792, 1798-1800 (1993) (emphasis added). Unlike
broadcasting, where the vast majority of Americans receive radio
and television only from broadcasts within the United States,
cyberspace is a global medium. Anyone in the world who has
access to an online network can post information that can be
viewed by anyone else in the world with access to the same
network. While the jurisdiction and practical ability of the
U.S. Justice Department to enforce this law outside the United
States is an open question, in fact online users and content
providers in other countries may not even know of the law and are
unlikely to follow it in any event. Therefore, it is highly
unlikely that online users, information providers and access
providers based in other countries will "purge" their systems of
material that could be "indecent" or "patently offensive" in the
United States. Censorship by any government (including the United
States) is simply not an effective way to eliminate "indecent" or
"patently offensive" communications from online services because
these services transcend national boundaries. Technologies that
allow online users to control the material from the receiving end
are a much more effective way to shield minors from allegedly
harmful material than any attempt to ban expression from the
distribution end.
In Fabulous Assoc., Inc., the Third Circuit held that
requiring adults to obtain an advance identification code in
order to obtain access to sexually explicit phone messages failed
the least restrictive means test and thus violated the First
Amendment. 896 F.2d at 788. As discussed above, in cyberspace,
screening and identification methods are not technically or
economically feasible, are ineffective, and result in an
effective total ban of "indecent" speech. The "indecency"
provisions of the Act are clearly even more restrictive than
those held unconstitutional in Fabulous Assoc., Inc.(68)
The statute is thus clearly not the least restrictive means
for controlling minors' access to objectionable material. The
rationale that led the Supreme Court to uphold time channeling
(not a total ban) of "indecent" language in broadcast, Pacifica,
438 U.S. 726 (1978), does not apply in cyberspace(69). Recently,
the Supreme Court emphasized the narrowness of the Pacifica
holding when it said, "the rationale for applying a less rigorous
standard of First Amendment scrutiny to broadcast regulation" --
"the unique physical limitations of the broadcast medium" --
"does not apply in the context of cable regulation." Turner
Broadcasting System, 114 S. Ct. at 2456; see also Sable, 492 U.S.
at 127; Bolger v. Youngs Drug Prods., 463 U.S. 60, 74 (1983);
Fabulous Assoc., Inc., 896 F.2d at 794; (Pacifica an
"emphatically narrow" holding). Just as the Supreme Court
clarified in Turner Broadcasting System for the cable medium,
"[t]he broadcast cases are inapposite" in the cyberspace medium
because cyberspace "does not suffer from the inherent limitations
that characterize the broadcast medium." 114 S. Ct. at 2456. The
Supreme Court has emphasized many times that its "decisions have
recognized that the special interest of the Federal Government in
regulation of the broadcast media does not readily translate into
a justification for regulation of other means of communication."
Youngs, 463 U.S. at 74; Turner Broadcasting System, 114 S. Ct. at
2456.
2. The Act Is Unconstitutionally Vague
a. "Indecency" and "Patent Offensiveness"
Vague laws violate two fundamental principles of due
process: (1) they leave the public guessing as to what actions
are proscribed; and (2) they invite arbitrary and discriminatory
enforcement by giving unbridled discretion to law enforcement
officers. Grayned v. City of Rockford, 408 U.S. 104, 108-09
(1972); Connally v. General Construction Co., 269 U.S. 385, 391
(1926). Vagueness is a particular problem where laws regulate
expression; and, accordingly, the Supreme Court has ruled that
perhaps the most important factor affecting the clarity that
the Constitution demands of a law is whether it threatens to
inhibit the exercise of constitutionally protected rights. If,
for example, the law interferes with the right of free speech or
association, a more stringent vagueness test should apply.
Village of Hoffman Estates v. Flipside, Hoffman Estates, Inc.,
455 U.S. 489, 499 (1982)(70). In short, "[p]recision of
regulation ... must be the touchstone" where free expression is
concerned. NAACP v. Button, 371 U.S. 415, 438 (1963). Such
exactitude is necessary since "[u]ncertain meanings" inevitably
lead citizens to "steer far wider of the unlawful zone' ...
than if the boundaries of the forbidden areas were clearly
marked.'" Baggett v. Bullitt, 377 U.S. 360, 372 (1964) (quoting
Speiser v. Randall, 357 U.S. 513, 526 (1958)).
The "indecency" and "patently offensive" standards violate
both objectives of the vagueness doctrine and fail to meet the
stringent constitutional test for laws regulating speech. Both
standards are so vague that a "person of ordinary intelligence"
could not possibly "know what is prohibited." Grayned, 408 U.S.
at 108; Smith v. Goguen, 415 U.S. at 572. "Indecency" itself is
a completely imprecise term -- wholly subjective and dependent on
individual values and attitudes that no person engaged in speech
can be expected to anticipate(71). Any one person's notion of
"indecency" will be influenced by such factors as his or her age,
occupation, race, level of education, socioeconomic status,
geographic location, personal interests and politics. Rock or
country music fans are likely to have very different ideas on the
subject from conservative ministers; a New York sophisticate's
notions will contrast dramatically with those of many rural
residents; artists, students, intellectuals, and political
leaders are also likely to have different definitions(72).
Some courts have resolved challenges to the vagueness of the
term "indecent" or "patently offensive" by construing it to mean
"obscene." See, e.g., Hamling v. United States, 418 U.S. 87
(1974)(73). Such a saving construction is not possible in this
case because Congress plainly intended to suppress more than
"obscene" communications. Its deliberate use of the FCC's
"indecency" definition and the legislative history citations to
FCC v. Pacifica, 438 U.S. 726 (1978), demonstrate that Congress
intended to expand the narrow plurality decision in Pacifica,
explicitly limited to time-channeling and to the broadcast
medium, to a total ban on "indecency" throughout cyberspace.
142 Cong. Rec. At H1128-29 (Jan. 31, 1996). In fact, the
"patently offensive" language of Section 223(d) shows that
Congress, like the FCC in Pacifica, intended to use only one part
of the three-part test for obscenity, set out in Miller v.
California, 413 U.S. 15, 24 (1973), thus banning communications
that lack prurient appeal and that have serious literary,
artistic, political, or scientific value.
Defining "indecency" as material that is "patently offensive
as measured by contemporary community standards" does little to
resolve the vagueness problem. First, what is "patently
offensive" is purely a matter of personal taste. As Justice
Harlan succinctly put it, "one man's vulgarity is another's
lyric." Cohen v. California, 403 U.S. 15, 25 (1971). Second,
the addition of "contemporary community standards" does nothing
to solve the vagueness problem. Just as nobody can predict what
a particular individual, organization, government agency, judge,
or prosecutor will consider "indecent," so no one can predict
what the varied and multifarious elements in our society might
consider "patently offensive as measured by contemporary
community standards."
Nor does the ban's confinement to "sexual or excretory
activities or organs" resolve the vagueness problem. The Supreme
Court has recognized that sex is "a great and mysterious motive
force in human life," and thus "one of the vital problems of
human interest and public concern." Roth v. United States, 354
U.S. 476, 487 (1957). Most speech about "sexual ... activities
or organs" is therefore entitled to full constitutional
protection, see, e.g., Sable, 492 U.S. at 126 (1989); it is only
"hardcore pornography" that lacks any serious literary, artistic,
political, or scientific value that may be outside the First
Amendment's umbrella. Miller v. California, 413 U.S. 15, 24, 27
(1973)(74). Since most communications on the subject of human
sexuality are constitutionally protected, and the Act seeks to
ban such communications only if they are "patently offensive"
according to "community standards," it is these latter completely
vague terms that are pivotal.
While much sex-related speech was considered taboo in the
past, contemporary America absorbs a great deal of such speech
through mainstream culture. For example, birth control,
masturbation, and orgasm are discussed on popular prime time
television shows such as "Seinfeld," "Roseanne," "Married with
Children," and "Mad About You;" "excretory functions" are a
popular source of humor on the animated show "The Simpsons,"
watched by children and adults. Sexually explicit literature
such as Philip Roth's Sabbath's Theatre, winner of last year's
National Book Award, are on best-seller lists and available
through numerous bookstores and libraries. World-renowned
museums such the Museum of Modern Art in New York and the Walker
Art Center in Minneapolis exhibit the art of Robert Mapplethorpe
and Andres Serrano. Broadway musicals and plays such as "Love,
Valour, & Compassion" and "Oh! Calcutta" with all-nude scenes win
theater awards and are attended by hundreds of thousands of
people. Given the availability and consumption through
mainstream print, broadcast, and live performance of art and
information about sexuality by both adults and older minors, it
is more impossible than ever for anyone to judge what material
would constitute a non-obscene but "patently offensive"
description or depiction of "sexual or excretory activities or
organs." Thus, the fact that the only "patently offensive"
expression barred by A7223(d)(1) deals with sex, excretion, or
body parts does nothing to alleviate the imprecision of the
operative terms: "patently offensive as measured by contemporary
community standards."(75
Both the "indecency" and the "patently offensive" standard
also violate the second prong of the constitutional vagueness
doctrine because they give unbridled discretion to prosecutors
and invite the worst type of arbitrary and
viewpoint-discriminatory censorship actions. See Forsyth County
v. The Nationalist Movement, 505 U.S. 123, 132 (1992); City of
Lakewood v. Plain Dealer Publishing Co., 486 U.S. 750, 757-58
(1988). The arbitrary effect of applying such terms to online
speech has already been experienced by some of the plaintiffs
when their online providers have attempted to screen "vulgar" or
"offensive speech." The host of plaintiff ACLU's web site has
expressed the possible need to remove certain explicit
information from the ACLU site(76). Plaintiff Clarinet had its
entire set of newsgroups about sexual subjects (covering news
stories about anti-abortion activists, gay rights, and
information) blocked by certain online providers who feared
liability under German law for sex-related speech(77). For
prosecutors to be allowed to prosecute persons or organizations
using such vague standards is to invite the most serious kind of
constitutional harm. As Justice Harlan said in Cohen, 403 U.S.
at 25, "it is largely because governmental officials cannot make
principled distinctions in this area that the Constitution leaves
matters of taste and style so largely to the individual."
Subjective standards for speech are not cured by a
government's interest in protecting children from exposure to
harmful material. As the Supreme Court said in Interstate
Circuit v. Dallas, 390 U.S. 676 (1968):
The permissible extent of vagueness is not directly
proportional to, or a function of, the extent of the power to
regulate or control expression with respect to children. . . .
"It is ... essential that legislation aimed at protecting
children from allegedly harmful expression -- no less than
legislation enacted with respect to adults -- be clearly drawn
and that the standards adopted be reasonably precise so that
those who are governed by the law and those that administer it
will understand its meaning and application."
Id. at 689 (quoting People v. Kahan, 15 N.Y.2d 311, 313 (1965)
(Fuld, C.J., concurring).
The Supreme Court's 1978 plurality decision in FCC v.
Pacifica Foundation, 438 U.S. 726, does not salvage the vagueness
of such terms as "indecency" or "patently offensive." Pacifica
narrowly upheld the application of an earlier version of the
FCC's "indecency" test -- confined to the time channeling of
programs using specific vulgar words -- to a comic monologue that
involved repetitive, "shock value" use of common vulgar
words(78).
Without addressing the inherent vagueness of an "indecency"
definition that turned on such ineffable concepts as "patent
offensiveness" and "community standards" -- indeed, without
addressing the facial constitutionality of the FCC's definition
at all(79) -- the Pacifica plurality narrowly held that it did
not violate the First Amendment for the agency to prohibit a
radio station from broadcasting the repetitive use of these
particular words during hours when children were most likely to
be in the listening audience. Id. at 750. Because the Pacifica
decision was narrowly limited to its facts,(80) involved only
time-channeling, and rested on the "`unique' attributes of
broadcasting," Sable, 492 U.S. at 127, Bolger, 463 U.S. at 74, it
does not foreclose a vagueness challenge to the "indecency" and
"patently offensive" provisions of Sections 223(a)(1)(B) and
223(d)(1)(81).
The Supreme Court is scheduled on February 21, 1996 to hear
oral argument in Alliance for Community Media v. FCC, 56 F.3d 105
(D.C. Cir. 1995), a case challenging "indecency" regulations for
cable television. The Supreme Court may decide a vagueness
challenge to the "indecency" regulations in that case.
Meanwhile, a stay is in effect preventing application of the
"indecency" regulations to cable television. Certainly, this
Court should stay application of vague "indecency" bans to all of
cyberspace at the very least until a decision in Alliance.
b. The Vagueness of the Liability Provisions
The Act puts access providers like plaintiffs IGC, AEGIS,
and Critical Path AIDS Project at risk of criminal prosecution
simply for allowing users to access their online systems,
regardless of whether they produced the content of the material
that is "indecent" or "patently offensive." Sections
223(a)(1)(B) and 223(d)(1) contain no specific intent clause.
Any access provider that provides general access to a variety of
online information databases, online discussion groups, and chat
rooms "knows" that "indecent" or "patently offensive" information
could be "displayed" to a minor if a minor gains access through
its system. The only sure way to comply with the statute would
be to provide accounts only to adults, or to provide two separate
networks -- one for adults, and one for minors -- which would be
economically infeasible and would impermissibly restrict the
First Amendment rights of minors to engage in online
communication.
The defense set out in Section 223(e) exacerbates the
uncertainty of the liability provisions. Section 223(e)
provides:
(1) No person shall be held to have violated subsection (a) or
(d) solely for providing access or connection to or from a
facility, system, or network not under that person's control,
including transmission, downloading, intermediate storage, access
software, or other related capabilities that are incidental to
providing such access or connection that does not include the
creation of the content of the communication.
Because even access providers who do not themselves create the
content of communications on their systems can technologically
exercise "control" over the communications for which they are
conduits, it is far from clear that this defense relieves access
providers of liability.
The Act also puts information providers at risk for content
posted by others on their sites. Many of the plaintiffs sponsor
online discussion groups and chat rooms in which they allow
online users to post messages on a particular topic or to discuss
a topic simultaneously with other online users(82). While these
plaintiffs do not themselves create the posted messages, their
online resources are used to "display" or "initiate the
transmission" of the messages, and thus they could be held liable
if someone posted an "indecent" or "patently offensive" message
and a minor gained access to the message through their sites.
3. The Act is Substantially Overbroad
a. The Act Bans Speech That Is Constitutionally Protected for
Minors
The constitutional infirmity of overbroad legislation "is
that it sweeps protected activity within its proscription." M.S.
News Co. v. Casado, 721 F.2d 1281, 1287 (10th Cir. 1983) (citing
Erznoznik v. City of Jacksonville, 422 U.S. 205, 212-13 (1975);
Grayned v. City of Rockford, 408 U.S. 104, 114 (1972)); See also
NYS Club Ass'n v. City of New York, 487 U.S. 1 (1988); Maryland
v. Munson, 467 U.S. 947 (1984); Broadrick v. Oklahoma, 413 U.S.
601 (1973). Sections 223(a)(1)(B) and 223(d) are overbroad
because they ban much expression that is protected even for
minors. The Supreme Court has ruled in many contexts that the
First Amendment protects minors as well as adults; and that
minors have the constitutional right to speak and to receive the
information and ideas necessary for their intellectual
development and their participation as citizens in a
democracy,(83) including information about reproduction and
sexuality, Carey v. Population Serv., Int'l , 431 U.S. 678, 693
(1977). With only narrow exceptions, therefore, it is
unconstitutional for the government to restrict minors'
participation in the marketplace of ideas.
The statute impermissibly burdens minors' First Amendment
rights in two ways. First, the Act could result in the outright
exclusion of minors from many of the vast public spaces in the
online medium that are currently accessible to both minors and
adults. Most information providers and other online users do not
currently know whether they are communicating with a minor.
Thus, there is no way to ensure that they could not be held
criminally liable for "transmi[tting]," A7223(a)(1)(B), or
"displaying," Section 223(d) to minors "indecent" or "patently
offensive" material. Minors would have to be completely excluded
from online public spaces to ensure that adult users and
information providers could post material they are
constitutionally entitled to post. (This would also require an
identification scheme that in the online medium is both
technically and economically infeasible, and would impermissibly
burden adult access.) While minors could access online public
spaces that had been cleared of all "patently offensive" or
"indecent" speech, most information providers, including the
plaintiffs in this case, do not have the resources to create two
versions of their online communications -- one for adults, and
one for minors(84).
Second, the statute impermissibly burdens minors' First
Amendment right to ideas and information about sexuality,
reproduction, and the human body -- subjects of interest not only
to humanity generally, but of special interest to maturing
adolescents. While there are limited exceptions to minors' First
Amendment rights to sex-related materials, those exceptions to
not apply to the Act's vague and overbroad speech ban. One such
exception is obscenity, a category of expression which the Court
has ruled does not merit First Amendment protection because it is
"no essential part of the exposition of ideas" and is "utterly
without redeeming social importance." Roth v. United States, 354
U.S. at 484-485. In Ginsberg v. New York, 390 U.S. 629 (1968),
the Court held that a state could ban the dissemination to minors
of materials not obscene when distributed to adults, but only if
the material met an adjusted three-pronged "obscene" or "harmful
to minors" test(85). Thus, states may only prohibit the
dissemination to minors of material that lacks serious value for
them, appeals to their shameful or morbid (not healthy) interest
in sex, and contains depictions or descriptions of specified
sexual activities that a local community would consider patently
offensive for minors(86). And in upholding "harmful to minors"
laws as constitutional, some courts have been careful to consider
the First Amendment rights of older adolescents and have
construed such laws to prohibit only that material that would
lack serious value for a 17 year-old. American Booksellers
Assoc. v. Webb, 919 F.2d 1493, 1504 (11th Cir. 1990); American
Booksellers Assoc. v. Virginia, 882 F.2d 125 (4th Cir. 1989)(87).
The statute deliberately encompasses a vast amount of
valuable material that falls outside the Ginsberg "harmful to
minors" test -- material that has serious value to minors and
that appeals only to a healthy interest (or no interest) in
sexuality. It is therefore unconstitutionally overbroad because
it criminalizes speech and information that minors have a First
Amendment right to engage in and receive, including the
information provided by plaintiffs in this case. Plaintiff ACLU,
EPIC, and EFF believe that it is important for minors to be able
to access their online educational materials about civil
liberties issues so that they can recognize when their rights are
being infringed(88). Plaintiffs Journalism Education
Association, AEGIS, Critical Path AIDS Project, and Safer Sex Web
Page believe that it is important that minors, many of whom are
sexually active, have access to their online safe sex education
material because it could literally save their lives(89).
b. The Act Unconstitutionally Restricts the Free Speech Rights
of Adults
Even if an "indecency" or "patent offensiveness" standard
could constitutionally be applied to restrict the ideas and
information available to minors, government may not, by asserting
its interest in "protecting" minors, ban the exercise of First
Amendment rights by adults. Butler v. Michigan, 352 U.S. at 382;
see also Erznoznik v. City of Jacksonville, 422 U.S. at 212-13
(holding unconstitutional ordinance that prohibited the showing
of films containing nudity ostensibly in the interests of
protecting minors); Fabulous Assoc., Inc., 896 F.2d at 788
(holding unconstitutional statute that inhibited adult access to
sexually explicit phone messages ostensibly in the interests of
protecting minors). Because of the nature of the online medium,
the Act is effectively a total ban on "indecency" and "patent
offensiveness" in cyberspace and thus violates the free speech
rights of adult online users. In the words of Justice
Frankfurter, the Act "burn[s] up the house to roast the pig."
Butler, 352 U.S. at 383.
The Act criminalizes the "transmission" or "display" to
minors of "indecent" or "patently offensive" material. The vast
majority of information in online networks is displayed in
"public" spaces -- spaces that act as online libraries or
bookstores -- and that minors as well as adults can access. As
described above, World Wide Web sites, gopher sites, online
discussion groups, chat rooms, and Usenet news groups are all
accessible by minors. There is no way to prevent transmission or
display to minors of "indecent" or "patently offensive" material
in the public areas of cyberspace without seriously restricting
the rights of adults and minors to constitutionally protected
material and crippling the potential of a new communications
medium(90).
A review of the nature of online communications illustrates
the problem. There are two ways in which plaintiffs and other
online users and providers could attempt to comply with the Act.
The first would be to attempt to screen all "indecent" or
"patently offensive" material from all of the public spaces on
online networks. This would de facto reduce all of the
information in online public spaces to material that is suitable
only for children, in direct violation of Butler. In addition,
because "indecent" and "patently offensive" are inherently vague
terms, plaintiffs and other online users and providers have no
idea how to determine which material on the subject of sexuality
or reproduction, or containing "vulgar" language, they need to
screen. An attempt to screen would thus inevitably lead to
suppression of constitutionally protected material both for
adults and for minors(91). Baggett v. Bullitt, 377 U.S. at 372.
Screening is also practically if not technically infeasible.
Unmoderated Usenet discussion groups would be eliminated because
there is no one distribution point for such services at which to
even attempt screening. Other online discussion groups, web
sites, and interactive information databases would be eliminated
because of the enormous burden of attempting to screen postings
from outside users. To shut down the "interactive" feature of
online communications would be to stifle its communication
potential as a diverse marketplace of ideas in which anyone can
participate. For many noncommercial providers, such a screening
requirement would also be economically infeasible, given the
enormous time and human resources it would take to screen the
vast amount of information that flows into a given online site.
See Fabulous Assoc., Inc., 896 F.2d at 788 (noting economic
burden of requiring access codes substantially burdens
constitutionally protected speech)(92).
In addition, the mechanisms that allow users to search
hundreds of different online information databases simultaneously
could no longer be used because it is technically impossible to
screen all "indecent" or "patently offensive" material from
appearing in the search results(93). Similarly, the tremendously
useful "linking" feature of online communications would have to
be eliminated because it would be impossible to screen all the
material on the linked sites in addition to the content on the
home site(94).
The second way in which plaintiffs and other online users
and providers could attempt to comply with the Act would be to
forbid minors from accessing online resources that might be
"indecent" or "patently offensive." This would require online
information providers to create two versions of their online
communications -- one for adults, and one for minors. It would
also require an identification scheme to ensure that minors could
not gain access to the adult sites. While requiring payment via
credit card or other identification card (e.g., driver's license)
would be a way to exclude most minors, most online information is
currently provided for free (once the user has paid a general
access subscription fee), and without identification
requirements. Imposing such requirements would both be
economically infeasible for all but the largest corporate online
providers and would prevent adults without credit or proper
identification from accessing online resources, thus excluding a
large class of American adults and an even larger class of adults
outside the United States). Identification requirements would
also prevent anonymous access, and would require content and
access providers to maintain records of users who had accessed
their sites in order to prove that a particular user was not a
minor. All of these burdens on adult access are impermissible in
the context of constitutionally protected material(95). See
Fabulous Assoc., Inc., 896 F.2d at 788.
4. The Act is Impermissibly Underinclusive
The Act constructs an impermissible system of discrimination
by imposing regulations on those who communicate through the
online medium but not on those who communicate the same
information through the print medium. See First Nat'l Bank of
Boston v. Bellotti, 435 U.S. 765, 784-85 (1978) ("In the realm of
protected speech, the legislature is constitutionally
disqualified from dictating ... the speakers who may address a
public issue."); Minneapolis Star & Tribune Co. v. Minnesota
Comm'r, 460 U.S. 575, 592 (1983). The statute distinguishes
among speakers for content-based regulation based upon a
criterion that "bears no relationship whatsoever to the
particular interests ... asserted." City of Cincinnati v.
Discovery Network, Inc., 113 S. Ct. 1505, 1514 (1993).
Many of the plaintiffs provide educational materials through
both the print and the online medium. For example, plaintiffs
ACLU, Human Rights Watch, Electronic Privacy Information Center,
and Electronic Frontier Foundation all create educational
materials about civil liberties and human rights issues that are
distributed through brochures, books, and pamphlets and that are
also available through its online sites(96). Irrationally, the
statute would make it a crime to send an electronic version of a
document that is "indecent" or "patently offensive" even though
print distribution of the same document is unquestionably
protected by the First Amendment. It makes no sense for
plaintiffs to deny a minor access to an online version of a
publication that the minor could easily -- and legally -- request
from plaintiffs in printed form.
The discrimination between online speakers and print
speakers is particularly problematic because online
communications systems have provided a low-cost forum for many
speakers who do not own mainstream newspapers, broadcast, or
other mass media. With absolutely no constitutional
justification or rationale, the statute creates second-class
First Amendment rights for online users and information
providers(97).
5. The Act Unconstitutionally Criminalizes Private E-mail
Sections 223 (a)(1)(B) and (d) criminalize "indecent" or
"patently offensive" communications to or between minors. The
Act thus prohibits a 17-year-old from having an online discussion
with a romantic partner about activities in which they had
lawfully engaged, and which they had lawfully discussed on the
phone, in person or by regular mail. The Act also prohibits
e-mail distribution among or between minors of important
information about sexuality, reproduction and the human body that
minors are clearly constitutionally entitled to receive.
The Act has an additional constitutional defect when applied
to private e-mail. E-mail is the equivalent of a private,
personal correspondence sent through the U.S. mail, or private
conversations by telephone or within the home. Such
communications are protected against governmental invasion or
censorship by the constitutional right to privacy found in the
First, Fourth, and Ninth Amendments and the substantive due
process clause of the Fifth Amendment. See Griswold v.
Connecticut, 381 U.S. 479, 484-85 (1965); id at 487-91 (Goldberg,
J. concurring)(Ninth Amendment); id at 500 (Harlan, J.
concurring) (substantive due process); Carey, 431 U.S. at 684-85
(due process clause).
As the Supreme Court recognized in Katz v. United States,
389 U.S. 347 (1967), individuals ordinarily have a "reasonable
expectation of privacy" in their telephone conversations. Minors
as well as adults have constitutional privacy rights that include
a right to personal decisionmaking about intimate matters
concerning sex and reproduction. See, e.g., Bellotti v. Baird,
443 U.S. 622, 642-43 (1979). The exercise of such a privacy
right requires access to information, particularly when acquired
through personal, intimate, one-on-one conversations with family
members, romantic partners, or trusted friends. If, as the Court
ruled in Stanley v. Georgia, 394 U.S. 557 (1969), the right of
privacy, combined with the First Amendment principle of freedom
of thought, prohibits the government from criminalizing the
possession of even obscenity within the privacy of the home,
surely the combined weight of free thought and privacy principles
protect "indecent" communications through private e-mail from
governmental invasion or control.
6. The Act Violates the First Amendment Right to Access
Information Anonymously
Currently, online users have a password and user name which
they use to sign on to their online service. Many user names are
pseudonyms that allow users to send, view, and receive online
communications anonymously(98). Several of the plaintiffs offer
information that users might want to access anonymously. For
example, AEGIS provides information and opportunities for
discussions about AIDS or HIV. Similarly, Queer Resources
Directory provides information and discussion opportunities to
gays, lesbians, and bisexuals. The Safer Sex Page provides
information on safer sex and sponsors a Safer Sex Forum for open
discussion(99). Users of these online services have reasons to
remain anonymous. For instance, users could fear discrimination
and harassment, or simply want to maintain their privacy on
sensitive issues. Some people might forego or be inhibited from
discussing issues and receiving information if they had to
disclose their identities(100).
In order to comply with the Act, plaintiffs would have to
require identification of those seeking access to a web site,
chat room, discussion group, or other online forum. Such an
identification requirement would remove the current option of
anonymity for both speakers and receivers of information.
As the Court recently stressed, anonymity "exemplifies the
purpose behind the Bill of Rights, and of the First Amendment in
particular: to protect unpopular individuals from retaliation --
and their ideas from suppression -- at the hand of an intolerant
society." McIntyre v. Ohio Elections Com'n, 115 S.Ct.
1511, 1524 (1995). Thus, with regard to speakers, the Court has
held on a number of occasions that the right to publish
anonymously is protected by the First Amendment. E.g., id. at
1516; Talley v. California, 362 U.S. 60, 64-65 (1960).
Cyberspace represents a new frontier for literary, political, and
other publishers who must be granted the same rights of anonymity
in this medium as they are accorded in print.
The anonymous receipt of information in cyberspace is also a
First Amendment right. The Supreme Court has held that the right
to receive literature is protected by the First Amendment.
Lamont v. Postmaster General, 381 U.S. 301, 305, 307 (1965). The
government cannot require a written request to receive mail
because such a requirement limits "the unfettered exercise of the
addressee's First Amendment rights." Id. at 305. Because the
receipt of information is protected by the First Amendment, the
importance accorded anonymous receipt should be as great as that
given to anonymous publishing as discussed above. Further, the
Supreme Court has concluded that compelled disclosure of identity
may unconstitutionally deter the exercise of First Amendment
rights. E.g., Brown v. Socialist Workers '74 Campaign Comm., 459
U.S. 87 (1982) (holding unconstitutional as applied to unpopular
political party statute requiring candidates for political office
to disclose identities of contributors and recipients of campaign
funds); NAACP v. Alabama, 357 U.S. 449 (1958) (denying state
access to membership lists). This is no less so for recipients
of information than for publishers.
As the Third Circuit has discussed, "[a]n identification
requirement exerts an inhibitory effect ... and such deterrence
raises First Amendment issues comparable to those raised by
direct state imposed burdens or restrictions." Fabulous
Associates v. Pennsylvania Public Utility Com'n, 896 F.2d 780,
785 (3d Cir. 1990) (citing Talley, 362 U.S. at 64-65). Thus, the
court held unconstitutional a statute mandating access codes with
an identification requirement for the use of phone sex services
because there was a less restrictive alternative. Fabulous, 896
F.2d at 787-88. The identification requirement for receipt of
information over the computer is analogous and a fortiori
violates First Amendment rights of anonymity.
7. 18 U.S.C. A71462(c), as Amended, Criminalizes
Constitutionally Protected Speech About Abortion
Plaintiffs ACLU and Planned Parenthood, and others routinely
engage in communications barred by 18 U.S.C. A7 1462 by
providing information about how and where to obtain abortions or
abortifacient drugs and devices, and when and under what
conditions doctors may perform abortions(101). These plaintiffs
also receive information about the conditions under which
abortions are performed, how they are performed, where they are
performed, and how to use abortifacient drugs and devices(102).
Plaintiffs receive this information from physicians performing
abortions, abortion rights advocates, and others(103). Through
these communications, Plaintiffs send and receive information
regarding "where, how, or of whom, or by what means" "any drug,
medicine, article, or thing designed, adapted, or intended for
producing abortion ... may be obtained or made." 18 U.S.C. A7
1462 (c). Thus, the activities of Plaintiffs fall squarely
within the A71462 ban.
Speech regarding abortion, including the communications
barred by A71462(c), is protected by the First Amendment.
Bigelow v. Virginia, 421 U.S. 809 (1975); see also Bolger v.
Youngs, 463 U.S. 60 (information regarding contraceptives); Carey
v. Population Servs. Int'l, 431 U.S. at 700-02 (plurality
opinion) (same); Virginia State Bd. of Pharmacy v. Virginia
Citizens Consumer Council, Inc., 425 U.S. 748 (1976) (information
regarding prescription drugs in general). Restrictions on
abortion-related speech are impermissible even under the lesser
scrutiny applied to commercial speech. Bigelow, 421 U.S. at
828-29; see also Bolger, 421 U.S. 809; Carey, 431 U.S. at 700-02.
In this case, where the speech is non-commercial and the
restriction is content-based, the statute is "presumptively
invalid," R.A.V. v. City of St. Paul, 505 U.S. at 382, and is
subject to "the most exacting scrutiny," Turner Broadcasting
System v. FCC, 114 S. Ct. at 2459 (1994).
The government has no compelling interest in restricting
speech about abortion. On the contrary, speech about abortion
"relates to activity which is [constitutionally] protected from
unwarranted state interference." Bolger, 463 U.S. at 69
(information regarding contraception); see also Carey, 431 U.S.
at 700-01 (same); Bigelow, 421 U.S. at 822 (abortion). The
Supreme Court has never found a government interest in
suppressing speech related to abortion or contraception
sufficient to uphold a restriction. See, e.g., Bigelow, 421 U.S.
at 827 (rejecting, inter alia, argument that ban on
abortion-related advertising furthered the state's interests in
maintaining the quality of medical care); Bolger, 463 U.S. at 71
(rejecting, inter alia, argument that alleged offensiveness could
justify suppression of protected speech); Carey, 431 U.S. at 701
(same). Therefore, A71462 is invalid on its face.
C. Any Asserted Harm to Defendants from Issuance of a
Temporary Restraining Order and Preliminary Injunction does not
Outweigh the Potential Harm to Plaintiffs if Censorship
Provisions of the Act are Not Enjoined
The harm to the plaintiffs, their members, and audiences, if
the Act is not enjoined, is of constitutional dimension.
Plaintiffs face suppression of constitutionally protected speech.
The banned speech includes material of significant educational,
political, medical, artistic and social value that deals with
issues such as sexuality, reproduction, human rights and civil
liberties. Some of the speech could literally save lives.
Moreover, the vagueness and subjectivity of the bans will result
in suppression of even greater amounts of information than "if
the boundaries were clearly marked." Speiser v. Randall, 357 U.S.
513, 526 (1958). Plaintiffs will eit her have to self-censor
their communications or face criminal prosecution if the Act is
not enjoined. Plaintiffs who rely on online providers to
distribute their information may be deprived of the ability to
communicate about important issues if the Act is not enjoined and
the providers ban their material in order to avoid prosecution
themselves.
The only governmental interest is in suppression of speech
that is "indecent" or "patently offensive" and transmitted by
computers. Such a purpose is flatly unconstitutional. Even if
the government could show some harm to minors from some
subcategory of "indecent" speech, there are numerous less
burdensome methods available for protecting those minors.
Finally, the identical speech is available to minors in print
form and will continue to be available whether or not the Act is
in effect.
D. Preliminary Relief Would Serve the Public Interest
For all the foregoing reasons, the injunction will not
disserve the public interest. There is no public interest in
suppressing constitutionally protected speech or in reducing all
speech in the promising new online medium to a level considered
by the government to be acceptable for minors. On the contrary,
the public interest is served by robust exchange of ideas, and
many alternatives are available for those parents who wish to
shield their children from online communications that they deem
appropriate.
CONCLUSION
For all these reasons, plaintiffs respectfully ask that the
motion be granted.
_________________________
Christopher A. Hansen
Marjorie Heins
Ann Beeson
Steven R. Shapiro
Laura K. Abel
Catherine Weiss
Reproductive Freedom Project
American Civil Liberties Union
Fdn.
132 West 43 St.
New York, NY 10036
212-944-9800
Stefan Presser
ACLU of Pennsylvania
125 South Ninth St. Suite 701
Philadelphia, PA 19107
215-923-4357
David L. Sobel
Marc Rotenberg
Electronic Privacy Information Center
666 Pennsylvania Ave. SE Suite 301
Washington, D.C. 20003
202-544-9240
Michael Godwin
Electronic Frontier Foundation
1550 Bryant St., Suite 725
San Francisco, CA 94103
415-436-9333
Roger Evans
Legal Action for Reproductiv Rights
Planned Parenthood Federation of America
810 Seventh Avenue
New York, NY 10019
212-261-4708
February 8, 1996
END NOTES
1. The Terms "computer communications systems," "online
medium," "interactive computer service," "online networks," and
"cyberspace" will be used synonymously in this brief to refer to
the combination of all online computer technologies affected by
the Act.
2. Plaintiffs also challenge a portion of the Act that bans
communications about abortion, and argue that the Act violates
the First Amendment rights of those who wish to receive ideas and
information anonymously.
3. Sister Mary Elizabeth Aff. Paragraphs 3, 6.
4. Chatelle Aff. Para. 9; McCullagh Aff. Para. 4; Rotenberg
Aff. Para. 7.
5. Glasser Aff. Para. 11; Meeks Aff. Para. 4.
6. For further detail on the plaintiffs and the basis for
their fear of prosecution under the Act, see the Complaint and
the affidavits submitted with this motion.
7. Glasser Aff.
8. Mariner Aff.
9. Rotenberg Aff.
10. Godwin Aff.
11. Perkins Bowen Aff.
12. Krause Aff.
13. Chatelle Aff.
14. Templeton Aff.
15. Sears Aff.
16. Donaldson Aff.
17. Sister Mary Elizabeth Aff; Kuromiya Aff.
18. Troyer Aff.
19. Hauman Aff.
20. Nell Warren Aff.
21. Casti Aff.
22. McCullagh Aff.
23. Meeks Aff.
24. Wallace Aff.
25. Johnson Aff.
26. Computer bulletin board systems that specialize in
adult material generally require identification and payment and
screen out minors, and are thus not at issue in this case.
27. General Accounting Office, Report to Congress:
Information Superhighway-An Overview of Technology Challenges,
January 1995, at ch. 1.
28. Peter Lewis, "On the Net," New York Times, May 29,
1995; see also White House Interagency Task Force on the National
Information Infrastructure, "The Global Information
Infrastructure: Agenda for Cooperation," Feb. 15, 1995, at 5.
29. White House Task Force, at 6. See also statistics on
the Internet, at gopher://nic.merit.edu:7043/11/statistics/nsfnet
/history/netcount.
30. See, e.g., Sears Aff.
31. Gopher, the older of the methods (and now becoming
somewhat obsolete), is a menu-driven program that allows the user
to "gopher" through multiple layers of menus to search for
information on a particular topic, and to link to other sites on
the Internet. For other ways to access information on the
Internet, see Plaintiff EFF's Guide to the Internet, available in
print form and on the Internet at http://www.eff.org.
32. "Usenet" is another set of online discussion groups
that has its own generalized distribution system. Usenet
discussion groups are known as "newsgroups." Usenet carries more
than 40 million characters a day -- "roughly the equivalent of
volumes A-G of the Encyclopedia Britannica." EFF's Guide to the
Internet, ch. 3. Most online systems that provide access to the
Internet also provide access to Usenet.
33. See, e.g., Kuromiya Aff. 147; Troyer Aff. 1417; McCullagh
Aff 2E 149.
34. Glasser Aff. 143; Godwin Aff. 143.
35. Godwin Aff. 145; Mariner Aff. 147; Sister Mary Elizabeth
Aff. 146; Kuromiya Aff. 1410.
36. The search mechanisms work somewhat similarly to the
searching features used on the popular online legal databases
LEXIS and Westlaw. Internet search engines differ from LEXIS and
Westlaw, though, because they allow users to search hundreds of
computer networks located all over the world rather than simply
searching one centralized computer network.
37. Kuromiya Aff. 1412; Godwin 144.
38. For example, many online services were developed separately
from the Internet. These services provide their own content to
subscribers and now also usually provide access to content over
the Internet. Large commercial services like America Online,
CompuServe, and Prodigy have over a million subscribers each and
contain thousands of databases and chat groups on a variety of
topics. Smaller online networks, known as Computer Bulletin Board
Systems (BBS's), usually cater to people interested in
specialized subject matter or to people from a particular
geographical region.
39. For example, America Online negotiated with businesses to
advertise on their "Downtown AOL" section. ATKOL Video, a gay
video store, negotiated a contract to post its mail order catalog
on the service. After signing an agreement with ATKOL, AOL
reviewed the catalog and then censored several video titles
from the online version before it would allow posting. See "AOL
Censors Gay Video Titles, Finds Buns' Acceptable but Studs' Too
Sleazy," ACLU Cyber-Liberties Update, 12/6/95.
40. Glasser Aff. 1412; Godwin Aff. 144; Troyer Aff. 1417;
Wallace Aff. 1410; McCullagh Aff. 149.
41. Troyer Aff. 1411.
42. Jerry Berman and Daniel Weitzner, "Abundance and User
Control: Renewing the Democratic Heart of the First Amendment in
the Age of Interactive Media," 104 Yale L.J. 1619, 1623-24
(1995).
43. Id.
44. "Any attempt to impose centralized content control in a
bureaucratic manner on this fundamentally decentralized medium is
bound to stifle the growth of the medium, squander the democratic
potential of the Internet, and may even cut the United States off
from the growing global information infrastructure." Interactive
Working Group Report to Senator Leahy, Parental Empowerment,
Child Protection, & Free Speech in Interactive Media, 7/24/95,
(Leahy Report), at 4-5; see also Berman and Weitzner, Abundance
and User Control, at 1624.
45. "Unlike centralized broadcast radio and television services,
there are no central control points through which either a single
network operator or government censors can control particular
content. ... [The] proliferation of individual speakers
stands in sharp contrast to broadcast television or even cable
television, where one may count five, ten or perhaps one hundred
speakers, each of whom controls a channel." Leahy Report, at
4-5.
46. Leahy Report, at 4-5.
47. Leahy Report, at 5-6.
48. Unlike the phone system, cost does not depend on the
distance between sender and receiver. Thus, it may cost no more
for an online user in Australia to communicate by e-mail with an
Irishman than it does for two neighbors in Beijing to exchange
messages on a computer bulletin board.
49. See EFF's Guide to the Internet, at 4.2.
50. Leahy Report, at 6.
51. Leahy Report, at 7-8.
52. Because there is no central distribution point on the
Internet, unlike large commercial online systems like AOL,
Prodigy, and Compuserve, it is difficult if not impossible for
Internet Service Providers to monitor all of the content that
passes through their networks.
53. Leahy Report, at 8-10.
54. Leahy Report, at 10-11.
55. "On-Line Firms Team Up on Technology," Washington Post,
September 9, 1995.
56. Glasser Aff. 1421; Perkins Bowen Aff. 1412; Sister Mary
Elizabeth Aff. 1412; Donaldson Aff. 1420.
57. Glasser Aff. 1416; Wallace Aff. 148; Rotenberg Aff. 149;
Perkins Bowen Aff. 1414; Kuromiya Aff. 1416.
58. Troyer Aff. 1416; Meeks Aff. 148; McCullagh Aff. 148;
Templeton Aff. 148; Wallace Aff. 149.
59. The Federal Communications Commission has defined the term
"indecency" for purposes of regulating broadcast radio and
television as material that "depicts or describes, in terms
patently offensive as measured by contemporary community
standards for the broadcast medium, sexual or excretory
activities or organs." See Action for Children's Television v.
FCC, 58 F.3d 654, 657 (D.C.Cir. 1995) (en banc), cert. denied,
133 L.Ed.2d 658 (1996). The FCC has issued the same
definition of "indecency" in other statutes or regulations
designed to suppress speech on sexual subjects. See, e.g.,
Alliance for Community Media v. FCC, 56 F 2E3d 105, 124
n.4 (D.C.Cir. 1995) (en banc), cert. granted, 64 U.S.L.W. 3347
(U.S. Nov. 13, 1995) (No. 95-124) ("indecency" on leased access
and public, education, and governmental access cable channels);
Dial Information Services v. Thornburgh, 938 F.2d 1535, 1540
(2d Cir. 1991), cert. denied, 112 S.Ct. 966 (1992) ("indecent"
telephone communications). Although 15223(d) tracks this FCC
definition, the "indecency" provision in 15223(a)(1)(B) does not
incorporate the "patently offensive" language and thus differs
from the cases considering "indecency" in broadcast, cable
television and telephones because the provision contains no
further definition of "indecent" and the FCC is given no
jurisdiction under the statute to issue any such definition. In
fact, other provisions of the Act expressly deny jurisdiction to
the Federal Communications Commission. See 15509, adding 47
U.S.C. 15230(d); 15502, adding 47 U.S.C. 15223(e)(6).
60. See 141 Cong. Rec. S8130-31, June 12, 1995.
61. Indeed, the Justice Department has already prosecuted online
obscenity and child pornography in several cases. See, e.g., U.S.
v. Thomas, 1996 U 2ES. App. LEXIS 1069 (6th Cir. Jan. 29, 1996)
(Nos. 94-6648/94-6649); "Use of Computer Network For Child Sex
Sets Off Raids," The New York Times, 9/14/95.
62. The Supreme Court has not ruled precisely on what is
required in terms of a showing of harm from exposure to
"indecency" or "patently offensive" material. The harm to
children from exposure to indecency was not at issue in Pacifica,
which narrowly upheld the FCC's time channeling of vulgar words
in the broadcast medium.
63. The Supreme Court has repeatedly made it clear that the
government "may not prohibit the expression of an idea simply
because society finds the idea itself offensive or disagreeable."
Texas v. Johnson, 491 U.S. 397, 414 (1989). See also Hustler
Magazine v. Falwell, 485 U.S. 46, 55-56 (1988); Bolger v. Youngs
Drug Prods. Corp., 463 U.S. 60, 71 (1983); ("offensiveness" not
a justification for suppressing speech); Cohen v. California, 403
U.S. 15, 25 (1971) (government cannot "cleanse public debate" of
certain "offensive" words).
64. Troyer Aff.; Sister Mary Elizabeth Aff; Kuromiya Aff.;
Mariner Aff.; Glasser Aff.; Rotenberg Aff.; Godwin Aff.
65. See 142 Cong. Rec. H1128-29, Jan. 31, 1996.
66. After Sable, Congress passed more narrowly tailored
legislation designed to bar minors' access to "dial-a-porn"
services. This statute was upheld in Dial Information Services
v. Thornburgh, 938 F.2d 1535 (2d Cir. 1991); Information
Providers v. FCC, 928 F.2d 866 (9th Cir. 1991). Those decisions
are irrelevant to the present case, however, because there
Congress had made findings that the blocking mechanisms mandated
by the statute were the least restrictive means of achieving the
government's compelling goal. The present statute, by contrast,
is supported by no such legislative findings and will
effectively deny adults as well as minors access to valuable
speech.
67. See discussion at section C4 supra.
68. In addition, the adult access restrictions in Fabulous
Assoc., Inc. applied only to material deemed "harmful to minors"
under the Ginsberg test, a much narrower restriction than the
Act's ban of "indecent" and "patently offensive" speech. 896
F.2d at 896.
69. One commentator has noted that "First Amendment lawyers
should be wary of applying current legal metaphors to the newer
electronic communication spaces without substantially immersing
themselves in the experience of using such cyberspaces.
Legislators should be equally cautious." Anne Wells Branscomb,
Anonymity, Autonomy, and Accountability: Challenges to the First
Amendment in Cyberspaces, 104 Yale L.J. 1639, 1676 (1995). See
also Thomas G. Krattenmaker and L.A. Powe, Jr., Converging First
Amendment Principles for Converging Communications Media, 104
Yale L.J. 1719 (1995) ("How can one reconcile the fact of
technology and media convergence with the legal presumption of
distinct treatments?"); Donald Lively, The Information
Superhighway: A First Amendment Road Map, 35 B.C. L. Rev. 1067
(1994) (arguing for the abandonment of different levels of First
Amendment protection for different media).
70. See also Hynes v. Mayor & Council of Oradell, 425 U.S. 610,
620 (1976) (more exacting vagueness scrutiny required where First
Amendment rights are implicated); Smith v. Goguen, 415 U.S. 566,
573 (1971) (where statute "is capable of reaching expression
sheltered by the First Amendment, the [vagueness] doctrine
demands a greater degree of specificity than in other contexts");
Cramp v. Board of Public Instruction, 368 U.S. 278, 287 (1961)
(same).
71. See, e.g., Smith v. Goguen, 415 U.S. at 573 (law barring
"contemptuous" treatment of the flag is unconstitutionally
vague); Coates v. Cincinnati, 402 U.S. 611, 614 (1971) (law
prohibiting "annoying" conduct is unconstitutionally vague);
Southeastern Promotions Ltd. v. Conrad, 420 U.S. 546, 552-53
(1975) ("clean and healthful and culturally uplifting" standard
impermissibly gives city unbridled discretion); Cinevision Corp.
v. City of Burbank, 745 F.2d 560, 573 (9th Cir. 1984), cert.
denied, 471 U.S. 1054 (1985) ("family entertainment" standard not
sufficient to limit "arbitrary and capricious action"); Finley v.
National Endowment for the Arts, 795 F 2ESupp. 1457, 1471-72
(C.D.Cal. 1992), app. pending ("general standards of decency" is
unconstitutionally vague); Gay Men's Health Crisis v. Sullivan,
792 F. Supp. 278, 295 n.35 (S.D.N.Y 1992) ("offensiveness" is
unconstitutionally vague).
72. With absolutely no other guidance, the use of the term
"indecency" in 15223(a)(1)(B) encompasses material that does not
even relate to sexuality or contain "vulgar" language.
73. See also United States v. 12,2000-ft. Reels of Film, 413
U.S. 123, 130 n.7 (1973); Manual Enterprises, Inc. v. Day, 370
U.S. 478, 482-83 (1962); Carlin III, 837 F.2d at 558-60; Osborne
v. Ohio, 495 U.S. 103, 132 n.7 (1990) (Brennan, J 2E, dissenting)
(reviewing cases in which "courts found it necessary to equate
lewd' [and indecent'] with obscene' in order to avoid
overbreadth and vagueness problems.").
74. Under Miller, even socially worthless hardcore pornography
is not necessarily unprotected; it must also appeal to a shameful
or morbid, rather than a healthy, interest in sex, and be
patently offensive according to the relevant local community.
Id. at 24; see also Brockett v. Spokane Arcades, 472 U.S. 491
(1985).
75. As the Supreme Court noted in Miller v. California, 413 U.S.
at 30, "[O]ur Nation is simply too big and too diverse for this
Court to reasonably expect that such standards [as `patently
offensive'] could be articulated for all 50 States in a single
formulation, even assuming the prerequisite consensus exists."
No court could possibly determine what is "patently offensive"
according to the "community standards" of cyberspace, which is
not a national, but a global communications medium. Such
decisions plainly should not be left to myriad local prosecutors,
judges, and juries.
76. Glasser Aff. 149. The company has not yet decided what
action to take.
77. Many other examples of private application of standards like
"indecency" have received attention from the mainstream and
online press. America On line recently banned a poetry
discussion group for using "vulgar" and "offensive" speech. See
Rotenberg Aff. 148.
78. The original FCC definition of "indecency" pursuant to the
broadcast prohibition in 18 U.S.C. 151464 referred only to
"patently offensive" language; it was later expanded to include
descriptions or depictions. See Action for Children's Television
v. FCC, 852 F.2d 1332, 1336 (D.C. Cir. 1988); John Crigler &
William Byrnes, Decency Redux: The Curious History of the New
FCC Broadcast Indecency Policy, 38 Cath.U.L.Rev. 329 (1989).
79. Pacifica, 438 U.S. at 742-43.
80. See 438 U.S. at 750; Sable Communications, 492 U.S. at 128;
Bolger v. Youngs, 463 U.S. at 74.
81. In the context of vagueness challenges to "indecency"
restrictions in other media, the Second, Ninth, and D.C. Circuits
have held that Pacifica determined that "indecency" was not
unconstitutionally vague. Dial Information v. Thornburgh, 938
F.2d at 1541 (2d Cir. 1991) (dismissing vagueness challenge to
FCC's "indecency" regulations for dial-a-porn); Information
Providers v. FCC, 928 F.2d 866 (9th Cir. 1991) (same); Alliance
for Community Media v. FCC, 56 F.3d 105 (D.C. Cir., (1995)
(rejecting vagueness challenge to "indecency" restrictions on
cable, but welcoming" correction" from "Higher Authority"). But
see Finley v. National Endowment for the Arts, 795 F. Supp 1457,
1471 (C.D.Cal 1992), appeal pending, (finding statute imposing
"general standards of decency" to be unconstitutionally vague).
82. Glasser Aff. 144, 1413; Troyer Aff. 144; Godwin Aff. 143.
83. See, e.g., Board of Education v. Pico, 457 U.S. 853, 864
(1982); Erznoznik v. City of Jacksonville, 422 U.S. at 213-14;
Tinker v. Des Moines Ind. School Dist., 393 U.S. 503 (1969);
West Virginia Bd. of Educ. v. Barnette, 319 U.S. 624 (1943).
84. See Glasser Aff. 1418; Mariner Aff. 1413; Meeks Aff. 145;
Donaldson Aff. 1414; Godwin Aff. 1411; Nell Warren Aff. 149;
Wallace Aff. 147.
85. Ginsberg pre-dated Miller. While the Supreme Court has not
revisited the so-called "obscene as to minors" or "harmful to
minors" test since Ginsburg, circuit courts dealing with "harmful
to minors" laws have concluded that the Ginsburg standard
should simply be adjusted to take account of the most recent
Supreme Court definition of obscenity in Miller. See American
Booksellers Assoc. v. Webb, 919 F.2 d 1493, 1496 (11th Cir.
1990); American Booksellers Assoc. v. Virginia, 882 F.2d 125,
127 (4th Cir. 1989); Upper Midwest Booksellers Assoc. v.
Minneapolis, 780 F.2d 1389, 1391 (8th Cir. 1985); M.S. News Co.
v. Casado, 721 F.2d 1281, 1286 (10th Cir. 1983).
86. See ABA v. Webb, 919 F.2d 1493; ABA v. Virginia, 882 F.2d
125 Upper Midwest Booksellers Assoc. v. Minneapolis, 780 F.2d
1389; M.S. News Co. v 2E Casado, 721 F.2d 1281.
87. Another very limited exception to minors' First Amendment
rights has been the "indecency" rules for radio and television
broadcast. See Pacifica, 438 U.S. 726 (1978) For the reasons
articulated in section B.1. supra, Pacifica cannot be
expanded to ban all "patently offensive" communications in the
completely different medium of cyberspace.
88. Glasser Aff. 1415; Godwin Aff. 149; Rotenberg Aff. 148.
89. Sister Mary Elizabeth Aff. 147; Kuromiya Aff. 1417; Troyer
Aff. 148.
90. Kuromiya Aff. 1421; Sears Aff. 147; Krause Aff. 1421; Sobel
Aff. 1410.
91. Chatelle Aff. 147; Templeton Aff. 146; Troyer Aff. 1412;
Sister Mary Elizabeth Aff. 1412; McCullagh Aff. 147; Meeks Aff.
147; Donaldson Aff. 1420; Casti Aff. 1411; Mariner Aff. 149; Nell
Warren Aff. 148.
92. Sears Aff. 146.
93. JEA Aff. 14146, 14.
94. Wallace Aff. 1410; Templeton Aff. 1410; Krause Aff. 1424;
Troyer Aff. 1417; Donaldson Aff. 148; Chatelle Aff. 147;
McCullagh Aff. 149; Casti Aff. 1412.
95. Ginsberg-type "harmful-to-minors" display statutes that have
been upheld under narrowing constructions are readily
distinguishable from the statute at issue in this case. First,
those statutes prohibited the display only of material that fit
the three-part Ginsberg/Miller "obscene as to minors" standard,
as opposed to the much more broad and vague categories of
"indecency" and "patent offensiveness." Second, courts have
careful to interpret those statutes to require compliance that
does not impermissibly burden adult access to constitutionally
protected materials 2E ABA v. Virginia, 882 F.2d at 127
(booksellers only need take "reasonable steps" to prevent
juveniles from browsing material deemed "harmful to minors"); ABA
v. Webb, 919 F.2d at 1507 (booksellers need only put material
deemed "harmful to minors" be hind blinder racks, and need not
physically segregate it from other materials, in order to comply
with the statute). In contrast, because of the unique nature of
the online medium, any strategy by which an online information
user or provider might comply with the Act would unduly burden
both minors' and adults' access to constitutionally protected
speech.
96. Glasser Aff. 143; Mariner Aff. 14145-6; Rotenberg Aff. 143;
Godwin Aff. 144.
97. While the Supreme Court has held that speakers in the
broadcast medium have less First Amendment protection than
speakers in the print medium, see Pacifica, 438 U.S. 726 (1978),
the rationale for the distinction is absent in cyberspace. See
discussion infra at section B.1. supra.
98. Troyer Aff. 149; Kuromiya Aff. 1421; Nell Warren Aff. 149.
99. Sister Mary Elizabeth Aff.; Casti Aff.; Troyer Aff.
100. Nell Warren Aff. 147; Sister Mary Elizabeth Aff. 147;
Kuromiya Aff. 1421; Troyer Aff. 149.
101. Glasser Aff. 141423, 25; Johnson Aff. 14143, 4, 7, 11.
102. Glasser Aff. 141424, 26; Johnson Aff. 14148, 11.
103. Glasser Aff. 141424, 26; Johnson Aff. 14148, 11.