Originally published on November 21, 2002

The author wishes to thank Ms. Rachel Zeehandelaar (University of Pennsylvania, B.A. expected 2005) for her valuable research assistance in preparation of this article.

An updated article for the Libel Defense Resource Center

Introduction

The "right to privacy" has been around since the early part of the last century. It

has evolved to apply – more or less – to a disparate array of social and economic issues, ranging from the desire to avoid publicity (Time v. Hill) to abortion (Roe v. Wade). The recent explosive growth of Internet use has created its own set of privacy concerns arising from this new medium. By mid-2001, the Federal government already had these major privacy laws on the books:

  • Fair Credit Reporting Act
  • Privacy Act
  • Family Educational Rights and Privacy Act
  • Right to Financial Privacy Act
  • Privacy Protection Act
  • Electronic Communications Privacy Act
  • Video Privacy Protection Act
  • Employee Polygraph Protection Act
  • Telephone Consumer Protection Act
  • Health Insurance Portability and Accountability Act
  • Driver’s Privacy Protection Act
  • Identity Theft and Assumption Deterrence Act
  • Gramm-Leach-Bliley Act (Title V)
  • Children’s Online Privacy Protection Act

This laundry list of legislation was in place before the terrorist attacks of September 11, 2001. Until then, debate had centered around what new measures could protect individuals’ private information and communications while online. Now, public opinion regarding the primacy of privacy is dramatically different. The principal developments in the law of online privacy in the past twelve months have involved the government’s response to the reality and ongoing threat of terrorism, and the American public’s altered attitudes about the proper "balance" between privacy and self-preservation.

How Privacy and Cyberspace Mix

The law of online privacy has focused primarily on users’ unhappy experiences with web pages, and how those pages collect and handle information about those who visit them. To understand how these disputes arise, it is important to understand that a web page can, in fact, learn quite a lot about those who click to them, browse through them and interact with them. This can occur even without the knowledge or consent of the visitor to the page. For an explanation of how this happens, and the technology behind it, visit the Consumer Project on Technology’s privacy library online: http://www.cptech.org/privacy. There is also an informative collection of links and resources on this subject at the Electronic Privacy Information Center’s site: http://www.epic.org/.

In response to users’ distaste for such surreptitious intelligence-gathering, many web sites now post "Privacy Policies" that are usually accessible on the front page of the site. Creating a Privacy Policy is easy, and there are web resources to help: http://cs3-hq.oecd.org/scripts/pwv3/pwhome.htm takes you to the Organization for Economic Co-operation and Development’s Privacy Policy Generator, and http://www.siia.net/govt/toolkit.asp has the Software & Information Industry Association’s Privacy Toolkit.

Using Privacy Policies, honorable web pages disclose to their visitors when and how they collect private information, as well as what is done with the information thereafter. The concept is simple: say what you do, and do what you say. The execution of that concept, however, has been uneven. Most commonly, a web page asks for a user’s name, address and e-mail information in exchange for providing something to the user. More than one naïve user has signed up for a "free" goodie by providing his e-mail address to a web page, shortly to find his e-mailbox flooded with unsolicited offers, come-ons and outright cons commonly known as "spam". How? The web page has shared his e-mail address with a direct-email marketer. As a way of regaining users’ trust, a number of pages go out of their way to promise their visitors that private information would never, ever be disclosed to anyone under any circumstances.

Cyber-Privacy in the Courts

Not all web pages proved worthy of such trust. Still others faced unanticipated difficulty in keeping their confidentiality promises. Today’s debate about online privacy is framed by some of the most well-known betrayals.

One of the earliest such incidents involved GeoCities, a web site devoted to creating online "communities." In signing up for the privilege of participating, users were asked for a great deal of information about themselves with the express assurance that they would not be used beyond the GeoCities space. Sadly, it turned out that GeoCities actually did use the information, but in a well-publicized Consent Decree with the Federal Trade Commission they promised they would not do it again. In the Matter of GeoCities, Docket No. C-3849 (Feb. 12, 1999).

Others misbehaved as well. Liberty Financial operated the Young Investors web site, devoted to adolescents and teens. The site included a survey that gathered private information (social security numbers and telephone numbers, for instance), promised prizes for completing it and assured users that "all of your answers will be totally anonymous." In fact, the FTC found that Liberty did not keep the information anonymously and did not even award the prizes it had promised. Liberty entered into a Consent Decree in 1999, promising to (a) stop making false claims about anonymity; (b) post a Privacy Policy; and (c) obtain "verifiable parental consent" before gathering private information from children under 13 years old. In the Matter of Liberty Financial, Docket No. 01-cv-939 (1999).

In the Liberty action, the FTC was foreshadowing the requirements of a law that went into effect the following year – the Children’s Online Privacy Protection Act ("COPPA"), 15 U.S.C. §6501 et seq. Under COPPA, a web site that is principally directed towards children under 13 years old must abide by some very strict rules before gathering personal information from users. COPPA requires a much more detailed Privacy Policy and goes further to require a direct notice to the parents, and that the web site operator has "verifiable parental consent" as was done in the Consent Decree with Liberty Mutual. According to the FTC, to be "verifiable" the site must get the parent to send a signed form by mail or fax, a valid credit card, a phone call to "a toll-free telephone number staffed by trained personnel" or an e-mail that contains a digital signature.

In an effort to assist web site operators trying to comply with these rules, the Act helpfully established a "safe harbor" provision. If a web site follows a self-regulation program, approved in advance by the FTC, the web site operator is protected in any FTC enforcement proceeding. The Children's Advertising Review Unit of the Council of Better Business Bureaus (CARU), the Entertainment Software Rating Board and TRUSTe have all had programs approved for such a safe harbor status.

A very different kind of dilemma faced Toysmart.com, a web-only toy retailer that had attracted many customers and collected personal information and created an extensive customer list under an iron-clad privacy policy that promised not to share that information with anyone, ever. As with many dot-com businesses, Toysmart encountered financial difficulties, filed for bankruptcy and its creditors sought to raise cash from the company’s assets. One of those assets was the customer list, which would be very valuable to a marketer seeking direct access to toy-buyers, except for pre-bankruptcy management’s promise. The FTC stepped in to protect the customers who were about to be the victims of a broken privacy promise, filed a lawsuit and quickly negotiated a settlement agreement. Under the terms of the settlement, the bankrupt entity could sell the list only along with the remainder of the business, and only to a "Qualified Buyer"-- an entity that was in a related market and that expressly agreed to be Toysmart's successor-in-interest as to the customer information. Moreover, the Qualified Buyer had to agree to abide by the terms of the Toysmart privacy statement. FTC v. Toysmart.com, Civil Action No. 00-11341-RGS, (D. Mass. 2000) and In Re Toysmart.com, Debtor, Case No. 00-13995-CJK (D. Mass. Bkcy. 2000).

Privacy litigation has also involved technology that is invisible to the user. One of the leading web advertising companies, Doubleclick, places banner ads onto a user’s screen when the user browses a particular web site that has sold advertising space. While the user sees what appears to be a single screen, in reality that screen is composed of elements that come from a variety of origins. Doubleclick’s service is to place the right ad at the right spot, but it does more than that – it keeps track of what ads a user has already been presented with and the user’s responses to those ads by placing a software "cookie" on the user’s computer. It therefore builds a database of user profiles and uses them to sell targeted ads. Doubleclick was sued by a purported class of individuals who claimed that Doubleclick invaded their privacy, violated the Electronic Communications Privacy Act ("ECPA"), 18 U.S.C. § 2701 et seq., and the Computer Fraud and Abuse Act ("CFAA"), 18 U.S.C. § 1030, et seq., as well.

In a very thorough analysis, Judge Naomi Reice Buchwald granted Doubleclick’s Rule 12(b)(6) Motion to Dismiss. In Re Doubleclick Inc. Privacy Litigation, 2001 U.S.Dist. LEXIS 3498 (S.D.N.Y. 2001). The ECPA claims were dismissed because, the court found, it only protects "users," a word which the statute defines as "any person or entity who (A) uses an electronic communication service and (B) is duly authorized by the provider of such service to engage in such use." (p. 30, citing 18 U.S.C. § 2510 (13)). Under these facts, the plaintiffs were not the "users" of internet access – the web sites that hired Doubleclick were, and they of course consented. The court noted that "in every practical sense, the cookies identification numbers are internal Doubleclick communications – both "of" and "intended for" Doubleclick….In this sense, cookie identification numbers are much akin to computer bar-codes or identification numbers placed on "business reply cards" found in magazines. These bar-codes and identification numbers are meaningless to consumers, but are valuable to companies in compiling data on consumer responses (e.g. from which magazine did the customer get the card?)." (pp. 44-45) The court found that since the people who were online were not "users," Doubleclick did not violate the ECPA. In a memorable critique of this reasoning, Professor Paul Schwartz asked "so what are the individual consumers, chopped liver?" (http://www.nytimes.com/2001/04/06/technology/06CYBERLAW.html)

Plaintiffs fared no better with their claims under the Computer Fraud and Abuse Act. As the court correctly observed, one of the essential elements for civil recovery under the CFAA is that plaintiffs suffer "damage or loss" in excess of $5,000. (18 U.S.C. § 1030(g) and (e)(8)), consistent with congressional intent to "limit the CFAA to major crimes." (p. 75). Plaintiffs’ alleged "damage or loss" included "(1) their cost in remedying their computers after Doubleclick’s access and (2) the economic value of their attention (to Doubleclick’s advertisements) and demographic information." (p. 78) Neither, in the court’s view, was sufficient to meet the statutory threshold 1 . It is clear that the Court’s analysis was influenced by the view that users could easily and completely protect themselves from the "cookie monster" (by changing the settings on the browser or e-mailing an opt-out request to Doubleclick). Moreover, the cookies only tracked the user’s interaction with other Doubleclick content. The court found that there was no suggestion that Doubleclick had accessed any "files, programs or other information on users’ hard drives." Common law privacy claims were therefore dismissed as well.

Privacy claims can also arise out of good intentions gone inadvertently awry. Pharmaceutical maker Eli Lilly established "Prozac.com," a very helpful web page for people struggling with depression. The page included information about the disease and the medicine Lilly makes that can be used to treat it. Among the services Lilly offered was a "Medi-Messenger," essentially an individualized e-mail reminder that the patient should take Prozac on schedule and refill the prescription when necessary (or simply to "remind yourself to have a great day!" http://www.ftc.gov/os/2002/01/eliappadpdf.pdf). Naturally, in order to implement this service, Lilly needed the patient’s e-mail address, but the web page Privacy Policy assured users that they kept personally identifiable and sensitive consumer information in confidence.

The Lilly program ran fine until June 27, 2001, when the "Medi-Messenger" for the day was sent. Unfortunately, that day’s message placed the recipients’ e-mail addresses in the "to" field. As a result, each person who received the e-mail could see the e-mail addresses of hundreds of others who also been sent the same e-mail. (By contrast, if you use Outlook for your e-mail, placing the addresses in the "bcc" field makes all addresses invisible to the recipients.) Lilly’s "Medi-Messenger" had – inadvertently – disclosed a list of Prozac users, contrary to Lilly’s stated Privacy Policy.

The Federal Trade Commission investigated, and soon reached an agreement resolving the matter with Lilly, including a consent order. The FTC secured Lilly’s promise to beef up its privacy practices with appropriate systems, supervision, training and annual reports regarding compliance with the new program and its effectiveness. In the matter of Eli Lilly and Company, FTC Docket No. C-4047 (2002), available online at http://www.ftc.gov/os/2002/01/lillyagree.pdf. Lilly is currently in the first year of this program, but privacy advocates are eagerly awaiting its first report in the Summer of 2003.

Current Technology and New Laws

The post-9/11 anti-terrorism law, known as the USA PATRIOT Act, made important changes to the CFAA, and broadened its scope enormously. Section 814 expands the reach of the CFAA to include computers outside of the United States, and re-defined "loss" to include "any impairment to the integrity or availability of data, a program, a system or information" – effectively, the definition used by the Shurgard court. As a result, plaintiffs with balky computers will have an additional argument to assert in litigation against the operator of a web page that they claim is the cause of their problem.

Fortunately for legitimate media, the USA PATRIOT Act also provides new protection for the technology industry. While the CFAA now reaches far more of the world’s computers than ever, and substantially lowers the threshhold of what constitutes a "loss," §814 also provides that a plaintiff cannot invoke the CFAA to sue hardware or software (or firmware) companies for negligent design or manufacture.

This past year has also seen the dramatic increase in the use of technology that

obtains information directly from a user’s computer. "Spyware" is the common name for an application that a user downloads without knowing it – frequently, it is packaged with something the user wants to download, such as a game - and proceeds to send information about the computer system it now inhabits back to whoever programmed it. You can think of Spyware as a cross between a stool pigeon and a homing pigeon. Just as Darwin would have predicted, this new form of computer activity has led to the development of defensive measures that users can employ to protect themselves. http://www.starlightcreate.com/adaware/lsaaw.htm

More pervasive, and potentially much more insidious, has been the dramatic growth of "peer-to-peer" systems. So-called "P2P" is the ultimate form of computer networking and file-sharing among PC’s, literally allowing one user to access the hard drive of another user’s PC. The most quickly-adopted computer application in history – Napster – was nothing more than P2P. As employed by Napster, this technology allowed music files ("MP3"s) to be shared and copied worldwide on a huge scale until that service was shut down as a result of litigation by music publishers. For a recent opinion in the Napster litigation from the Court of Appeals for the Ninth Circuit, see http://news.findlaw.com/hdocs/docs/napster/napster032502opn.pdf. Since then, however, many P2P services have emerged and while music publishers have filed suit against them, the fate of those suits is far less certain due to critical differences in how those services operate. Regardless of who wins the battle over copyright income, the technology itself presents a serious privacy issue for uninformed users. One popular system, Kazaa, was the subject of a recent study that confirmed how easy it is for unscrupulous users to extract private information from others’ PCs. www.hpl.hp.com/shl/papers/kazaa/KazaaUsability.pdf And Kazaa itself appears to have been loaded with spyware. Although Kazaa’s spyware is currently inactive, the software is said to be ready to be activated by the transmission of the appropriate code. http://news.com.com/2008-1082-875620.html.

In an increasingly-futile effort to keep up with technology, the U.S. Congress has dozens of new privacy bills pending. Which ones (if any) will become law is guesswork, but one that has attracted a lot of attention is the proposed "Online Personal Privacy Act," S. 2201, sponsored by Senator Hollings. According to the Senator, the "bill mirrors the European Union privacy directive," sets a "uniform, federal standard for the protection of online personal information and codif[ies] the five core privacy protection principles outlined by the Federal Trade Commission: consent, notice, access, security and enforcement."

http://hollings.senate.gov/~hollings/press/2002517A41.html. The bill would give consumers the right to choose who may send them e-mail solicitations and would preempt State privacy standards, but, in a bizarre twist, it too has been attacked as anti-privacy and pro-spyware: http://www.salon.com/tech/feature/2002/04/26/hollings_spyware/?x.

Is Privacy a Trojan Horse for Violations of the First Amendment?

Paradoxically, the demand for privacy may be one of the most clear and present dangers to the First Amendment since the Nixon administration. What may be an asserted privacy right to one person could, simultaneously, be a new legal weapon to restrict the free-flow of ideas and comment. This tension was explored in detail by Professor Eugene Volokh, in a law review article entitled Freedom of Speech and Information Privacy: The Troubling Implications of a Right to Stop People From Speaking About You, 52 Stanford L. Rev. 1049 (2000):

The difficulty is that the right to information privacy--my right to control your communication of personally identifiable information about me--is a right to have the government stop you from speaking about me. We already have a code of "fair information practices," and it is the First Amendment, which generally bars the government from controlling the communication of information (either by direct regulation or through the authorization of private lawsuits), whether the communication is "fair" or not. While privacy protection secured by contract is constitutionally sound, broader information privacy rules are not easily defensible under existing free speech law. (footnotes omitted)

Similarly, scholars are now exploring the societal costs imposed in the name of "privacy." In their article Putting People First: Consumer Benefits of Information-Sharing, Professors Fred H. Cate and Michael E. Staten (published by the National Retail Federation as part of its Protecting Privacy in the New Millennium series, available online at http://www.privacyalliance.org/resources/consumerbenies.pdf) posit that a reasonable balance between privacy interests and individual preferences is the best approach. "Information-sharing plays a significant role in reducing the prices that consumers pay for goods and services and in expanding the range and affordability of methods of paying for them…. Widespread information-sharing provides consumers with unprecedented convenience, and greatly enhances the speed with which decisions can be made and services provided….we tend to take the information infrastructure for granted, until we are faced with the daunting prospect of learning to live without the many benefits that flow from it."

And history teaches that the cause of privacy can quickly be redirected into repression. Ominously, that lesson is being learned again under the banner of the European Union’s 1995 Data Protection Directive, which declared privacy to be "a fundamental human right." As reported by Bruce Johnson in The Battle over Internet Privacy and the First Amendment, The Computer & Internet Lawyer, Volume 18, No. 4 (April 2001), the Spanish Ministry of Justice shut down the web site of the Association Against Torture in March, 2000, on the grounds that it named the government agents who had been accused of torture or brutality. Spain had passed a broad privacy law, making it a crime to disclose information about someone without their consent. What public figure could resist using such leverage? While the First Amendment would undoubtedly prevent enforcement of such a law in the United States, it illustrates the potential for mischief that can be accomplished under the guise of protecting privacy.

Conclusion

The familiar parade of horribles about the need to protect privacy invariably includes identity theft, credit card fraud and receiving unwanted advertisements. Yet the first two are already illegal under Federal (and many states’) laws, and the latter is an annoyance at worst. Proposed legislation will never be an adequate or timely deterrent to privacy problems, since bad actors (and malicious code-writers) are global and they move fast. There are ample laws protecting privacy on the books, and increasingly useful technologies so that users can protect themselves. Moreover, new laws can have unintended consequences that would dramatically limit the freedom of expression. Privacy is valuable, but not more valuable than the First Amendment. The Internet has made everyone a publisher; the right of privacy should not be permitted to make everyone a censor.

Endnote

1 2 Cf. Shurgard Storage Centers, Inc. v. Safeguard Self Storage, Inc., 119 F.Supp.2d 1121 (W.D. Wash. 2000) (unauthorized access to private information constitutes "loss" because integrity of data is diminished).

The content of this article does not constitute legal advice and should not be relied on in that way. Specific advice should be sought about your specific circumstances.