Showing posts with label privacy law. Show all posts
Showing posts with label privacy law. Show all posts

Friday, October 05, 2012

High Court to Rule on Exception to Driver's Privacy Law

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

The U.S. Supreme Court on September 25 agreed to hear a case that could clarify what uses of personal information obtained from driver's license and vehicle registration databases are "permissible" under the federal Driver's Privacy Protection Act ("DPPA"), 18 U.S.C. §§2721-2725. The DPPA regulates the disclosure of personal information contained in the records of state motor vehicle departments. The DPPA permits the disclosure of protected personal information for several "permissible uses" listed in §2721(b).

The Court granted a petition for certiorari filed by individuals asking the Court to review a decision of the U.S. Court of Appeals in Richmond (CCH Privacy Law in Marketing ¶60,751), holding that four attorneys who obtained motor vehicle records of South Carolina vehicle buyers for the purpose of engaging in mass solicitation without consent-a purpose prohibited by the DPPA-could not be liable for violating the DPPA, as a matter of law. According to the appellate court, the attorneys made permissible use of the buyers' personal information protected by the DPPA, specifically, use in connection with litigation, including investigation in anticipation of litigation, and the solicitation was conducted in the course of that permissible use.

The attorneys had sent several FOIA requests to the South Carolina Department of Motor Vehicles seeking information regarding individuals who purchased automobiles during specific periods of time, including the name, address, and telephone number of the buyer; the dealership where the automobile was purchased; the type of vehicle purchased; and the date of the purchase. The attorneys then mailed a letter to the individuals whose information was obtained, offering a free consultation and inviting the individuals to hire the attorneys to represent them in a lawsuit against certain dealerships. The letter included the label "ADVERTISING MATERIAL."

The DPPA provides that a state DMV may disclose personal information for use in connection with an investigation in anticipation of litigation. It was a matter of settled state law and practice that solicitation was an accepted, expected, and inextricably intertwined element of conduct satisfying the litigation exception under the DPPA, the Fourth Circuit said. Accordingly, such solicitation was not actionable by the buyers. Dismissal of the buyers' DPPA claims by the federal district court in Greenville, South Carolina, was affirmed.

The petition is Maracich v. Spears, 12-25.

Thursday, July 05, 2012

Apple’s Collection of iPhone Data Could Violate California Law

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

Users of mobile applications (“apps”) on Apple’s “iOS” devices (iPhone, iPad, and iPod Touch, etc., or “iDevices”) could go forward with claims under California’s Consumers Legal Remedies Act and Unfair Competition Law against Apple for violating their privacy rights by unlawfully allowing third-party apps that run on the devices to collect and make use of personal information, for commercial purposes and without users’ knowledge or consent, the federal district court in San Jose, California has ruled.

The court dismissed the users’ claims against Apple and mobile app developers for violations of the Stored Communications Act, the Wiretap Act, the Computer Fraud and Abuse Act, and California’s constitutional right to privacy.

Two Putative Classes

The users’ amended consolidated complaint asserted claims with respect to two putative classes of individuals. The first class, referred to as “the iDevice Class,” contended that Apple-approved apps created by third-party companies (Admob, Inc., Flurry, Inc., AdMarval, Inc., Google, Inc., and Medialets, Inc., collectively, “Mobile Industry Defendants”) unlawfully collected information about the users, including their addresses and current whereabouts, gender, age, zip code, time zone, and information about which functions the users performed on the app. They alleged that Apple violated its express privacy policy by allowing the Mobile Industry Defendants to design apps with the capability to track and collect data about their app use or other personal information.

The second class, referred to as the “Geolocation Class,” consisted of iDevice purchasers who alleged that they “unwittingly, and without notice or consent transmitted location data to Apple servers.” They alleged that, starting in July 2010, Apple began intentionally collecting data on their precise geographic location and storing that information on the iDevice in order to develop a database about the geographic location of cellular towers and wireless networks. They asserted that Apple continued collecting geolocation information about them even after they switched off the location services settings on their iDevices, despite the fact that Apple had represented that they could prevent the collection of such data in that way.

Article III Standing

Both the iDevice Class and the Geolocation Class users alleged sufficient injury to have standing to sue under Article III of the U.S. Constitution, the court decided. In their initial complaint, the users had relied on a theory that collection of personal information itself created a particularized issue for purposes of standing, which the court rejected (CCH Privacy Law in Marketing ¶60,676).

In their amended complaint, the users’ allegations had been significantly developed to allege particularized injury, in the court’s view. The users articulated additional theories of harm beyond their theoretical allegations that personal information has independent economic value. In particular, they alleged actual injury, including diminished and consumed iDevice resources, such as storage, battery life, and bandwidth; increased, unexpected, and unreasonable risk to the security of sensitive personal information; and detrimental reliance on Apple’s representations regarding the privacy protection given to users of iDevice apps.

In addition, the users described the specific iDevices used, the specific defendants that allegedly accessed or tracked their personal information; which apps they downloaded that accessed their personal information; and what harm resulted from the access or tracking of their information. They also identified the types of information collected, such as their home and workplace locations, gender, age, zip code, terms searched, and ID and password for specific app accounts.

The users also identified an additional basis for Article III standing, the court said. The injury required by Article III may exist by virtue of statutes creating legal rights, the violation of which creates standing. The users alleged violations of their statutory rights under the Wiretap Act and the Stored Communications Act. The alleged injuries were fairly traceable to the actions of the defendants. The Geolocation Class asserted that Apple intentionally designed its software to retrieve and transmit geolocation information located on its customers’ iPhones to Apple’s servers.

The iDevice Class alleged that Apple designed its products and App Store to allow individuals to download third-party apps and that Apple represented to users that it took precautions to safeguard their personal information. The app developers were accused of accessing personal information without users’ knowledge or consent. These allegations were sufficient to establish standing, the court concluded.

Stored Communications Act

The users’ claims under the Stored Communications Act (SCA) failed because the SCA was not applicable to the alleged conduct by Apple and the Mobile Industry Defendants, the court determined. Stating an SCA claim requires an allegation that the defendants accessed without authorization a “facility through which electronic communication service is provided.” The users’ mobile devices did not meet the SCA’s definition of “facility.” The users’ iDevices did not provide an electronic communications service simply by virtue of enabling use of electronic communication services.

In addition, the storage of real-time location information and other data on the iDevices did not qualify under the SCA as “electronic storage,” the court said. The iDevices stored location data for up a year; such storage did not constitute the type of temporary, intermediate storage of data incidental to the transmission of the data.

Wiretap Act

The users asserted that Apple’s collection of precise geographic location data from WiFi towers, cell phone towers, and GPS data on users’ devices constituted “interceptions” of data prohibited by the Wiretap Act. However, such data was not “content” covered by the Wiretap Act, the court said. Data automatically generated about a telephone call did not constitute “content” because it contained no information about the substance of the communication. The geolocation data was generated automatically and was not part of the information intentionally communicated by the users.

Computer Fraud and Abuse Act

The court also rejected the users’ Computer Fraud and Abuse Act (CFAA) claims. Apple had the authority to access iDevices and to collect geolocation data as a result of the voluntary installation of software by the users and, therefore, could not have violated the CFAA. In addition, the users failed to allege damage or “impairment” to their devices or an interruption of service.

California Constitutional Right to Privacy

Collection of the users’ data by Apple and the Mobile Industry Defendants did not violate the users’ right to privacy under the California Constitution, the court found. The alleged disclosure of device identifier numbers, personal data, and geolocation information from the users’ iDevices—even if transmitted without their knowledge or consent—was not an egregious breach of social norms, as required to state a claim for invasion of privacy. Rather, it was routine commercial behavior, according to the court.

California Consumer Legal Remedies Act

Apple could be liable for violating California’s Consumer Legal Remedies Act (CLRA), the court determined. The users sufficiently alleged that they sustained harm as a result of the alleged data collection practices. With respect to geolocation data, the users alleged that Apple had stored such data on the users’ iDevices for Apple’s own benefit, at a cost to the users, and the that if Apple had disclosed the true cost of the geolocation features, the value of the iDevices would have been materially less than what the users paid.

In addition, the users contended that because of Apple’s failure to disclose its practices with regard to collection of personal data via apps, the users overpaid for their iDevices. At the pleadings stage, the users sufficiently alleged that they were consumers under the CLRA, and their allegations related to the purchase of goods, the court said.

California Unfair Competition Law

The court also decided that the users could go forward with their claims under the California Unfair Competition Law (UCL). The users had standing under the UCL because they alleged that they paid more for their iDevices than they would have if Apple had disclosed its privacy practices. Apple’s conduct could be illegal under the Consumers Legal Remedies Act and therefore covered by the UCL. In addition, the conduct could be “unfair,” for purposes of the UCL. The users met their burden of pleading fraud with particularity, according to the court.

The decision is In re iPhone Application Litig., CCH Privacy Law in Marketing ¶60,775.

Further details regarding CCH Privacy Law in Marketing appears here.

Wednesday, November 02, 2011





Customers Could Seek Costs of Mitigating Harm from Data Security Breach

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

Customers of supermarket chain operator Hannaford could pursue claims for breach of implied contract and negligence under Maine law against Hannaford for failing to prevent a data security breach, the U.S. Court of Appeals in Boston has held.

The customers stated valid claims for damages based on the costs of replacing their credit and debit cards and of purchasing credit insurance after a breach resulted in the theft of an estimated 4.2 million debit and credit card numbers, expiration dates, PINs, and other personal information.

Mitigation Damages

The damages sought by the customers amounted to “mitigation damages,” the court said. These damages were reasonably foreseeable, and recovery for them had not been barred by Maine for policy reasons.

Under Maine common law, a plaintiff may recover for costs and harms incurred during a reasonable effort to mitigate its damages resulting from a defendant’s negligence, regardless of whether the harm is nonphysical. To recover mitigation damages, plaintiffs needed only show that the efforts to mitigate were reasonable and that those efforts constituted a legal injury, such as actual money lost, rather than time or effort expended.

The case involved a large-scale, sophisticated, apparently global criminal operation conducted over three months and the deliberate taking of credit and debit card information. There had been actual misuse of customer data by the thieves, the court noted. The data had been used to run up thousands of improper charges to customers’ accounts; the customers were subject to a real risk of financial loss, making their mitigation efforts reasonable.

By the time Hannaford had notified customers of the breach, over 1,800 fraudulent charges had been identified, and the customers could have reasonably expected that many more fraudulent charges would follow. The customers’ claims for identity theft insurance and replacement card fees involved actual financial losses from credit and debit card misuse. Such damages were recoverable in Maine under both tort law and contract law, according to the court.

The customers could not, however, recover damages for their claims for loss of reward points, loss of reward point earning opportunities, and fees for pre-authorization arrangements. These injuries were too attenuated from the data breach because they were incurred as a result of third parties’ unpredictable responses to the cancellation of the customers’ credit or debit cards, the court said.

Breach of Implied Contract

With regard to the customers’ claims for breach of an implied contract, the court determined that a jury could find that, in a grocery transaction in which a customer uses a debit or credit card, there was an implied contract that Hannaford would not use the credit card data for other people’s purchases, would not sell the data to others, and would take reasonable measures to protect the information.

A customer using a credit card in a commercial transaction intended to provide that data to the merchant only and did not expect the merchant to allow unauthorized third parties to access the data, the court said.

Breach of a Fiduciary Duty

The customers failed, however, to assert a claim for breach of a fiduciary duty. First, the customers did not have a “confidential relationship” with Hannaford that would give rise to a fiduciary duty, according to the court. The “trust and confidence” allegedly placed by the customers in Hannaford was not the type of trust and confidence contemplated by Maine’s common law. Such claims typically involved family relationships, joint ventures or partnerships, and lender/borrower relations in which one party had taken advantage of another for purposes of acquiring or using the other’s property or assets. No such relationship existed in this case.

Second, the grocery purchase relationship between the parties was not characterized by a disparity in bargaining positions. Hannaford did not have a monopoly on the sale of groceries and did not require the use of credit or debit cards.

Third, the customers failed to allege that Hannaford abused a position of trust, the court said. There was no suggestion in the complaint that Hannaford provided anything but a fair exchange in groceries in return for the customers’ payments or that Hannaford somehow took advantage of the system of allowing customers to use credit and debit cards.

Unfair Trade Practices

Hannaford’s failure to disclose the breach did not give rise to a cause of action under Maine’s Unfair Trade Practices Act, the court decided. The private remedies provision of the Act required that the plaintiff suffer a loss of money or property as a result of the defendant’s unlawful act. Maine’s highest court had interpreted the Act as only allowing private actions for “substantial” injuries. The private remedies provision was to be read narrowly, particularly when common-law actions for negligence and breach of implied contract were available.

The decision in Anderson v. Hannaford Brothers Co., appears at CCH Privacy Law in Marketing ¶60,687.

Monday, October 31, 2011





Users Lacked Standing to Assert Privacy Claims Against Apple, Mobile App Developers

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

Users of mobile applications on Apple’s devices could not maintain an action against Apple and mobile app developers for alleged violations of various federal and state privacy laws, because the users failed to allege that they had suffered any injury, the federal district court in San Jose has decided.

Without sufficient allegations of any injury in fact, a federal district court concluded that the users did not have constitutional standing.

Users may download apps for Apple devices only through Apple’s "App Store" application and website. According to the complaint, Apple represented to users that it took precautions to safeguard their personal information against "theft, loss, and misuse, as well as against unauthorized access, disclosure, alteration, and destruction."

Apps Access User Information

However, the devices’ operating system allows apps—without consent of the users—to access, use and track the following information: address book, cell phone numbers, file system, geolocation, International Mobile Subscriber Identity, keyboard cache, photographs, SIM card serial number, and unique device identifier. Developers of apps are able to exploit this access to collect and track personal data without the user’s permission or knowledge.

The users brought suit against Apple and eight mobile app developers for violations of various federal and state laws, including the Computer Fraud and Abuse Act and California’s Computer Crime Law. Apple and the developers argued that the users lacked standing to bring suit, because they did not allege any injury in fact. Apple also argued that its privacy agreements with users barred the users’ claims.

Injury in Fact

To satisfy the constitutional standing requirements of Article III, plaintiffs must show that:

(1) They have suffered an injury in fact that is concrete and particularized and actual or imminent;

(2) The injury is fairly traceable to the challenged action of the defendant; and

(3) It is likely, as opposed to merely speculative, that the injury will be redressed by a favorable decision.

In their complaint, the users alleged three injuries:

(1) Misappropriation or misuse of personal information;

(2) Diminution in value of the personal information, which is an "asset of economic value" due to its scarcity; and

(3) "Lost opportunity costs" in having installed the apps and diminution in value of the Apple devices because their insufficient security made them less valuable in light of the privacy concerns.

The court determined, however, that the users failed to allege any injury to themselves. The users did not identify which devices they used, if any of the developers accessed or tracked their personal information, and what harm, if any, resulted from such activity. As a result, the users failed to identify any concrete harm from Apple’s or the developers’ activities.

Injury Traceable to Defendants

In addition, the users failed to allege any injury that was fairly traceable to Apple or the developers. The users’ only allegation as to Apple was that Apple designed a platform that could potentially be used by the developers for harmful acts. Such conjectural or speculative allegations about the risk of harm are not sufficient for standing, the court concluded.

Lastly, Apple argued that "click-through" agreements with the users governed any potential liability for third-party apps on the users’ devices, and the express terms and conditions of the agreements barred claims against Apple for any alleged injuries.

The users argued that the agreements were unconscionable, providing no meaningful choice for users. While the court declined to determine whether the agreements were an absolute bar to the users’ claims, it noted that there is always a meaningful choice when a challenged term in a contract involves nonessential recreational activities—forgoing the activity.

The decision is In re iPhone Application Litigation, CCH Guide to Computer Law ¶50,268.

Tuesday, October 18, 2011





“Do-Not-Track” Approach to Consumer Privacy Questioned by FTC Commissioner Rosch

This posting was written by Jeffrey May, Editor of CCH Trade Regulation Reporter.

Federal Trade Commissioner J. Thomas Rosch reiterated his doubts about the viability of a “do-not-track” mechanism to protect consumer privacy in the United States, in a speech delivered at the Loyola Chicago Antitrust Institute Forum last Friday. Commissioner Rosch has called the FTC staff’s recent endorsement of such a mechanism “premature.”

A do-not-track mechanism would purportedly enable consumers to choose whether to block the tracking of their online searching and browsing activities in order to limit targeted advertising. In an FTC staff report issued in December 2010, entitled “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers,” the staff recommended the implementation of a do-not-track mechanism.

“Serious Reservations”

Commissioner Rosch concurred in the decision to issue the staff report for comment, but expressed “serious reservations” about the do-not-track proposal advanced in it. At the time the report was released, Commissioner William E. Kovacic also questioned the wisdom of do not track. However, Commissioner Kovacic left the agency earlier this month, leaving Rosch the only member of the Commission skeptical of the staff’s recommendation.

In his remarks in Chicago, Commissioner Rosch explained how the do-not-track approach to privacy protection has “generated attention not only from the Commission and the media, but also from Congress, the online industry, and a host of consumer advocacy groups.”

While there are bills in Congress that address broader privacy concerns without providing for a specific do-not-track mechanism, two pieces of legislation have been proposed this term that instruct the FTC to develop a specific do-not-track mechanism, according to Rosch.

Proposed Federal Legislation

The proposed “Do Not Track Me Online Act” (H.R. 654) would require the FTC to issue rules: (1) establishing standards for “an online opt-out mechanism; (2) requiring mandatory disclosures regarding the collection, use, and sharing of information; and (3) allowing consumers to otherwise prohibit the collection or use of a broad array of information transmitted online.

The proposed “Do-Not-Track Online Act of 2011” (S. 913) would require the FTC to issues rules: (1) establishing a mechanism whereby consumer can simply and easily opt out of having their personal information collected online—including on mobile devices; and (2) prohibiting the collection of personal information from consumers who have opted out.

Online Industry’s Efforts

The Commissioner criticized the online industry’s efforts to implement do not track. He questioned claims that these efforts provide consumers with the choice to eliminate behavioral advertising, tracking, or targeted advertising. Specially, he mentioned the browser-related mechanisms associated with Microsoft’s Internet Explorer 9, Mozilla’s Firefox, and Google’s Chrome and the self-regulatory regime of the Digital Advertising Alliance, which uses cookies to effectuate the choice mechanism.

According to Rosch, there are four overarching shortcomings with the industry’s efforts:

(1) Some of the mechanisms only allow consumers to opt out of behavioral advertising, but not all “tracking,” and there is a failure to alert consumers to this fact.

(2) Consumers may not be fully informed about the benefits or consequences of subscribing to a do-not-track mechanism. Commissioner Rosch expressed concern that “across-the-board_ opting out by consumers might reduce the overall financing that supports free content across the Internet, and accordingly, result in a decrease in innovation.

(3) There was not much evidence that the mechanisms were really working to alert consumers about the existence of tracking and online behavioral advertising. The rates of adoption are very low.

(4) The current proposals involve well-entrenched firms that might favor barriers to consumer tracking in order to create or raise entry barriers to rivals. The firms’ intentions might not be solely to protect consumers against behavioral tracking.
“[W]e cannot be blinded so much by our zeal to protect consumers from behavioral tracking that we lose sight of our competition mission,” Commissioner Rosch said. “There is probably nothing worse than to have firms with an anticompetitive agenda designing consumer protection initiatives.”

The text of Commissioner Rosch’s October 14 remarks, entitled “Do Not Track: Privacy in an Internet Age,” appears here.

Monday, October 10, 2011





Courts Differ on Whether Publisher’s Purchase, Use of Data Violated Driver’s Privacy Protection Act

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

A class of Missouri residents could go forward with claims under the federal Driver’s Privacy Protection Act (DPPA) against West Publishing Co. for acquiring and disseminating personal information derived from driver’s license records from various states, the federal district court in Jefferson City, Missouri has ruled.

However, a class of Illinois residents could not pursue a DPPA class action against West Publishing for the same conduct, the U.S. Court of Appeals in Chicago has held.

Missouri Action

The federal district court held that the class representative had standing under the DPPA and granted the representative’s motion to certify the class.

According to the district court, the DPPA did not
(1) permit the publisher to obtain driver’s license information from the state when its sole purpose was to resell the information to third parties or

(2) permit the publisher to disclose the entire driver’s license database to a business or individual having only a potential future use for some of the information sold.

The Missouri residents were not required to present evidence of specific misuse of their personal information, according to the court. The publisher obtained, and continued to obtain, large databases of motor vehicle records from several states. The databases contained personal information belonging to millions of licensed drivers.

The DPPA made nondisclosure of personal information the default rule. Under the statute, states were permitted to disclose driver’s license information only to “authorized recipients” who obtained information for one of the permissible uses under the DPPA, and not simply a recipient whom the state had authorized to receive information.

The DPPA did not delegate to the states the decision of who was an “authorized recipient.” The publisher, therefore, was not an “authorized recipient” of the information based on its mere purported purpose of reselling information for permissible uses, in the district court’s view.

Rather than specifically listing prohibited uses for driver’s license information, the statute generally prohibited all but 14 permissible uses. Bulk resale was not included in those permissible uses. Only one DPPA exception made reference to “bulk distribution,” and that provision required that individuals opt in by providing their express consent to such bulk release for marketing and solicitation.

Given the strict linkage between the method of obtaining data and the restrictions on resale, Congress could not have intended to create a “gaping hole” in the statute for resellers by authorizing them to obtain the entire driver’s license database simply by identifying themselves as resellers, in the court’s opinion. Congress could have created a separate exception for resellers, but it did not.

Class Certification

The class consisted of all persons who registered a motor vehicle in or were issued a driver’s license or state identification card by 29 states and the District of Columbia since February 19, 2006, and whose personal information was obtained, disclosed, or sold by the publisher. There was no dispute as to the numerosity requirement for class certification.

The publisher’s business model for obtaining and then selling information was a question of fact common to all class members, the court said. This was true regardless of whether this business model resulted in some individual sales for uses that were authorized by the DPPA.

Class action litigation was the superior method for adjudicating the claims, according to the court. Given the large number of potential plaintiffs and the commonality of their claims, certifying the class would allow a more efficient adjudication of the controversy than would individually litigating the claims.

The decisions are Johnson v. West Publishing Co., CCH Privacy Law in Marketing ¶60,672 and ¶60,673.

Illinois Action

The Seventh Circuit held that the publisher’s acquisition of personal information contained in the motor vehicle records conduct did not violate the DPPA. The DPPA did not prohibit the publisher from reselling the residents’ personal information to third parties with permissible uses for the data under the statute.

The Illinois residents asserted that the publisher acquired the personal information contained in motor vehicle records of millions of drivers from state DMVs for resale. The residents, failed, however to state a DPPA claim.

The DPPA authorized the acquisition and resale of personal information by “authorized recipients” for 14 “permissible uses.” There was no allegation that the ultimate users of the records compiled and sold by the publisher lacked a permissible use for the records. The publisher did not have to have an immediate permissible use of its own in order to be an authorized recipient of the data, the appeals court said.

The decision in Graczyk v. West Publishing Co., will appear in CCH Privacy Law in Marketing.

Further information about CCH Privacy Law in Marketing appears here.

Friday, October 07, 2011





Final Approval Granted to Ameritrade Data Security Breach Settlement

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

A federal district court in San Francisco has granted final approval to a settlement of class action claims against online investment broker Ameritrade, which allegedly failed to prevent a data security breach that exposed more than six million account holders’ private information to spammers and rendered the same information vulnerable to others.

Preliminary approval was granted to the settlement in December 2010 (CCH Privacy Law in Marketing ¶60,574).

Cash Payout, Attorneys’ Fees

Ameritrade agreed to pay a minimum of $2.5 million and a maximum of $6.5 million in claims. Under the settlement, class members are eligible for a cash payout in amounts ranging from $50 to $2,500, depending on the nature of the account affected by the breach and the specific loss incurred. Attorneys’ fees for class counsel were capped at $500,000.

The settlement class was defined as “All persons who are or were accountholders or prospective accountholders of the Company and who provided physical or email addresses to the Company on or before September 14, 2007.”

The weakness of the plaintiffs’ case weighed in favor of granting approval to the settlement, the court said. Federal courts had viewed private class actions involving data breaches with skepticism, particularly where the only alleged injury was the receipt of spam, increased risk of identity theft, or loss of the benefit of the bargain.

Prosecuting the case through trial and the appellate process would involve a large amount of risk and expense. In addition, obtaining and maintaining class action status during the course of litigation would pose considerable risks to the plaintiffs.

The settlement afforded tangible benefits to the plaintiffs that were not available in prior proposed settlements, which had been rejected by the court.

Objections, Opt Outs

Notice of the settlement had been sent to the attorney general’s office of each of the 50 states, and none had objected. Of the approximately six million class members who were mailed the notice of the settlement, only 23 had submitted objections and fewer than 200 chose to opt out.

Objections to the settlement were overruled by the court. There was no evidence that class members had been misled about the terms of the settlement by the claims administrator.

Even though the settlement did not provide for a payout to every conceivable accountholder who might have been affected by the breach in some way, the settlement was a reasonable compromise that balanced the possible recovery against the risks inherent in litigating further.

The decision is In re TD Ameritrade Account Holder Litigation.

Thursday, September 15, 2011





FTC Proposes Amendments to Children’s Online Privacy Protection Rule

This posting was written by John W. Arden.

The Federal Trade Commission has proposed amendments to the Children’s Online Privacy Protection Rule in order to ensure that the rule continues to protect children’s privacy, as online technologies evolve. The agency is seeking public comment on the proposal through November 28, 2011.

According to a September 15 press release, the proposed amendments would give parents control over what personal information websites may collect from children under 13 years of age.

The Children’s Online Privacy Protection Act (COPPA) (CCH Trade Regulation Reporter ¶27,590) requires operators of websites or online services directed to children under 13—or those having actual knowledge that they are collecting personal information from children under 13—to obtain verifiable consent from parents before collecting, using, or disclosing such information.

The FTC rule implementing COPPA—the Children’s Online Privacy Protection Rule (CCH Trade Regulation Reporter ¶38,059)—became effective in 2000.

In April 2010, the Commission sought public comment on the COPPA Rule, posing numerous questions for public consideration, holding a public roundtable, and reviewing 70 comments from industry representatives, advocacy groups, academics, technologists, and members of the public.

Proposed changes to the rule, released today, include:

Definitions. The FTC proposes updating the definition of “personal information” that may not be collected from children under 13 without parental consent to include geolocation information and certain “persistent identifiers” such as tracking cookies used for behavioral advertising. The agency further proposed a change to the definition of “collection” to allow children to participate in interactive communities, without parental consent, as long as the operators take reasonable measures to delete children’s personal information before it is made public.

Parental notice. The Commission seeks to streamline and clarify the direct notice that operators must give parents prior to collecting children’s personal information in a succinct “just-in-time” notice rather than just in a privacy policy.

Parental consent mechanisms. New proposed methods of obtaining verifiable parental consent would include electronic scans of signed parental consent forms, video-conferencing, and use of government-issued identification checked against a database. These new methods would supplement the existing methods of obtaining parental consent, which include signed parental consent forms, parents’ use of a credit card in connection with a transaction, and parents' calls to a toll-free telephone number. The FTC proposes eliminating parental consent through “e-mail plus,” an e-mail to a parent coupled with another step such as sending an e-mail confirmation.

Confidentiality and security. Proposed rules would strengthen confidentiality and security by requiring that operators ensure that any third party to whom they disclose personal information has reasonable procedures to protect that information, retain the information for only as long as reasonably necessary, and properly delete that information.

Safe harbor. The FTC proposes to strengthen its oversight of self regulatory “safe harbor programs” by requiring groups to audit their members at least annually and to report the results of audits to the Commission.

The 122-page notice of proposed rule and request for comments appears here on the FTC website.

Submission of Comments

Interested persons may submit comments online here or may send a hard copy of comments to: Federal Trade Commission, Office of the Secretary, Room H-113 (Annex E), 600 Pennsylvania Avenue, N.W., Washington, D.C. 20580.

Write “COPPA Rule Review, 16 CFR Part 312, Project No. P-104503” on the submissions.

Friday, September 09, 2011





Breach Notification, Disposal Standards Added to Illinois Personal Information Protection Act

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

The Illinois Personal Information Protection Act has been amended to provide additional safeguards and penalties surrounding the protection of personal information, including prevention of and response to a security breach.

A new provision added to the Act requires the disposal of “materials containing personal information in a manner that renders the information unreadable, unusable and undecipherable.” The law containing the amendments (H. 3025, Public Act No. 483) was approved on August 22, 2011 and will be effective on January 1, 2012.

Detailed Notification of Breach

The amended Act provides additional details as to what security breach notifications must contain. Previously, the Act required entities to notify affected individuals that a breach had occurred, but it did not specify what the notification should include.

The changes require notifications to include:

• Toll-free numbers and addresses for consumer reporting agencies;

• The toll-free number, address, and website for the Federal Trade Commission; and

• A statement that the individual can obtain information from these sources about fraud alerts and security freezes.

Application to Storage of Data

The amended Act will apply security breach notification requirements to any data collector that maintains or stores computerized data. The current version of the Act does not apply to data collectors that merely stored data for others. Moreover, service providers will be required to cooperate with data owners or licensees in regard to the breach.

Data Disposal Requirements

The new data disposal provision specifies the following proper methods for disposal of personal information:

• Paper documents containing personal information may be redacted, burned, pulverized, or shredded so that personal information cannot practicably be read or reconstructed; and

• Electronic media or other non-paper media containing personal information may be destroyed or erased so that personal information cannot practicably be read or reconstructed.

Any person, entity, or third-party is subject to a civil penalty of $100 (capped at $50,000) per individual whose personal information was not disposed of properly, and the attorney general may bring a civil suit to impose a penalty.

Text of Public Act No. 483 appears here. The current version of the Illinois Personal Information Protection Act is reported at CCH Privacy Law in Marketing ¶31,300.

Wednesday, September 07, 2011





California Enhances Data Breach Notification Requirements

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

Under new legislation that will take effect next year, persons and entities doing business in California will be required to make additional disclosures in the event that the security of their computerized data systems are breached.

Existing law requires companies doing business in California to disclose data breaches involving the personal information of California residents.

The recent legislation (Senate Bill 24, Chapter 197) amended California Civil Code Sec. 1798.82, adding several specific requirements as to the form and substance of breach notifications. As amended, the statute requires breach notifications to be in plain language.

At a minimum, notifications must contain the following:

• The name and contact information of the notifying person or business.

• The types of personal information that were the subject of the breach.

• The date or estimated date of the breach.

• Whether notification was delayed as a result of a law enforcement investigation.

• A general description of the breach incident.

• The toll-free telephone numbers and addresses of the major credit reporting agencies, if the breach exposed California residents’ Social Security, driver's license, or identification card numbers.

At the discretion of the notifying company, the security breach notification may also include any of the following:

• Information about what the notifying company has done to protect individuals whose information has been breached.

• Advice on steps that persons whose information has been breached may take to protect themselves.
In addition, if notification is made to more than 500 California residents as a result of a single breach of the security system, the notifying company must electronically submit a single sample copy of the notification, excluding any personally identifiable information, to the California Attorney General.

The legislation was signed by Governor Jerry Brown on August 31, 2011, and will take effect on January 1, 2012. Similar bills were vetoed by former Governor Arnold Schwarzenegger in 2009 and 2010.

The current version of the law appears at CCH Privacy Law in Marketing ¶30,500.

Wednesday, April 13, 2011





Senators Introduce Proposed “Commercial Privacy Bill of Rights Act”

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

A new comprehensive regulatory framework to protect consumers’ personal information would be established by a Senate Bill unveiled by Senators John Kerry (D-Mass.) and John McCain (R-Ariz.) on April 12. If enacted, the proposed “Commercial Privacy Bill of Rights Act of 2011” (S. 799) would regulate the collection, use, and dissemination of covered information.

“John and I start with a bedrock belief that protecting Americans’ personal, private information is vital to making the Information Age everything it should be,” said Senator Kerry. “Americans have a right to decide how their information is collected, used, and distributed and businesses deserve the certainty that comes with clear guidelines.”

Senator McCain said, “Consumers want to shop, browse and share information in an environment that is respectful of their personal information. Our legislation sets forth a framework for companies to create such an environment and allows businesses to continue to market and advertise to all consumers, including potential customers.”

Covered Information

The measure would cover personally identifiable information (PII)—including name, postal address, e-mail address, phone number, Social Security, credit card number, and biometric data—as well as any information that is used, collected, or stored in connection with PII in a manner that may reasonably be used to identify a specific individual—such as a birth date or an IP address.

Coverage would exclude PII obtained from public records; PII obtained from a forum where the individual voluntarily shared the information, that is widely and publicly available, and that contains no restrictions on who can access and view such information; PII reported in public media; and PII dedicated to contacting an individual at the individual’s place of work

Covered Entities

The law would apply to any person who collects, uses, or transfers covered information concerning more than 5,000 individuals during a 12-month period, and over whom the Federal Trade Commission has authority pursuant to Sec. 5(a) (2) of the FTC Act. The law also would apply to common carriers under the Communications Act of 1934 and to nonprofit organizations.

Right to Security and Accountability

The bill would call on the FTC to create rules requiring covered entities to carry out security measures to protect covered information.

Taking a “privacy by design” approach to data protection, the bill would require covered entities to implement a comprehensive information privacy program by incorporating development processes and practices throughout the product life cycle that are designed to safeguard PII based on the subject individuals’ reasonable expectations and any relevant threats.

Covered entities also would be required to maintain appropriate management processes and practices throughout the data life cycle.

Right to Notice and Individual Participation

Collectors of information would be required to provide clear notice to individuals on their collection practices and the purpose for such collection. Additionally, individuals would have to have the ability to opt out of any information collection that would otherwise be unauthorized by the law, as well as the ability to opt out of having their information used by third parties for behavioral marketing or advertising.

Affirmative consent (opt-in) would be required for the collection of sensitive personally identifiable information, including information related to a medical condition or religious affiliation.

Individuals would be given the right to access and correct their information, or to request cessation of its use and distribution.

Data Minimization, Constraints on Distribution, Data Integrity

Collectors of information would be permitted to collect only as much information as necessary to process or enforce a transaction, to deliver a service, to prevent or detect fraud, to investigate a possible crime, to engage in advertising or marketing, for research and development, or for certain internal operations.

Information could be retained only as long as it takes to provide or deliver goods or services to the subject individual or as long as the information is necessary for research and development purposes.

Collectors would have to contractually bind third parties to which they transfer information, to ensure that the third parties comply with the law’s requirements. The bill would require the collector to attempt to establish and maintain reasonable procedures to ensure that PII collected and maintained is accurate, if that PII could be used to deny consumers benefits or could cause significant harm.

Enforcement

A knowing or repetitive violation of the law would be treated as an unfair or deceptive act or practice in violation of the FTC Act. The FTC would be charged with enforcing the measure. State attorneys general also would have enforcement powers, unless the FTC takes action first. Violations would be subject to civil penalties.

The bill provides that it may not be construed to provide any private right of action.

The measure would supersede state laws relating to the collection, use, or disclosure of covered information. It would not preempt state laws (1) addressing health or financial information, (2) addressing notification requirements in the event of a data breach, or (3) relating to acts of fraud.

Safe Harbor

The bill would direct the FTC to create requirements for the establishment and administration of voluntary safe harbor programs to be overseen by nongovernmental organizations. Safe harbor programs would have to achieve protections at least as rigorous as those enumerated in the bill.

As incentive for enrolling in a safe harbor program, participants would be permitted to design or customize procedures for compliance and would be exempt from some requirements of the bill.

Further Information

Further information, including the text of the bill, is available here on Senator Kerry’s website.

Wednesday, March 02, 2011





Corporations Lack “Personal Privacy” Interests: High Court

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

Corporations do not have “personal privacy” interests for the purposes of a provision of the Freedom of Information Act, the U.S. Supreme Court held on March 1 in an 8 to 0 decision. AT&T could not block disclosure of certain documents under FOIA’s Exemption 7(C), which covers law enforcement records that “could reasonably be expected to constitute an unwarranted invasion of personal privacy.”

A trade association had submitted a FOIA request for documents AT&T had provided to the Federal Communications Commission Enforcement Bureau during an investigation of that company. The Bureau found that Exemption 7(C) applied to individuals identified in AT&T’s submissions, but not to the company itself, because corporations do not have “personal privacy” interests as required by the exemption.

The FCC upheld the Enforcement Bureau’s interpretation, but the U.S. Court of Appeals in Philadelphia disagreed, reasoning that “personal” is the adjective form of the term “person,” FOIA’s definition of which included corporations.

"Person" as Individual

The Supreme Court rejected the appellate court’s reasoning. Although “person” was a defined term in the statute, “personal” was not. When a statute does not define a term, the Court typically applies the term’s “ordinary meaning.”

Chief Justice Roberts, writing for the Court, stated that “personal” ordinarily referred to individuals. Corporations are not usually regarded as having personal characteristics, personal effects, or personal tragedy, he said.

“Adjectives typically reflect the meaning of corresponding nouns, but not always,” Roberts reasoned. “Sometimes they acquire distinct meanings of their own.” For example, Roberts explained, the meaning of “crabbed” was distinct from “crab,” and “corny” was distinct from “corn.”

Absence of Other Statutory References

AT&T did not cite any other instance in which a court had expressly referred to a corporation’s “personal privacy,” Roberts noted, and it did not identify any other statute that did so. In addition, the term “personal privacy,” as used in FOIA Exemption 6—regarding personnel and medical files—had been interpreted by the Court as involving an individual’s privacy rights.

Justice Kagan did not take part in the consideration or decision of the case.

The March 1 decision in Federal Communications Commission v. AT&T will appear in CCH Privacy Law in Marketing.

Thursday, February 03, 2011





High Court Agrees to Review Vermont Prescriber Privacy Law

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

The U.S. Supreme Court has agreed to decide whether the First Amendment prohibits the enforcement of a Vermont law that restricts access to information in prescription drug records.

At issue is a decision of the U.S. Court of Appeals in New York City (CCH Privacy Law in Marketing ¶60,558) holding that a Vermont statute regulating the collection and use of data identifying health care providers’ prescribing patterns impermissibly restricted commercial speech.

The statute banned the sale, transmission or use of prescriber-identifiable data for marketing or promoting a prescription drug unless the prescriber gave consent.

The appellate court determined that Vermont had failed to show that the statute directly and materially advanced the substantial state interests of lowering health care costs and protecting public health. The law had been challenged by three data-mining companies.

Similar laws in New Hampshire and Maine have been upheld by the U.S. Court of Appeals in Boston (CCH Privacy Law in Marketing ¶60,270 and CCH Privacy Law in Marketing ¶60,527, respectively).

The petition is Sorrell v. IMS Health Inc., Docket 10-779, cert granted January 7, 2011.

Monday, January 10, 2011





“Truth in Caller ID Act” Signed by President Obama

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

A law that amends the Telephone Consumer Protection Act to prohibit the transmission of misleading or inaccurate caller identification information with the intent to defraud, cause harm, or wrongfully obtain anything of value was signed by President Obama on December 22, 2010.

The “Truth in Caller ID Act of 2009” (Public Law No. 111-331, Senate Bill 30) is intended to combat “spoofing,” which occurs when a telephone caller alters the number or other information that appears on a recipient’s caller ID to conceal his or her identity.

Transmissions are exempted if they are made in connection with authorized activities of law enforcement agencies or a court order specifically authorizing the use of caller ID manipulation.

The law calls for the Federal Communications Commission to prescribe implementing regulations no later than six months after enactment.

Further details will appear in an upcoming issue of CCH Privacy Law in Marketing.

Friday, December 03, 2010





Vermont Prescriber Privacy Law Violates First Amendment

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

A Vermont statute regulating the collection and use of data identifying health care providers’ prescribing patterns impermissibly restricted commercial speech in violation of the First Amendment, the U.S. Court of Appeals in New York City has held.

A district court decision (CCH Privacy Law in Marketing¶60,330) denying declaratory and injunctive relief from enforcement of the statute, sought by three data-mining companies, was reversed and remanded.

The statute banned the sale, transmission, or use of prescriber-identifiable data (“PI data”) for marketing or promoting a prescription drug unless the prescriber gave consent. The law restricted speech and did not regulate merely non-expressive conduct, the court said. Restricting the sale of prescription information was a restriction on disclosure of information, which was a regulation of speech.

Substantial State Interests

Vermont alleged that the law advanced three substantial state interests: (1) protecting the public health, (2) protecting the privacy of prescribers and prescribing information, and (3) containing health care costs.

Vermont purportedly sought to discourage marketing practices regarding new brand-name prescription drugs that may not be efficacious or which may not be more effective than generic alternatives.

The state’s asserted interest in medical privacy was too speculative to qualify as a substantial state interest, in the court’s view. Vermont had not shown any effect on the integrity of the prescribing process or the trust patients have in their doctors from the use of PI data in marketing.

Lowering Costs, Protecting Public Health

Vermont did have a substantial interest in both lowering health care costs and protecting public health, but the court held that Vermont failed to show that the statute directly and materially advanced those interests.
The statute did not directly restrict the prescribing practices of doctors or the marketing practices of pharmaceutical companies. Rather, it restricted the information available to marketers so that their practices will be less effective and less likely to influence the prescribing practices of physicians.

This indirect approach was antithetical to a long line of Supreme Court cases stressing that courts must be very skeptical of government efforts to prevent the dissemination of information in order to affect conduct.

More Limited Restriction

In addition, Vermont's interests could be served as well by a more limited restriction on commercial speech, according to the court. The statute targeted the use of PI data to market all brand-name prescription drugs, not merely new brand-name drugs or those brand-name medications for which there were no generic alternatives. Thus, the statute banned speech beyond what the state’s evidence purportedly addressed.

There were alternative means to promote its interests, such as mandating the use of generic drugs as a first course of treatment, absent a physician’s determination otherwise, for all those patients receiving Medicare Part D funds.

The decision is IMS Health Inc. v. Sorrell, CCH Privacy Law in Marketing ¶60,558.

Wednesday, December 01, 2010





FTC Privacy Report Proposes “Do Not Track” Mechanism for Web Users

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

The Federal Trade Commission has proposed, as part of a framework to balance the privacy interests of consumers with innovation that relies on consumer information, the implementation of a “Do Not Track” mechanism for Internet users.

Described in a preliminary staff report, the mechanism would likely be a persistent setting on consumers’ web browsers, the FTC said, and would enable consumers to choose whether to allow the collection of data regarding their online searching and browsing activities.

“Technological and business ingenuity have spawned a whole new online culture and vocabulary—email, IMs, apps and blogs—that consumers have come to expect and enjoy,” said FTC Chairman Jon Leibowitz. “The FTC wants to help ensure that the growing, changing, thriving information marketplace is built on a framework that promotes privacy, transparency, business innovation, and consumer choice.”

Failure of Self Regulation

The report—titled “Protecting Consumer Privacy in an Era of Rapid Change”—states that industry efforts to address privacy through self-regulation “have been too slow, and up to now have failed to provide adequate and meaningful protection.” The framework outlined in the report is designed to reduce the burdens on consumers and businesses.

The proposed framework also is intended to inform policymakers, including Congress, as they develop solutions, policies, and potential laws governing privacy, and to guide and motivate industry as it develops more robust and effective best practices and self-regulatory guidelines.

Leibowitz added that the FTC, in addition to making policy recommendations, “will take action against companies that cross the line with consumer data and violate consumers’ privacy—especially when children and teens are involved.”

“Privacy by Design”

To reduce the burden on consumers and to ensure basic privacy protections, the report recommends that companies adopt a “privacy by design” approach by building privacy protections into their everyday business practices. Such protections would include security measures for consumer data, limited collection and retention of such data, and reasonable procedures to promote data accuracy.

Companies also should implement and enforce procedurally sound privacy practices throughout their organizations, including assigning personnel to oversee privacy issues, training employees, and conducting privacy reviews for new products and services.

Consumer Choice

Consumers should have the opportunity to make choices about the collection and sharing of their data at the time and in the context in which they are making decisions—not after having to read long, complicated privacy policies that they often cannot find.

The report adds that, to simplify choice for both consumers and businesses, companies should not have to seek consent for certain commonly accepted practices, such as product fulfillment, fraud prevention, and legal compliance.

A “Do Not Track” mechanism would constitute a simplified means of consumer choice, the report said, allowing consumers to opt out of the collection of information about their Internet behavior for targeted ads.

Transparency

The report also recommended other measures to improve the transparency of information practices, including consideration of standardized notices that allow the public to compare information practices of competing companies. Consumers should have “reasonable access” to the data that companies maintain about them, particularly for non-consumer facing entities such as data brokers. In addition, the report proposed that stakeholders undertake a broad effort to educate consumers about commercial data practices and the choices available to them.

Text of the report is available here on the FTC website.

“Safe Harbor” Proposed

In response to the report, Senator John Kerry (D-Mass.) stated, “The Federal Trade Commission’s report should be a wakeup call for every Internet user in this country. The report confirms that many companies—both online and offline—don’t do enough to protect consumer privacy.”

Kerry continued, “The report also makes clear that properly protected information and respect for consumer trust can be good for both business and consumers.”

The senator called for the creation of FTC-approved safe harbor programs through which organizations can establish procedures to ensure compliance with standards on data protection, disclosures on information collected and the uses of such information, and the right of consumers to opt out.

“Those actors participating in safe harbor programs would be subject to FTC oversight and penalties,” Kerry said, “but because of their voluntary participation and commitment to high standards, they would be free from a private right of action and the complaint and adjudication process.”

Wednesday, October 13, 2010





“Smart Grid” Needs Consumer Education, Choice to Protect Privacy: Department of Energy

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

The long-term success of “Smart Grid” energy technologies depends upon understanding and respecting consumers’ reasonable expectations of privacy, security, and control over who has access to energy-usage data, the U.S. Department of Energy said in a report released October 5, 2010. The report, Data Access and Privacy Issues Related to Smart Grid Technologies, focuses on the ways legal and regulatory regimes are evolving to protect consumer privacy and choice, while promoting the growth of innovative energy-management services and technologies that rely on detailed energy-usage data.

Advances in Smart Grid technology could significantly increase the amount of information about personal energy consumption that is available to utility companies and third parties, the report found. For example, “advanced metering” technology that closely monitors electricity usage could reveal such personal details as consumers’ daily schedules, whether their homes are equipped with alarm systems, whether they own expensive electronic equipment like plasma TVs, and whether they use certain types of medical equipment.

“Consumers rightfully expect that the privacy of this information will be maintained,” the DOE said. At the same time, access to consumer data will be necessary to achieve the goals that Smart Grid technologies will advance, such as improved reliability in power delivery, reduced transmission costs, and increased energy efficiency.

According to the DOE, information privacy and access, in the context of a Smart Grid, are complementary values, rather than conflicting goals.

“The practical impact of a Smart Grid depends on its capacity to encourage and accommodate innovation,” the DOE said, “while making usage data available to consumers and appropriate entities and respecting consumers’ reasonable interests in choosing how to balance the benefits of access against the protection of personal privacy and security.”

Utility companies, the DOE said, should be able to access and use consumer-specific energy usage data (CEUD) for utility-related business purposes, such as managing their networks, coordinating with transmission and distribution-system operators, and billing for services. The report recommended, however, that consumers be able to choose whether to affirmatively opt in to any non-utility, third-party use of their CEUD through a secure and trustworthy process. In particular, according to the DOE, the practice of disclosing or selling CEUD to third parties for the purpose of targeted advertising should require affirmative and informed consumer consent.

Consumer education will be critical to the successful adoption and deployment of Smart Grid technologies like advanced metering, the DOE said.

“It is important for consumers to understand the long-term benefits of these technologies, like lowering energy bills,” the report stated.

The Federal Communications Commission’s National Broadband Plan issued last spring called for the DOE to study the privacy and access implications of Smart Grid technologies, and how they were likely to affect the communications needs of utilities. The report complements a companion DOE report, also released October 5, 2010, Informing Federal Smart Grid Policy: The Communications Requirements of Electric Utilities.

Full text of the DOE’s report, Data Access and Privacy Issues Related to Smart Grid Technologies, is available here on the DOE’s website and will be published in CCH Privacy Law in Marketing.

Thursday, October 07, 2010





Retail Chain’s Data Security Breach Did Not Injure Customers

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

Retail grocery chain customers whose financial information was stolen during a breach of the chain's computer system did not sustain actual harm and could not pursue claims for negligence and implied contract under Maine common law, according to Maine’s highest court.

The customers asserted that they were injured by Hannaford's failure to prevent and notify them of the breach.

Dealing with Fraudulent Charges

The expenditure of time and effort to identify and remediate fraudulent charges on their credit and debit card accounts did not constitute a cognizable injury, in the absence of physical harm, economic loss, or identity theft, the court held.

The time and effort expended by the customers represented the ordinary frustrations and inconveniences confronted by everyone in daily life, the court said. The question had been certified to the court by the federal district court in Portland, Maine (CCH Privacy Law in Marketing ¶60,382).


The decision is In re Hannaford Bros. Co. Customer Data Security Breach Litigation, CCH Privacy Law in Marketing ¶60,534.

Further information about CCH Privacy Law in Marketing appears here.

Monday, October 04, 2010





Schwarzenegger Again Vetoes Amendments to California Data Breach Law

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

Proposed legislation to amend California's data security breach notification law was vetoed by Governor Arnold Schwarzenegger on September 29, 2010.

Senate Bill 1166 would have required any agency, person, or business required to issue a notification under existing law to meet additional requirements regarding that notification.

The legislation would have required security breach notifications to be written in plain language and to contain certain specified information, including contact information regarding the breach, the types of information breached, and, if possible to determine, the date of the breach.

It also would have required notification to the California Attorney General of breaches affecting more than 500 California residents.

Schwarzenegger vetoed an identical bill in 2009.

Veto Statement

In a message to the members of the California Senate, Schwarzenegger said:

“California's landmark law on data breach notification has had many beneficial results. Informing individuals whose personal information was compromised in a breach of what their risks are and what they can do to protect themselves is an important consumer protection benefit. This bill is unnecessary, however, because there is no evidence that there is a problem with the information provided to consumers. Moreover, there is no additional consumer benefit gained by requiring the Attorney General to become a repository of breach notices when this measure does not require the Attorney General to do anything with the notices.

“Since this measure would place additional unnecessary mandates on businesses without a corresponding consumer benefit, I am unable to sign this bill.”

Further information about CCH Privacy Law in Marketing appears here.

Thursday, June 10, 2010





Data Breach Notice, Anti-Spam Laws Proposed in Canada

This posting was written by Thomas A. Long, Editor of CCH Privacy Law in Marketing.

Proposed privacy legislation introduced on May 25, 2010 in the Canadian Parliament would create an obligation for Canadian businesses to notify the government and, in some cases, individuals of data security breaches and would place new restrictions on Internet and wireless spam.

Bill C-29 would add provisions to Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) (CCH Privacy Law in Marketing ¶42,200) to require organizations to report material breaches of data security safeguards to the Privacy Commissioner. Organizations would have to notify individuals of data security breaches only when such breaches create a risk of significant harm.

Disclosure of Personal Information

In addition, the measure would amend PIPEDA to permit the disclosure of personal information without the knowledge or consent of the individual for the purposes of:

(1) Identifying an injured, ill, or deceased individuals and communicating with their next of kin;

(2) Performing police services;

(3) Preventing, detecting, or suppressing fraud; and

(4) Protecting victims of financial abuse.

Spam, Spyware

Another bill (Bill C-28) proposes the enactment of a new statute, the “Fighting Internet and Wireless Spam Act.” That legislation would prohibit the sending of commercial electronic messages without the prior consent of the recipient and would provide rules governing the sending of such messages, including a mechanism for the withdrawal of consent.

The statute would also prohibit the alteration of data transmissions and the unauthorized installation of spyware programs on computers. Violations would be subject to administrative monetary penalties by the Canadian Radio-television and Telecommunications Commission. Persons affected by violations would be able to bring a private action for actual and statutory damages.

Bill C-28 also would amend PIPEDA to prohibit the collection of personal information by means of unauthorized access to computer systems, as well as the unauthorized compilation of lists of electronic addresses.

“Canadian shoppers should feel just as confident in the electronic marketplace as they do at the corner store,” said Minister of Industry Tony Clement.

“With today’s two pieces of legislation, we are working toward a safer and more secure online environment for both consumers and businesses—essential in positioning Canada as a leader in the digital economy,” he added.