A recent FTC settlement provides some illuminating guidance for app developers and publishers regarding the sharing of geolocation data, when an app may begin collecting and sharing data, and privacy representations made in a license agreement or similar document. In re Goldenshores Technologies, LLC. This settlement is the first to impose substantial conditions upon the collection of location data, including a disclosure as to why the information is being collected, and provides insight into the FTC’s view of how its guidance from its March 2012 report, Protecting Consumer Privacy in an Era of Rapid Change (“2012 Privacy Report”) and February 2013 report, Mobile Privacy Disclosures: Building Trust Through Transparency (“Mobile Privacy Report”), should be executed.
By Justine Young Gottshall And Damien Wint
As we approach six months since the Federal Trade Commission’s (FTC) amendments to the Children’s Online Privacy Protection Act (COPPA) Rule, 16 C.F.R. Part 312 (the “Rule” or, as amended, the “Amended Rule”) became effective, it is essential that any website or online service that is not in full compliance take steps to do so. This article summarizes the Amended Rule’s changes and discusses their practical implications for website and online service operators.
- I. Introduction and Key Highlights
On December 19, 2012, the FTC issued amendments to the Children’s Online Privacy Protection Act Rule, 16 C.F.R. Part 312 (the “Rule” or, as amended, the “Amended Rule”) that subsequently became effective on July 1, 2013.
According to the FTC, the amendments are intended to “to clarify the scope of the Rule and strengthen its protections for children’s personal information.” FTC 2012 Amended COPPA Rule Summary, 78 Fed. Reg. 3972, 3972. Consistent with such an ambitious goal, the amendments address a number of the Rule’s ambiguities and make significant changes to its requirements.
The Amended Rule modifies the definitions of:
- “Operator” to make clear that the Rule covers an operator of a child-directed site or service where it integrates outside services that collect personal information from its visitors, such as plug-ins or advertising networks;
- “Web site or online service directed to children” to (i) include plug-ins or ad networks that have actual knowledge that they are collecting personal information through a child-directed Web site or online service, and (ii) allow a subset of child-directed sites and services (those that target children but not as their primary audience) to “age screen” their users and require such properties to meet COPPA’s notice and consent obligations only for those users who self-identify as younger than age 13;
- “Online contact information” to include not only email, but also any other “substantially similar identifier” that permits direct online contact with a person;
- “Collect or collection of information” to make clear that a wide range of actions are covered under the Amended Rule, including the passive tracking of children’s personal information; and
- “Personal information”to (i) add “screen or user names” that function in the same manner as “online contact information,” (ii) “photographs, videos, or audio files” that contain a child’s image or voice, and (iii) create new sub-categories for
- “Geolocation information” sufficient to identify street name and name of a city or town, and
- “Persistent identifiers” that can be used to recognize a user over time and across different Web sites or online services. The Amended Rule also adds the definition of “support for the internal operations of the Web site or online service,” and the position that operators do not need to obtain prior parental consent if the operator or third parties collect only persistent identifiers and do so solely for such purposes
The Amended Rule also:
- Streamlines and clarifies the direct notice requirements to ensure that key information is presented to parents in a succinct “just-in-time” notice;
- Expands the non-exhaustive list of acceptable methods for obtaining prior verifiable parental consent and creates three new exceptions to the Rule’s notice and consent requirements;
- Strengthens data security protections by requiring operators to take reasonable steps to release personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of such information;
- Adds a data retention and deletion provision, 16 C.F.R. § 312.10, requiring operators to (i) delete personal information after it is no longer needed to fulfill the purpose for which it was collected, and (ii) use reasonable measures to protect against the unauthorized access to, or use of, such deleted personal information;
- Strengthens the FTC’s oversight of self-regulatory safe harbor programs by (i) requiring entities seeking FTC approval of their safe harbor program guidelines to submit comprehensive information about their capability to run an effective safe harbor program, (ii) increasing the base-level of oversight that safe harbor program providers must have over their members, and (iii) requiring safe harbor programs to submit annual reports to the FTC; and
- Institutes voluntary pre-approval mechanisms for new consent methods and for activities that support the internal operations of a Web site or online service.
- II. Updates to the Web Sites and Online Services Covered by COPPA
New Factors Relevant to FTC’s “Child-Directed” Site or Service Determination
The Amended Rule adds several criteria to the list of factors the FTC will consider when facing the pivotal question of what makes a website or online service a “child directed” one under COPPA. The new factors include: (i) the site or service’s musical content, (ii) the presence of child celebrities, and (iii) the presence of celebrities that appeal to children.
The FTC has not provided additional guidance on how it intends to analyze these new factors (such as how it will determine whether a celebrity “appeals to children”), but has advised that these new factors will not predominate over existing criteria and will not cause the Rule to capture sites that should be deemed general audience sites. See 2012 Amended COPPA Rule Statement of Basis and Purpose (“2012 SBP”), 78 Fed. Reg. 3972, 3978.
Child-Directed Sites and Services Are Responsible for Collection of Personal Information by Third Parties Through Their Sites and Services
The Amended Rule clarifies that covers operators of sites and services that integrate plug-ins or advertising networks that collect personal information from visitors are responsible for that collection by third parties, regardless of whether the operator itself collects personal information from visitors. See 2012 SBP at FR 4001. What’s more, the Amended Rule also provides that operators will be held strictly liable for personal information collected by third parties through their sites. Id.; see also, Section 4.0, below, for a detailed discussion of operators responsibilities with respect to personal information collected by third parties.
COPPA Applies to Plug-ins and Ad Networks with “Actual Knowledge” That They Are Collecting Visitor Information from a Child Directed Site or Service
The Amended Rule also clarifies that it covers third parties (such as providers of plug-ins or online advertising services) with “actual knowledge” that they are collecting personal information through a child-directed Web site or online service (“Collectors”). The FTC expects its “actual knowledge” determinations to be highly fact specific, but has indicated that it will likely find “actual knowledge” where: (i) a child-directed content provider directly communicates the child-directed nature of its content to the entity collecting information through its site, or (ii) a representative of such an entity recognizes the child-directed nature of the content. See 2012 SBP, 78 FR 3972, 3978.
In addition, a Collector acquires “actual knowledge” if a site from which it collects visitor information sends the Collector a signal certifying that it is a child-directed site. Conversely, Collectors may ordinarily rely on a “not child-directed” signal received from a site. The FTC has cautioned, however, that such reliance is advisable only if sites affirmatively signal that they are not child directed. In other words, Collectors are advised not to rely on a signal that a site is not child-directed if the signal is sent by default. See COPPA FAQs, D.12.
- NOTE: A Collector may still acquire “actual knowledge” where it receives concrete evidence of a site’s child directed nature (e.g., a screenshot), even if the Collector has also received a contradictory signal or representation from the site. That said, if the evidence of the site’s child-directed nature is inconclusive, then the Collector may still rely on a site’s specific affirmative representation or signal that it is not a child-directed site. Id
Collectors should also beware that the FTC may find an entity has “actual knowledge” just because the child-directed nature of a site from which it collected personal information was communicated to one of its employees. Id., D.10.
- TIP: Collectors may want to set up a hotline for communications concerning their collection of data from child-directed sites, as the FTC has hinted that it will be less likely to find that a Collector has obtained actual knowledge because of a communication sent to an individual employee where the Collector has provided a mechanism for the public to share COPPA related information. Id.
The FTC’s Complying with COPPA: Frequently Asked Questions Guide (“COPPA FAQs”)  also provide helpful guidance for Collectors concerning the (common) receipt of a list of sites that the third party claims are child-directed under COPPA. First, it’s unlikely that the receipt of a list of purportedly child-directed Web sites alone would constitute actual knowledge. Further, this does not, by itself, trigger any duty to investigate. But where the Collector of personal information receives screenshots or other forms of concrete evidence concerning a site’s status, such information is sufficient to provide operators with actual knowledge that the Web site is directed at children.
Second, where the plug-in or ad-network service provider receives information that makes them uncertain as to whether the site is child-directed, the provider may ordinarily rely on a specific affirmative representation from the Web site operator that its content is not child-directed.
- NOTE: For this purpose, a Web site operator would not be deemed to have provided a specific affirmative representation if it merely accepts a standard provision in a third party service provider’s Terms of Service stating that, by incorporating its code, the first party agrees that it is not child directed. See COPPA FAQs, D.11.
- III. Changes to the Definition of “Personal Information” Covered by COPPA
The Amended Rule expands the universe of information and activities covered under the Rule, adding the following to the Rule’s definition of personal information:
- Photos or videos containing a child’s image, and audio files with a child’s voice
- Operators must (i) meet COPPA’s parental notice and consent requirements before allowing kids to upload photos of themselves or other children, or (ii) prescreen and delete from children’s submissions any photos of themselves or other children.
- TIP: A site may still allow children to upload photos of themselves without first obtaining parental notice and consent if the site blurs such images before they are posted. See 2012 SBP, 78 FR 3972, 3982 (“The Commission believes that operators who choose to blur photographic images of children prior to posting such images would not be in violation of the Rule.”)
- Persistent identifiers that can be used to recognize a user over time and across different websites or online services, regardless of whether they are combined with individually identifiable information.
- Examples of persistent identifiers covered by the rule include: (i) cookies (ii) IP address, and (iii) mobile device IDs. See FTC May 15, 2013 Model Letter to Business That May Be Collecting Persistent Identifiers.
- The Amended Rule provides that persistent identifiers can be collected without providing parental notice or obtaining verifiable parental consent provided certain conditions are met, as described more fully in Section 5 below, but generally where the persistent identifiers are used only for internal operations purposes.
- Screen or user names that function in the same manner as online contact information that permits direct contact with a person online.
- Geolocation information sufficient to identify street name and name of a city or town.
- IV. Changes in Compliance Requirements
New and Modified Requirements for Protecting Personal Information
The Amended Rule revises the Rule’s data confidentiality, security, and integrity provision (16 CFR § 312.8) to require operators to inquire about data security capabilities of third parties to whom they release children’s personal information and, either by contract or otherwise, receive assurances from such entities about how they will treat the personal information they receive. Though they will not be required to “ensure” that these third parties secure the information absolutely, they must take reasonable steps to release children’s personal information only to service providers and third parties who are capable of maintaining the confidentiality, security and integrity of such information, and who provide assurances that they will maintain the information in such a manner.
Data Deletion and Data Retention
The Amended Rule includes a new data retention and deletion provision, (16 C.F.R § 312.10), which requires operators to anticipate the reasonable lifetime of the personal information they collect from children, and apply the same concepts of data security to its disposal as they are required to do with regard to its collection and maintenance.
New and Modified Requirements Related to Third Party Collection of Personal Information
Operators’ Responsibility for Personal Information That Third Parties Collect Through Their Sites
The Amended Rule imposes specific requirements both on child-directed sites and services that allow third parties to collect personal information from site users, and the third parties themselves. As an initial matter, the FTC has left no doubt that covered operators are accountable for actions taken by third parties on their Sites, warning operators of child-directed sites and services that the “Amended Rule holds you liable for the collection of information that occurs on or through your sites and services, even if you yourself do not engage in such collection.” COPPA FAQs, D.7.
And, ignorance is not a defense, especially for operators of child-directed applications: “As the operator of a child-directed app, you must conduct an inquiry into the information collection practices of every third party that can collect information via your app.” Id at D.8. According to the FTC, this is because app operators need to determine each third party’s information collection practices so that they can “make an informed decision as to whether its presence on [their] app will require [them] to give parents notice and obtain their consent prior to their collection of personal information from children.” Id.
Further, the Amended Rule requires that operators do more than “merely inform third parties of the child-directed nature of [their] site or service” to meet their COPPA Rule obligations. See COPPA FAQs, D.6. Operators should note that, despite not alone being sufficient to meet COPPA’s requirements, the FTC still recommends that their sites signal their status as a child-directed site to third parties.
Absent an exception under the Amended Rule, operators must (i) not allow any other entity to collect personal information from visitors; or (ii) in the alternative, provide notice and obtain prior parental consent before allowing any entity to collect personal information from their visitors, as well as provide all of the other COPPA protections. See 16 C.F.R. 312.2.
The FTC recommends operators arrange with third parties collecting personal information from their Site visitors to provide adequate COPPA protections; but no matter what the arrangement, the ultimate responsibility for providing these protections still rests with the Operator.
The FTC also strongly suggests that operators get answers to the following questions (amongst others) before entering into an arrangement under which a third party will serve advertising on their Site:
- Is there a way to control the type of advertising that appears on our Sites? For example, can we stipulate that, or contractually require you to, engage only in contextual advertising, and prohibit you from employing behavioral advertising or retargeting techniques? See COPPA FAQs, D.7
- What categories of information will be collected from users on the sites and services in connection with the ads they are served? Will persistent identifiers be collected for purposes other than support for internal operations? Will geolocation information be collected in connection with the ads served? Id.
The Amended Rule cuts much of the granular details that the original Rule required operators to disclose in their privacy policies. Instead, the Amended Rule dictates that online privacy notices for covered sites and services take a “shorter and more streamlined” approach to disclosing the operators’ information collection and use practices that would be “most critical to parents.” The FTC made this change “to encourage operators to provide clear, concise descriptions of their information practices…” Id. at C.2.
The FTC also notes that reducing the volume of required disclosures has the added benefit of making sites’ privacy policies easier to read on smaller screens (e.g., smartphones or other Internet-enabled mobile devices). Specifically, the online policy must provide:
- The name, address, telephone number, and email address of all operators collecting or maintaining personal information through the site or service (or, after listing all such operators, provide the contact information for one that will handle all inquiries from parents);
- A description of what information the operator collects from children, including (i) whether the operator enables children to make their personal information publicly available, (ii) how the operator uses such information, and (iii) the operator’s disclosure practices for such information; and
- A disclosure stating that the parent can review or have deleted the child’s personal information and refuse to permit its further collection or use, and state the procedures for doing so. See 16 C.F.R. 312.4(d).
- VI. Changes in Parental Notice and Consent Requirements
The Amended Rule made a few key modifications to the Rule’s notice and consent requirements and provides substantial guidance as to the steps Operators need to take in order to meet those requirements.
New and Modified Exceptions to Parental Notice and Consent Obligations
Modified Exception Based on the Deletion of Personal Information from Children’s Online Submissions
The Amended Rule removed the requirement that operators delete 100 percent of all personal information from postings made by children before publishing the posting or, in the alternative, provide parents with notice and get parental consent to publish that content. Now, it is sufficient that operators take reasonable measures to delete personal information from children’s posts before they are made public. 2012 SBP, 78 FR 3972, 3973-74
New Exception for Persistent Identifiers
Second, the Amended Rule creates an exception to notice and consent obligations for the collection of a persistent identifier, where (i) no other personal information is collected alongside the persistent identifier, and (ii) the persistent identifier is used solely to provide “support for the internal operations of the Web site or service.” See 2012 SBP, 78 FR 3972, 3994.
Helpfully, the Amended Rule expands the definition of “Support for the internal operations of the Web site or online service” to provide guidance as to which uses of persistent identifiers will qualify for this exception:
- “Support for the internal operations of the Web site or online service means:
- (1) Those activities necessary to: (i) maintain or analyze the functioning of the Web site or online service; (ii) perform network communications; (iii) authenticate users of, or personalize the content on, the Web site or online service; (iv) serve contextual advertising on the Web site or online service or cap the frequency of advertising; (v)protect the security or integrity of the user, Web site, or online service; (vi) ensure legal or regulatory compliance; or (vii) fulfill a request of a child as permitted by § 312.5(c)(3) and (4);
- (2) So long as The information collected for the activities listed in paragraphs (1)(i)-(vii) of this definition is not used or disclosed to contact a specific individual, including through behavioral advertising, to amass a profile on a specific individual, or for any other purpose.” See 16 CFR 312.2.
The major takeaway here is that the persistent identifier exception likely will apply where a persistent identifier is used in connection with site analytics and contextual advertising activities; but will not apply if it is used in connection with behavioral advertising.
New Exception Based on Age Screening
The Amended Rule has also introduced a narrow exception for sites or services that are classified as “child-directed,” but also do not target children as their primary audience. An example of such a site would be one that attracts a disproportionate number of children younger than the age of 13, but has an audience that is comprised mostly of the teenagers and adults that the site targets.
An operator of such a site or service may age-screen its users if it (i) does not collect personal information from any visitor prior to collecting age information, and (ii) prevents the collection, use, or disclosure of personal information from visitors who identify themselves as under age 13 without requisite parental notice and consent. See 16 C.F.R. § 312.2 (definition of “Web site or online service directed to children,” paragraph (3)).
Once the age screening is complete, these sites may not block children younger than age 13 from visiting or using their entire site. Instead, they are to offer different activities and functions to users depending on their age. The portions of their sites that are accessible to children must meet all of the stringent protections contained in COPPA, while the content in the portions of their sites that are accessible only to those who are 13 years old or older need not. Id.
- TIP: Operators who qualify for and wish to take advantage of this new screening mechanism are free to rely on a child’s representation that she is 13 years old or older but the FTC still suggests that such operators ask age-verification questions in a neutral manner (e.g., do not include a check box stating “I am over 13 years old.”). Id. at D.3.
New Specifications for Form and Content of Direct Notices
The Amended Rule fleshes out in considerable detail the required form and content of the notice operators must provide in a variety of circumstances, including before (i) collecting a child’s personal information, (ii) collecting a parent’s online contact information, (iii) contacting the child more than once, and (iv) collecting a child’s and parent’s name and contain information for the child’s safety. Some of the specific requirements are as follows.
Form of Notice Required before Collecting a Child’s Personal Information
Where an operator seeks to obtain a parent’s verifiable consent prior to the collection, use, or disclosure of a child’s personal information, the direct notice must:
- State that (i) the operator has collected the parent’s online contact information from the child, and, if such is the case, the name of the child or the parent, in order to obtain the parent’s consent; (ii) the parent’s consent is required for the collection, use, or disclosure of such information, and that the operator will not collect, use, or disclose any personal information from the child if the parent does not provide such consent; (iii) if the parent does not provide consent within a reasonable time from the date the direct notice was sent, the operator will delete the parent’s online contact information from its records; and
Where an operator voluntarily seeks to provide notice to a parent of a child’s online activities that do not involve the collection, use or disclosure of personal information, the direct notice must:
- State that (i) the operator has collected the parent’s online contact information from the child in order to provide notice to, and subsequently update the parent about, a child’s participation in a Web site or online service that does not otherwise collect, use, or disclose children’s personal information; (ii) the parent’s online contact information will not be used or disclosed for any other purpose; (iii) the parent may refuse to permit the child’s participation in the Web site or online service and may require the deletion of the parent’s online contact information, and how the parent can do so; and
- Include a hyperlink to the operator’s online notice of its information practices. Id.
Form of Notice Required Before Using a Child’s Contact Information to Communicate With the Child More than Once
Where an operator intends to communicate with the child multiple times via the child’s online contact information and collects no other information, the direct notice must:
- State that (i) the operator has collected the child’s online contact information from the child in order to provide multiple online communications to the child; (ii) the operator has collected the parent’s online contact information from the child in order to notify the parent that the child has registered to receive multiple online communications from the operator; (iii) the online contact information collected from the child will not be used for any other purpose, disclosed, or combined with any other information collected from the child; (iv) the parent may refuse to permit further contact with the child and require the deletion of the parent’s and child’s online contact information, and how the parent can do so; (v) if the parent fails to respond to this direct notice, the operator may use the online contact information collected from the child for the purpose stated in the direct notice; and
- Include a hyperlink to the operator’s online notice of its information practices. See Id.
Form of notice required before collecting child and parent’s name and online contact information to protect the child’s safety
Where the operator’s purpose for collecting a child’s and a parent’s name and online contact information is to protect a child’s safety and the information is not used or disclosed for any other purpose, the direct notice must:
- State that (i) the operator has collected the name and the online contact information of the child and the parent in order to protect the safety of a child; (ii) the information will not be used or disclosed for any purpose unrelated to the child’s safety; (iii) the parent may refuse to permit the use, and require the deletion, of the information collected, and how the parent can do so; (iv) that if the parent fails to respond to this direct notice, the operator may use the information for the purpose stated in the direct notice; and
- Include a hyperlink to the operator’s online notice of its information practices. See Id.
New and Modified Mechanisms for Obtaining Parental Consent
The Amended Rule makes some significant changes to the Rule’s requirements for obtaining verifiable parental consent, which include modifications to the Rule’s non-exhaustive list of FTC-approved mechanisms for obtaining verifiable parental consent.
First, the Amended Rule clarifies that verifiable consent may be obtained by requiring a parent to use a credit card in connection with a monetary transaction. Though the FTC has long interpreted the Rule as limiting the use of credit cards as an acceptable mechanism for obtaining parental consent to situations involving actual monetary transactions, the Rule did not contain any specific language to this effect. See 2012 SBP, 78 FR 3972, 3987.
Second, the Amended Rule adds several newly approved ways to obtain verifiable parental consent, including via:
- Electronic Scanning: Involves providing a parent with a consent form that the parent may sign and return via an electronic scan (of course, it is still acceptable under the Amended Rule for parents to return consent forms via mail or fax)
- Video Verification: Involves having a parent connect to trained personnel via video conference
- Government-issued ID Verification: Involves checking government-issued ID (e.g., a driver’s license, or a segment of the parent’s Social Security number) that the parent provided against a database; the ID must be deleted from the Operator’s records when the verification process is complete
- Debit Card or “Other Online Payment System” Verification: Involves requiring the parent, in connection with a monetary transaction, to use a debit card or other online payment system that provides notification of each discrete transact to the primary account holder. See Id. at 3986-87.
Third, the Amended Rule added a voluntary process that allows entities to seek FTC approval of new methods for obtaining verifiable parental consent. Anyone seeking approval for a novel consent method under this new process will be required to present a detailed description of the proposed parental consent mechanism along with an analysis of how it is “reasonably calculated, in light of available technology, to ensure that the person providing consent is the child’s parent.” 2012 SBP, 78 FR 3972, 3987.
Finally, the Amended Rule provides that any operator participating in a FTC-approved safe harbor program may use any parental consent mechanism that the safe harbor program deems to be compliant with the Rule’s requirement for obtaining verifiable parental consent. See 2012 SBP, 78 FR 3972, 3992.
- Changes in Safe Harbor Programs
The Amended Rule also made some significant changes to COPPA’s safe harbor program that were intended to strengthen the Commission’s oversight of the program and to raise the minimum of level of oversight that the programs exercised over participating operators. Specifically, the Amended Rule:
- Requires safe harbor programs to conduct annual, comprehensive reviews of each of their members’ information practices
- Requires applicants seeking to become a FTC-approve safe harbor program provider to explain in detail their business models and the technical capabilities and mechanisms they possess
- Requires safe harbor program providers to submit annual aggregated summaries of the annual assessments they conducted on program members’ information practices
- VIII. Conclusion
The Amended Rule makes significant changes, some of which are complex and may be difficult to fully address. Compliance with the Amended Rule will require operators to pay careful attention to the audience age composition of their sites and services, the kinds of data they collect, and also the data collection practices of third party business partners and services providers. As a result, covered operators (and operators who aren’t sure whether they are covered), would be wise to study the Amended Rule and other FTC COPPA guidance, perform an initial assessment of how their current practices correspond to the Amended Rule’s new requirements, and, where there are gaps, work to address them as soon as possible.
 The full list of factors set forth in the Amended Rule now includes: the site or services’ (i) subject matter, (ii) visual content, (iii) use of animated characters, child-oriented activities and incentives, music, or other audio content; the age of models depicted on the site, or the presence of child celebrities or celebrities who appeal to children; other characteristics of the Web site or online service; or whether advertising promoting or appearing on the Web site or online service is directed to children. See 16 C.F.R. § 312.2 (definition of “Web site or online service directed to children,” paragraph (1)).
 Complying with COPPA: Frequently Asked Questions: A Guide for Business and Parents and Small Entity Compliance Guide, revised July 2013, available at http://business.ftc.gov/documents/Complying-with-COPPA-Frequently-Asked-Questions.
 According to the FTC, this was less of an addition, and more of a clarification of the original rule, which already covered any geolocation information that provides information precise enough to identify the name of a street and city or town. See 2012 SBP, 78 FR 3972, 3982.
 The definition for online contact information lists email addresses, instant messaging (“IM”) user identifiers, voice over Internet protocol (“VOIP”) identifiers, and video chat user identifiers as examples of online contact information. The FTC has also confirmed that online contact information does not include mobile phone numbers. See 2012 SBP, 78 FR 3972, 3975.
 According to its Statement of Basis and Purpose, the FTC added this voluntary approval process because “little technical innovation in the area of parental consent has occurred.” The FTC established this new process to address the concerns of operators that have been reluctant to use consent methods other than those listed in the Rule, and hopes “to encourage the development of new consent mechanisms, and to provide transparency regarding consent mechanisms that may be proposed.” 2012 SBP, 78 FR 3972, 3991
The Federal Trade Commission’s long awaited “Internet of Things” public workshop was held Nov. 19, 2013, and webcast live (with presentations, transcripts and videos to be archived for ready access at http://www.ftc.gov/video) to explore a wide range of potential privacy and security issues associated with Internet-connected devices everywhere – at home, work and in the car. As the workshop follows closely on the heels of the FTC’s first “Internet of Things” enforcement action in September (covered by the InfoLawGroup, “FTC Enters “Internet of Things” Arena With TRENDnet Proposed Settlement“) FTC watchers have eagerly awaited the agency’s view on how it may push deeper into the nascent area.
New Study Finds that Two Thirds of U.S. Adults Would Not Return to a Business Where Their Personal Information was Stolen.
From hackers to stolen laptops, security breaches have been on the rise. While most businesses are aware of the dangers associated with potential security breaches, few truly understand the full ramifications. Calculating the time and money spent on investigations and notifications is fairly straight forward but measuring the damage to a company’s reputation or customer confidence is more complicated. A recent survey sponsored by Cintas is helping to shed some light on this issue. An online survey of 2,061 U.S. adults ages 18 and older was conducted by Harris Interactive in August of this year and the results are surprising. Nearly two thirds of the participants indicated that they would not return to a business where their personal information was stolen. For specific types of businesses:
– 55 percent would change banks
– 46 percent would switch insurance companies
– 42 percent would go to a different drug store/pharmacy
– 40 percent would get a new doctor or dentist
– 39 percent would get a new lawyer
– 38 percent would donate to a different charity/non-profit organization
– 35 percent would not return to their hospital
– 24 percent would no longer donate to their alma mater or another educational institution they attended.
It should be noted that the discrepancy between the two-thirds rate and the industry-specific rates suggest that while consumers are concerned about security breaches on a whole, there is a certain amount of customer loyalty maintaining the relationship. As expected, that loyalty is strongest with educational institutions and charities but weakest with banks and insurance companies. Nevertheless, the survey results indicate that loyalty will only get you so far and that businesses should be proactive in safeguarding confidential information.
Earlier today the U.S. District Court for the Southern District of New York granted Google’s motion for summary judgment in the 8-year-running Google Book Search case. The court held that Google’s copying and display of in-copyright books is a noninfringing fair use. The decision is a signal that modern copyright law, despite its many flaws that become apparent in the digital age, will make at least some room for technological innovation.
About the Case
In 2004, Google announced plans to scan the full text of millions of books, both in the public domain and in-copyright, and to make those scanned works searchable on the web. For in-copyright works, Google would make “snippets,” each comprised of 1/8 of a page, available as search results for keywords contained within them.
Plaintiffs (some individual authors and the Authors Guild) sued Google in 2005 for copyright infringement. The parties reached a settlement agreement in 2008, but the court later rejected that agreement. Earlier this year, plaintiffs suffered a setback when the Second Circuit held that a class action was not appropriate because of the fact-specific fair use questions.
Back at the trial court level, the parties cross-moved for summary judgment. Google argued, as it had from the outset, that the scanning and display is a noninfringing fair use. The court granted Google’s summary judgment motion.
The Court’s Decision
Fair use is an affirmative defense to copyright infringement. Section 107 of the Copyright Act guides a court on how to determine whether a defendant’s use is fair. That provision instructs that it is not copyright infringement if one uses another’s work “for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research.” It goes on to provide that a court should apply four non-exhaustive factors in the fair use analysis:
- the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
- the nature of the copyrighted work;
- the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
- the effect of the use upon the potential market for or value of the copyrighted work.
In this case, the court began its fair use analysis by emphasizing how “[c]opyright law seeks to [provide] sufficient protection to authors and inventors to stimulate creative activity, while at the same time permitting others to utilize protected works to advance the progress of the arts and sciences.”
In applying the fair use factors, the court focused extensively on the first factor (purpose and character of the use). In this analysis, the court looked to the key question of whether the use was “transformative.” Borrowing from the Supreme Court, the court looked to whether Google’s scanning and display of the works “superseded” or “supplanted” the original creation, or whether the conduct:
instead adds something new, with a further purpose or different character, altering the first with new expression, meaning, or message; it asks, in other words, whether and to what extent the new work is “transformative.”
The court found the use to be transformative, comparing the “snippets” to thumbnail images, which were found to be protected by fair use in a 2003 case (Kelly v. Arriba Soft) involving an image search engine. Moreover, the court’s opinion noted the extensive benefits occasioned by Google’s efforts, including:
- The creation of new and efficient ways for readers and researchers to find books.
- The increased ability to conduct “data mining” or “text mining” research. For example, using the tool, researchers can examine word frequencies, syntactic patterns, and thematic markers to consider how literary style has changed over time.
- The expansion of access to books. In particular, the court noted how traditionally underserved populations will benefit as they gain knowledge of and access to far more books. Digitization facilitates the conversion of books to audio and tactile formats, increasing access for individuals with disabilities.
- The preservation and “new life” of old and out of print books.
The court acknowledged the idea that Google was engaged in commercial activity through the scanning project. But such fact was not dispositive to the fair use finding. Given the nature of the benefits, the court was satisfied that the purpose and character of the use supported a fair use finding.
On the second fair use factor, the court found that because the works were already published and were non-fiction, the balance tilted in favor of fair use.
The third factor came out “slightly” in Google’s favor. While a copying of all of a work will generally weigh against the defendant in considering the “amount and substantiality of the portion used,” the fact that the search feature would not function properly had the entire work not been scanned, put this factor on Google’s side.
Finally, on the fourth fair use factor — effect on the market — the court rejected plaintiff’s arguments that the scanning and display will negatively affect the sale of plaintiff’s works. Google does not sell the scanned copies, nor would it be feasible or practicable for a user to obtain an entire copy of the work through an assemblage of snippets. In fact, the court noted how the platform will help readers and researchers identify books, thus benefiting authors and publishers. From this, the court found, Google Books will generate new audiences and create new sources of income.
What This Says About Innovation and the Law
Google’s use of technology in this situation was disruptive. It challenged the expectation of copyright holders, who used copyright law to challenge that disruption. It bears noting that in the court’s analysis, it assumed that copyright infringement had taken place. But since fair use is an affirmative defense, it considered whether Google had carried its burden of showing that the circumstances warranted a finding that the use was fair. In this sense, fair use serves as a backstop against copyright ownership extremism. Under these particular circumstances — where Google demonstrated incredible innovation — that backstop provided room for the innovation to take root and grow. Technological innovators should be encouraged.
Educational institutions at all levels have begun to realize that they hold a treasure trove of student-related information, that if analyzed using “Big Data” techniques, could yield valuable insights to further their educational missions. Educational institutions hold a broad variety of student-related information that may be analyzed, including grades, financial information, health information, location-related information (both on and off campus), email, online and offline behavioral data and more. Those advocating for the use of advanced analytics in this context hope that Big Data will result in enhanced student performance, improve the educational experience, yield student financial aid options and allocation, provide additional marketing opportunities and revenue for the school, allow for more effective teaching techniques, and lead to a more efficient and impactful allocation of school resources.
Of course, as one can imagine, Big Data projects using student-related information can implicate significant privacy issues. Schools are regulated under the Family Educational Rights and Privacy Acts Statute, and depending on a school’s specific activities may be subject to GLB and HIPAA. In addition, many educational institutions have internal policy and public-facing privacy policies that apply to, and may limit, the collection, use and disclosure of student personal information. The impact of applicable privacy laws and existing privacy-related policies should be taken into account well before engaging in a Big Data project. We have looked at Big Data privacy issues generally before, and the following is a framework for analyzing high level legal considerations and action items for educational institutions considering Big Data projects involving student-related information. Continue Reading
The act of using editorial content for promotional and marketing purposes, or what has come to be known as “native advertising,” is a burgeoning and profitable area of advertising. And not surprisingly, the practice has caught the eye of regulators as a potentially deceptive trade practice.
In recent weeks, the National Advertising Division (NAD), a division of the Advertising Self-Regulatory Council that is administered by the Council of Better Business Bureaus, issued two rulings on the practice of native advertising. Typically, NAD cases address complaints brought by one competitor against another. Here, in contrast, the NAD brought the two complaints itself as a result of the NAD’s regular monitoring of national advertising claims.
In the first case, the NAD analyzed articles made available on the technology website http://mashable.com/. Qualcomm paid for the articles to advertise its Snapdragon microprocessors, which are designed for use in cell phones and tablets. It turns out that Mashable employees wrote the majority of the articles posted on Mashable. The main issue in the case was whether the articles should have retained the Qualcomm sponsorship label after the Mashable sponsorship terminated.
Qualcomm responded that it did not direct the creation or subject matter of the articles and that the articles did not address devices that contain Snapdragon or other Qualcomm products. Additionally, Qualcomm stated that the articles existed independently, without mention of Qualcomm, before the articles were published and continued to exist on the Mashable website after the sponsorship period concluded. Therefore, Qualcomm argued there was no continuing obligation to disclose that the articles were sponsored.
The NAD began by pointing out that consumers could be misled when an advertiser conveys a commercial message without disclosing that it is the author of the message. This is because sponsored content can convey an explicit or implicit message about a product, the benefits of using the product, or the disadvantages of a competing product.
Here, the NAD determined that the sponsored content was independently created before the sponsorship began and was controlled by Mashable. The NAD found that Qualcomm’s sponsorship was more akin to an advertisement that ran alongside an article for a period of time, rather than content written to further an advertiser’s commercial purpose. Therefore, the NAD found that it was appropriate for Qualcomm to disclose itself as the sponsor of the articles when its advertisements ran in conjunction with the articles. But the NAD determined it was not necessary for Qualcomm to continue to identify itself as the sponsor after the sponsorship period ended and its advertisements ceased.
In the second recent native advertising case, the NAD analyzed advertising claims made by eSalon on http://www.esalon.com/, a hair coloring website for women, a secondary website eSalon runs called http://www.haircolorforwomen.com/, and online content eSalon published on its social media channels, including Facebook, Twitter, and Pinterest. The NAD had requested eSalon provide substantiation for several express advertising claims. After its review of the submitted evidence, the NAD found that eSalon could substantiate several of the claims at issue.
With respect to the other challenged claims, the NAD found that several articles and blog posts on http://www.haircolorforwomen.com/, which included hair styling tips, necessitated more conspicuous disclaimers to put consumers on notice of that fact that the articles and blog posts were sponsored by eSalon. Specifically, the NAD recommended that eSalon:
Disclose, clearly and conspicuously, at the top of www.haircolorforwomen.com and on each page or post, that eSalon maintains the blog.
- Advise reviewers of their disclosure obligations when it provides incentives for posting online reviews or content about eSalon, and eSalon disclose any incentives it provides for posts about eSalon when eSalon promotes or otherwise redistributes such posts.
- Disclose its connection to the blog, www.haircolorforwomen, when it posts content from the blog on Pinterest or other social media.
- Discontinue its use of non-endorser celebrity photos on its website or in social media, because such use implies an endorsement of eSalon by the depicted celebrity.
What This All Means:
These cases highlight the increasingly blurred line between online advertising and pure editorial content as a growing number of advertisers embrace the practice of native advertising to create additional online revenue. In recognition of the trend, on December 4th, the Federal Trade Commission plans to host a roundtable on native advertising. See http://www.ftc.gov/opa/2013/09/nativeads.shtm. Whether the resulting discussions will lead to regulations and enforcement actions remains to be seen. In the interim, advertisers need to pay particular attention to native advertising and its placement online to ensure that the distinction between editorial and sponsored content is made clear and conspicuous to the consumer.
Insurers providing privacy liability coverage were collectively breathing a sigh of relief last week given a decision from the California Court of Appeals. Interpreting the California Confidentiality of Medical Information Act (CMIA), the court in Regents of the Univ. of Cal. v. Superior Court of Los Angeles County, No. B249148 (Cal. Ct. App. October 15, 2013) significantly limited the ability of plaintiffs to obtain nominal statutory damages of $1,000 per patient under CMIA. For the past several years, CMIA was pretty much the best game in town when it came to data breach litigation. Although enacted in 2008, CMIA was only over the past several years successfully used by plaintiffs’ counsel to obtain settlements previously unattainable post-breach. The CMIA “statutory damages” bonanza reaped by class counsel was significant – the prospect of such damages allowed counsel to overcome Article III and other “lack of injury” arguments, potentially allowed for class certification even with an otherwise uneven plaintiff pool, and created an early incentive to settle on the part of a defendant – and its insurer – given the potential size of an award.
It is no surprise CMIA was the bane of a good number of network security and privacy insurers – it led to settlements that would not have otherwise occurred. The Regents decision is noteworthy given it was the first appellate court to decide the availability of CMIA statutory damages and rejected the notion that mere negligence coupled with disclosure could trigger statutory damages. This is a major departure from how the law was interpreted by the lower courts and instantly dried up a good part of the statutory damages manna drunk by plaintiffs’ counsel.
The facts of the case would provide a nice law school hypothetical – a doctor’s home is burglarized and his encrypted external drive is stolen – and, just for good measure, he cannot locate the note card containing the drive’s password. Was there unauthorized access to the stolen information? A CMIA private right of action allowing for statutory damages turns on whether “negligence results in unauthorized or wrongful access to the information.” It is easy to assume when someone may have also stolen the password located near a stolen hard drive that the theft will result in an unauthorized access – especially when the stolen drive is never found.
After reviewing the statute’s legislative history and related laws, the Court of Appeals strictly construed the statute to allow for nominal, or statutory damages of $1,000 – but only when there was actual “unauthorized or wrongful access to the information.” Given that the class plaintiff was unable to allege her information was improperly viewed or otherwise accessed, the superior court was ordered to have the case dismissed.
In effect, the Court of Appeals significantly neutered CMIA by requiring actual improper access to a patient’s medical information. In most likely breach scenarios, ID theft and “actual access” can go hand in hand. Armed with evidence of potential or actual ID theft, most plaintiffs’ counsel would withstand some level of motion practice – with or without CMIA. In other words, the benefits derived from CMIA’s availability of nominal damages may have dwindled to some potential commonality assistance during a class certification motion.
Although it remains to be seen whether insurers will lower healthcare privacy premiums due to this one decision, one thing is certain – claims adjusters will have “a little” extra free time on their hands.
In Part I of our two part series on FDA’s wireless medical device guidance, we provided a high-level overview of FDA’s wireless medical device quality control requirements and summarized the agency’s general recommendations to medical device manufacturers (MDMs) for securing wireless medical devices.
Here, in Part II, we dive into FDA’s more technical recommendations and concrete suggestions for designing, manufacturing, and maintaining safe wireless medical devices. In an era of enhanced FDA supervision (see Part I for further discussion), wireless medical device manufacturers would be wise to take these “recommendations” and “suggestions” to heart.
Medical and healthcare-related security and privacy concerns have been front page news in 2013, especially with recent launches of federal and state medical healthcare exchanges and changes stemming from the “HIPAA Omnibus Final Rule” enacted early this year that went into effect as of September 23rd. In a timely and notable report, the Ponemon Institute released a study sponsored by the Medical Identity Fraud Alliance, called the 2013 Survey on Medical Identity Theft (the “Survey”). The Survey attempts to measure “the prevalence of medical identity theft in the United States and its impact on consumers” by, in part, analyzing the what 788 adult-aged individuals who self-reported that they or family members had been medical ID theft victims experienced. What did Ponemon uncover?