Senior Counsel, Richard Santalesa will address the Connecticut Association of Paralegals at its June 12, 2013 member event on how technological developments have effected the practice of law, from the way we now conduct legal research and present evidence at a trial to the way law firms are managed, with links to the changing expectations attorneys have of paralegals in today’s digital world.
Data security and what qualifies as “reasonable” security is on everyone’s mind these days – at least if you’re involved in IT, or responsible for addressing any aspect of the “GRC” troika of governance, risk management and compliance issues.
Sometimes overlooked on the cyber side, however, is the interaction of cyber with real world, physical security and how the two can mutually reinforce and benefit each other and security overall.
This fact was brought home as I attended in New York City this week ASIS International’s Security Conference and Expo, which was colocated with the Computer Forensics Show and CyBit (Cyber security and IT security) Expo.
The frequently beefy, bull-necked attendees at the NYC ASIS conference, where you couldn’t turn around without running into someone wearing the dress uniform of a federal, state or municipal law enforcement agency, were a far cry from the populace that generally patrols and sits on panels at cyber security events. But we should rub elbows with our colleagues manning the physical security wall more, for a variety of reasons, not the least of which is that many physical “security” solutions will soon or already have embraced the digital and increasingly digital security controls and contracts address – or should be addressing – physical security specifics with more particularly that in days past.
As reported by the LA Times, “a powerful coalition of technology companies and business lobbies, the California Chamber of Commerce, insurers, bankers and cable television companies as well as direct marketers and data brokers” were able to stop a California bill aimed at giving consumers greater insight as to the use of their personal data.
First introduced in February by Assemblywoman Bonnie Lowenthal (D-Long Beach), the proposed Right to Know Law (AB 1291) would have implemented major revisions to existing law and created new rights for consumers. Specifically, the proposed law would require
any business that has a customer’s personal information, as defined, to provide at no charge, within 30 days of the customer’s specified request, a copy of that information to the customer as well as the names and contact information for all 3rd parties with which the business has shared the information during the previous 12 months, regardless of any business relationship with the customer.
This new level of transparency might have helped sooth consumer concerns. According to a 2012 USC Dornsife/Los Angeles Times poll, “82 percent of Californians said they are “very concerned” or “somewhat concerned” about Internet and smartphone companies collecting their personal information.” On the other hand, providing a full and accurate accounting of who had access to a consumer’s data – even to only the small percentage of consumers who would actually take the time to request it – would have generated a major undertaking for a wide range of companies. It is not surprising that the companies who fought so hard to pull the plug on this bill represent a very diverse coalition of businesses.
Even if this bill does not get revived in a new form sometime in the future, the prospect of what it might have brought to the table should serve as a wake up call to those businesses deep into online behavioral advertizing. It may be time to better understand just who has access to what information – and it may not eventually matter whether that information belongs to a current client or consumer or whether it was anonymized. As usual, staying in front of the regulatory curve remains a sound business practice.
Last week, Arkansas enacted H.B. 1901, joining California, Illinois, Maryland, Michigan, New Mexico, and Utah in restricting employer access to social media or personal accounts. A total of seven states now have such laws. New Jersey’s harsh bill, which we have covered, has cleared the Assembly and is awaiting the Governor’s signature. The Arkansas law provides in pertinent part:
An employer shall not require, request, suggest, or cause a current or prospective employee to:
(A) Disclose his or her username and password to the current or prospective employee’s social media account;
(B) Add an employee, supervisor, or administrator to the list or contacts associated with his or her social media account; or
(C) Change the privacy settings associated with his or her social media account.
Although the Arkansas law closes potential loopholes created by some other similar state laws that did not prohibit employers from requiring employees or job applicants to become a “friend” or “connection” with the employer or its employees, this provision may also raise potential new concerns that could be tested in a future case. For instance:
- Is a supervisor prohibited from sending a friend request to an employee he or she supervises? One could argue that the act of sending a request constitutes a ‘request’ or ‘suggestion’ that is prohibited by the statute. If so, potential First Amendment problems may arise. Is this particular act of the supervisor imputed to the employer if the employer otherwise has no hand in causing the friend request to be sent?
- From whose perspective is it determined whether a connection request is a statutory “request?” Employers and employees or job applicants may have different perspectives on this question.
Even those employers that do not maintain a policy of requiring access to employee social media accounts may wish to keep an eye on the development of these laws, based on possible issues noted above. As more states will likely enact similar laws in the future and tinker with the restrictions on employer conduct, the waters could get murkier still. Proactive employers may wish to begin considering potential revisions to their social media policies.
For operators of web sites, apps and other online services, change is definitely coming – and quickly. On April 25, 2013, the Federal Trade Commission (“FTC”) issued updated Frequently Asked Questions (the “FAQs”) for its amended implementing rule (the “Rule”) for the Children’s Online Privacy Protection Act (“COPPA”). The FAQs give some additional insight regarding the changes and updates to the Rule (for a summary of those changes, click here). Key takeaways of the FAQs include:
- The Rule will take effect on July 1, 2013 – the FTC has not granted an extension.
Anticipate the FTC may act quickly to enforce the new Rule. All web sites and other online services should evaluate whether they are in full compliance with COPPA and the amended Rule and, if applicable, take immediate steps to compliance in order to meet the July 1 deadline.
- The FTC clarified the extent to which the new Rule will apply to previously collected data:
Geolocation Data: The Rule only covers geolocation information precise enough to identify street name and city or town. The FTC considers its specific delineation of this type of data a mere clarification of existing regulation. Accordingly, if prior to July 1 an online service collected precise geolocation data without parental consent, the online service must immediately obtain parental consent. Note that this data may be collected passively and unintentionally, as geolocation information is sometimes automatically associated with uploaded files (such as pictures and video).
Photos, Videos and Audio Files: Online services may retain previously collected files containing a child’s image or voice without obtaining parental consent. However, the FTC recommends ceasing use or disclosure of this type of information without parental consent — as a “best practice.” Thus, businesses should consider carefully whether they will continue to use, disclose or retain photographs, videos and audio files after the Rule takes effect July 1. Note, too, that the FTC states it is acceptable to post photos where the child’s face is blurred and not recognizable.
User Names and Screen Names: The amended Rule is more broad in its definition of user and screen names as personal information, such that as of July 1, these types of identifiers are “personal information” if they permit direct online contact with a person. Similar to photos, videos and audio files, organizations may retain previously collected user and screen names (and similar identifiers) that are newly included as subject to the Rule without parental consent. However, the Rule will apply to these identifiers if an organization associates any new information with the identifier after the Rule takes effect July 1. In addition, the FTC recommends obtaining parental consent — as a “best practice.” Thus, online services should fully analyze their use of screen names and similar identifiers to determine if they will trigger the new Rule and consider obtaining parental consent.
Persistent Identifiers: Starting July 1, a persistent identifier (such as an IP address) is “personal information” subject to the Rule if it can be used to recognize a user over time and across different web sites or online services (whether or not combined with individually identifiable information). Here, too, organizations may retain, without parental consent, persistent identifiers that are now covered by the Rule but were not previously subject to it. However, the Rule applies to any collection on or after July 1 of that persistent identifier or any association of information with that persistent identifier (e.g., association of an IP address with browsing activity on a web site). Accordingly, beginning July 1, online services will need prior parental consent to collect data using persistent identifiers unless the information is used solely for support of internal operations or falls within another exception under the Rule.
- Mobile phone numbers are not “online contact information” as defined by the Rule.
Accordingly, operators of online services must not collect mobile phone numbers from children as part of the process to obtain parental consent. Instead, operators should collect an email address, IM user identifier, VOIP identifier, video chat user identifier or other substantially similar identifier. However, once in contact with the parent, the parent may provide his or her mobile phone number for further communications.
- The FTC provided App-specific guidance:
Parental Notice and Consent: All operators of online services (whether App, website or other online service) may collect from the child the parent’s online contact information for the sole purpose of providing direct notice to the parent. The FTC also recognizes that other acceptable means are available through Apps, and operators may use those other means, such as through the mobile device, so long as the mechanism used provides notice and obtains the parent’s consent prior to collection of personal information from the child and is reasonably designed to ensure that it is the parent who receives the notice and gives consent. Note, however, that an App may not rely on collection of an app account number or password to fulfill the Rules’ notice and consent requirements without other indicia or reliability, which may include knowledge-based authentication questions, because it is too unlikely that the app store account information (user name or account number and password) is provided by the child and not the parent.
Locally Stored Content: An app is not “collecting” personal information and does not trigger compliance obligations merely because it includes features that allow a user to upload photos or otherwise interact with personal information stored on the device — so long as that information remains locally stored and is never transmitted from the device.
- The FTC provided specific guidance regarding online advertising:
Key points from the FAQs with regard to how the Rule impacts online advertising include:
- Behavioral advertising triggers the Rule; it does not fall within the term “support for internal operations,” and thus there is no exception for collecting persistent identifiers if they are used for behavioral advertising.
- A child-directed content provider will be strictly liable for any collection of personal information (including persistent identifiers such as IP address) by a third party.
- A child-directed content provider must provide notice and obtain prior parental consent before allowing any third party to collect personal information from visitors.
- It is acceptable for a child-directed content provider to allow for contextual advertising on the site – but it must ensure that doing so does not otherwise violate COPPA or the Rule.
- A company (for example, an online advertising service) that collects information through a third party web site or other online service will have “actual knowledge” that it has collected personal information from users of a child-directed site or service if: (1) the content provider directly provides that information; (2) if a representative of the company recognizes that the nature of the content on the third party site or service is child-directed.
- Online services partially directed to children may age-screen but may not block users who are younger than age 13
The Rule allows a web site, app or other online service that falls under the definition of being directed toward children — but where children are not the primary audience — to use an age screen to differentiate between child and non-child users. Businesses must then either obtain parental consent or not allow children to participate in features and activities that collect personal information as defined by the Rule. However, the FTC makes clear repeatedly in the FAQs that businesses must not altogether prohibit children from participating in a site or service that is “child-directed” as determined by a preponderance of factors as set forth in the Rule and FAQs. Organizations should take care in determining whether their online service is “directed to children” and, if so, whether it is directed to children as the “primary audience.” Any age-screening mechanism should comply with FTC guidance (including with regard to not blocking child-users for at least certain types of sites and taking care not to encourage children to falsify their information).
- Reasonable security measures include contract provisions and periodic monitoring
Organizations must determine that third parties have reasonable practices in place to maintain the confidentiality and security of data prior to sharing personal information of children with those third parties. The FAQs state that contracts with service providers should specifically address this issue and that the entity sharing the data must use reasonable means, such as periodic monitoring, to confirm that the third party is, in fact, maintaining the confidentiality and security of the information.
- Operators of online services may need to update their privacy policies and online practices.
Verizon’s annual “Data Breach Investigations Report” (“DBIR”) is a must read for data and information security professionals and we eagerly await each release. The 2013 DBIR is now out and being carefully read by information security professionals. Now in its sixth year, each DBIR provides a broad overview of the changing information security and data breach landscape from year to year, combining Verizon’s own Risk Team breach data with 19 participating organizations around the world to glean lessons learned by analyzing 47,000+ security incidents and 621 confirmed data breaches.
What does this year’s Verizon Data Breach Investigations Report reveal? Read on.
Serenity Springs v. LaPorte County Convention and Visitors Bureau, 2013 WL 1560206 (Ind.App. April 15, 2013)
Here’s a case that shows how with just a couple minutes’ effort and a only a few dollars, marketing professionals can prevent loads of trouble and expense for their organizations down the road.
Plaintiff, a local government-run tourism bureau, announced at a public meeting that it had come up with a marketing phrase (“Visit Michigan City LaPorte”) to promote local commerce. An employee of one of the businesses attending the meeting went home and registered the phrase as a domain name, and had it redirect to the company’s website.
Plaintiff sued (in state court), and got the trial court to order the domain name be transferred, and a permanent injunction against defendant’s use of the purported mark. Defendant sought review with the Indiana Appellate Court. On appeal, the court reversed and remanded.
The appellate court found that the mark comprising the domain name was geographically descriptive, and had not acquired secondary meaning prior to defendant’s registration of the domain name. The court held that without this protection as a trademark, plaintiff’s claims of infringement and cybersquatting failed, and that the lower court erred by finding in plaintiff’s favor.
The real moral of the story is that organizations that are adopting trademarks as part of a branding campaign should take at least the minimal steps necessary to protect those proposed marks. In this case, a simple $12 investment in the prospective domain name hours before the meeting would have saved thousands of dollars in litigation costs over the next three years.
And you thought the privacy legal landscape couldn’t get any more challenging to traverse for online operators. Guess again.
California legislators recently proposed a bill that would significantly broaden its “Shine the Light Law,” Cal. Civ. Code § 1798.83. Enacted ten years ago, the Shine the Light Law became the de facto federal law regulating online privacy, requiring online operators to disclose to consumers how they use and share consumers’ personal information.
Proponents of the bill argue that it would provide more transparency to consumers about the data collection and sharing practices of online operators. Proponents also contend that the bill would more closely align United States law with the data disclosure laws in Europe, laws with which many online operators in the United States already comply.
Opponents of the bill see things differently. They argue that the bill is too broad and unworkable. For example, the California Chamber of Commerce contends that the bill unnecessarily expands the definition of “personal information” to include device identifiers. Opponents also argue that it would be impractical to require online operators to provide the name and address of every entity with which they share consumer information. Even more harrowing, opponents argue, are the bill’s failure to define what constitutes injury to the consumer and the bill’s stiff penalties. If an online operator fails to comply, the consumer may recover a civil penalty of up to $500 per violation and up to $3,000 per willful, intentional or reckless violation. The bill provides, however, that non-willful violations may be cured within 90 days of notice to avoid a penalty.
The bill for the “Right to Know Act” is scheduled for a hearing in the state legislature at the end of this month.
As a firm focused on all evolving aspects of privacy law, InfoLawGroup is obviously often called upon to assist its clients with consumer privacy legal issues. This post takes a detour towards privacy theory terrain and is prodded by a recent New York Times article. In Letting Down Our Guard With Web Privacy, published on March 30, 2013, the author details ongoing research being conducted by Alessandro Acquisti, a behavioral economist at Carnegie Mellon University. Mr. Acquisti’s research is cutting edge when it comes to online behavioral advertising (OBA) and associated consumer behavior. Indeed, he’s the academic who famously announced in 2011 that one might be able to discover portions of someone’s social security number simply by virtue of a posted photograph. His research often distills to one major premise – consumers may not always act in their best interests when it comes to online privacy decisions.
It appears consumers and merchants alike may be missing out on fully cultivating a very valuable commodity. According to the World Economic Forum, “personal data represents an emerging asset class, potentially every bit as valuable as other assets such as traded goods, gold or oil.” Rethinking Personal Data: Strengthening Trust, at 7, World Economic Forum Report (May 2012). Before this asset class can ever be completely exploited and fully commercialized, however, its constituent value components must be correlated by all in the privacy food chain.
Chicago attorney Ben Stein has joined InfoLawGroup LLP as Counsel. Ben comes to InfoLawGroup from Edwards Wildman Palmer LLP and brings experience in matters involving advertising, promotions, new media, privacy, trademarks and intellectual-property. He is published in the World Trademark Review and is a member of the International Trademark Association. Ben has both a B.A. and J.D. (cum laude) from University of Michigan.