Do You Really Need to Store That IoT Data?

Not only are companies collecting a massive amount of data generated by the Internet of Things (IoT), they are storing it too. According to a survey of 1,000 enterprises conducted by 451 Research, 71 percent of enterprises are gathering IoT data and nearly half of the data generated are being stored. What the survey doesn’t reveal is if companies are considering the legal implications of storing IoT data and preparing to deal with demands for that data from outside entities.

Some contend that the IoT is on the brink of changing life as we know it. According to Gartner, 20.8 billion objects will be connected to the internet by 2020. On their own, droves of these data-generating things will churn out an inconceivable amount of intriguing data about our patterns of behaviors. And, when they begin talking to each other, the IoT will be as prevalent as oxygen.

With the IoT’s capability to fade into the background of our lives and quietly play witness to our worlds, the data it generates will pique the interest of a parade of public and private third parties that we can only begin to imagine, but that’s exactly what general counsel need to do. Sit down with business executives who are spearheading IoT projects and talk about the near-future risks associated with storing IoT data, prepare to respond with requests for IoT data and think about the impact to consumers.

When product development and marketing executives are working on an IoT initiative, top of mind are returns on investment, storage options and product development lifecycles. The last thing they are thinking about are criminal investigations, subpoenas and expectations of privacy rights. But, in-house counsel need to engage business executives about the nature of the data—whether it needs to be stored and for how long—before the data is collected, not after when it’s much more difficult to walk back.

Without legal counsel, business executives are more likely to store IoT they might not necessarily need, laboring under the notion that they may decide to use it down the road. Business executives need to understand not just the costs but the risks associated with storing IoT data as well as consider risk mitigation options such as building in automatic data deletion when data is no longer needed.

With its ability to tell stories about who we are, what we are doing and why, IoT data will reveal information that public and private parties are willing to fight for in court and pay big money to gain access to. Most of us have heard about police requesting access to Amazon’s personal digital assistant to help solve a murder case. In a different case, police are attempting to use Fitbit data to prove a husband was involved in his wife’s death.

When data requests arise, will you institute a do-no-evil policy such as Alphabet Inc.’s Google, which vows to only turn over data if a proper search warrant is issued? Will you fight any court requests tooth and nail and make a public show of it to strengthen your privacy-protecting brand? Or, will you cooperate with any police requests? Or will you cooperate only with a court order? And how will you disclose all of this to consumers?

Similar to a breach response plan, in-house counsel should be prepared to react to any third-party request for data—be it law enforcement, government regulator or third-party litigant—in accordance with a corporate position and plan. And the plan should take into account the potential legal and public relations issues, given that these types of unanticipated issues can provoke an emotional response from the public.

The IoT is booming, devices are proliferating and IoT-generated data is already beginning to attract the attention of police. And, it’s only a matter of time before more parties come knocking on your digital door with data requests. In-house counsel need to be involved with IoT initiatives from the start, regardless of the form they take. Early intervention will minimize the risk of unnecessary collection and storage of IoT data, as well as give GCs the opportunity to promulgate policies for how the company will deal with IoT data demands.

 

Reprinted with permission from the August 7, 2017, edition of Corporate Counsel. ©2017 ALM Media Properties, LLC.

All rights reserved. Further duplication without permission is prohibited.

 

A Reasonable Security Blanket

Fear the data breach.  Companies large and small worry that a security lapse compromising personal information may hurt their customers or employees and expose the organization to costly liability and a damaged reputation.  But recent developments suggest that comfort may still be found in keeping privacy promises and keeping up with “reasonable security” best practices.

This week’s $11.2 million settlement of the Ashley Madison class action is a reminder that companies handling potentially sensitive personal information can pay a heavy price for lax security.  Of course, in that case there were allegations of “deceptive” as well as “unfair” practices under FTC Act section 5(a), since the company, for example, charged a fee for deleting data from closed accounts and then failed to do so.  See the Bloomberg Law article (in which I’m quoted).  But this follows last month’s $115 million proposed settlement of consolidated class actions against Anthem, Inc. after the first of a wave of cyberattacks against large health insurance companies in 2015 and 2016.  In such cases, liability generally comes down to a simple question of keeping up with reasonable security measures, not a failure to keep specific privacy promises.  These cases demonstrate that this effort is a real challenge even for large organizations with substantial in-house IT resources.

The Federal Trade Commission has handled more than 60 complaints and consent orders concerning data breaches exposing sensitive personal data.  Its “Start with Security” guide offers ten practical principles for businesses handling personal information.  Today, the FTC announced that it will publish a weekly blog post on Fridays over the next few months called “Stick with Security” to offer insights drawn from the FTC’s experience with data breach investigations.  The first post explains how the FTC chooses to take enforcement action in the case of some publicized breach incidents and not others.

Good summer reading for a few minutes on Fridays.  Beach blanket optional.

Are You Keeping Up with COPPA? The FTC Just Updated Its Compliance Plan for Businesses

On June 21, 2017, the FTC released its updated its 6 step COPPA Compliance Plan for Businesses (“Compliance Plan”).   The changes to the Compliance Plan are intended to help businesses keep up with changes to technology and evolving business models in connection with the Children’s Online Privacy Protection Act (“COPPA”). The FTC states in its blog post  announcing the updated Compliance Plan that the changes fall into 3 categories:

New business models/technologies: COPPA applies to websites and other “online services.” The FTC is making clear, if there was any doubt, that voice activated devices that collect information are “online services” subject to COPPA.

New products: The FTC has specifically highlighted connected toys and other IoT devices as being subject to COPPA.

New Methods for Obtaining Parental Consent: The FTC has added two recently approved methods for obtaining parental consent to its list: (1) having parents answer a series of knowledge-based questions that only a parent should be able to answer; and (2) using facial recognition technology and photographs submitted by the parent.

Key Takeaways: The FTC has not updated its rules but has clarified its position. Companies should be continuously aware of how the use of new technologies and the launch of new products could implicate COPPA — and in an IoT world many products that previously would not have been an “online service” now likely are.

Partner Heather Nolan To Be Interviewed This Wednesday, June 21, 2017 on Privacy Issues In Marketing and Broadcasting

On Wednesday, June 21, 2017, InfoLawGroup partner, Heather Nolan, will be interviewed on the Privacy Piracy radio program out of the University of California by its host Mari Frank.  Heather will be discussing privacy issues in marketing and broadcasting.  The show is scheduled to take place at 3:30pm Central Time, and will be available for listening on the show’s website after the live stream.

 

InfoLawGroup Thanks Clients, Attorneys and Staff for 2017 Chambers Recognition

Chambers and Partners has again recognized InfoLawGroup in its new 2017 guide.  Along with being ranked for  Media & Entertainment: Transactional in the USA guide, two of our attorneys, Justine Gottshall (privacy) and Jamie Rubin (media & entertainment) were again recognized as leaders in their field.

We are delighted to receive public acknowledgment from Chambers and to be recognized along with other great firms.  We are grateful to our clients for recommending us, and thank all of our attorneys and staff whose hard work continues to make InfoLawGroup a success.

Does #Partner Mean “I Was Paid To Post This Message”?

The FTC says no!  Specifically, the FTC said: “terms like “Thank you,” “#partner,” and “#sp” aren’t likely to explain to people the nature of the relationship between an influencer and the brand.”  Before now, I might have approved the use of #partner in the right context.  But last week, the FTC sent letters to over 90 influencers (athletes, celebrities, etc.) reminding them of their obligation to disclose if they are being paid to post or otherwise have a relationship with a company/brand mentioned in their post. Letters were also sent to marketers to remind them of their obligations in this arena. The letters warn of the wrong way to make these disclosures (e.g., don’t make the disclosure after a “more” button in a post, don’t make the disclosure within a string of unrelated hashtags, don’t make the disclosure vague – no #sp or “thank you”).  Below are links to samples of the letters the FTC sent to influencers and marketers regarding improper/proper material connection disclosures.  And check out our other posts that explain related compliance obligations when engaging influencers: HERE and HERE.

Influencer Sample Letter From FTC

Marketer Sample Letter From FTC

NY AG Settles with TRUSTe Over its COPPA Safe-Harbor Program

Last week, the New York Attorney General’s Office announced that it had entered into a settlement with privacy compliance company TRUSTe, signaling the AG’s continuing interest in children’s privacy and potentially portending an uptick in state-level enforcement under the Children’s Online Privacy Protection Act (“COPPA”).

TRUSTe operates an FTC-approved safe-harbor program for online services subject to COPPA. As an approved safe-harbor provider, TRUSTe is required to conduct annual audits of its clients’ online services to assess compliance with its program requirements.  According the the New York Attorney General, TRUSTe’s assessments failed to adequately address the presence of third-party tracking technologies prohibited on child-directed sites under COPPA.  Continue Reading

New Litigation Against National Clothing Retailer for Use of “Up To % Off” Messaging

What Have Retailers Been “Up To” In New Jersey?

The last few years have been quite interesting for retailers, with a number of different pricing and advertising related legal issues coming to the forefront. Recently, another new front in this battle opened in New Jersey where a national clothing retailer became the subject of a class action related to its discount price advertising. This time the claim relates to a seemingly tried and true method of retail advertising– the use of “Up To X % Off” promotional messaging. This also marks the next chapter in our on-going discussion of “Up To” claims.

The suit, filed against JoS. A. Bank Clothiers, Inc., alleges that the retailer did not adequately display the minimum percentage of savings to customers when it used an “Up to 70 Percent Off” message and other similar messages in its advertising. The plaintiff, Michael Leese, alleges that this type of message violates New Jersey’s consumer protection laws, which require advertisements to state the minimum percentage of savings just as conspicuously as the maximum percentage of savings.

Continue Reading

BYOB: Be Your Own Broadcaster (and Studio) –What to watch out for in content creation and distribution

2017 is starting to look like the year of DIY content creation and distribution. Companies are becoming their own studios and broadcasters at a seemingly record pace. If your organization doesn’t already have its own channel or stream, there’s a good chance someone is at least considering making it happen. We work hand in hand with clients as they move into this space, and here are some important things to take note of and plan to address:

Continue Reading

LexBlog