EU Regulators Provide Guidance on Notice and Consent under GDPR

By Mark Young, Joseph Jones and Ruth Scoles Mitchell

The Article 29 Working Party (WP29) has published long-awaited draft guidance on transparency and consent. We are continuing to analyze the lengthy guidance documents, but wanted to highlight some immediate reactions and aspects of the guidance that we think will be of interest to clients and other readers of InsidePrivacy. The draft guidance is open for consultation until 23 January 2018.

Transparency

Updating existing notices. The guidance is clear that if processing already is underway, “a data controller should ensure that it is compliant with its transparency obligations as of 25 May 2018.” In other words, notices need to be updated to include all of the information set out in Articles 13 and 14.
Content of notices. A schedule in the guidance sets out all of the required information (under Articles 13 and 14) and WP29’s corresponding comments, such as: notices preferably should include different means to communicate with the controller; notices should specify the “relevant” legal bases; and categories of recipients should be as specific as possible (and the default should be to “provide information on the actual (named) recipients”).
Clear language. The guidance emphasizes the need to use clear language, and states that expressions such as the following are not sufficiently clear: “’We may use your personal data to develop new services’ (as it is unclear what the services are or how the data will help develop them); ‘We may use your personal data for research purposes’ (as it is unclear what kind of research this refers to); and ‘We may use your personal data to offer personalised services’ (as it is unclear what the personalisation entails).”
Website notices. The guidance includes some specific pointers on providing notice on websites and in other online contexts, and making sure that notices are easily accessible. In relation to websites, for example, it states, “Positioning or colour schemes that make a text or link less noticeable, or hard to find on a webpage, are not considered easily accessible.”
App notices. The guidance acknowledges that it can be difficult to provide notice but that users should not have to go searching for it. In the app context, it states that, “once the app is installed, the information should never be more than ‘two taps away’. Generally speaking, this means that the menu functionality often used in apps should always include a ‘Privacy’/ ‘Data Protection’ option.”
Notices to children. Language should be tailored to the audience. When processing children’s data, the language should be age-appropriate. The guidance notes that, “A useful example of child-centred language used as an alternative to the original legal language can be found in the ‘UN Convention on the Rights of the Child in Child Friendly Language’.”
Means of providing notice. Providing information in writing is the default method, and the guidance refers to various options, including layered privacy statements/ notices, “just-in-time” contextual pop-up notices, 3D touch or hover-over notices, and privacy dashboards. Additional “means” include “videos and smartphone or IoT voice alerts . . . , cartoons, infographics or flowcharts” (see WP29 Opinion 8/2014 on Recent Developments in the Internet of Things). The guidance goes on to set out recommendations for each of these methods of providing information, including for providing notice orally and in-person.
Icons. The guidance clarifies that icons should not replace all of the information required under Articles 13 and 14, but should be used in combination with such information (citing Article 12(7)). The draft guidance recognizes that “the development of a code of icons should be centred upon an evidence-based approach and in advance of any such standardisation it will be necessary for extensive research to be conducted in conjunction with industry and the wider public as to the efficacy of icons in this context.”
Free services and notice. Where free services are being provided, “information must be provided prior to, rather than after, sign-up given that Article 13(1) requires the provision of the information ‘at the time when the personal data are obtained’.” The guidance also states, “information provided under the transparency requirements cannot be made conditional upon financial transactions, for example the payment for, or purchase of, services or goods.”
Changing notices. Going forward, “a notification of changes should always be communicated by way of an appropriate modality (e.g., email/ hard copy letter etc.) specifically devoted to those changes (e.g., not together with direct marketing content).” Further, “References in the privacy statement / notice to the effect that the data subject should regularly check the privacy statement /notice for changes or updates are considered not only insufficient but also unfair in the context of Article 5.1(a).” Although the GDPR is silent on timing requireme..

Digital Health Checkup: Key Questions to Consider in the Digital Health Sector

Covington’s global cross-practice Digital Health team has posted an illuminating three-part series on the Covington Digital Health blog that covers key questions entities should be asking as they seek to fit together the regulatory and commercial pieces of the complex digital health puzzle.

In the first part of the series, the Digital Health team answers key regulatory questions about digital health solutions.
In the second part of the series, the Digital Health team considers key commercial questions when contracting for digital health solutions.
In the third part of the series, the Digital Health team answers key regulatory and commercial questions about the Artificial Intelligence (AI), data privacy, and cybersecurity aspects of digital health solutions.

NIST Releases Updated Draft of Cybersecurity Framework

On December 5, 2017, the National Institute of Standards and Technology (“NIST”) announced the publication of a second draft of a proposed update to the Framework for Improving Critical Infrastructure Cybersecurity (“Cybersecurity Framework”), Version 1.1, Draft 2. NIST has also published an updated draft Roadmap to the Cybersecurity Framework, which “details public and private sector efforts related to and supportive of [the] Framework.”

Updates to the Cybersecurity Framework

The second draft of Version 1.1 is largely consistent with Version 1.0. Indeed, the second draft was explicitly designed to maintain compatibility with Version 1.0 so that current users of the Cybersecurity Framework are able to implement the Version 1.1 “with minimal or no disruption.” Nevertheless, there are notable changes between the second draft of Version 1.1 and Version 1.0, which include:

Increased emphasis that the Cybersecurity Framework is intended for broad application across all industry sectors and types of organizations. Although the Cybersecurity Framework was originally developed to improve cybersecurity risk management in critical infrastructure sectors, the revisions note that the Cybersecurity Framework “can be used by organizations in any sector or community” and is intended to be useful to companies, government agencies, and nonprofits, “regardless of their focus or size.” As with Version 1.0, users of the Cybersecurity Framework Version 1.1 are “encouraged to customize the Framework to maximize individual organizational value.” This update is consistent with previous updates to NIST’s other publications, which indicate that NIST is attempting to broaden the focus and encourage use of its cybersecurity guidelines by state, local, and tribal governments, as well as private sector organizations.

An explicit acknowledgement of a broader range of cybersecurity threats. As with Version 1.0, NIST intended the Cybersecurity Framework to be technology-neutral. This revision explicitly notes that the Cybersecurity Framework can be used by all organizations, “whether their cybersecurity focus is primarily on information technology (“IT”), cyber-physical systems (“CPS”) or connected devices more generally, including the Internet of Things (“IoT”). This change is also consistent with previous updates to NIST’s other publications, which have recently been amended to recognize that cybersecurity risk impacts many different types of systems.

Augmented focus on cybersecurity management of the supply chain. The revised draft expanded section 3.3 to emphasize the importance of assessing the cybersecurity risks up and down supply chains. NIST explains that cyber supply chain risk management (“SCRM”) should address both “the cybersecurity effect an organization has on external parties and the cybersecurity effect external parties have on an organization.” The revised draft incorporates these activities into the Cybersecurity Framework Implementation Tiers, which generally categorize organizations based on the maturity of their cybersecurity programs and awareness. For example, organizations in Tier 1, with the least mature or “partial” awareness, are “generally unaware” of the cyber supply chain risks of products and services, while organizations in Tier 4 use “real-time or near real-time information to understand and consistently act upon” cyber supply chain risks and communicate proactively “to develop and maintain strong supply chain relationships.” The revised draft emphasizes that all organizations should consider cyber SCRM when managing cybersecurity risks.

Increased emphasis on cybersecurity measures and metrics. NIST added a new section 4.0 to the Cybersecurity Framework that highlights the benefits of self-assessing cybersecurity risk based on meaningful measurement criteria, and emphasizes “the correlation of business results to cybersecurity risk management.” According to the draft, “metrics” can “facilitate decision making and improve performance and accountability.” For example, an organization can have standards for system availability and this measurement can be used at a metric for developing appropriate safeguards to evaluate delivery of services under the Framework’s Protect Function. This revision is consistent with the recently-released NIST Special Publication 800-171A, discussed in a previous blog post, which explains the types of cybersecurity assessments that can be used to evaluate compliance with the security controls of NIST Special Publication 800-171.

Future Developments to the Cybersecurity Framework

NIST is soliciting public comments on the draft Cybersecurity Framework and Roadmap no later than Friday, January 19, 2018. Comments can be emailed to cyberframework@nist.gov.

NIST intends to publish a final Cybersecurity Framework Version 1.1 in early calendar year 2018.

English High Court Finds Supermarket Liable for Data Breach by Employee in First Successful Privacy Class Action

By Joseph Jones and Ruth Scoles Mitchell

On December 1, 2017, the High Court of England and Wales found the fourth-largest supermarket chain in the UK, Wm Morrisons (“Morrisons”), vicariously liable for a data breach caused by the intentional criminal actions of one of its employees, namely the leaking of payroll information online.

The breach affected almost 100,000 Morrisons employees and the action, brought by 5,518 former and current employees, is considered to be the first of its kind in the United Kingdom. The data compromised in the breach included personal data such as names, addresses, and bank account details.

Facts

In March 2014, payroll data relating to almost 100,000 Morrisons employees was disclosed on a file-sharing website by a disgruntled Morrisons employee (“Mr. Skelton”). Mr. Skelton had been entrusted by Morrisons with the data for the purpose of facilitating account auditing. He copied the dataset onto a personal USB drive and posted it to a file-sharing website. He was found to be criminally liable for the breach and was imprisoned for eight years for fraud, securing unauthorized access to data, and disclosing personal data.

A legal action seeking damages on behalf of 5,518 former and current Morrisons employees whose data was leaked was premised on Morrisons being either directly liable or vicariously liable for Mr. Skelton’s acts. The action alleged that Morrisons had committed a breach of statutory duty under the Data Protection Act 1998, among other things.

Direct liability

The High Court held that Morrisons was not directly liable for the breach. The judgment states that where a corporation “is in no sense responsible for authorising or requiring” the breach and the employee is acting against the employer’s wishes in committing the breach, the liability may be vicarious but not direct (para. 49).

Vicarious liability

The High Court ruled that vicarious liability under the Data Protection Act 1998 may be applicable notwithstanding the fact that the Data Protection Act does not expressly refer to it. Citing past case law (Majrowski [2006 UKHL 34]), the High Court held that employers can be vicariously liable for the actions of their employees where an employee commits a breach of statutory obligations, while acting in the course of his employment, unless legislation expressly or impliedly indicates otherwise. Moreover, the High Court reasoned that vicarious liability could further the legislative purpose of the Data Protection Act: to protect the rights of data subjects.

On the facts of the case, the High Court found Mr. Skelton to have been acting “in the course of employment”, adopting a broad interpretation of the scope of employment (consistent with past case law: Bazely v Curry [1999 174 D.L.R. 4th 45], Lister [2001 UKHL 22] and Mohamud [2016 UKSC 11]). Accordingly, Morrisons was held to be vicariously liable.

In addition to the central issue of vicarious liability, the High Court addressed a number of other issues, including:

Security standards. The High Court clarified that the fact that a level of security is available but has not been implemented does not — by itself — amount to a failure to reach an appropriate standard. Applying a balancing test is necessary. The High Court found that Morrisons had violated the security principle of the Data Protection Act 1998 by not having a policy for deletion of data held outside its normal secure repository. However that violation did not cause any loss nor did it enable Mr. Skelton’s breach. On the facts of the case, therefore, the High Court found that Morrisons did provide “adequate and appropriate [security] controls”.
Employee monitoring. The High Court considered routine employee monitoring as needing justification on an individual basis. Active monitoring is not the norm in businesses such as Morrisons and may be deemed unnecessary in the context of its business.
Unhelpfully, the High Court did not resolve the dispute as to the burden of proof. In other words, it remains unclear whether a claimant needs to prove a violation of the Data Protection Act 1998 or whether the defendant needs to prove that its arrangements were appropriate.

Significance

The ruling could have widespread implications for employers and potentially lead to more actions of this kind. The ruling means that employers that may not have directly or actively breached their data protection obligations under UK data protection legislation may nonetheless be held to be vicariously liable for an employee’s acts, notwithstanding that the employee acted independently and that it was not unreasonable for the employer to entrust the employee with the data. Further, this liability is, apparently, not diminished by the fact that the employee’s acts were deliberate and specifically intended to cause harm to the employer (as was the case on the facts for Morrisons and Mr. Skelton).

Interestingly, and at the end of the judgment, the judge indicated that h..

District Court Rejects Consent Revocation Claim Under TCPA

A recent District of New Jersey case emphasizes that while, under the FCC’s 2015 interpretation of the law, a customer has a broad right to revoke consent to receive automated calls and texts under the Telephone Consumer Protection Act (“TCPA”), the manner in which the consumer seeks to revoke his or her consent must be reasonable.

On November 27, 2017, a New Jersey federal judge dismissed a putative class action against Kohl’s, rejecting the plaintiff’s assertion that her sentence-long opt-out replies to automated text message “sales alerts” were reasonable when she was presented with other clear and simple opt-out mechanisms.

The FCC’s rules under the TCPA prohibit a caller from making telemarketing or advertisement calls and texts using an Automatic Telephone Dialing System (“ATDS”) to a mobile telephone number without the “prior express written consent” of the call recipient. In its 2015 Order interpreting the statute, currently on appeal before the D.C. Circuit, the FCC stated that consumers may revoke such consent “through any reasonable means.”

The plaintiff in the New Jersey litigation initially consented to receive automated sales alerts from Kohl’s via text message, but she later attempted to revoke her consent by responding to those messages with messages of her own, including “I’ve changed my mind and don’t want to receive these anymore,” “please do not send any further messages” and “I don’t want these anymore. This is your last warning!” Under the terms and conditions of Kohl’s mobile sales alerts, customers can opt-out of receiving future messages by texting back any of the following commands: STOP, CANCEL, QUIT, UNSUBSCRIBE, or END. In response to each of her attempted revocations, the plaintiff received an automated reply that stated in relevant part: “Sorry we don’t understand the request! Reply HELP for help, STOP to cancel.” Plaintiff did not do so.

Nevertheless, the plaintiff argued that her more lengthy responses constituted effective revocation of her consent, and that Kohl’s continued messages violated the TCPA. The plaintiff asserted this claim on behalf of herself and a class she believed to number in the tens of thousands.

In finding that the plaintiff’s actions did not constitute effective revocation of her consent, Judge Brian R. Martinotti cited another portion of the FCC’s 2015 Order:

When assessing whether any particular means of revocation used by a consumer was reasonable, we will look to the totality of the facts and circumstances surrounding that specific situation, including, for example, whether the consumer had a reasonable expectation that he or she could effectively communicate his or her request for revocation to the caller in that circumstance, and whether the caller could have implemented mechanisms to effectuate a requested revocation without incurring undue burdens. We caution that callers may not deliberately design systems or operations in ways that make it difficult or impossible to effectuate revocations.

Rules and Regulations Implementing the Tel. Consumer Prot. Act of 1991, 30 FCC Rcd 7961, 7996 ¶ 64 n.233 (2015). Judge Martinotti concluded that the plaintiff could not have reasonably expected that she could communicate her request for revocation in the manner that she did, given that each time she attempted to do so she received an automated response stating that her message was not understood.

NIST Releases New Draft Publication Designed to Assist Contractors In Assessing Compliance with NIST SP 800-171

By Susan Cassidy, Moriah Daugherty, and Ashden Fein

Ahead of the upcoming December 31, 2017 deadline for federal defense contractors to implement the security controls of National Institute of Standards and Technology (“NIST”) Special Publication 800-171 (“SP 800-171”), NIST has released a new draft publication designed to assist organizations in assessing compliance under SP 800-171, Draft Special Publication 800-171A, Assessing Security Requirements for Controlled Unclassified Information (“CUI”) (“SP 800-171A”).

Currently, there is no regulation or statute that imposes SP 800-171A on contractors. Rather, SP 800-171A is intended as guidance for organizations in developing assessment plans and conducting “efficient, effective, and cost-effective” assessments of the implementation of security controls required by SP 800-171. Similar to SP 800-171, SP 800-171A does not prescribe specific, required assessment procedures. Instead, SP 800-171A provides a series of “flexible and tailorable” procedures that organizations could use for conducting assessments with each security control in SP 800-171. SP 800-171A specifically recognizes three distinct methods for conducting assessments: examining and interviewing to facilitate understanding, achieve clarification, or obtain evidence and testing to compare actual results with expectations.

Requirements of SP 800-171A:

Following the format of SP 800-171, SP 800-171A groups its assessment procedures by the fourteen families of CUI security control requirements, and highlights how an assessor could examine, interview, or test each particular control at issue. Although SP 800-171A suggests a majority of the controls could be evaluated using all three methods, it does recognize that some of the controls can only be effectively assessed using a subset of the three methods. SP 800-171A also recognizes that organizations may not need to test every control – controls that are not applicable to a particular organization should not be tested in the assessment, but should instead be documented as non-applicable in the organization’s System Security Plan (“SSP”).

Consistent with its recent update to NIST Special Publication 800-53, Security and Privacy Controls for Federal Information Systems and Organizations (“SP 800-53”), in creating this publication, NIST used the term “system” rather than “information system” to reflect that CUI needs to be safeguarded on a broader array of contractor information systems such as industrial and process control systems, cyber-physical systems, and individual devices that are part of the Internet of Things.

Impact on Contractors:

Although there is currently no requirement that defense contractors follow the procedures in SP 800-171A, the draft publication was designed as “a starting point” for organizations to use in developing assessment plans and determining compliance with NIST SP 800-171. In particular, SP 800-171A notes that “[o]rganizations can use the assessment procedures to generate evidence to support the assertion that the security requirements have been satisfied.” Such evidence could be used in a variety of ways, such as the basis for identifying security related weaknesses in a system, as an aid in source selection, or by the Defense Contract Management Agency (“DCMA”) when auditing contractor compliance with Defense Federal Acquisition Regulation Supplement (“DFARS”) clause 252.204-7012.

Attached to SP 800-171 is an appendix that provides supplemental guidance for implementing and assessing the CUI security requirements in SP 800-171. As currently drafted, many of the SP 800-171 security controls are only a sentence or two long. The supplemental guidance is based on the more fulsome “security controls in NIST Special Publication 800-53 and is provided to give assessors a better understanding of the mechanisms and procedures used to implement the safeguards employed to protect CUI.” NIST states that this supplemental guidance will be included in the next update to SP 800-171.

As noted in a previous blog post, NIST is in the process of revising SP 800-53, which only applies to federal systems. One of the stated objectives of the revised version, however, is to make SP 800-53’s cybersecurity and privacy standards and guidelines accessible to non-federal and private sector organizations for voluntary use on their systems. As a result, because NIST is incorporating this guidance more explicitly, defense contractors may ultimately see a blurring of some of the requirements of SP 800-171 versus SP 800-53.

NIST is seeking comment on draft publication SP 800-171A no later than December 27, 2017. Comments can be emailed to sec-cert@nist.gov.

The Agency Costs of Equal Treatment Clauses

This Essay explores the agency costs associated with equal treatment clauses, which require all share classes to receive equal consideration in the event of an acquisition. Despite these clauses’ benign appearance, they actually create another hurdle to the sale of a controlled company to the potential detriment of minority shareholders.

The Supreme Court Arguments in Carpenter Show that It May Be Time to Redefine the “Third-Party Doctrine”

On Wednesday, the Supreme Court heard oral arguments in Carpenter v. U. S., a case that involved the collection of 127 days of Petitioner Thomas Carpenter’s cell site location information as part of an investigation into several armed robberies. We attended the argument to gain any insights into how the Supreme Court may resolve this important case.

The central issue in the appeal is whether the government can access this type and amount of individual location data without a warrant. But an equally important issue is whether the Supreme Court should reevaluate the “third-party doctrine” exception to the Fourth Amendment’s warrant requirement in light of dramatic changes in the way individuals interact with technology in the digital era. The “third-party doctrine” provides that individuals have no expectation of privacy in any information that is voluntarily released to a third party—a mobile-phone provider, cloud service provider, and the like. The Court’s decision will have major implications for technology companies’ ability to protect customer data against warrantless searches by law enforcement officials.

During the 80-minute, extended oral arguments, the Justices broadly acknowledged that technology has changed dramatically in the decades since the Court originally recognized the third-party doctrine. Each Justice, however, appeared to place varying weight on the import of that change on current legal standards. Justices Kennedy and Alito focused on the information itself, rather than the technology, asking whether location information should be considered more sensitive than the bank information that United States v. Miller permitted law enforcement to access without a warrant, suggesting that banking information might be considered more sensitive.

Justice Breyer, by contrast, referred to this sort of cellphone location information as “highly personal information” and that “on a line . . . it’s somewhat closer to the [protected] diagnostic testing than it is to purely commercial information.” Justice Sotomayor shared in this assessment, further noting her discomfort with the large scope of information cell phones are capable of collecting, and the implications of allowing this sort of information to go unprotected. She stated that she is “not beyond the belief that someday a provider could turn on [her] cell phone and listen to [her] conversations.” Thus, by implication, cell phone information is of a nature that must be protected from law enforcement intrusion, otherwise the government would have open access to monitor individual’s activities in bed, in the bathroom, and in other intimate, private locations.

Justice Sotomayor repeated her concerns that “[t]he Constitution protects the rights of people to be secure. Isn’t it a fundamental concept . . . that that would include the government searching for information about your location every second of the day?” She noted the desire of the American people to “avoid Big Brother,” and explained that the third party doctrine “was never an absolute rule,” and has been subject to many limitations over the years. Justice Kagan also noted the similarity between this case and United States v. Jones, which found that a warrant was required to put a GPS tracking system on a car (albeit under a trespass theory). She said that the key issue in both is the “reliance on a new technology that allows for 24/7 tracking” and introduces “an altogether new and different thing” to “intrude on people’s expectations of who would be watching them.”

The government contends, however, that these law-enforcement demands are simply asking businesses to provide information about their own transactions with customers, and are not subject to Fourth Amendment protections. The crux of the government’s argument rests on the assumption that individuals voluntarily choose to use cellphones; that cellphones, by their very nature, relay location information to cell towers; and that cell services providers “choose to make their own business records” of that location information without mandate by the government.

Justice Roberts noted that the Court’s decision in Riley v. California has already settled individuals’ lack of meaningful choice in whether to engage with these technologies, but noted that it is an open question whether the information at issue here is sufficiently sensitive to be analogous to protected “content” (requiring a warrant for access) or to non-content “routing information” (which does not require a warrant). Justice Kennedy joked that if he knows that business have this cell phone data, everybody does. But other justices were more reserved in previewing their opinions on whether consumers have a reasonable expectation of privacy in location information. Justice Gorsuch, for example, questioned the government’s conclusion with wariness, saying “it seems like your whole argument boils down to if we get it from a third-party, we’re okay, regardless of property inter..

Natural Rights and the First Amendment

This Article excavates the Founding Era approach to expressive freedom, which was grounded in a multifaceted understanding of natural rights that no longer survives in American constitutional thought. This forgotten history undercuts the Supreme Court’s recent insistence that the axioms of modern doctrine inhere in the Speech Clause itself.