As the Federal Communications Commission repeals the Open Internet Order—more commonly known as the net-neutrality rules—health care consumers and providers have been left wondering how this change will affect their ability to receive and deliver health care using digital health tools. In this On the Subject, we outline how changes in internet access will affect digital health and what the regulatory landscape will look like in the coming months and years.

Continue Reading.

On October 20, 2016, the United States Department of Justice Antitrust Division (DOJ) and Federal Trade Commission (FTC) issued joint Antitrust Guidance to Human Resource (HR) Professionals (the Guidance) involved in hiring and compensation decisions. The agencies issued the guidance to educate HR professionals about how the antitrust laws apply in the employment context.

Read the full article here.


As we reported in May 2014, the Federal Trade Commission (FTC) convened stakeholders to explore whether health-related information collected from and about consumers — known as consumer-generated health information (CHI) — through use of the internet and increasingly-popular lifestyle and fitness mobile apps is more sensitive and in need of more privacy-sensitive treatment than other consumer-generated data.

One of the key questions raised during the FTC’s CHI seminar is: “what is consumer healthinformation”?  Information gathered during traditional medical encounters is clearly health-related.  Information gathered from mobile apps designed as sophisticated diagnostic tools also is clearly health-related — and may even be “Protected Health Information,” as defined and regulated by Health Information Portability and Accountability Act (HIPAA), depending on the interplay of the app and the health care provider or payor community.  But, other information, such as diet and exercise, may be viewed by some as wellness or consumer preference data (for example, the types of foods purchased).  Other information (e.g., shopping habits) may not look like health information but, when aggregated with other information generated by and collected from consumers, may become health-related information.  Information, therefore, may be “health information,” and may be more sensitive as such, depending on (i) the individual from whom it is collected, (ii) the context in which it is initially collected; (iii) the other information which it is combined; (iv) the purpose for which the information was initially collected; and (v) the downstream uses of the information.

Notably, the FTC is not the only regulatory body struggling with how to define CHI.  On February 5, 2015, the European Union’s Article 29 Working Party (an EU representative body tasked with advising EU Member States on data protection) published a letter in response to a request from the European Commission to clarify the definitional scope of “data concerning health in relation to lifestyle and wellbeing apps.”

The EU’s efforts to define CHI underscore the importance of understanding CHI.  The EU and the U.S. data privacy and security regimes differ fundamentally in that the EU regime broadly protects personally identifiable information.  The US does not currently provide universal protections for personally identifiable information.  The U.S. approach varies by jurisdiction and type of information and does not uniformly regulate the mobile app industry or the CHI captured by such apps.  These different regulatory regimes make the EU’s struggle to define the precise scope and definition of “lifestyle and wellbeing” data (CHI) and develop best practices going forward all the more striking because, even absent such a definition, the EU privacy regime would offer protections.

The Article 29 Working Party letter acknowledges the European Commission’s work to date, including the European Commission’s “Green Paper on Mobile Health,” which emphasized the need for strong privacy and security protections, transparency – particularly with respect to how CHI interoperates with big data  – and the need for specific legislation on CHI-related  apps or regulatory guidance that will promote “the safety and performance of lifestyle and wellbeing apps.”  But, in its annex to the Article 29 Working Party letter, the Working Party notes: “due to the wide range of personal data that may fall into the category of health data, this category represents one of the most complex areas of sensitive data and …display[s] a great deal of diversity and legal uncertainty.”    Thus, even within the more protective EU data privacy regime, regulators acknowledge the likely need for specific privacy and security protections in light of the consumer-driven nature of CHI, the myriad mechanisms in which such data is collected and aggregated in the digital landscape, and the difficulty in tracing, tracking and predicting how such data will be aggregated, disaggregated and otherwise used.

As a starting point, the annex to the Article 29 Working Party letter presents a framework for determining when personal data are health data, which is:

  1. “The data are inherently/clearly medical data.
  2. The data are raw sensor data that can be used in itself or in combination with other data to draw a conclusion about the actual health status or health risk of a person.
  3. Conclusions are drawn about a person’s health status or health risk (irrespective of whether these conclusions are accurate or inaccurate, legitimate or illegitimate, or otherwise adequate or inadequate).”

The Annex also notes the importance of obtaining “the unambiguous consent of the data subject,” given that many CHI-related mobile apps collect and process location data and data collected through sensors, which, when combined with other data, could identify a  person’s health status.

Back in the United States, the FTC continues to signal its interest in mobile applications that collect and analyze CHI.  On February 23, 2015, the FTC released a pair of consent orders about two different mobile applications, alleging that the apps did not perform as advertised.  Although these consent orders do not expressly address the data privacy implications of the apps, they signal that the FTC is monitoring the representations that apps collecting and using CHI are making to consumers.

As mobile apps become more sophisticated and assist patients and providers with the active detection and management of health conditions, we expect that the need for clarity and consensus about reasonable data privacy and protection practices with respect to CHI will intensify because this need for clarity and consensus is something about which both U.S. and EU regulators can agree.

Last month, the Federal Trade Commission (FTC) and the Equal Employment Opportunity Commission (EEOC) issued joint guidance addressing the use of background checks in employment decisions.  The guidance does not offer new requirements related to background checks, but rather serves as a reminder to employers of their obligations under federal law when they use background checks, and creates a user-friendly guide to applicants and employees regarding their rights with respect to background checks.

The guidance consists of two documents – one for employers, “Background Checks: What Employers Need to Know,” and one for applicants and employees, “Background Checks: What Job Applicants and Employees Should Know.”  The first document, “What Employers Need to Know,” offers guidance to employers on their existing legal obligations under the Fair Credit Reporting Act (FRCA), a federal law enforced by the FTC, and federal non-discrimination laws enforced by the EEOC.  The document reminds employers that under FCRA employers must obtain written permission from job applicants and employees before conducting a background check, and must notify applicants and employees that background reports may be used to make decisions about employment.  In addition, the agencies reaffirm that employers must not discriminate based on a person’s race, color, national origin, sex, religion, age (40 or older) or disability when requesting or using background information for employment.  Finally, the guidance discusses the requirements related to the retention, preservation and disposal of personnel or employment records.

The second document, “What Job Applicants and Employees Should Know,” describes applicants’ and employees’ rights under federal law when an employer conducts background checks. The agencies remind applicants and employees that it is lawful for potential employers to ask about applicants’ or employees’ backgrounds or require a background check, as long as the employer does not unlawfully discriminate.  The guidance also states that employers must not ask for medical information until they offer an applicant a job, and can only ask for genetic information under limited circumstances (for example, when an employer offers health or genetic services as part of a voluntary wellness program, or if the information is required to comply with the Family and Medical Leave Act).  Finally, the guidance explains that when applicants have been turned down for a job or denied a promotion based on information in their background reports, they have the right to review the report for accuracy.

This marks the first time the two agencies have jointly issued guidance, which seems to indicate that both agencies have a vested interest in enforcing the laws related to employer use of background checks, and perhaps serves as a signal to employers that both agencies consider this topic a priority.  Employers should consider reviewing the new guidance, and ensure that their policies and practices with respect to background checks comply with federal law, as well as applicable state and local law.

by Jorge R. Arciniega, Elisabeth Malis Morgan and Heather Egan Sussman

The U.S. Federal Trade Commission (FTC) released updated guidance on how to make online advertising and marketing disclosures “clear and conspicuous” to avoid consumer deception.  The guidelines affect the structure and format of digital advertisements and marketing initiatives such as the use of endorsements and testimonials.

To read the full article, click here.

by Daniel F Gottlieb, Heather Egan Sussman and Randall J. Ortman

A new Federal Trade Commission report urges mobile app platforms and developers to better inform consumers about their privacy practices. Mobile app platforms and developers should review their privacy policies to ensure accuracy, transparency and appropriate level of consumer choice.

To read the full article, click here.

For more information, please contact Heather Egan Sussman, Daniel F. Gottlieb or Rohan Massey.

Privacy and data protection continue to be an exploding area of focus for regulators in the United States and beyond.  This Special Report gives in-house counsel and others responsible for privacy and data protection an overview of some of the major developments in this area in 2012 around the globe, as well as a prediction of what is to come in 2013.

To read the full article, click here.

by Jennifer S. Geetter, Heather Egan Sussman and Carla A. R. Hine

We recently released a Hot Topic that details the Federal Trade Commission’s (FTC) settlement with Spokeo, Inc.  Spokeo collected information about individuals from online and offline sources to create profiles that included contact information, marital status, age range and in some cases included a person’s hobbies, ethnicity, religion, participation on social networking sites and photos that Spokeo attributed to a particular individual.  Spokeo marketed these profiles to companies in the human resources, background screening and recruiting industries as information to serve as a factor in deciding whether to interview or hire a job candidate.  The FTC concluded that Spokeo acted as a consumer reporting agency and thus violated the Fair Credit Reporting Act (FCRA) by: (1) failing to ensure the consumer reports it sold were used for legally permissible purposes; (2) failing to ensure that the information it sold was accurate; and (3) by failing to inform users of Spokeo’s consumer reports of their obligations under the FCRA.  Spokeo agreed to pay $800,000, and comply with the FCRA going forward, among other things.

There is an important message for employers in this settlement:  If you receive profile information from data brokers and use that information in making employment decisions, the FCRA applies.  And while this enforcement action focused on the data broker, the FTC could turn next to offending employers.  The FTC has published guidance on how to avoid an enforcement action in these circumstances and comply with the FCRA at:  Using Consumer Reports: What Employers Need to Know  Employers should also check on the local state laws that may apply, because some states restrict the use of such reports for employment purposes.