Skip to content
Photo of David A. Zetoony

David Zetoony, Co-Chair of the firm's U.S. Data, Privacy and Cybersecurity Practice, focuses on helping businesses navigate data privacy and cyber security laws from a practical standpoint. David has helped hundreds of companies establish and maintain ongoing privacy and security programs, and he has defended corporate privacy and security practices in investigations initiated by the Federal Trade Commission, and other data privacy and security regulatory agencies around the world, as well as in class action litigation.

Under the GDPR, controllers are required to provide individuals with information relating to what personal data is processed, and how that processing takes place. Some supervisory authorities have specifically taken the position that organizations which use personal data to train an artificial intelligence (AI) must draft and publish a privacy notice that provides “data subjects

Data is typically needed to train and fine-tune modern artificial intelligence (AI) models. AI can use data—including personal information—to recognize patterns and predict results.

The GDPR permits controllers to process personal information if one (or more) of the following six lawful processing purposes applies:[1]

  1. Consent. A company may process personal information if it collects

Most modern U.S. state data privacy laws exempt from their definition of personal information “publicly available information.” What constitutes publicly available information differs between state privacy laws and may not correlate to the lay definition understood by many businesses and individuals. For example, while some businesses may consider information that is available on the internet

Most modern U.S. data privacy statutes require companies to allow data subjects to opt out of having their personal information (PI) used for targeted advertising. As the following chart indicates, the term “targeted advertising” is defined consistently between and among most state statutes with the notable exception of the California Consumer Privacy Act (CCPA) and

The term “targeted advertising” is defined relatively consistently between and among modern U.S. data privacy statutes with the notable exception of California which deviates somewhat in the California Privacy Rights Act’s (CPRA) definition of the similar term “cross-context behavioral advertising” by omitting any reference to tracking a person over time or making predictions about a

Probably not.

Under the European GDPR, if the personal information that an organization is going to use as part of training an AI has been collected directly from individuals, then those individuals should be provided with a copy of the organization’s privacy notice “at the time when personal data are obtained.”[1] If the personal

Attorneys familiar with the European GDPR are acquainted with the bifurcation of the world into controllers and processors. For purposes of European data privacy, a “controller” refers to a company that either jointly or alone “determines the purposes and means” of how personal data will be processed.[1] A “processor” refers to a company (or

Categorizing data as “sensitive” is a common feature in U.S. state privacy law, as well as the EU’s GDPR (which uses the term “special category” for similar personal data).[1] What is considered sensitive data varies from state to state, as well as the obligations that come with it. Colorado, Connecticut, Florida, Indiana, Montana, Oregon

Companies across industries are considering whether, and how, to utilize artificial intelligence (AI). Once developed, an AI can be utilized in a variety of different ways.

Foundational models are large-scale AI models trained using vast amounts of unlabeled data that can be used for varying tasks. Foundational models typically use self-supervised learning to apply learnings

Companies across industries are considering whether, and how, to utilize artificial intelligence (AI). Once developed, an AI can be utilized in a variety of different ways.

As part of developing an AI, computers may be provided with large quantities of data from which patterns and associations can be recognized (“training data”). Training data is often