The term “targeted advertising” is defined relatively consistently between and among modern U.S. data privacy statutes with the notable exception of California which deviates somewhat in the California Privacy Rights Act’s (CPRA) definition of the similar term “cross-context behavioral advertising” by omitting any reference to tracking a person over time or making predictions about a

On Aug. 9, 2023, a tutoring company agreed to pay $365,000 to settle an artificial intelligence (AI) lawsuit with the Equal Employment Opportunity Commission (EEOC). The settlement comes on the heels of multiple EEOC warnings to employers about potential discrimination associated with the use of AI for hiring and workplace decisions.

Continue reading the full

Data is typically added to an AI to explain a problem, situation, or request (“input data”). Some AI providers, particularly those that provide natural language or large language models, refer to “prompts” as a subset of input data that describes the instructions that have been provided to the AI model (i.e., “please summarize the following

Data is typically added to an AI to explain a problem, situation, or request (“input data”). Some AI providers, particularly those that provide natural language or large language models, refer to “prompts” as a subset of input data that describes the instructions that have been provided to the AI model (i.e., “please summarize the following

The right of correction (sometimes called the “right of rectification”) refers to a person’s ability to request that a company fix any inaccuracies in the personal data it holds about them.[1] Correction is sometimes referred to as an absolute right in the context of the GDPR, because unlike some other rights conferred by the

Under the GDPR controllers are required to provide information relating to what personal information they process, and how that processing takes place.[1] Data is typically needed to train and fine-tune modern artificial intelligence models. If that training data contains personal information, an organization is required to include a description of that processing in its

Following on the heels of a California Superior Court’s last minute ruling that stayed enforcement of the revised California Consumer Privacy Act (CCPA) regulations, as previously discussed on this blog, California’s data privacy regulators have responded in ways that confirm they are more committed than ever to holding businesses accountable for alleged violations

Under the GDPR, controllers are required to provide individuals with information relating to what personal information is processed, and how that processing takes place.[1] Some supervisory authorities have specifically taken the position that companies which use personal information to train an artificial intelligence (AI) must draft and publish a privacy notice that provides “data

The term “data minimization” generally refers to two requirements within the GDPR: (1) a company should only collect personal data that is “necessary” in relation to its purpose, and (2) a company should keep data for “no longer than is necessary for [that] purpose[].”[1] Put differently, a company should only collect what it needs

On April 17, 2023, the Washington State Legislature passed the “My Health My Data Act” (WMHMDA or the Act).* Unlike other modern state privacy laws that purport to regulate any collection of “personal data,” WMHMDA confers privacy protections only upon “Consumer Health Data.” That term is defined to include data that is linked (or linkable)