Skip to content

Europe’s General Data Protection Regulation (GDPR) allows individuals to request that their information be deleted in the following situations:[1]

  1. Companies must delete data upon request if the data was processed based solely on consent. The GDPR recognizes that companies may process data based on six alternate lawful grounds.[2] One of these is where a person has given consent to the processing for a specific purpose.[3] If a company’s sole basis for processing data to train an AI is the consent of individuals, the company typically is required to honor an erasure request, which might for all practical purposes be viewed as a revocation of that consent. Conversely, if processing is based on an additional permissible purpose an erasure request does not necessarily have to be granted.
  2. Companies must delete data upon request if the data was processed based upon the controller’s legitimate interest, and that interest is outweighed by the individual’s rights. One of the other grounds upon which a company can process data is to further the company’s “legitimate interest.” When training an AI is based upon a company’s legitimate interest, an individual has a right to request erasure unless the interest of a controller or a third party is demonstrably “overriding.”[4] 
  3. Companies must delete data upon request if data is being processed unlawfully. The GDPR states that an erasure request must be honored if the processing of personal information is (or has become) unlawful.[5] Here, too, the obligation to honor an erasure request may be redundant of other obligations within the GDPR. Put differently, if a company is complying with the other requirements of the GDPR its processing would presumably be lawful and there may be few, if any, situations in which a right to be forgotten request would require that the company take any additional actions. Framing this as an individual’s right, however, opens up an additional source of civil liability for the company towards the individual.
  4. Companies must delete data upon request if erasure is already required by law. The GDPR states that a right to be forgotten request must be honored if the data is required to “be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject.”[6] This requirement also appears redundant to other legal obligations. If a company is required to erase data pursuant to another Member State law and is complying with that requirement, there may be few, if any, situations in which additional action would be necessitated by a right to be forgotten request.
  5. Companies must delete data upon request if it is collected from a child as part of offering an information society service. The GDPR requires the deletion of information when requested where the information was “collected in relation to the offer of information society services” to children under 16.[7]

In the context of AI, some supervisory authorities have suggested that if a company uses publicly sourced data to train an AI (e.g., data scraped from the internet), the only plausible lawful purpose would be either (1) the consent of the individuals whose personal information is being provided or (2) the legitimate interest of the controller.[8] As discussed above, if processing is based either on consent or on legitimate interest then individuals must be given a right to request that their information be deleted.

Note that information does not always need to be deleted simply because an erasure request has been made. For example, a company can choose to decline an erasure request if honoring it would interfere with a legal obligation imposed on the company to maintain the data, or if the data is needed to establish, exercise, or defend a legal claim.[9]


[1] Requests for deletion are referred interchangeably as “deletion requests” and “erasure requests.”

[2] GDPR, Article 6(1)(a)-(f).

[3] GDPR, Article 6(1)(a).

[4] GDPR, Article 17(1)(c).

[5] GDPR, Article 17(1)(d).

[6] GDPR, Article 17(1)(e).

[7] GDPR, Article 17(1)(f); Article 8(1).

[8] Garante Per La Protezione Dei Dati Personali, Provision of April 11, 2023[9874702] (English translation).

[9] GDPR, Article 17(3)(b), (e).

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of David A. Zetoony David A. Zetoony

David Zetoony, Co-Chair of the firm’s U.S. Data, Privacy and Cybersecurity Practice, focuses on helping businesses navigate data privacy and cyber security laws from a practical standpoint. David has helped hundreds of companies establish and maintain ongoing privacy and security programs, and he

David Zetoony, Co-Chair of the firm’s U.S. Data, Privacy and Cybersecurity Practice, focuses on helping businesses navigate data privacy and cyber security laws from a practical standpoint. David has helped hundreds of companies establish and maintain ongoing privacy and security programs, and he has defended corporate privacy and security practices in investigations initiated by the Federal Trade Commission, and other data privacy and security regulatory agencies around the world, as well as in class action litigation.

Photo of Carsten A. Kociok Carsten A. Kociok

Carsten Kociok is a partner in the Technology, Financial Services and Data Privacy Practice in Berlin and Co-Head of Greenberg Traurig’s global Fintech Group. He advises national and international clients across all industries, including financial services, information technology, artificial intelligence, ecommerce, media, health

Carsten Kociok is a partner in the Technology, Financial Services and Data Privacy Practice in Berlin and Co-Head of Greenberg Traurig’s global Fintech Group. He advises national and international clients across all industries, including financial services, information technology, artificial intelligence, ecommerce, media, health care, telecoms, retail and real estate, on a wide variety of complex commercial and regulatory matters.

Carsten is a leading technology lawyer, ranked consistently in Band 1 for Fintech Legal in Germany since 2020. He has in-depth and wide-ranging experience in the areas of privacy and cybersecurity, payments law, financial services, e-money products, blockchain technology, and financial and banking regulation, as well as in artificial intelligence regulation – including compliance with the EU AI Act – and the integration of AI technologies into existing software systems.

Carsten regularly assists clients in licensing projects and audit proceedings with financial regulators and advises on the contractual and regulatory aspects of developing, implementing and operating financial technology products and transactions.

On the data privacy side, Carsten counsels clients on complex data-driven business models and regulatory matters, including on international data transfers, data privacy compliance, monetization of data, artificial intelligence, litigation, cybersecurity and data breach response.

Carsten regularly lectures and publishes on various FinTech and data privacy topics. Prior to joining the firm, Carsten worked at Olswang Germany for eight years and in the Capital Transaction Practice Group of an international law firm in New York.