Italy Says ChatGPT Violates Data Protection Rules

OpenAI artificial intelligence ChatGPT

Italian data protection regulator says investigation found OpenAI’s ChatGPT violates EU’s GDPR privacy rules

Italy’s data protection regulator has informed OpenAI that its ChatGPT artificial intelligence (AI) chatbot may violate EU data protection rules.

The Garante regulator said OpenAI has 30 days to reply with defence briefs on the alleged violations, details of which were not disclosed.

OpenAI said it believes its practices are in compliance with EU data protection laws.

The regulator said it would take into account the ongoing work of a special task force set up by the European Union’s European Data Protection Board in April 2023 to monitor ChatGPT, which has become hugely popular since OpenAI launched it in November 2022.

The Board’s task force brings together data protection authorities from across the bloc.

AI artificial intelligence chatgpt

Privacy concerns

Italy’s notice to OpenAI, called a “notice of objection”, comes after the country temporarily suspended ChatGPT in Italy in March 2023, the first Western country to do so.

At the time the Garante cited privacy concerns including the mass collection of users’ data to train the chatbot’s algorithm.

The regulator also expressed concern that younger users could be exposed to inappropriate content, as it included no age verification mechanism.

ChatGPT was restored to operation in Italy after about four weeks, with the Garante saying OpenAI had “addressed or clarified” the issues it had raised.


At the time the regulator launched a “fact-finding activity” which has now concluded that ChatGPT may violate the EU’s GDPR data protection regulation.

The Garante said in a statement that it “concluded that the available evidence pointed to the existence of breaches of the provisions contained in the EU GDPR”.

The regulation allows authorities to fine companies up to 4 percent of their global turnover.

In April 2023 the Garante said it wanted OpenAI to implement an age-verification system for ChatGPT and wanted Italians to be made more aware of their right to opt-out from having their personal data used to train algorithms.

Data protection

“We believe our practices align with GDPR and other privacy laws, and we take additional steps to protect people’s data and privacy,” OpenAI said in a statement.

The company said it “actively” works to reduce personal data in training systems such as ChatGPT, “which also rejects requests for private or sensitive information about people”.

The company said it would continue talks with the Garante.

ChatGPT is also under investigation by the US Federal Trade Commission (FTC) over concerns about the AI’s tendency to generate false results.