July 13, 2024 | 6:16

News

ChatGPT faces legal problems in Europe

Jesus Carames

April 7, 2023 | 6:05 a.m.

Chat GPT, the language model developed by OpenAI, is at the center of a privacy issue in Europe. The Italian Data Protection Agency has issued a temporary emergency decision requiring OpenAI to stop using the personal information of millions of Italian citizens included in the ChatGPT training data. This situation could set a precedent throughout Europe and jeopardize the use of personal information in artificial intelligence models.

Regulatory action in Italy

On March 31, Italy's data regulator, Garante per la Protezione dei Dati Personali, issued an emergency decision requiring OpenAI to cease using the personal information of millions of Italians in ChatGPT. According to the regulator, OpenAI does not have the legal right to use such information in your chatbot.

In response to this situation, OpenAI has blocked access to ChatGPT in Italy while providing responses to officials investigating the case. This action is the first taken against ChatGPT by a Western regulator and highlights tensions around privacy in the creation of generative AI models, which are often trained on vast amounts of data from the internet.

Potential impact in Europe

The decision taken in Italy could have repercussions across Europe. Since the announcement of the investigation, data regulators in France, Germany and Ireland have contacted the Italian Guarantor for more information on its findings. Tobias Judin, head of the Norwegian Data Protection Authority, argues that if a model is built with data that could have been harvested illegally, this raises questions about whether someone can legally use tools based on that data.

Increased scrutiny over AI models

The blow to OpenAI in Italy comes at a time when the scrutiny of artificial intelligence models it is on the rise. On March 29, technology leaders called for a pause in the development of systems like ChatGPT, concerned about its possible future implications. Judin says that the Italian decision highlights more immediate concerns and that the development of artificial intelligence could have a significant deficiency in relation to privacy and data protection.

The problems of ChatGPT according to the GDPR

Europe's General Data Protection Regulation (GDPR) protects the personal information of more than 400 million people on the continent. The Italian authority maintains that ChatGPT has four problems under the GDPR: OpenAI does not have age controls to prevent minors under 13 from using the text generation system; may provide information about individuals that is not accurate; and people have not been informed that their data was collected. Furthermore, it is claimed that there is no "legal basis" for the collection of personal information in the large volumes of data used to train ChatGPT.

More news