italy-fines-openai-million

Italy fines OpenAI €15 million

Regulation&Security

December 23, 2024

Italy's Data Protection Authority (DPA) has fined OpenAI €15 million for violating privacy and leaking user data. The regulator concluded that the company unlawfully used personal data to train its language models, including the popular ChatGPT model. At the same time, OpenAI failed to meet transparency and user notification obligations, which became the basis for the fine.

Data leak in March 2023

The incident occurred in March 2023, but the data leak information was not immediately disclosed. The investigation showed that OpenAI failed to properly inform the public about its actions, which was one of the main factors leading to the sanctions.

Under European legislation, companies are required to comply with strict standards of privacy and transparency when handling personal data.

Fine and obligations

OpenAI's fine was reduced, which could have amounted to up to 4% of their annual revenue, due to cooperation with regulators. However, the company is required to conduct a six-month campaign to inform users about ChatGPT’s training principles and the use of their data. The initiative will cover radio, television, the internet, and print media to raise awareness of generative AI systems.

As a result of these measures, users should understand how their data is used to train AI and how to opt out if they wish. This obligation aims to increase transparency in AI and data protection.

What does Poland think about this?

It is not only Italy that is paying attention to data protection issues related to OpenAI. In September 2023, Poland launched its own investigation into ChatGPT. This confirms that European legislative bodies are becoming increasingly attentive to ethical and privacy concerns in the field of artificial intelligence.