Ireland’s Data Protection Commission Initiates Investigation into X Regarding AI Training Practices

Ireland's Data Protection Commission Initiates Investigation into X's AI Training Practices

Inquiry into X’s Use of Data for AI Model Training

Background of the Investigation

The Data Protection Commission (DPC) in Ireland has launched an inquiry into X, the social media platform owned by Elon Musk. This investigation centers around how X processes the posts of European users to train its artificial intelligence (AI) model named Grok. The DPC aims to ensure that X complies with important laws under the General Data Protection Regulation (GDPR). Key aspects under scrutiny include the legality and transparency of data handling practices.

The Impact of the EU’s AI Act

The European Union’s AI Act, which came into force last year, sets various rules and guidelines for technology companies. It has caused unease among tech executives, highlighting the growing concern regarding how companies utilize data, particularly personal data. Following the DPC’s initiation of the inquiry, X has put a temporary stop to using data for Grok training.

What is Grok?

Developed by xAI, Grok includes a variety of AI models, notably Large Language Models (LLMs). These LLMs are fundamental in powering a generative AI querying tool available on the X platform. Grok is designed to enhance user interaction by providing advanced AI-driven responses. However, the models are trained on a wide array of data sets, some of which include publicly available posts from users.

Focus of the DPC Investigation

The DPC’s investigation will target a specific subset of data that X controls, particularly personal data derived from publicly accessible posts by users located in the European Union and European Economic Area (EU/EEA). The main goal is to investigate whether using this personal data for training Grok’s LLMs adheres to legal standards set out for data protection.

In April 2025, the Irish DPC officially informed X about the inquiry, emphasizing the need to assess compliance under Section 110 of the Data Protection Act 2018. This inquiry is crucial given the sensitive nature of personal data involved and the implications for users and the platform alike.

Similar Investigations Worldwide

This isn’t the first time X has faced scrutiny over its data usage practices. In February 2025, Canada’s privacy commissioner also opened an investigation into X concerning potential violations in AI training data usage. The Office of the Privacy Commissioner of Canada is investigating whether X violated privacy laws by using Canadians’ personal information to aid in training their AI models.

This probe, mandated by the Personal Information Protection and Electronic Documents Act (PIPEDA), seeks to determine if X has properly followed federal privacy regulations in their collection, use, and disclosure of Canadians’ personal data.

Summary of Key Points

  • Initiating Body: The Data Protection Commission in Ireland has opened an inquiry into X over data usage for AI training.
  • Legal Framework: Focus is on compliance with GDPR and the EU’s AI Act.
  • Nature of Grok: Grok includes various AI models, particularly LLMs, used within X’s platform.
  • Data Scope: Investigation will scrutinize personal data from publicly accessible posts by EU/EEA users.
  • International Matters: X is also being examined by Canada’s privacy regulator for its data practices.

The ongoing investigations highlight essential issues concerning user privacy, adherence to legal standards, and the broader implications for technology companies operating in today’s data-driven landscape. As the inquiry progresses, updates and outcomes will not only affect X but could also set important precedents for other technology firms in similar situations.

Please follow and like us:

Related