AI Tools

Comparing AI Tools: ChatGPT, Copilot & Gemini put to the GDPR test

The use of AI tools is a huge trend in the workplace – but with innovation comes responsibility: Data protection and GDPR compliance must be top priorities when companies use artificial intelligence. Anyone who uses AI solutions such as ChatGPT, Copilot, or Google Gemini needs to know how these systems handle personal and sensitive data. The key question is: Which of these AI tools meets European data protection requirements – and where do the risks lie? In the following overview, you’ll learn how ChatGPT, Microsoft Copilot, and Google Gemini perform in terms of GDPR. We’ll examine key topics such as data minimization, transparency, controllability, risk assessment, and accountability – and present DeepSeek as a negative example of GDPR violations.

Data minimization as a fundamental principle  

Data minimization is one of the most important principles of the GDPR. Only the processing of personal data that is necessary for the processing purpose is permitted. AI tools often require large amounts of data to function. Companies face the challenge of limiting the processing of personal data to a reasonable level when using AI.

  • ChatGPT (especially the Enterprise or API versions) allows business customers to specifically restrict web access. The use of user input for training purposes can be contractually excluded. However, completely disabling abuse monitoring is generally not possible, as OpenAI—like all providers—remains legally obligated to detect abuse.
  • Microsoft Copilot accesses Microsoft 365 data, but admins can control data access granularly via the admin center. Microsoft documents this in detail in the Compliance Center and the Data Processing Agreement. Separation of business and training data is standard, and training data is not derived from customer data.
  • Google Gemini in Google Workspace is directly linked to several Google services, from Gmail to Docs. According to the Data Protection Terms and Admin Guide, access control is handled via admin policies. However, these policies must be actively customized to ensure privacy-friendly settings; otherwise, the default configuration is usually very extensive.

Practical example: When using AI for contract creation, companies should block access to sensitive HR data or emails. This is the only way to truly implement data minimization .

Transparency in AI data processing  

Transparency requires providers and users to disclose how personal data is processed.

  • ChatGPT provides information about data processing procedures in the Trust Portal. In the Enterprise version, companies can configure the retention period and deletion routines. By default, conversations are stored for a limited period of time. The exact duration and deletion options depend on the selected product variant and the contractual agreements.
  • Microsoft Copilot scores with clear documentation and a Data Processing Addendum (DPA), which regulates the conditions of data processing in detail.
  • Google Gemini offers privacy information that varies in transparency depending on the variant. The Google Controller Data Protection Terms apply to the Workspace version.  

Beware of the black box: Despite improved documentation, the automated decision logic of many AI tools remains difficult for users to understand – so it is important to regularly review data flows.

Current legal dispute over data storage at OpenAI:

In the ongoing legal dispute between OpenAI and the New York Times, the NYT is demanding that OpenAI indefinitely retain all user data, API logs, and output that could relate to NYT content. OpenAI is challenging this court order, arguing that such comprehensive and indefinite retention is technically disproportionate, poses massive data protection risks for all users, and violates its own privacy-by-design philosophy. It emphasizes that chat histories are generally not stored unless users actively consent, and that for enterprise customers with Zero Data Retention (ZDR), no data is stored anyway. The case demonstrates that external legal requirements can also influence data protection practices, and that providers like OpenAI are actively committed to protecting user data.

Controllability of AI tools  

GDPR and best practices require that AI-supported decisions always remain controllable by humans.

  • ChatGPT Enterprise offers admin features for managing conversations and setting security levels. However, for privacy reasons, complete visibility into all user data is not provided; instead, features are available for implementing deletions and access restrictions.
  • Microsoft Copilot integrates with Microsoft 365 rights management and offers extensive control and tracking options.
  • Google Gemini allows admins to control app access and permissions granularly – with complexity increasing as the feature set expands.  

Tip: Company policies should stipulate that critical AI decisions require human review (so-called human-in-the-loop).  

Risk assessment according to GDPR  

According to Art. 35 GDPR, a data protection impact assessment (DPIA) is mandatory whenever there is a high risk to the rights of data subjects, which is practically always the case with modern AI tools. Typical risks include discriminatory effects due to algorithmic bias, data leaks due to incorrect configuration, misuse of data, and non-transparent decision-making.

  • ChatGPT: Risks arise in particular from the processing of personal data on servers outside the EU. This risk can be mitigated through appropriate contractual arrangements (e.g., standard contractual clauses, data processing addendum) and technical measures. However, full GDPR compliance can only be guaranteed with the consistent implementation of all protective measures.
  • Microsoft Copilot: Thanks to the EU-US Data Privacy Framework and proven Microsoft security architectures, there are good conditions – provided data access is configured restrictively.  
  • Google Gemini: The extensive integration and long retention periods for prompts (up to 18 months) increase the risks, especially for companies with sensitive data.  

Recommendation: Conduct a separate DPIA for each tool and document all identified risks and protective measures taken.  

Accountability as a central GDPR obligation (Accountability GDPR, good AI tools)

The accountability requirement under Art. 5 (2) GDPR requires proof of data protection – from configuration to daily use.

  • ChatGPT: In the Enterprise version, a separation of usage data and training data is possible, provided this has been contractually agreed upon. The servers are also located in the EU, allowing exclusive processing within the EU to be selected. Admins can delete conversations and adjust certain settings, but control over all data flows must be reviewed from a technical and organizational perspective.  
  • Microsoft Copilot provides tools like Microsoft Purview to secure audit and compliance evidence.  
  • Google Gemini offers auditable logs – these vary depending on the product version selected.  

Practical tip: Control mechanisms should be continuously reviewed and documented through audits.  

Tools in the GDPR comparison in detail

ToolChatGPTCo-pilotGemini
Data protectionStrict separation of usage and training data possible (Enterprise)According to DPA* Data in the European Economic Area; separation of business and user data; no training data from customer dataData protection risks due to integration into multiple workspace services
Server locationsUSA, partly EU (Enterprise)EEA/Switzerland, EUData processing in selectable regions, including the EU (e.g., Belgium, Germany, Finland, Netherlands, Poland, Spain, Zurich)
controlRestricted access and deletion options for conversationsComprehensive admin and compliance toolsNumerous admin settings, but standard access is extensive
Additional measuresDPIA recommended for personal data; DPA available; Zero Data Retention for corporate customersDPIA recommended for personal data; DPA mandatory; EU Data Boundary optional; compliance reports availableDSFA recommended for personal data; AVV mandatory

DeepSeek: A negative example of AI tools under the GDPR  

DeepSeek is an AI language model from China that is considered a prime example of tools that are problematic from a data protection perspective:  

  • The tool stores a wealth of user data, including IP addresses, all keystrokes, and uploaded documents – without providing any transparency about storage locations and data flows.  
  • According to the provider, Chinese authorities can access stored user data; there is no adequacy decision between the EU and China.  
  • DeepSeek does not provide data processing agreements (DPAs) and is not willing to sign them – a serious violation of Articles 28 and 32 GDPR.  
  • Information obligations, purpose limitation, and data security are not fulfilled, and massive data leaks have been documented.  
  • Companies face drastic risks if they use it – from fines to data and reputational losses to loss of control over sensitive information.  

Conclusion: From a GDPR perspective, the use of DeepSeek is currently not recommended.

Companies should rely exclusively on transparent, certified AI solutions that comply with clear data protection requirements.

What are the best practices for GDPR-compliant use of AI tools?  

  • Develop clear privacy policies and usage rules for AI tools.
  • Train your employees on data protection and AI.
  • Conduct regular audits and risk reviews.
  • Always enter into data processing agreements with providers and monitor compliance with them.

Conclusion: Configure AI tools correctly for compliance

The comparison shows that even powerful AI tools like ChatGPT, Copilot, and Gemini are only GDPR-compliant if they are properly configured and controlled. Particular caution is required with non-European tools (like DeepSeek), as they can violate key GDPR principles.

Recommendation for companies: Review tools carefully, consistently paying attention to data minimization, transparency, control options, and regular risk analyses. This is the only way to use AI innovatively – while still complying with data protection regulations.

Tags: No tags

Comments are closed.