Hidden GDPR Risks in AI Tools
A US federal court recently ordered a fraud defendant to disclose 31 documents generated through an AI chatbot, ruling that the conversations carried no privilege protection. The attorney-client privilege question is a US procedural issue and falls outside the CIPP/E exam. But the underlying fact matters for every candidate studying European data protection: if AI chat logs are treated as discoverable business records, they contain personal data. The moment they contain personal data, the controller's GDPR obligations follow. For organisations whose staff paste client names, contract details or employee information into public AI tools every day, AI tools GDPR compliance is not a theoretical exercise. This article walks through the GDPR obligations that apply, mapped to the IAPP's Body of Knowledge (BoK) for the CIPP/E certification, which defines every domain and topic the exam covers.
Purpose Limitation and Lawful Basis
Article 5(1)(b) of the GDPR requires that personal data be collected for specified, explicit and legitimate purposes and not processed further in a manner incompatible with those purposes. Article 6 requires a lawful basis for every processing activity. These two provisions sit at the core of BoK Domain III, which covers European data processing principles (III.A.1) and lawful processing bases (III.B.1).
Consider a straightforward scenario. A staff member receives a client's contract summary by email, pastes it into a public AI chatbot to draft a reply and sends the output back to the client. In doing so, the controller has introduced a new processing purpose that was almost certainly not covered by the original privacy notice or the lawful basis under which the data was collected. The personal data in the contract summary is now being processed by a third-party platform for a purpose the data subject was never informed of. If the original lawful basis was contractual necessity, that basis does not extend to AI-assisted drafting on an external platform. If it was legitimate interest, no balancing test was conducted for this new activity.
The exam tests whether candidates can identify this chain of failures. The controller's obligation under AI tools GDPR compliance is not simply to have a lawful basis; it is to ensure the basis and purpose still hold when processing moves to a new channel.
Controller, Processor, or Neither?
Article 28 sets out the requirements for relationships between controllers and processors. BoK section II.B.3 expects candidates to understand effective and responsible vendor management, and IV.B.1 addresses the accountability requirements of controllers and processors under Article 24.
When an employee uses a public AI tool, the employer remains the controller of any personal data entered into the platform. The AI vendor is processing that data. Under Article 28, the controller must have a data processing agreement (DPA) in place with any processor, specifying the subject matter, duration, nature and purpose of processing, the type of personal data and the obligations of both parties. Most consumer-grade AI tools do not offer a DPA that meets Article 28 requirements. Their standard terms of service typically reserve the right to use input data for model training, safety review and legal compliance; none of these are purposes the controller has authorised.
Without a compliant DPA, the controller is processing personal data through an unauthorised processor. That is an accountability failure under Article 24, and a question the CIPP/E exam is well positioned to test. Candidates should note that the 2024 CIPP/E curriculum updates added emphasis on AI and emerging technologies precisely because this pattern of AI tools GDPR compliance failure is so widespread.
DPIA and Records of Processing
Two obligations catch most organisations off guard when AI tool usage scales beyond a handful of individuals.
When Does AI Tool Usage Trigger a DPIA?
Article 35 requires a data protection impact assessment when processing is likely to result in a high risk to the rights and freedoms of individuals. The EDPB's guidelines identify nine criteria for assessing whether a DPIA is needed; meeting two or more generally triggers the obligation. BoK section IV.B.3 expects candidates to understand both the role of DPIAs and the criteria for conducting them.
Staff use of public AI tools can engage several of these criteria simultaneously. When employees across departments routinely enter client data, employee data and commercial information into the same AI platform, the cumulative processing involves innovative technology, large-scale data collection and possible evaluation or profiling. Each individual chat session may appear low-risk in isolation. Aggregated across dozens of staff and hundreds of daily interactions, the risk profile changes substantially.
Why the Processing Register Must Include AI
Article 30 requires controllers to maintain records of processing activities. BoK section IV.B.2 covers documentation and cooperation with regulators. If AI tool usage is not recorded in the processing register, the organisation cannot demonstrate compliance when a supervisory authority asks how personal data flows through AI channels. The register must include AI tool usage even when no individual interaction seems significant. The aggregate is what triggers supervisory attention, and AI tools GDPR compliance depends on having that documentation in place before the question is asked.
What Would the Exam Ask Here?
CIPP/E scenario questions test whether candidates can connect GDPR provisions to operational facts. Three plausible question stems from this topic:
- A marketing team uses a public AI chatbot to draft customer emails, pasting client names and purchase history into the tool. Which GDPR principle is most directly at risk under Article 5(1)(b)?
- An organisation's staff use a consumer AI platform that does not offer a data processing agreement meeting Article 28 requirements. What is the controller's primary accountability obligation under Article 24?
- A company's employees use a public AI tool across five departments for routine document drafting. Under EDPB guidance, which combination of criteria is most likely to trigger a DPIA under Article 35?
Each tests whether the candidate can trace a real-world AI scenario back to the correct GDPR provision and the matching BoK domain.
Map Your Own AI Tool Usage
Review how your organisation handles AI tools today. Has the processing register been updated to include AI-assisted activities? Has a DPIA been conducted for systematic AI tool usage across departments? Does a data processing agreement meeting Article 28 standards exist with every AI vendor handling personal data? If any of these questions is open, start there. Explore the CIPP/E study resources at 22academy.com/study and test yourself against the Friday Perplexity prompt on this same topic in the study groups.