Hidden GDPR Test for AI Images

Hidden GDPR Test for AI Images

On 7 May 2026 Council and Parliament negotiators reached a provisional deal on the Digital Omnibus amending the AI Act. The headline is a new Article 5 prohibition on AI systems producing non-consensual sexually explicit imagery or child sexual abuse material; compliance is required by 2 December 2026. CIPP/E candidates who stop reading miss the harder exam point: the AI Act ban is narrow, the GDPR is wide. AI-generated images of an identifiable real person sit inside the GDPR whether the AI Act prohibition bites or not.

When AI-generated images become personal data

Article 4 of the GDPR defines personal data as any information relating to an identified or identifiable natural person. That definition does not ask how the information was produced; it asks whether the person can be identified from it. The instant a generative model outputs an image of a recognisable real person, the resulting AI-generated image counts as personal data. Every controller obligation under the GDPR comes with it.

The "synthetic is not personal" trap

A common mistake treats AI-generated images as something other than personal data because the content is "synthetic". Synthetic content can be perfectly identifiable; identifiability is the test the GDPR sets. The same logic catches lookalikes built from prompts naming an individual, plus composites that combine features from several real people into something the public still recognises. CIPP/E scenarios on digital humans turn on this point.

Biometric features and the special-category step

Once a system processes uniquely identifying biometric features, a second GDPR layer kicks in. Article 9 treats biometric data processed to uniquely identify a natural person as a special category. Such processing falls under a prohibition unless one of the conditions in Article 9(2) applies. In commercial generative-image contexts the realistic condition is explicit consent.

The EDPB Guidelines 05/2022 on facial recognition technology sit in a law-enforcement frame, yet their analysis of when biometric data crosses into Article 9 territory is what the CIPP/E exam expects candidates to apply more widely. A provider fine-tuning on someone's face, or using facial-feature extraction to make AI-generated images resemble a target person, processes biometric data for identification. The Article 9(2) gate has to open before Article 6 even matters.

Lawful basis and transparency for AI-generated images

Article 6 sets the lawful bases for ordinary processing. For AI-generated images the realistic options are consent, contract or legitimate interests. Only consent travels cleanly across to Article 9(2) when biometric identification is in play. Legitimate interests is the basis providers love to argue. The right CIPP/E answer is usually to flag why the balancing test fails on intrusiveness and reasonable expectations. Where consent is the basis chosen, candidates should apply the full standard: freely given, specific, informed, unambiguous and demonstrable.

For AI-generated images the exam stacks this as two questions. Is there a valid Article 6 basis? And where biometric identification applies, is there an Article 9(2) condition? An answer option supplying only the first is incomplete by design.

Transparency, watermarking and the overlap candidates miss

Articles 13 and 14 require controllers to tell data subjects what happens with their personal data, with whom it is shared and on what legal basis. The AI Act adds a separate watermarking duty under Article 50(2): generative-AI providers must mark synthetic content in a machine-readable way. CIPP/E candidates must keep these distinct. Watermarking does not satisfy the GDPR notice. GDPR transparency is owed to the person depicted; AI Act watermarking is owed to the downstream market for synthetic content.

This is where GDPR transparency obligations and the AI Act run on parallel tracks for AI-generated images. An exam option conflating the two duties, or one that lets watermarking discharge the GDPR notice, is the option to rule out. Both rules apply at once when AI-generated images depict a real person.

When a DPIA is required for AI-generated images

Article 35 requires a data protection impact assessment when processing is likely to result in a high risk to data subjects. The criteria sit in Article 35(3); each supervisory authority also maintains a mandatory list. Two limbs trigger routinely for AI-generated images of real people: systematic processing of biometric or special-category data on a large scale, plus a systematic evaluation of personal aspects through automated processing. Candidates should name the limb that fits the scenario rather than reciting the article number alone.

A CIPP/E scenario showing the GDPR test in action

Picture a marketing agency using a generative tool to produce social-media images that resemble a named EU celebrity, without that person's consent. The AI Act Article 5 prohibition may or may not apply, depending on whether the content meets the threshold of non-consensual sexually explicit imagery. Either way, the GDPR analysis runs independently and survives whatever the answer to that question turns out to be.

That agency processes personal data the moment its output is identifiable. It processes biometric data the moment the model uses facial features for identification. Four downstream duties then stack on top: an Article 6 basis; an Article 9(2) condition where biometric features apply; an Article 13 or 14 notice to the person depicted; and an Article 35 DPIA where the high-risk limbs trigger. A correct CIPP/E answer names these GDPR obligations on AI-generated images that survive regardless of the AI Act question. It also points to consent as the realistic basis the agency does not have.

This scenario sits alongside the broader pattern of GDPR risks in AI tools CIPP/E candidates need to read fluently. Exam markers reward candidates who hold the AI Act and the GDPR on different shelves and who run both checks every time a real person turns up in synthetic output.

For a one-pager mapping these obligations onto current news, the GDPR test for AI tools on the cheat-sheet shelf and the free CIPP/E assessment at 22academy.com/study cover this material in revision-ready depth.

Share this Post

Exam Question Masterclass



Ready to kick-start your career?

GET STARTED NOW



About The Blog


Stay up to date with the latest news, background articles, and tips for your study.


Our latest video





22Academy

Tailored Training Solutions

Let's find the best education solution for your situation. We will contact you for Free Support!

Success! Your message has been sent to us.
Error! There was an error sending your message.
It’s for:
We will only use your email address to contact you regarding your education needs. We do not sell your personal data to third parties.