Skip to main content

Beyond Legal #33: The product manager who said the Supervisory Authority could wait

Beyond Legal #33: The product manager who said the supervisory authority could wait

We’ll add the data protection layer in v2. The Supervisory Authority can wait. — Natasha V., former Product Manager

Product teams move fast. That is not a criticism — it is the model, and it produces good products. The problem is that data protection laws and regulations like the GDPR does not move fast, does not recognise the concept of a compliance backlog, and has no equivalent of a retrospective fix. Data protection by design is not a layer you add in v2. It is a condition of the processing being lawful in v1. By the time v2 ships, the unlawful processing has already happened.

Natasha’s story #

Natasha is a Product Manager at a health technology start-up operating across six EU member states. She is experienced, commercially focused, and has shipped several successful products under pressure. Her instinct — honed across multiple product cycles — is to launch, learn, and iterate. Data protection is, in her mental model, something legal reviews at the end of the process.

The feature in question was a personalised health insights tool that would analyse users’ self-reported health data — including information about chronic conditions, medication, and mental health — to generate behavioural recommendations and flag patterns for follow-up. From a product perspective, it was a strong proposition. From a data protection perspective, it was a high-risk processing activity involving special category data under Article 9 of the GDPR, which required a DPIA before launch under Article 35, a clearly documented lawful basis for processing health data, and data protection embedded into the product architecture from the point of design under Article 25.

The Data Protection Leader — let’s call her Cleo — raised all three concerns in the product review meeting, six weeks before the planned launch date.

Natasha acknowledged the concerns. She committed to addressing them post-launch. She launched on schedule.

What happened when v2 never came #

The feature attracted complaints within weeks of launch. Data subjects who had shared mental health information through the platform raised concerns about how it was being used in the recommendation engine. The Supervisory Authority opened an investigation. When it asked for the DPIA, the Article 9 lawful basis documentation, and evidence that data protection had been considered in the product architecture, there was none.

The investigation found serious violations of Articles 9, 25, and 35. The product was suspended pending remediation. Cleo’s documentation showed she had raised the concerns before launch and had been overruled. Natasha was let go. Cleo remained and was given formal authority over product launch sign-off for any processing involving special category data.

Does a DPIA have to be completed before a product launches under the GDPR? #

Yes — and the timing is not discretionary. Under Article 35 of the GDPR, a DPIA is required prior to the processing where a type of processing is likely to result in a high risk to the rights and freedoms of natural persons. The EDPB’s guidelines on DPIAs are explicit: the assessment must be carried out before the processing begins, not in parallel with it, and not after launch pending a review. For products that process special category data under Article 9 — including health data, biometric data, and mental health information — the threshold for requiring a DPIA is met in almost all cases. Article 25 similarly requires that data protection by design and by default be implemented at the time of the determination of the means of processing — not retrofitted when complaints arrive. A product that launches without these steps in place is not an early-stage product in need of iteration. It is a product that is processing personal data unlawfully from the moment of its first user interaction.

Two enforcement decisions confirm what this looks like in practice. In January 2025, the Finnish Data Protection Authority fined Sambla Group €950,000 after finding that the company’s loan application platforms had processed the financial and personal data of tens of thousands of customers through insecure URL links accessible to third parties, without Data Protection by Design measures under Article 25, and without adequate security under Article 32. The authority found that the violations had continued for a significant period after the risks had become apparent, and ordered the immediate suspension of processing.

In another case from December 2024, the Irish Data Protection Commission fined Meta €251 million for a 2018 breach that exploited a vulnerability in the platform’s “View As” feature — finding that Meta had failed to implement data protection by design and by default under Article 25, had failed to include all required information in its breach notification to the Supervisory Authority under Article 33(3), and had failed to adequately document the facts of the breach and the remedial steps taken under Article 33(5). The DPC’s finding was specific: the breach could have been prevented had data protection by design principles been embedded in the feature at the point of development.

What does data protection by design require from a Product Manager? #

Data Protection by Design under Article 25 of the GDPR requires that technical and organisational measures are implemented to give effect to the data protection principles — data minimisation, purpose limitation, accuracy, storage limitation — at the time of designing the product, not after it ships. For a Product Manager, this means three things that must happen before a feature involving personal data reaches production. First, the lawful basis for every category of personal data the feature will process must be identified and documented before the first line of code is written. Second, if the processing is likely to result in a high risk to data subjects, a DPIA must be completed and its recommendations reflected in the product design before launch. Third, the default configuration of the product must be the most data-protective option available — not the most data-rich. A product that collects more than it needs by default, in the name of future feature development, is not compliant with Article 25 from the moment it ships.

The challenge for today: Walk through the last three features your product team shipped that involved personal data. For each one, find the lawful basis documentation, the DPIA where applicable, and the data protection sign-off. If any of those records are missing, or were completed after launch, you have found the gap — and the processing that occurred in the interval was unlawful regardless of the feature’s commercial success.

For more on how data protection requirements apply from the start of design rather than the end of delivery, see Beyond Legal #21 on what engineers need to know, and Beyond Legal #26 on how design decisions affect consent validity.


Article references: Article 5(1) (data protection principles), Article 5(2) (accountability), Article 9 (special categories of personal data), Article 25 (data protection by design and by default), Article 33 (notification of personal data breach), Article 35 (data protection impact assessment), Article 83(4) and 83(5) (administrative fines).

Series: This is post 13 in the Beyond Legal series — 20 roles, 20 days, real consequences. Natasha, Cleo and the story are fictitious; the two cases are real.

Purpose and Means is a niche data protection and GRC consultancy based in Copenhagen but operating globally. We work with global corporations providing services with flexibility and a slightly different approach to the larger consultancies. We have the agility to adjust and change as your plans change. Take a look at some of our client cases to get a sense of what we do.

Author
Tim Clements

Browse by Topic

accountability accountability frameworks ai act ai ethics ai governance ai infrastructure sovereignty ai literacy ai regulation article 12 article 13 article 25 article 28 article 30 article 32 article 35 article 46 article 5 article 6 article 7 audit and assessment automated decision-making awareness campaigns behaviour change beyond legal board reporting case law cloud infrastructure compliance monitoring consent cookie compliance cross-border transfers dark patterns data breach notification data flows data mapping data minimisation data processing agreements data protection data protection by design data protection day data protection leader data quality data residency data retention data sovereignty datatilsynet deceptive design direct marketing dora dpia education employee data employee engagement eprivacy esg executive communication external legal counsel finance and banking gdpr gdpr at 10 generative ai governance grc healthcare horizon scanning hr and data protection hr and employment incident response information security intellectual property internal communications international transfers lawful basis leadership lego serious play machine learning marketing nis2 privacy by design privacy culture product management public sector quantum computing records of processing regulatory guidance risk management risk reduction ropa software development special category data standard contractual clauses strategic planning sub-processors sustainability third-party risk training design transparency trend radar ux design vendor management visual communication weak signals workshop facilitation

Related Posts