Skip to main content

Beyond Legal #26: The UX researcher who asked what users actually understood

Beyond Legal #26: The UX researcher who asked what users actually understood

She asked me what data subjects actually understood about consent. No Data Protection Leader had ever done that. — Dara B., UX researcher

The consent mechanism on most platforms I have reviewed in the last five years has one thing in common: it was designed to obtain consent, not to enable it. The legal team reviewed the wording. The product team provided the design. Yet nobody asked the people clicking the buttons what they thought they were agreeing to.

Dara’s story #

Dara is a UX researcher at a mid-sized European e-commerce platform. His job is to understand how users actually experience products — not how designers intended them to, not how the legal sign-off assumed they would, but how they genuinely navigate, read, hesitate, and decide. He runs user testing sessions, interprets behavioural data, and translates findings into design recommendations. He had never worked closely with a Data Protection Leader before.

That changed when the Data Protection Leader — let’s call her Rosa — flagged a concern about the platform’s consent flow during a product review meeting. The platform was using a layered consent mechanism for marketing communications and behavioural analytics. Legally, the team had checked the boxes: an “Accept” button, a “More options” link, a privacy notice linked in the footer. Rosa had a different concern. She wanted to know whether the consent being collected was genuinely informed and freely given under Article 7 of the GDPR, or whether the design itself was creating a bias toward acceptance.

She asked Dara if he could run a user testing session focused specifically on the consent journey.

He could. He did. The findings were not comfortable.

What the research revealed #

In Dara’s moderated sessions, most participants clicked “Accept” within three seconds without reading anything. When asked what they had agreed to, the majority believed they had consented only to functional cookies. None understood they had authorised behavioural analytics processing. Several said they had not noticed the “More options” link at all — it was rendered in a smaller font and a lighter colour than the Accept button, sitting below the fold on mobile.

The consent mechanism was not technically a dark pattern in the most aggressive sense. Nobody had been tricked. But the design had systematically made acceptance easier than refusal, and had communicated nothing meaningful to the people clicking through it. Under the GDPR, consent must be freely given, specific, informed, and unambiguous. Dara’s research demonstrated that it was none of those things in practice.

Dara used the findings to overhaul the consent flow — equal visual weight between Accept and Reject, no pre-selected options, plain language explanations of each processing purpose at the point of decision. The redesign temporarily reduced opt-in rates. Dara and Rosa presented the compliance rationale to the board together. Shortly afterwards, both were promoted into a joint product governance function - the board could see their value immediately.

Yes — directly and enforceable so. Under Articles 4(11) and 7 of the GDPR, valid consent requires a freely given, specific, informed, and unambiguous indication of agreement. Interface design is not decorative in this context — it is part of the consent mechanism itself. The European Data Protection Board’s “Guidelines 03/2022 on Deceptive Design Patterns in Social Media Platform Interfaces (2023)” explicitly identify asymmetric visual design, buried opt-out options, and confusing language as patterns that invalidate consent, regardless of whether the legal text is technically correct. A cookie banner with legally accurate wording and a design that systematically discourages refusal does not produce valid consent. The design is the consent mechanism, and the consent mechanism must be designed to enable a genuine choice.

Two enforcement decisions make this concrete. In September 2023, the Irish Data Protection Commission fined TikTok €345 million for, among other violations, nudging younger users (under 18, and especially under 16) toward more privacy-intrusive settings during registration using default public account settings and a design that made the privacy-protective path harder to find. The DPC found this infringed the fairness principle under Article 5(1)(a). In February 2023, Italy’s Garante fined Ediscom €300,000 in one of the first European regulatory decision to explicitly name dark patterns as GDPR violations — finding that asymmetric button sizes, small font opt-out links placed outside pop-ups, and additional consent prompts shown only to users who declined all constituted breaches of Articles 5(1)(a), 7(2), and 25. The Garante’s finding was unambiguous: the intentional use of unequal design to steer users toward consent is not a grey area. It is a violation.

Dara’s sessions produced exactly the evidence that prevented the platform from becoming the subject of a similar finding.

What does data protection by design require from a UX researcher? #

Data Protection by Design under Article 25 of the GDPR requires that data protection be integrated into processing systems and practices from the point of design, not retrofitted afterward. For consent mechanisms, this means that the interface must be designed to produce genuine informed choice — and that genuine informed choice must be tested with real users, not assumed from a legal review of the button labels. A UX researcher who understands what the GDPR requires from a consent mechanism is a direct contributor to the programme’s accountability under Article 5(2). A Data Protection Leader who has never asked a UX researcher what users actually understand is operating with a significant blind spot.

The challenge for today: Pull up your consent mechanism and ask one question: is the path to decline as visually prominent, as accessible, and as quick as the path to accept? If the answer requires careful thought, you already have the answer.

For more on how non-legal roles shape data protection outcomes, see Beyond Legal #21 on data protection by design in engineering, and Beyond Legal #23 on what happens when lawful basis is treated as a formality.


Article references: Article 4(11) (definition of consent), Article 5(1)(a) (lawfulness, fairness, transparency), Article 5(2) (accountability), Article 7 (conditions for consent), Article 25 (data protection by design and by default), Article 83(5) (fines for consent violations).

Series: This is post 26 in the Beyond Legal series — 20 roles, 20 days, real consequences. Dara, Rosa and the actual story is fictitious, the two cases are real.

Author
Tim Clements

Browse by Topic

accountability frameworks ai act ai ethics ai governance ai infrastructure sovereignty ai literacy ai regulation article 28 article 32 article 7 audit and assessment automated decision-making awareness campaigns behaviour change beyond legal case law compliance monitoring consent cross-border transfers dark patterns data breach notification data mapping data minimisation data processing agreements data protection by design data protection day data quality data retention data sovereignty datatilsynet deceptive design dora dpia education employee data employee engagement esg executive communication finance and banking gdpr gdpr at 10 generative ai grc horizon scanning hr and data protection hr and employment incident response information security intellectual property lawful basis leadership lego serious play machine learning nis2 privacy by design privacy culture public sector quantum computing records of processing regulatory guidance risk management risk reduction software development strategic planning sub-processors sustainability third-party risk training design trend radar ux design vendor management visual communication weak signals workshop facilitation

Related Posts