• Data protection by design (DPbD), a holistic approach to embedding principles in technical and organizational measures undertaken by data controllers, building on the notion of Privacy by Design, is now a qualified duty in the GDPR.

  • Practitioners have seen DPbD less holistically, instead framing it through the confidentiality-focussed lens of privacy enhancing technologies (PETs).

  • We show that some confidentiality-focussed DPbD strategies used by large data controllers leave data reidentifiable by capable adversaries while heavily limiting controllers’ ability to provide data subject rights, such as access, erasure and objection, to manage this risk.

  • Informed by case studies of Apple’s Siri voice assistant and Transport for London’s Wi-Fi analytics, we suggest three main ways to make deployed DPbD more accountable and data subject–centric: building parallel systems to fulfil rights, including dealing with volunteered data; making inevitable trade-offs more explicit and transparent through Data Protection Impact Assessments; and through ex ante and ex post information rights (Articles 13–15), which we argue may require the provision of information concerning DPbD trade-offs.

  • Despite steep technical hurdles, we call both for researchers in PETs to develop rigorous techniques to balance privacy-as-control with privacy-as-confidentiality, and for DPAs to consider tailoring guidance and future frameworks to better oversee the trade-offs being made by primarily well-intentioned data controllers employing DPbD.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.