Today’s approach to privacy by design, or data protection by design as it’s referred to in the EU General Data Protection Regulation, is fundamentally flawed. Privacy by design is the concept that privacy must be “baked” into every stage and aspect of a new product’s design, as well as a company’s organizational structures.

Privacy by design as a concept has been around for a long time. In 2009, Ann Cavoukian, then Information and Privacy Commissioner for Ontario, Canada, along with her colleagues, came up with “The 7 Foundational Principles” of privacy by design. According to Cavoukian and her colleagues, privacy must be (1) proactive and not reactive; (2) embedded into a product’s design; (3) user-centric; (4) visible and transparent; (5) positive-sum, not zero-sum; (6) full lifecycle protection; and (7) the default setting. Though for many years privacy by design was considered best practice, it was never a formal legal requirement. Until the GDPR. Article 25 of the GDPR introduces a legal requirement to implement privacy by design and “bake” privacy into product development and organizational processes.

So why is our approach to this topic flawed? If you ask anyone who advises on GDPR compliance what is required to comply with Article 25’s requirements, they will probably say the following: When you are designing a new product or launching a new service, carry out a privacy impact assessment to mitigate any risks identified, then decide whether to proceed with the new product as is or modify it.

I believe this approach is wrong and does not comply with Article 25’s requirements for three main reasons. The first reason is purely legal/textual. Article 35 of the GDPR already deals with PIAs, when they are necessary, the process for carrying them out, etcetera. There is no need for an entirely separate article, Article 25, simply to introduce another instance in which a PIA needed. The second reason is more practical and takes into account how businesses operate. By the time a PIA is carried out, a product is already conceptualized, planned and, in some instances, a beta is ready to be tested. By that stage, it might already be too late to change the functionality. A PIA is merely a reactionary tool, and privacy by design calls for proactivity. The third reason is also practical and has to do with what a PIA is designed to uncover. A PIA is inherently very narrow and tailored specifically to the new product that is being developed. It will not generally highlight systematic organizational issues rooted in a company’s structure, such as whether a company’s policies are being adhered to or not. I will elaborate more on this in a minute.

So why is privacy by design so important, and why are we talking about it now? The answer is that this requirement is slowly creeping into the headlights and gaining importance. European authorities are placing greater emphasis on this requirement, as well. On Nov. 13, 2019, the European Data Protection Board published its guidelines on data protection by design and default, the official “handbook” detailing Article 25’s requirements. The EDPB accepted public comments through January 2020.

 

Perhaps of equal significance is the attention this requirement received on the national-enforcement stage. The national data protection authorities are beginning to enforce this requirement and fine companies for noncompliance. On June 26, 2019, the Romanian National Supervisory Authority For Personal Data Processing fined Unicredit Bank 130,000 euros for unlawfully publicly disclosing the information of 337,042 individuals, in cases in which these individuals were transferring funds via the bank. On Oct. 7, 2019, the Hellenic Data Protection Authority fined OTE, a telecommunication service provider, 400,000 euros for unlawfully retaining details of subscribers and not properly deleting them in accordance with legal requirements. And, on Oct. 30, 2019, Germany’s Federal Commissioner for Data Protection and Freedom of Information fined a property company, Deutsche Wohnen, a staggering  14,500,000 euros (the fifth largest GDPR fine to date) for unlawfully retaining customer data. To be clear, the property company had a retention policy in place. However, it failed to implement the proper process to ensure the automation and enforcement of its policy. In all three cases, the authorities cited breaches of Article 25 as one of the reasons for the fines.

What is amazing…

Read The Full Article

Leave a Reply