With a handful of exceptions, the GDPR isn’t – to some people’s surprise – super prescriptive about exactly what type of data companies can collect, how they can use it, and how they share it. Instead, the regulations force companies to balance risks, and utility, and a number of other considerations that, with any luck, lead to products that collect as little data as necessary tailored to specific uses.
And here’s where the regulatory burdens of the GDPR start to take their toll: What the GDPR really demands is that companies be able to show their work; that they can demonstrate they considered all these variables and calculuses as they developed products. And all that gets reflected in DPIAs for the most sensitive types or uses of data.
The problem, of course, is that showing your work takes time and demands the collaboration of many stakeholders if it’s to be done right.
Now, U.S. state laws include DPIA requirements, a pattern that’s only going to repeat as more states pass privacy laws.
On June 16, TerraTrue Director of Content Strategy Angelique Carson interviewed Odia Kagan about how to conduct DPIAs in accordance with regulators’ expectations. Below is a transcription of that webinar, edited for clarity and length.
Angelique Carson: What are DPIAs and why do they matter?
Odia Kagan: The GDPR basically says: If you have data processing, what is that data processing, and does it have anything to do with personal information. If you crunch it, you benchmark it, if you compare it, if you enrich it, if you sell it, if you do any of those things that are likely to result in a high risk to the rights and freedoms of people, then you need a DPIA, a data protection impact assessment.
A DPIA is required by the new U.S. laws, every one of them but Utah. So that means the CPRA, Colorado, Virginia, Connecticut and to some extent the new American Data Privacy Protect Act (draft bill).
We’re not talking about risks to the company, which is what we normally do. This about the risk to people, and we are trying to figure out how to mitigate it. I’m not going to say “avoid it,” but how to mitigate it to a reasonable size.
Carson: Okay, so how do we do that?
Kagan: The European Data Protection Board, which is the regulator in the EU, issued guidance on DPIAs that basically lists out criteria on in which situations you will likely need to do a DPIA. There are two lists, one, let’s call a Blacklist. In those situations, if your processing fits under any one of these purposes, you will likely need to do a DPIA. The member states also have white lists; here are the situations where you don’t need DPIAs.
Finally privacy for pre-deployment – versus afterthought.
Carson: What’s the difference between the GDPR’s DPIA requirements and the states?
Kagan: Generally speaking, the GDPR is a little bit more established, and we have guidance, we have sample DPIAs, we actually have a European precedent, which is helpful. There were a lot of DPIAs that were made public. So the difference is with the GDPR we already have like four years-plus experience and tools and mechanisms promulgated by the European regulators, so it’s a lot more established. And in the U.S., we basically have to start.
Carson: How can we anticipate DPIAs shaking out in the CPRA?
Kagan: We don’t really know how it’s going to shake out in the CPRA. Personally, I think it’s going to go somewhat in the European direction, but maybe not the same level of formality as say, Germany. What we do know in the CPRA is that if you have information that is considered “heightened risk,” you need to conduct a risk assessment, and the risk assessment needs to include whether you have sensitive information. What are the benefits to the business or the consumer? How can you outweigh the benefits versus the risks? It’s kind of a very vague description, it anticipates regulations. We just don’t know what those details are going to be. But we don’t have to completely reinvent the wheel now to do them.
Carson: From a regulatory standpoint, how should enforcement agencies think about DPIA guidance in the states?
Kagan: If you know that you need to go through this very granular, very detailed analysis, sometimes you’re incapable of doing it because it’s too onerous from a process, cost, stakeholder buy-in perspective.
I always give this example because I’m a recovering perfectionist. When something is too daunting, you just don’t start. If you come in and your sink is overflowing with dishes, you’re like I can’t do this, I’m not starting this now.
But I will not be the first person to put the first glass in the sink after the cleaners come. I’ll put that one in the dishwasher. But if there’s like 50 glasses already, I’m like “I’ll put that one in the dishwasher.” But if there’s 50 glasses already, I’m like, “I’ll put that one right next to the sink, there’s no room and I’m just gonna leave it.
If it’s too difficult to even start your DPIA, you’re like, “Whatever.” But if you have something more tailored to what you need, you’re more likely to do it now.”
There should be rules and decision trees, and there should be criteria. And the criteria should be granular enough for people to know what to do, but not too granular so that people can’t start. This is the approach when you look at the NIST Privacy Framework and the FTC’s approach.
What they’re saying is with respect to accountability, you need to be able to demonstrate that you’ve complied with the requirements, and the formality of the documentation will vary by the nature of the data, the scope of the data, the nature of the processing, and the nature of your company.
Carson: What do we see trending in U.S. state privacy laws concerning DPIAs?
Kagan: The list of things that require a DPIA under Virginia, and I think Colorado too, are already broader than the GDPR. In Colorado, you need to do a DPIA if you’re selling or processing personal data. Virginia is similar, if you’re processing for targeted advertising, selling personal data, processing sensitive data, or processing for profiling – if there’s a reasonably foreseeable risk of unfair or deceptive treatment this is one of the triggers.
Carson: What should we start doing now?
Kagan: The first thing I want to say is what is it: It’s a risk assessment of the risk of the rights of people, not of the company.
The second big takeaway that I want to give people is that the time to start doing this is right now, even though we don’t have the regs and we don’t have the specifics, because the stuff you’re definitely going to need to do is the stuff that mostly you – and not your attorney – can do that is common to all of them under every DPIA under every regime, and you will need to do it. This is the time.
The most lengthy, time-consuming tasks require administrative time, require stakeholders, requires questions. It’s going to take time for you to actually do that. So the first thing you need to look at when you’re initiating a new process is: What is this new process, and think to yourself, “Does it meet the threshold issues we already know may be triggered?” So, is it targeted advertising, is it a sale, is it sensitive information? I would even look at the EU criteria and see if you fall in there: Is it criminal conviction data, is it innovative technology? Because anything that’s innovative technology is more likely to trigger this biometric or genetic threshold.
You need to demonstrate necessary and proportionate. Then you need to dig into your processing. And this is where it takes the most time.
For more detailed advice on how to kickstart your DPIA process under the GDPR or U.S. state laws, see the program in full here.