Automation in Action

Follow us how you can conduct a DPIA threshold analysis.

One of the features of the Privacy Suite is to drive automation so its users can focus on the aspects of business that require human brain power and experience. Automation can take many faces, though; and below we’ve sketched out our perspective illustrating how we think technology — and in particular, artificial intelligence — best supports a human user.

DPIA threshold analysis

A data protection impact assessment (DPIA) is a thorough review of a processing activity. However, only processes that pose particular risk to the rights and freedoms of individuals need to undergo this detailed assessment. The EU General Data Protection Regulation (GDPR), the European Data Protection Board (EDPB), and national data protection authorities (like the German DPAs) have all established rules and criteria to clarify this threshold so it’s clear when a DPIA is needed — and when it’s not. However, this can appear as a convoluted mess of sources from which to choose. So how can you tackle this task with efficiency in mind? Take a look to see how you can conduct a DPIA threshold analysis.

Step 0: The old-fashioned way

Your first option is to read the sources, the GDPR, and the opinions of EDPB and DPAs, then apply to the case at hand, and document your findings in a Word document.

That’s what lawyers (me certainly included) have done for decades. It’s not that this is completely without efficiency; at least you can create a template for case two and onwards. There is also a learning curve: Once you’ve read the documents a couple times, you’re much more poised to spot certain issues right away.

This system doesn’t count as automation, though. It’s still the human (expert) who assembles the knowledge. Could clients do this analysis on their own without the special know-how? Certainly not as quickly and reliably. That’s why we haven’t counted it as a step towards automation.

Step 1: A static questionnaire

A very typical first move towards automation is to create a questionnaire. People love checklists, and legal requirements can often be broken down into small chunks that are easy to digest. You tick a couple of boxes . . . and out comes a result. This isn’t really automation, as there is still a good degree of manual work involved; but it’s a step towards user friendliness. And — isn’t getting a reliable result in an easy manner ultimately what automation is all about?

Our DPIA threshold analysis has already made it into a questionnaire: The data protection authority of Lower Saxony has created such a questionnaire (in German, obviously); and through niedersachsen.digitalwe’ve provided our thoughts in the creation process. In addition, we have translated the German checklist into English. It’s 34 pages, so not something you fill in while you wait for the teakettle to boil.

Find the German original of the questionnaire here and our translation here.

Step 2: An interactive questionnaire

Filling in a questionnaire is fine and good, but what if the questions could be asked as the user is guided a bit? For instance, if two of the nine criteria of the European Data Protection Board (EDPB) are fulfilled, then there is already an answer: Typically, you would have to conduct a DPIA. In a text-only checklist, this is sometimes overlooked — so users might be uncertain whether they are really done already. Wouldn’t it be better if there were some method of guided automation in which the questionnaire would lead the users automatically to the next relevant question?

With this in mind, in comes our first approach to build such a system with technology. We’ve transformed the questionnaire by the Lower Saxony authority using the legal automation bot created by our friends at Josef Legal.

You can produce a document as if you had filled in the Word version of the questionnaire.

Step 3: Rule-based automation in the Privacy Suite

A key feature of automation is to reuse information that has been collected already. Automating the boring part means not having to enter the same information over and over again. That’s the vision that drives the Privacy Suite as an integrated data protection management system. Why enter data in a data protection agreement, why fill in a questionnaire if the information already exists in the records of processing? That’s why we believe that the often smiled-at record of processing under Article 32 GDPR is the core foundation for good data protection. In short, if you don’t know what you’re doing, how can you do it better?

Our Privacy Suite thus collects information on a processing activity in a record of processing. This information is then automatically screened if any of the various rules in the DPIA, EDPB, or elsewhere apply. You can even create your own set of criteria if you feel like adding to the mix. We use a rule-based approach: The rules run automatically in the background if you change information in the record. No additional user input is required; the Privacy Suite presents you with its findings, and all the users have to do is review them. The same approach is used for the actual DPIA.

Step 4: AI for Privacy

This rule-based automation does not require additional information, and it already picks up many cases when a DPIA is needed. It also does an excellent job of detecting the hard criteria like vulnerable data subjects. However, any such approach will struggle with soft criteria, especially with a criterion like innovative use of systems or solutions. Such soft criteria require a lot more judgment (what’s new today will be old tomorrow). It would be good to know what others with similar processing activities think about it.

That’s where our AI for Privacy project comes in, and we have received an innovation grant by the German State of Hesse to develop it. We use artificial intelligence to find similar processing activities in real time in the case base of all entries in the Privacy Suite (obviously only for participating customers). The AI for Privacy engine then makes suggestions as to which risks it deems applicable for the new record in light of all other records.

Our AI for Privacy screening process will be launched in October 2021, after which we will work to extend it beyond risk detection to risk assessment.

Summing it all up

Automation can take many forms, and it can include many steps. It is essential that automation does not become a purpose in itself, but rather has the users’ needs at its core. Automation creates an impact only if the users’ lives are made easier — because users can take fewer steps, or because the outcome of the process is more reliable.

For our users, the information they input into a record of processing is not used only to comply with that requirement under GDPR or to screen the processing activity for risks and conduct a DPIA. It also means that users can create an agreement from that record — with the appendices already filled in.

Is that the end of privacy automation? Certainly not. What if users don’t interact in a browser but ask the software for what they need? Or if the software anticipates the next step — and automatically processes it? Many avenues are open to explore if you keep the users centrally in focus — to make privacy happen.