Getting to grips with GDPR: Rights in relation to automated decision making and profiling

profile picture Yoti 5 min read

The eighth blog post in our series on GDPR rights is about the rights in relation to automated decision making.

So far, our series has covered:

  1. Your right to be informed
  2. Access rights
  3. Correction rights
  4. Deletion rights
  5. Objection rights
  6. Restriction rights
  7. Portability rights

 

Part 8: Rights in relation to automated decision making and profiling

This is not a new right and the GDPR wording is almost identical to that in the EU Directive it replaces. The aim of this right is to provide safeguards for individuals against the risk that a potentially damaging decision is taken without human intervention.

In the GDPR this provision is part of the section on individual rights and is set out as follows.

Individuals have the right not to be subject to a decision when:

  • it is based only on automated processing;
  • and it produces a legal effect or a similarly significant effect on the individual.

 

The right does not apply if the decision:

  • is necessary for entering into or performance of a contract between an organisation and the individual;
  • is authorised by law (such as for the purposes of fraud or tax evasion prevention);
  • based on explicit consent;
  • when a decision does not have a legal or similarly significant effect on someone.

 

If the right applies, organisations must make sure that individuals are able to:

  • get human intervention;
  • express their point of view; and
  • get an explanation of the decision and challenge it.

 

So what’s new?

The new aspect is that the data protection regulators have interpreted this provision as a prohibition and not as a right. Their view is that companies cannot carry out only automated decision-making at all unless it’s necessary for a contract, they have your explicit consent or it’s authorised by law. For the UK this is a different interpretation of the same provision in the old law. Some other EU regulators have always thought of this provision as a prohibition, but there has never been any guidance or enforcement from them on the topic.

The UK Data Protection Act 2018 has some extra obligations for organisations who are doing only automated decision-making based on it being authorised by law. In these cases, organisations must, as soon as reasonably possible, notify you in writing that they have made an automated decision. You then have one month to request they either reconsider the decision, or take a new decision that is not based only on automated processing.

 

What does it mean in practice?

Many organisations use automated decision-making in some way as a core part of their business. For example, when you go on a price comparison website to get quotes, the results you get back are generated automatically based on the information you provide and the information from the different organisations who are represented on the site. When your bank contacts you to say they think someone has cloned your card or is trying to use your details fraudulently, they know this because of automated processing in the background that spots unusual activity on your account. When you identify yourself using a pass card or biometrics, it is an automated system that is checking whether your credentials match what is registered.

It is important to remember though that the GDPR right is about automated decisions that have a legal or similarly significant effect on you. A legal effect might be, for example, a decision that leads to a contract being cancelled, or you being denied a particular social benefit granted by law, such as child or housing benefit. A similarly significant effect is harder to define, but it might be, for example, a decision that denies you an employment opportunity or puts you at a serious disadvantage.

Many automated decisions may not meet this threshold and while regulators have put out some guidance on what they think this threshold means, it may take some time, and possibly court cases, to agree where the lines in the sand are.

As with all the rights, the organisation also has to be able to verify your identity before taking action as a result of your request.

 

What is Yoti doing?

We don’t make any automated decisions that meet the threshold of the GDPR provision. Our automated decisions are related to the fraud prevention checks we carry out, such as to make sure it’s really you when you take certain actions in the app, or when you add a document to make sure you are adding your own, genuine document. If the automated technology fails when you’re setting up your account, we have trained staff who can intervene and make a manual decision. If you have problems while using the app, you can always contact our Customer Support team for help.

If you have any questions about this right, contact privacy@yoti.com.