19355609 - piles of stacks of green banknotes

What you need to know about which data breaches are fined, and how you can avoid them

rebecca

Because of the way the GDPR was written and that it is still relatively new, there are a lot of open questions. For example, how do we know if our organisation’s guidelines and procedures live up to the GDPR’s requirements?  How much will we be fined if there is a data breach?  

DLA Piper published a report on January 20, 2021 that summarises GDPR fines and data breaches over the past year (the webpage is in Danish but you can find the report in English here). This report gives us a better idea of which data breaches are more frequently fined, and what kind of data breaches will be fined in the future.  

This article will help you get a deeper understanding of what types of data breaches you should pay close attention to, and specific strategies you can use that might prevent your company from being fined.

The Biggest Fines Last Year

The table below illustrates the three biggest fines given last year and shows why it is important to take the GDPR seriously.

Rank Supervisory Authority Fine (penalty) Reason
1.
France’s Data protection authority, CNIL 
€50m 

Google Inc, was fined for 

  • Failing to adequately explain how they process data.
  • Failing to have legal grounds to process data regarding personalised advertising.
2.
The Hamburg Data protection Supervisory Authority
€35.26m

A global retailer was fined for 

  • Failing to have enough legal support for processing data.
3.
Italy’s Data Protection Supervisory Authority, the Garante
€27.8m

A telecommunications operator was fined for 

  • Failing to adequately explain how they process data. 
  • Failing to have legal grounds to process data 
  • And more 
GDPR fines
Click to enlarge

This article will cover: 

GDPR fines can be challenged

Within the past year, EU countries reported an increase of breach notifications by 19%. This is an average of 331 notifications per day. Regulators have issued a total of EUR 158.5m in fines since January 28, 2020, but it is not the end all be all. Some organisations successfully appealed these claims, which led to large reductions in anticipated fines. The fines were reduced by 80-90%, in part because of the financial hardships caused by Covid-19.

The main takeaway here: it pays to appeal and challenge the proposed fines.

What kinds of data breaches are fined?

The same rules have different ways of being enforced

Since each country has its own regulatory authority, data breaches and fines vary from country to country. Even though all data protection laws in the EU and the UK originate from the GDPR, the compliance cultures within organisations differ widely.

In addition, the separate data protection supervisory authorities have different interpretations of the legislation and therefore different enforcement practices. This uncertainty is most challenging for multinational organisations that operate in many different countries, as it makes it harder for them to anticipate and implement the different interpretations of the GDPR. Despite this, the fines summarised below were frequently handed out, regardless of supervisory authority.

Failure to communicate clearly and openly about how they process data

Organisations have gotten into trouble if they were not open and clear with users about how they process their personal data. The main way that they show this is in their privacy notice. This notice must explain to individual users in a simple way, how the organisation will use their personal data, and how their practices comply with the GDPR.

Organisations are fined for having overly complicated privacy notices. They were also fined if notices that were too gritty, inaccurate, or incomplete. For example, if you move personal data to a third country (one that is not in the EU), but don’t communicate this information, your privacy notice is incomplete.

It’s a fine line…

The challenge with drafting privacy notices can be that when you include too much detail, your audience may not be able to understand it, and thus breaches the GDPR’s transparency principle. On the other hand, if you include too little detail, you risk of receiving a fine for providing imprecise or incomplete information.

Possible solution: Take a ‘layered approach’ to privacy notices. This approach presents the privacy notice in three different ways:

  1. A short note (a brief explanation of why they use personal data, and where to go for more information),
  2. A medium note (a graphic that describes how they use personal data),
  3. A long note (a complete and thorough notice that covers all necessary information to comply with the GDPR).

However, this is not a bullet-proof solution either. Organisations have also been fined for this method as, in some cases, users had to read and check off too many different privacy notes. In this case, the information was presented in a way that was far too complicated for the user.

The bottom line is that it can be extremely difficult to explain the complexity of processing to regular consumers who don’t have a law degree. All in all, the report did not have a strong solution to this problem. The main takeaway here: proceed with caution when drafting privacy notes and ask for help when in doubt.

Failure to have or show a legal reason to use personal data

There are many different scenarios where your organisation will have a legal reason to process data.

For example: your organisation sells a subscription to a music app and asks the consumer to give consent so you can process their musical preferences in order to suggest other songs and artists that they might like.

Organisations were fined for failure to legally be allowed to process personal data. They were also fined even when a legal ground to process existed, but the organisation had failed to properly demonstrate or document this. Therefore, documentation is just as important as the existence of a legal basis to process.

Finally, organisations were fined when processing was based on invalid consent. This is a broad term that covers a wide range of situations. Some of which are: someone might be penalised or miss out on an opportunity if they do not give consent, they might not realise they have given consent, the organisation does not specifically name themselves when asking for consent, and many more.

To make sure that personal data is being used legally, organisations should have:

  1. Good data mapping in thorough and accurate records of processing.
  2. Rules and guidelines that properly protect personal data.
  3. Documentation that shows they can legally use personal data.
  4. Clear privacy notices that connect each processing activity with the law that allows them to use the data.

Possible solution: You can apply a risk-based approach, for which you give more time and attention to higher-risk processing activities. For these activities it is important that you employ a Data Protection Impact Assessment (DPIA). This is a type of risk assessment, that can help identify where there are data protection risks with a given project or plan. This assessment tool, when used correctly, helps you demonstrate in what ways you comply with all your data protection obligations.

Failure to implement appropriate security measures

According to the GDPR, organisations must use ‘appropriate’ measures to make sure there is a level of security that corresponds to how much risk is involved for the individual, whose data you are processing. The term ‘appropriate’ is very subjective, which makes it hard for organisations to adequately identify and implement such measures. However, based on the fines that have been given out in the past year, we can get a better sense of what qualifies as appropriate security measures.

The following table is a list of security measures, that might be necessary to implement in order to avoid a fine.

Security Measure Explanation

A privileged user is someone who has access to systems and data that might be sensitive and needs extra protection. It is therefore important to monitor these users by analysing behaviour and providing reports, to ensure that personal data is protected. 

A process that boosts a server’s protection, using techniques to prevent access to administrator accounts
Encrypting personal data

Encryption refers to the process that converts clear text into a code that only can be decoded again with a special key. It is particularly useful for protecting sensitive personal data.   

An extra security measure that requires the user to provide two or more verification factors to gain access to an application, online account or other. This can be a pre-determined password, followed by a code that is sent to you, or one that you have in physical form.

This guarantees that users are who they say they are, and that they are allowed to access company data. You should use this on a needs basis, and remember to remove a user’s access when it is no longer necessary.  

This is a simulated cyber-attack against your computer system to check where you are vulnerable, and what can be exploited. This is not a one-off measure, and should be used regularly
Avoid hardcoding passwords

Plain text passwords can easily be hacked. Hardcoding is the plain text passwords embedded in source code. Hardcoding passwords should be avoided.

It is important to keep track of failed access attempts. If a data breach happens, having this log makes it much easier to track and investigate.
This is the process of reading source code line-by-line in order to identify where your organisation might be vulnerable. It also might reveal personal data that is being stored incorrectly. Overall, it can greatly improve an organisation’s security.  

Using the Payment Card Industry Data Security Standard (PCI DSS) 

Using this set of security standards makes sure that your organisation accepts, processes, stores or transmits credit card information in the most secure way.
Security measure GDPR
Click to enlarge

Possible solution: Organisations should consider and evaluate whether any of these measures are needed in their efforts to protect personal data. Organisations should also consider documenting the basis for why they adopted or omitted a particular security measure.

Breach of the data minimisation and data retention principles

Organisations have been fined when they have processed too much data or have saved too much data. In the two decades prior to when the GDPR was passed, there was very little regulation of how you could use personal data, combined with an enormous increase of the amount of personal data processed and stored by organisations. Now, these organisations are faced with the time and resource consuming task of detangling and sorting this massive amount of data. It requires the following two things:  

  1. Finding and classifying all data into two categories: to be deleted (data minimisation) and to be saved.

  2. Deleting personal data stored in old systems and applications, that either cannot or do not easily support secure deletion. 

Possible solutions: To resolve this and prevent harm to data subjects, organisations must create an overview of all their data, and delete data that they no longer need, or have a right to process. In addition, it is important to develop guidelines and use systems that can securely process data going forward.

Transfer of personal data to third countries and international organisations: Will be more frequently fined in the future

In July 2020, the Shrems II judgement, was handed down by Europe’s highest court, that demanded, “in addition to fines, the immediate suspension of alleged illegal transfers of personal data from the EU to third countries”. We have yet to see the fallout from this judgement, and it might take some time before it filters through to enforcement practice.

Possible solutions: We are still waiting for finalised drafts from the Commission regarding how to treat data that is transferred to third party countries (any country not members of the EU and UK). Normally when organisations need to transfer data to third party countries, they use a “standard contractual clause” to do so. This contract is intended to protect personal data, so that when it leaves the EU, it will remain protected by the GDPR in other countries that do not offer the same amount of protection to personal data.

Once the Commission has updated this policy, organisations should implement a Transfer Impact Assessment (TIA). This will illustrate which risks are linked with the transfer of personal data out of the EU, and your organisation should revise the standard contractual clauses accordingly.

Securely processing personal data can be a daunting task. Often, those responsible for this task are not legal experts on the matter. You can find more information on our blog, to explore best practices and processes for safeguarding personal data.

Subscribe to our newsletter

Receive updates containing free templates, tools, and news from CyberPilot.

Join our 2000+ subscribers and sign up for our newsletter. You will receive inspiration, tools and stories about good cyber security practice directly in your inbox. Our newsletter is sent out approximately once a month.

Top