Amanda Lathia examines the increase in fines under the GDPR in Information Security Buzz

  • February 13, 2020
  • By Amanda Lathia, Associate

This article was originally published in Information Security Buzz and can be accessed here.

Reducing The Risk Of ICO Enforcement Notices And Penalties Under The GDPR

Since the GDPR regulations came into force on 25 May 2018, there have been hundreds of thousands of GDPR breaches resulting in enforcement action for non-compliance and /or penalties for data protection breaches. With regards to the latter, regulators across Europe have imposed much more severe penalties than previously seen under the Data Protection Act 1998 (DPA 1998), the record being £183m against British Airways for losing over 500,000 customers’ personal data.

It is not only large, global corporates that are penalised under the GDPR. Smaller organisations and even individuals have been penalised for GDPR breaches, for example: estate agents failing to keep tenants’ personal data safe; social workers emailing individuals’ sensitive data to their personal email addresses without authorisation; company directors selling personal data (which can lead to director disqualification for a period of 5 years); and organisations sending marketing material electronically to customers without their consent. Lack of transparency, not having a lawful basis for processing or failure to obtain valid consent feature in the growing number of complaints to the Information Commissioner’s Office (ICO).

Complaints relating to electronic communication

The DPA 1998 remains in force for the purposes of the Privacy and Electronic Communications (EC Directive) Regulations 2002 (“the PECR”) notwithstanding the introduction of the DPA 2008. The PECR gives effect to the right to privacy directive, 2002/58/EC. Section 55 of the DPA 1998 prohibits illegally obtaining personal data without the consent of the data controller and selling or sharing that data. Many complaints to the ICO are related to unsolicited direct marketing messages. Individuals have stated, via the ICO’s online reporting tool, that where they have not given their consent to receive marketing messages, they find it “concerning and worrying” how a company has managed to get hold of their personal information, e.g. a mobile phone number. It is also very concerning that they do not know what other information a company may have about them.

Minimising consent breaches

Organisations can minimise consent breaches by rectifying any or all of the following:

  1. Do not have consent ‘yes’ boxes defaulted to being ticked – under the GDPR, the default must always be ‘no’ – explicitly ask the customer if they would like to receive marketing information electronically;
  2. Review any data sharing agreements with third parties and make sure that if you do share data with third parties (providing you have the customer’s consent), require that the third party obtains the customer’s consent separately before sending them any marketing material;
  3. Where you receive personal data from a third party, always verify a customer’s consent before sending any marketing material even if the third party informs you that consent has been obtained; and
  4. Transparency: always tell a customer whose data has been shared with you by a third party from where you obtained their data and what you intend to do with it.

Automated decision-making and profiling

Article 22 of the GDPR Regulations provides for even stricter conditions to protect individuals from automated decision-making that has legal or similarly significant effects on them. More specifically, solely automated decisions, i.e. decisions that do not have a human input or where the human review of such decisions is not meaningful could amount to a breach under Article 22. Examples include: an online decision to award a loan or a recruitment aptitude test that uses pre-programmed algorithms and criteria. Profiling goes further and collates data on an individual to ascertain their buying habits or lifestyle with the aim of predicting individuals’ behaviour or to make decisions about them.

Organisations using or intending to use artificial intelligence (AI) applications and/or profiling should ensure that human reviewers can override automated decisions and not be penalised for doing so. Where there is a low or non-meaningful level of human decision-making, the risk of a breach of Article 22 is higher. The ICO has published guidance relating to the issues of what level of human involvement in automated decision-making is meaningful (See ). It remains to be seen whether the increase use of AI will result in an increase in fines under the GDPR.

Related News

Jul 22, 2021
Gregor Kleinknecht and Constance Tait examine the impact on trademark litigation and provide 10 tips on navigating the post-Brexit era in Managing IP
Jul 16, 2021
Gregor Kleinknecht and Anastassia Dimmek examine the growing threat of zombie firms in Lawyer Monthly
Jul 07, 2021
Richard Baxter and Constance Tait examine a report suggesting that firms with targeted support for ethnic minority workers see benefits
Jun 28, 2021
Richard Baxter discusses UK-EU Data Protection and how adequacy decisions avoid imminent disruption to data flows
Jun 23, 2021
Richard Baxter and Constance Tait examine the recent Burnell v Trans-Tag Ltd case in the High Court
Jun 22, 2021
Anastassia Dimmek discussed the key challenges of protecting clients’ healthy businesses from zombie firms in a webinar hosted by Advoselect
Jun 18, 2021
Richard Baxter and Constance Tait discuss the looming annual returns deadline for employee share schemes
May 18, 2021
Hunters hosted the Withdrawal and The Trade Marks Act 1994 webinar
Mar 17, 2021
Stephen Morrall comments on Uber drivers entitled to minimum wage, holiday pay and pension following the Supreme Court decision in The Sunday Times Driving, The Times and the Daily Mail
Feb 19, 2021
Stephen Morrall comments on Uber losing a landmark Supreme Court battle in the Evening Standard and the Financial Times

© Hunters Law LLP 2021 | Privacy NoticeLegal & Regulatory | Cookies Policy | Complaints Procedure.

Hunters Law LLP is authorised and regulated by the Solicitors Regulation Authority (number 657218)