Organisations need to prepare for the new provisions on profiling, such as informing individuals in their privacy notices. EU DPAs are now seeking comments on their profiling guidelines. By Nicola Fulford and Krysia Oastler of Kemp Little LLP.

The Article 29 Working Party issued on 3 October 2017 its draft "Guidelines on Automated individual decision-making and Profiling"1 under the GDPR (referred to as "the guidelines" in this article).

The guidelines provide some clarity around what is a puzzling aspect of the GDPR for many. Profiling involving personal data is already part of the dayto-day processing undertaken by many organisations. Existing profiling activities involving personal data are subject to data protection law. The GDPR is an evolution (not a revolution) of the law, so why is there such attention about the impact of the GDPR on profiling?

There are three reasons. Firstly, the GDPR includes a definition of profiling, which is new, very broadly defined and being a new definition, requires guidance on which activities are caught. Secondly, there are references to profiling throughout the text of the GDPR suggesting that profiling is, in and of itself, a risky activity and organisations need to determine what this means for their profiling activities. Thirdly, there are several specific rules in relation to solely automated decision-making which reference profiling, and the text of the GDPR does not provide clarity on the scope of these rules.

In this article, we explore what is meant by profiling, automated decision- making and solely automated decision-making under the GDPR and consider how to navigate the rules applying to these activities.

What is profiling?

Profiling consists of three aspects:

  1. Automated processing (processing using computers);
  2. of personal data2
  3. with the aim of evaluating personal aspects relating to a person or group of people (including analysis or prediction)3.

The guidelines make it clear that the definition is very broad and that the processing does not need to involve inference to be caught – "simply assessing or classifying individuals based on characteristics such as their age, sex, and height could be considered profiling, regardless of any predictive purpose"4.

The guidelines describe profiling as having three distinct stages each of which fall within the GDPR definition of profiling: (1) data collection; (2) automated analysis to identify correlations; and (3) applying the correlation to an individual to identify characteristics of present or future behaviour5.

Examples of profiling include:

  • Collection and analysis of data to gain insights into behaviours and characteristics (the guidelines include an example of a data broker collecting data from different public and private sources, compiling the data to develop profiles on the individuals, placing the individuals into segments and selling the output information to companies who wish to improve the targeting of their goods and services6);
  • Keeping a record of traffic violations to monitor driving habits of individuals over time to identify repeat offenders (which may have an impact on the sanction)7; and
  • Considering an individual's credit score before granting a mortgage8.

What is meant by solely automated decision-making?

A decision based solely on automated processing is a decision with no human involvement in the decision process9. The guidelines warn that involving a human in the process to circumvent the rules on solely automated decision making would not work, as the human involvement must be meaningful and not just a token gesture. The individual needs to have the authority to change the decision considering all the information available10..

Decisions that have a legal effect are those that impact on an individual's legal rights (including in contract). Examples given in the guidelines include:

  • entitlement or denial of a social benefit granted by law, such as child or housing benefit;
  • increased surveillance by competent authorities; or
  • being automatically disconnected from a mobile phone service because an individual forgot to pay his/her bill before going on holiday.

A decision that has a similarly significant effect "must have the potential to significantly influence the circumstances, behaviour or choices of the individuals concerned. At its most extreme, the decision may lead to the exclusion or discrimination of individuals." 11 The examples given in the GDPR are automatic refusal of an online credit application or e-recruiting practices without any human intervention. The guidelines explain that although online advertising will not generally meet the threshold of having a similarly significant effect, online  advertising may meet the threshold depending on the intrusiveness of the profiling, the expectations and wishes of the individuals, the way the advert is delivered and the vulnerabilities of the individuals concerned. An example given is an advert for risky financial products targeted at vulnerable individuals.

what should organisations be doing now?

Take stock of profiling activities and any automated decision-making: It will be impossible to comply with GDPR requirements without first identifying the profiling activities and automated decisions taken by the organisation. Organisations are likely to find it helpful to think about the three stages of profiling12 to help identify profiling activities.

Where automated decisions are identified, assess whether they are solely automated and, if so, if they may produce a legal or similarly significant effect on individuals. Organisations should document their analysis as part of GDPR accountability requirements.

Comply with the data protection principles: Identify an appropriate legal basis for each of your profiling activities and automated decisions. Ensure your activities comply with the data protection principles13.

Tell people about your profiling activities and automated decisions: Organisations need to provide information about profiling and automated decision-making in their privacy notices14. The rights to object and, where consent is the legal basis for processing, the right to withdraw consent must be explicitly brought to the attention of individuals and presented clearly and separately from other information. [See section below for specific requirements for Article 22 solely automated decisions that have a legal or similarly significant effect ("Article 22 decisions").]

Have processes to deal with individual's rights in relation to profiling and automated decision making: Organisations need to have processes in place to deal with requests from individuals exercising their rights. Consider the right of access to data and what information to which individuals will be entitled to a copy.

Individuals have an absolute right to object to direct marketing including profiling related to direct marketing. Organisations will need to have a clear view on their profiling that is related to direct marketing in order to be able to fulfil the absolute right to object to direct marketing. Individuals also have a right to object to processing of personal data necessary for the purposes of the legitimate interests pursued by the controller. Such objections to processing will likely need to be considered on a case-by-case basis by the controller.

Special considerations for article 22 decisions:  There is debate about whether Article 22 is a prohibition (meaning organisations cannot take Article 22 decisions unless one of the exemptions applies) or just a right for individuals not to be subject to Article 22 decisions (meaning individuals only have the right to object to such decisions).

The guidelines clearly state that the controller can only carry out the processing if one of the three exceptions covered in Article 22(2) applies15. Read as a prohibition, organisations are only permitted to take Article 22 decisions where:

  1. The decision is necessary for entering into, or performance of, a contract between the individual and the controller;
  2. the decision is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or
  3. the decision is based on the individual's explicit consent; and
  4. the controller has implemented suitable measures to safeguard the individual's rights and freedoms and legitimate interests (which includes at least a means for the individual to obtain human intervention, express his or her point of view and/or contest the decision).

Note that Article 22 decisions must not be based on special categories of personal data unless the controller has the explicit consent of the individual or the automated decision-making is necessary for reasons of substantial public interest and suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place.

Regardless of the distinction, when taking Article 22 decisions, organisations must implement documented processes to ensure that:

  • the decisions are lawful;
  • information about the profiling and the Article 22 decisions is easily accessible for individuals and brought to their attention (which includes the rationale behind or the criteria relied on in reaching the decision and the consequences for the individual with tangible examples);
  • details of Article 22 decisions are provided in response to data subject access requests, including meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the individual;
  • suitable measures to safeguard individuals' rights, freedoms and legitimate interests (including as a minimum, a way for the individuals to obtain human intervention, express their point of view, obtain an explanation of the decision reached and/or contest the decision) are implemented.

Potential differences across EU member states

Member States have discretion to introduce legislation to restrict individuals' rights and controllers' obligations regarding Article 22 decisions. In the UK, Section 13 of the Data Protection Bill sets out safeguards in relation to the Member State derogation provisions on automated decision-making and introduces the concept of a "qualifying significant decision", which is an automated decision that produces legal effects or significantly affects the data subject, is required or authorised by law and is not exempt due to it being necessary for performance of a contract or as a result of explicit consent being obtained.

Where a qualifying significant decision exists, the automated decision-making will be exempt from the Article 22 prohibition, subject to the controller, as soon as reasonably practicable, notifying the data subject in writing that the automated decision has been made. The individual has 21 days  to ask the controller to reconsider the decision or take a new decision that is not based solely on automated processing. If such a request is submitted, then the controller has a further 21 days to comply with the request and inform the data subject of the steps it has taken to comply along with the outcome.

It is unclear how an automated decision "authorised by law" will be interpreted in each country. It is also unclear whether including details of the automated decision in a privacy notice would satisfy the obligation to notify the individual or, more likely, this should be interpreted as an additional requirement.

There is potential for the additional exemptions to create confusion for organisations that are seeking to implement a workable mechanism which can be consistently applied, as the requirements are different from the requirements for the other exemptions (performance of a contract or if explicit consent is obtained). This is an area for organisations to keep under review.

Looking to the future –DPIA's?

Data protection impact assessments (DPIAs) are mandatory in certain circumstances under the GDPR16. A DPIA is required in the case of Article 22 decisions. Organisations need a process to identify whether they are required to perform a DPIA on future profiling and automated decision- making. Even where a DPIA is not legally required, a DPIA should be considered as a good practice tool and a way of demonstrating that profiling/automated decision-making complies with the GDPR.

Something to say?

The Article 29 Working Party has requested comments on the profiling guidelines (and separately the data breach notification guidelines) be submitted by 28 November 2017, so time remains in which to submit your views17. The aim is for the guidelines to be finalised by the end of the year.

This article was first published in Privacy Laws & Business UK Report, November 2017, www.privacylaws.com.

Footnotes

1 Available on the Article 29 Working Party websiteec.europa.eu/newsroom/just/document.cfm?doc_id=47963

2 Defined in Article 4(1) of the GDPR as any information relating to an identifiedor identifiable natural person ('data subject'); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person

3 Profiling is defined in Article 4(4) of the GDPR as "any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements".

4 Page 7 of the guidelines

5 See reference above

6 See reference above

7 Page 8 of the guidelines

8 See reference above

9 The guidelines illustrate the difference between decision-making based on profiling and solely automated decisions using examples. An example of decision-making based on profiling is where a human decides whether to agree the loan based on a profile produced by purely automated means. An example of a solely automated decision (including profiling) is where an algorithm decides whether the loan is agreed and the decision is automatically delivered to the individual, without any meaningful human input.

10 Page 10 of the guidelines

11 Page 11 of the guidelines

12 Data collection, automated analysis to identify correlations and applying the correlation to an individual to identify characteristics of present or future behaviour

13 Set out in Article 5 of the GDPR

14 See Articles 13 and 14 of the GDPR

15 Page 15 of the guidelines

16 See Article 35 of the GDPR

17 Details on how to provide comments are available on the Article 29 Working Party website ec.europa.eu/newsroom/just/

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.