January 11th, 2018 | Sterling
What Are Profiling and Automated Decision-Making Under the GDPR?

Sterling has produced a 10-part webinar series about the changes to the way personal data is protected in the European Union when the EU General Data Protection Regulation (GDPR) applies on 25 May 2018. The webinars share key steps hiring managers, HR and legal personnel can take now to help ensure full compliance from day one. The first webinar in the series, “What You Need to Know with 12 Months to Go” introduced the changes to data privacy laws to come. The seventh webinar in the series, “Profiling and Automated decision-making” is now available on demand. The webinar presented by Oran Kiazim, Vice President of Global Privacy and Beatriz Torets-Ruiz, Privacy Analyst & Legal Researcher of Sterling, explains what is meant by profiling and automated decision-making and how it is treated under the GDPR.
What is Profiling under the GDPR?
Profiling is any form of automated processing of personal data which can evaluate certain personal aspects relating to an individual. The process analyses or predicts certain aspects concerning that natural person’s performance at work, economic situations, health, personal preferences, interests, reliability, location or movement. Profiling must be understood in the context of automated decision-making.
Automated processing that amounts to profiling is generally uncommon in the background check industry. Third-party screening providers may have automated solutions to offer, but these data-driven services are still influenced and directed by human beings. One anomaly that the Information Commissioner’s Office (ICO), the UK’s data protection regulator, has highlighted is that the definition of profiling is not consistent in the GDPR. According to the ICO, there is a risk that the scope of profiling might be stretched to include data processing that isn’t solely based on automated means. Since the recording of the webinar, the WP29 has released guidelines for profiling and automated decision-making under the GDPR. The proposed guidelines state automated decision-making technologies bring increased efficiencies and resource savings to “better segment markets and tailor services and products to align with individual needs”. Organisations must be transparent and fair in processing profiling and automated decision-making. The automated reporting should not deny people access to employment opportunities, credit or insurance. The need to collect and hold personal data should be explained and safeguards must be put in place to verify that the information is accurate.
What is Automated Decision-Making under the GDPR
The GDPR provides safeguards for individuals against the risk that a potentially damaging decision is taken without human intervention. These rights are similar to the existing rights under the Directive. Article 22 of the GDPR defines an automated decision as a decision which is made following the processing of personal data that has been conducted solely by automatic means, where no humans are involved in the decision-making process. The rights of an individual in relation to an automated decision only arise where the automated decision can have a legal or significant impact on the individual.
There are two elements that must be present for a decision to be considered an automated decision:
- Solely based on the automated decision: If there is no real human influence on the outcome of the decision, then it’s likely it will be considered an automated decision.
- Producing “legal or significant effects”: A legal effect is something that adversely impacts an individual’s legal rights, or affects their legal status. A significant effect suggests some consequence that is more than trivial and potentially has an unfavourable An example of this would be an automated refusal of where the decision relates to the individual’s job performance, an online credit application or e-recruitment practices.
Automated decision-making may not be used where the individual is a child or in relation to other types of sensitive personal data (such as ethnic origin, political opinions, religious beliefs, medical data) unless the individual has given explicit consent.
How Does Automated Decision-Making Apply to Background Screening?
The GDPR (in Recital 71) indicates decisions made from e-recruitment are an example of an automated decision that may produce significant effects on an individual. However, it also makes clear that the objectionable element is the lack of human intervention in this type of recruitment.
Under the current definition of automated decision-making, it would appear these types of decisions are being made during the background screening process. However, to avoid this type of finding it is important that a real person is involved in the final recruitment decisions and, at the latest, in the assessment of the results of the background checks before sending the rejection letter to the candidate. If a company uses automated decision-making, including profiling, they need to make sure they obtain explicit consent (where this is relied on), especially for sensitive data and any legal authorisation is based on EU or Member State law. Also, if automated decision-making involves children, it is recommended to seek legal advice.
Sterling has been planning since 2016 for the GDPR changes that go into full effect on 25 May 2018. One way to stay up-to-date on the provisions of the GDPR and make sure that your organisation is compliant is to sign-up for the Sterling GDPR 10-part webinar series. The On-Demand webinars tackle the many aspects of the GDPR, from privacy notices to definitions of automated decision-making and how the changes will impact the background screening industry. Sign up today for these informative webinars.
Please note: Sterling is not a law firm. The material available in this publication is for informational purposes only and nothing contained in it should be construed as legal advice. We encourage you to consult with your legal counsel to obtain a legal opinion specific to your needs.
This content is offered for informational purposes only. First Advantage is not a law firm, and this content does not, and is not intended to, constitute legal advice. Information in this may not constitute the most up-to-date legal or other information.
Readers of this content should contact their attorney or lawyer to obtain advice concerning any particular legal matter. No reader, or user of this content, should act or refrain from acting on the basis of information in this content without first seeking legal advice from counsel or lawyers in the relevant jurisdiction. Only your individual attorney or legal advisor can provide assurances that the information contained herein – and your interpretation of it – is applicable or appropriate to your particular situation. Use of, and access to, this content does not create an attorney-client relationship between the reader, or user of this presentation and First Advantage.