Picture this, a candidate is finally invited to a job interview. They expect to meet with a staff member from HR, but instead find themselves sitting opposite a robot. During the 30-minute interview, the robot asks the candidate dozens of questions about their motivation, career goals, strengths and weaknesses. Its built-in cameras record not only the interviewee's words, but also their facial expressions and gestures. The robot recognises emotional responses in the candidate and can react spontaneously. The combination of the generated responses and emotions results in a personality profile, which is compared with existing data on successful employees and assigned to one of two categories: suitable or unsuitable.
In today's job market, this scenario is no longer fiction. The US company HireVue, among others, advertises and sells its sophisticated video interview software to large companies for recruitment purposes. While applicants are interviewed from the comfort of their homes, up to 20,000 data points can be collected from this type of interview and analysed instantaneously using algorithms to find the right employee. Further, many larger companies use AI in their global HR recruitment process. For example, in Switzerland, Credit Suisse uses software to review and categorise applicants according to their suitability or unsuitability for the advertised position.
Many legal issues have arisen following the introduction of sophisticated video interview software, including with respect to labour law and data protection. Further, the use of this technology in civil and criminal proceedings will likely become a highly debated topic, given its potential to breathe new life into and eliminate the shortcomings of the much-maligned lie detector test.
Supporters of the new software believe that algorithms are better able to comply with today's demanding employment law requirements because they make decisions objectively. Supporters also claim that the broader information database improves inclusion and diversity (ie, the larger the database, the greater the accuracy). Further, these kinds of video interview generate valuable candidate information that is much richer than a CV, including subconscious characteristics which even sophisticated interviewers cannot spot.
However, algorithms must be programmed in such a way that no interview questions contravene labour laws or interfere with protected employee interests. For example, questions about health or financial circumstances violate individuals' rights to privacy and informational self-determination, depending on the advertised position. As is widely known, personal data may be processed only if it relates to a candidate's suitability for an employment relationship or is necessary to execute an employment contract.
A key concern relates to the extent of personal data that can be extracted from applicants with their consent. While consent should be given voluntarily, an employer's right to access sophisticated AI derived personal data may be hard to determine if a candidate is willing to get a job under any circumstances. Further, consenting candidates may be unaware of the extent of disclosure from video interview software (eg, the evaluation of non-verbal signals such as heart rate and eye movements). This could lead to candidates answering questions that are inadmissible under labour laws which protect their personalities.
Care must also be taken to ensure that application software is non-discriminatory. Critics rightly assert that algorithms are designed by individuals and can therefore reflect human prejudices. In other words, they are effectively opinions embedded in codes and thus inherently subjective.
According to the EU General Data Protection Regulation, job applicants have the right to reject a hiring decision based on automated data processing which has a legal effect. Therefore, any hiring decision can be subject to appeal.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.