In our previous articles on digital compliance, we explained the functionality and various uses of software-based compliance tools. Since the use of these tools also involves the processing of extensive personal data of a company's own employees and customers, it is imperative that the use of such tools complies with data protection requirements.

Principle of data protection law: prohibition with reservation of permission

When a compliance tool evaluates internal processes and checks them for possible violations, personal data of natural persons (employees, customers and other third parties) are generally considered and included.

The first step is to analyse which personal data could be accessed by the compliance tool. Once it has been established which personal data could be processed, one needs to check whether permission for the processing pursuant to data protection law exists. The GDPR stipulates that data processing is generally prohibited unless the permission criterion has specifically been met.

The prevailing interests of the controlling enterprise, in particular, are considered to be a permission criterion (Art. 6 (1) (g) GDPR). However, in order to invoke this permission criterion, the opposing interests of both sides first have to be assessed and this assessment documented. For the processing of employee data, the permission criterion in Section 26 (1) sentence 2 German Federal Data Protection Act [Bundesdatenschutzgesetz, BDSG] can also be applied in individual cases, namely when the data processing is necessary to uncover criminal offences committed by employees. However, this requires a suspicion with documented factual indications to the effect that the data subject has committed a criminal offence in the employment relationship and that the employee does not have a prevailing legitimate interest in the exclusion of the processing, in particular that the type and scope of the processing are not disproportionate in relation to the grounds.

Involvement of the works council

In addition to the data protection law permission criteria, the works council's right of co-determination must also be taken into account. Pursuant to Section 87 (1) No. 6 of the German Shop Constitution Act [Betriebsverfassungsgesetz, BetrVG], the works council must be included in the introduction or use of technical equipment intended to monitor the behaviour or performance of employees. Many compliance tools appear to fulfil the aforementioned requirements, which means that the involvement of the works council definitely needs to be checked before their introduction.

Involving and agreeing with the works council on the introduction of a compliance tool has the advantage, in addition to fulfilling the obligation under Section 87 (1) No. 6 BetrVG, of fulfilling a permission criterion required under data protection law for the processing of employees' personal data by means of the shop agreement. However, one must bear in mind that a shop agreement cannot constitute permission to process data of third parties, such as business partners.

Principle of data economy (privacy by design)

Compliance tools would work best if they were able to process as much data as possible from as many sources as possible. However, with the principle of data minimisation (Art. 5 (1) (c) GDPR), the GDPR requires that a technical system for data processing be designed in such a way that it can fulfil its function with as little personal data as possible ("privacy by design"). The decisive criterion for the scope of the respective data to be processed is the purpose of the processing. On the basis of the specific purpose of the processing, one must determine which personal data are absolutely necessary to achieve the purpose. To achieve this, technical measures such as anonymisation, pseudonymisation or access restrictions can be used. However, there are high hurdles for anonymising and pseudonymising personal data: The possibility of anonymisation depends on whether it is possible, with a justifiable effort in terms of time, costs and manpower, to identify the respective data subject on the basis of the existing data contained in the entire database within the company or perhaps even within the database of a third party.

Principle of transparency

The principle of transparency is intended to ensure that data subjects can always know who knows what, when and in what context about them. Accordingly, data subjects (employees, business partners, if applicable, other third parties) need to be informed about the use of the compliance tool and the associated data processing (Art. 13, 14 GDPR). Data subjects are also entitled to information at any time regarding the processing carried out with the tool; such enquiry must be answered by the controlling enterprise within one month.

Compliance tool should not make automated decisions

Article 22 (1) of the GDPR prohibits decisions reached "solely" on the basis of automated processing that have "legal effects" on an individual or similarly "significantly" affect an individual. This also fundamentally applies to decisions made with the help of compliance tools and affecting individuals. The prerequisite is that the decision has a significant effect on the individual. This can be affirmed, for example, when a job applicant is not hired or a notice of termination is issued. The decision must not be based "solely" on the automated processing, which is the case if there is no relevant human involvement in the decision-making process. If the automated process only leads to a recommendation and a human considers other factors before making the final decision, the decision is not based "solely" on automated processing. Accordingly, before implementing compliance tools, companies should consider whether the tool's "decisions" might have a significant effect on employees or other third parties. If so, companies should ensure that a human is involved in the final decision. The extent of human involvement in the decision-making process should also be documented by the company.

Is the tool provider a processor or an actual controller?

Before introducing a compliance tool, one must check whether the actual provider of the tool could access the controlling company's personal data. This is usually the case with a cloud-based operation. If access possibilities exist, the provider's role under data protection law needs to be assessed. If the provider processes the data solely upon instruction and for the purposes determined by the controlling company, the provider is a processor pursuant to data protection law. In this case, a processing agreement must be concluded with the provider with the content of the regulations specified in Art. 28 (3) GDPR.

If the provider wants to process the data for its own purposes (e.g. for testing and development purposes for its own tool), one needs to check whether the companies and the provider could even be considered joint controllers.

Examination of the necessity of a data protection impact assessment

In some cases the use of a compliance tool might require the performance of a data protection impact assessment pursuant to Art. 35 (1) GDPR. This is the case, especially when new technologies are used, if the data processing is likely to pose a significant risk to the rights and freedom of natural persons due to the nature, scope, circumstances and purposes of the processing. Whether a data protection impact assessment is necessary in an individual case depends in particular on the specific compliance tool as well as on the respective personal data to be processed.

A data protection impact assessment must be carried out before the start of the processing operations being considered. Especially when using new technologies, in particular AI tools, companies should always check the necessity of a data protection impact assessment before their introduction and must document the outcome of the check.

Consideration of future regulations of the AI Regulation

The European legislator is currently dealing with harmonised regulations for artificial intelligence (so-called AI Regulation). The Regulation is expected to enter into force in mid-2024. The current draft of the AI Regulation envisages further obligations for providers of artificial intelligence and users. We therefore recommend taking the current developments of the AI Regulation into consideration before introducing compliance software containing AI components.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.