In the last three articles of our "Digital Compliance" series, we explained the benefits and risks of using software-based compliance tools under aspects of data protection law and sanctions compliance. In this article, we are now taking a look at the benefits and risks of such software-based compliance tools from a labour law perspective.

In practical cases where a whistleblower reports a compliance violation – a noticeably increasing occurrence at present following the introduction of the German Whistleblower Protection Act [Hinweisgeberschutzgesetz, HinSchG] and the ensuing establishment of internal reporting offices - employers are regularly obliged to initiate an internal investigation. The extensive investigative activities can be carried out much more efficiently and quickly using software-based applications, e.g. for searching for specific keywords in e-mails, etc.

In this context, in addition to the admissibility under data protection law already discussed in our third article, this gives rise to labour law questions, in particular regarding the co-determination requirement for such investigative acts.

Co-determination rights of the works council

If special software is used for viewing and analysing data as part of internal investigations, the works council has a right of co-determination in this respect pursuant to Section 87 (1) No. 6 of the German Shop Constitution Act [Betriebsverfassungsgesetz, BetrVG]. The works council has a right of co-determination pursuant to Section 87 (1) No. 6 BetrVG with regard to the introduction and use of technical equipment intended to monitor the behaviour or performance of employees. Here, the mere possibility of monitoring performance and behaviour is already sufficient to trigger co-determination. Whether the employer actually intends to use it therefore is not important.

Co-determination rights also exist for the fundamental use of AI as a work resource (Section 90 (1) No. 3 BetrVG) as well as for its influence on the organisational behaviour of employees (Section 87 (1) No. 1 BetrVG). Furthermore, if employers use artificial intelligence they should be aware that the works council is entitled to commission an expert irrespective of any need to involve such an expert, Section 80 (3) BetrVG. In addition, Section 95 (2a) BetrVG stipulates a consent requirement in cases where the employer uses AI to draw up guidelines.

It is therefore advisable to involve the relevant works council at an early stage when planning the use of such tools in order to be able to rapidly and effectively transform to digital work. Please note: many existing standard programmes are going to be supplemented with AI functions in the near future. We therefore strongly advise a review of co-determination under the aspect of compliance.

Limits set by anti-discrimination law

The German General Equal Treatment Act [Allgemeines Gleichbehandlungsgesetz, AGG] also sets limits on the use of digital compliance tools; for not only people, but also algorithms must not discriminate in their decisions.

Discrimination is possible, for example, when compliance tools are used in a job application process or in constellations in which a self-learning algorithm trains itself from possible past experience to proceed on the basis that certain offences are committed more frequently by certain groups of employees.

In a case of discrimination, the person affected is entitled to material and immaterial damages. In addition, a discriminatory measure taken by the employer is null and void pursuant to Section 134 of the German Civil Code [Bürgerliches Gesetzbuch, BGB] in conjunction with Section 7 AGG.

Outlook

Digital compliance tools offer a wide range of possibilities for modernising working life, but employers must observe the current legal limits of data protection, co-determination and anti-discrimination law when using them. The problems surrounding the use of digital compliance tools will remain a hot legal topic. As already mentioned in the previous series of articles, in June 2023 the members of the EU Parliament agreed on a joint position on the AI Act, a law intended to regulate the use of artificial intelligence, with regard to the risks of artificial intelligence. Primarily so-called "high-risk AI systems" will be subject to special requirements and government conformity assessments.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.