This overview aims to highlight selected potential Polish law issues related to liability for damages caused by machine learning algorithms (MLA) which an entrepreneur operating, buying or selling a business based on MLA may face and should consider. Needless to say, the civil law liability touched upon here is not the only liability regime which should be borne in mind. For the sake of clarity, however, matters related to specific sector regulation (e.g. for medical devices), intellectual property rights, GDPR as well as criminal liability are not addressed below, although they are equally important for MLA-based business.
All these terms...
Machine learning algorithms, artificial intelligence, robots – all these buzzwords represent the real growing importance of machines able to make autonomous decisions. There are plenty competing definitions of these terms (applying both mathematical terms and psychological or philosophical aspects). This overview focuses on socalled reinforced learning algorithms, i.e. algorithms collecting data without the constant input of an operator and autonomously making decisions (affecting its surroundings) based on a balance between potential risks and profits. These algorithms also act without human supervision.
Where we are
The legal aspects of the development and use of MLA are discussed at the international (e.g. at the forum of UN agencies), European Union1 and national levels. At the time of writing this overview (September 2018), no regulation addressing the specific concerns raised about the civil law liability or accountability of MLA (or more broadly algorithms and robots) has be implemented or proposed, save for various proposals of codes of ethics which would apply to developers and users/beneficiaries of MLA.2
Lack of specific regulation does not mean that there is no regulation. On the contrary, the Polish Civil Code provides for a wide range of possible liability regimes to be applied to MLA, such as liability for dangerous product, tort liability (based on the risk principle) or contractual liability based on the agreed contractual terms.
In some circumstances it is debatable which liability regime applies. For instance, it is discussed if and when MLA constitutes a "product" within the meaning of the provisions related to liability for dangerous products3 depending on the specific features of the given MLA and on the circumstances of damage. But such features are not unique to MLA, since the aforementioned issues may also arise in cases involving "traditional" machines.
What is specific to MLA as opposed to fully human-controlled machines is its complexity, unpredictability and scale. In other words, it would be difficult in many cases to determine and apply the proper level of due diligence in designing and using MLA, because the due diligence needed in creating autonomous systems is hard to imagine in detail upfront and not all risks can be foreseen.4 For these reasons, and to address public fears related to lack of control over MLA, strict rules of liability, e.g. based on the risk principle (making it difficult for the accountable user or developer of the MLA to release itself from liability) likely will be applied.
Sometimes, even identifying the developer, user or beneficiary of the MLA's operations will be difficult, if at all possible.5
Where we are heading
The solution is not yet clear. The legislation may basically go in two opposite directions:
- the model of liability which now applies to traditional machines will only be modified. For instance, MLA developers and users will be obligated to follow specific rules of conduct and due diligence, but generally some level of innovative risk will be accepted. Additionally, individuals affected/harmed by operation of MLA will relatively easily pursue claims for damages, but on the other hand, the liable person/entity developing or using MLA will enjoy viable defences; or
- governments create an authorisation-like system for employing MLA. For example, to apply an MLA, the entrepreneur will need to first seek governmental approval or at least ensure a high premium insurance policy before the given MLA is implemented. Moreover, developers or users will be subject to very strict liability rules (in the worst-case scenario they will be practically unable to release themselves from liability).6
The adopted approach will most likely lie in between these extremes.
What should be analysed when assessing liability risks in the short and medium term?
When assessing the liability for MLA, operation risk (e.g. when drafting a contract on purchase or sale of a business developing or using MLA) it is worth being up-to-date with the legislation. In particular, stakeholders should bear in mind that:
- the development and implementation or commercialisation of MLA should – at least until specific legislation is implemented – follow the rules for implementation of other high-risk products currently in force to the furthest possible extent (and with an even higher standard of care in mind). This means that the strict liability for any personal injuries inflicted as a result of MLA should be borne in mind;
- as long as liability for MLA development and operation is not specifically regulated in Polish law, the applicable legal provisions in use for standard machinery should be applied, even if there are doubts about whether they are binding;
- due to the complexity and unpredictability of MLA, it is virtually impossible to adopt a one-fits-all approach;
- the far-reaching effects of MLA and lack of control over it can multiply bias embedded in MLA by developers and hence the damage and the amount of compensation payable;
- the absence of a common approach among potential legislators makes it difficult to predict how the laws affecting MLA will develop, i.e. whether they will be open or restrictive. In any case, a company developing or using MLA may be required to disclose the algorithm and perhaps also explain how it makes decisions. This will limit the company's competitive edge, as other companies may try to apply similar MLA (regardless of whether it constitutes a breach of IP laws or not). Sometimes it will not be possible to easily explain how the "blackbox" in-built in the MLA works;
- ongoing changes in sector regulation, for example, concerning road safety if MLA is driving a car.
1 See, for instance, European Parliament resolution of 16 February 2017 on civil law rules on robotics with recommendation to the European Commission. See: www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+TA+P8-TA-20170051+0+DOC+XML+V0//EN
2 The European Commission set up an expert group on liability and new technologies that will help prepare guidance to the new EU directive addressing the liability aspects of the creation and use of algorithms and robots. The aim is to issue such guidance by mid-2019. The first meeting of the expert group took place in June 2018. See: ec.europa.eu/growth/single-market/goods/free-movement-sectors/ liability-defective-products_en – read on 12 September 2018
3 On the EU legislation level those provisions are set forth mainly in Directive 85/374/EEC on liability for defective products.
That is why there are a number of ongoing projects aimed at establishing basic rules of conduct for MLA developers and users.
5 There are far more potential problems discussed in legal doctrine, for instance, what rules apply if an MLA "acts" in virtual reality only or "contracts" with other MLAs.
6 In October 2017, the European Parliament Research Centre published a first (preliminary) report on public consultation on robotics and AI (dated 13 July 2017). 90 % of the participants opted for regulation in that area and were mainly concerned about industry abuse and data protection issues, while 74 % were concerned about liability rules. See: www.europarl.europa.eu/committees/en/juri/newsletters.html?id=20171005CNW05623&fhch=2fa74fd0eeb79c2b4680133f79adde80
This article was up to date as at the date of going to publishing on 10 December 2018.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.