Malta: AI's Responsibility Gap

Last Updated: 10 October 2019
Article by Michael Camilleri

An autonomous vehicle is heading into another vehicle with five people inside. It cannot brake in time. The two options are: to do nothing and allow itself to crash into the vehicle or to divert its trajectory and crash into a wall, almost certainly killing the autonomous vehicle's sole occupant. This is the Artificial Intelligence version of the 'Trolley Problem' - at its core a philosophical/ethical problem tackled from a number of perspectives, not least by a utilitarian approach.

This moral question, which was previously a problem which 'merely' troubled the mind, now troubles us in an actual, physical and even a legal way - in the form of 'embodied AI'. From a legal point of view, the question is "who would be responsible in such a scenario: the AI, computer programmers, manufacturers, vendors or the vehicle's occupant?". The reality is that the courts would not revert to philosophical arguments when faced with an accident involving an AI powered machine, but would be confined to enforcing the letter of the law - primarily laws equipped to assign responsibility to a human for malicious or negligent acts or omissions. Yet to keep the status quo would be dangerous since judges would thus be confined to black letter law even if it produced unjust results.

Hence the responsibility gap. Indeed, probably the single largest hurdle before the mass release of AI powered machines is that of assigning responsibility in the case of an accident. Malta's non-contractual liability regime (tort law) is dictated by fault. The gist of the responsibility problem is that in AI powered machines, for example fully autonomous vehicles which operate without human oversight,any given accident can hardly be said to be the fault of the owner or the human 'driver', who is now more akin to a passenger or bystander.

As one of the most pervasive, and indeed one of the most immediate AI-related technologies, autonomous vehicles ('AVs') shall be taken as the main case study for these purposes. AVs combine complex software, in the form of AI, with a corporeal presence which can have an actual physical impact on this world – as opposed to software for example which only exists in the virtual world. AVs thus represent what is broadly referred to as embodied AI, or, in other words, a robot. A widely accepted definition of a robot is that contained in Mataric's 'Robot Primer' that is: "A robot is an autonomous system which exists in the physical world, can sense its environment, and can act on it to achieve some goals." 1 The key terms to take out of this definition are that a robot:

  • is autonomous broadly meaning that it can act independently of a human;
  • exists and interacts with the physical world and is therefore also limited to what is mechanically possible;
  • can sense its environment meaning that it has sensors such as vision or sound through which it can collect data from its surroundings;
  • can act to achieve goals meaning that it can have a physical impact on the world.

The above features, combined with AI's ability to adapt or learn from its surroundings (known as machine learning algorithms), including its trials as well as it errors, are the features which are most challenging to existing legal systems. We are therefore faced with two options: to do nothing, keep the status quo and leave the assigning of responsibility/liability in the hands of the courts; or to pre-emptively prepare for the mass release of AVs and similar AI technologies through regulation.

" Keeping the Status Quo

Is the current legal system equipped to deal with the vast potential impact on responsibility regimes that AI promises to bring? What would happen under the current regime, and which laws would take effect, if a robot were to cause an accident in Malta? All these are questions which need answering in the interest of industry players as well as of the public at large.

The extent to which a party can be found responsible is dictated by the applicable law of choice, that is, for our purposes: contract law, tort law and/or product liability law, which often go hand-in-hand but which may have different consequences for interested parties. 2

In particular, if a party were a victim of an accident involving an AV, this would provoke the following questions of the current legal system:

  • Who is liable for the damages suffered, i.e. Who do I sue?
  • How can I prove that the defendant/s caused the damages suffered, i.e. How can I prove it?

1. Who do I sue?

Can the robot itself be found responsible?

The starting point in assigning responsibility would be to take a look at the provisions of tort law. However, a glance at the provisions of responsibility in tort immediately suggest that these were drafted with a natural person in mind. For example, Article 1031 of the Civil Code states that "Every person...shall be liable for the damage which occurs through his fault." In other words, every person has a duty of care, to make use of his rights within the proper limits. A product is not a person (so far), and to say that an AI e.g. an AV has a duty of care (even if granted legal personality) would be to stretch current interpretations too far. Therefore, it is clear from the start that fault cannot, at least on the basis of the current legal framework, be attributed to a robot itself. Whether we grant legal personality to robots is another matter altogether.

Could the owners of the car be found responsible?

It could be argued that by extension of their duty of care, human drivers or owners of autonomous vehicles may have vicarious liability over such vehicles (under tort law), just as the owner of an animal may have by virtue of their responsibility to the public regarding something which should be in their control. It would however be unreasonable and unwise to attribute such form of strict or objective liability to drivers or owners of AVs if they are to be adopted on a widespread basis. Firstly, what use are AVs if it is demanded that drivers/owners are to keep a constant watchful eye on the driving of the vehicle? Secondly, it is probably easier to train a dog than it is to train an AV and the average consumer cannot be expected to understand the workings of an AV, so how can it be expected to care for it? Finally, it is arguable that few would be willing to risk buying AVs if vicarious liability were attached to them. It is thus clear, at least to this author, that vicarious liability is not a viable option.

Could the vendor or the producer/manufacturer be made responsible?

The Vendor

When it comes to attribution of liability to the vendor of an AV one may primarily resort to contract law. If we take a B2C scenario, the vendor (e.g. an auto dealer) is bound to carry out two principal obligations, that is to deliver and warrant the product that is sold (Article 1378, Civil Code). In the case of AVs, the thing we are most concerned with is the latter, the warranty against latent defects. This warranty is covered by Article 1424 of the Civil Code (a similar provision exists in the Consumer Affairs Act providing for a remedy in cases of a 'lack of conformity'), describing latent defects as defects which exist at the time the contract was made and which:

  • "render [the product] unfit for the use for which it is intended;
  • which diminish its value to such an extent that the buyer would not have bought it or;
  • would have tendered a small price, if [the buyer] had been aware of them."

In parallel to this it is necessary to refer to the main features of robots, and particularly their ability to learn as fuelled by machine learning algorithms and as is present in most AI systems. Considering that AVs learn as they drive on the road, how can a vendor be made to answer for latent defects (or lack of conformity) which AVs learn AFTER the time the contract was concluded? This question would already complicate the task of attributing responsibility to the vendor under contract law.

The Producer/Manufacturer

In a claim for damages caused by a product, interested parties may, besides pursuing an action against the actual front-end vendor of the product concerned, pursue the producer or manufacturer instead – this is made possible by product liability law. The Consumer Affairs Act, states that 'the producer' or 'manufacturer' of products (including manufacturers/producers of the whole e.g. the chassis of the car, or of a part e.g. the wheels or software of an AV) shall be liable for damages which are caused by defective products which are produced or manufactured by them.

It has been established that it would be unreasonable/impossible to attribute responsibility for damages in an accident caused by an AV to consumers/drivers. This means that the only course of action would be to sue those who are responsible for the development and manufacture of the car, its parts or its software.

However even if, by elimination of other parties, it is the producers or manufacturers of AVs who must be held responsible for an accident caused by AI, how can I prove that they are responsible for the damages suffered?

2. How can I prove it?

Product liability law to the rescue?

Product liability law seeks to establish an equilibrium between the expertise and financial power of the producer/manufacturer and that of the consumer, inter alia by placing strict liability on such producer/manufacturer. Moreover, it protects the interests of consumers, demanding that they are sold products which are safe and which conform to their description. Just as you wouldn't expect your microwave to explode upon use , neither would you expect your AV to crash into a wall.

Due to the strict liability introduced by product liability law, injured parties must merely prove:

A ). that damage actually occurred;

B ). that the product was defective; and

C ). that a causal relationship existed between the defect (b) and the resulting damage (a).

It is asserted that in the case of an accident caused by an AV, this will in itself constitute first hand evidence that damage has occurred and that the product was defective. Therefore, the only task that remains is to prove a causal relationship (c) - between (a) and (b). To prove this, one would need to prove inter alia that the producer/manufacturer failed to properly warn its customer of any possible dangers of the vehicle, or that the AV's safety systems were not up to scratch. In an analogous situation, the operation of aircrafts is now a highly automated affair. It has been reported that the lack of manual intervention mechanisms in certain aircrafts were the cause of recent fatal plane crashes in Ethiopia. and Indonesia. Fingers are thus being pointed at the aircrafts' manufacturers for the damage caused.

In light of the cutting-edge technology at the core of our discussion it is probable that manufacturers/producers faced with a lawsuit will invoke the 'state of the art' defense. It follows that manufacturers/producers will likely argue that the resultant damages caused by AVs were unforeseeable, in that the scientific and technical knowledge at the time the product was put into circulation did not enable the discovery of the defect which led to the accident.

This does not even take into consideration that the perceived 'defect' may arguably not be a defect at all. The AI might have very well made an informed decision and consciously (a word having implications that veer beyond the scope of this paper) chosen one course of action over another. To return to the example mentioned above, the AI might have decided that crashing into the wall would result in lesser fatalities than allowing the oncoming vehicle to collide with it. Can the heirs or dependants of the deceased passenger successfully argue that there was a 'defect' here? Wouldn't the manufacturer be able to claim that the AI did what it was programmed (or taught itself) to do?

" The opacity problem

It has already been highlighted above that AI's ability to learn makes it an arduous task to prove that a vendor is responsible for an allegedly defective AV. In fact, the machine learning algorithms fuelling such ability to learn are sometimes referred to as 'black box algorithms'. In other words, the logic behind the decisions taken by AI systems such as AVs are either: intentionally opaque, in that they are protected by trade secrets or Intellectual Property or even; unintentionally opaque in that the code behind such AI systems is so complex, that even experts in computer programming are unable to interpret the logic behind such decisions. The latter was the case in a Facebook experiment, where two AI powered machines were shut down because they started talking to each other in their own invented language which not even their own developers could understand.

As long as the opacity problem persists, considering the sheer impossibility of peering inside the black box, for any party to prove that the defect was:

(a) discernible to the consumer; and/or

(b) present at the time of conclusion of the contract

could prove to be an insurmountable task.

" Logistical problems

Even though product liability law may present an option at law to bridge the responsibility gap, instituting a case against producers/manufacturers presents several financial and logistical problems in addition to the conceptual issues outlined in the last preceding paragraph.

One must keep in mind that the manufacture of ordinary cars is already a highly technical affair, often requiring a wide array of know-how and the involvement of numerous experts in different fields. The specialist nature of AVs means that, besides the need to involve manufacturers of engines, wheels, chassis etc, to produce such a vehicle one would also require expertise in AI and robotics. In turn, the development of AI may rely on or be the end-product of boundless bits and pieces of software from different specialists all around the world.

In its current form, bringing evidence at a lawsuit would be a costly and time-consuming affair for all parties, especially considering the likely need for court experts versed in AI technology and digital evidence, who may need to be flown in from all over the world.

Finally, due to the 'who do I sue?' dilemma outlined above, how do you even determine which witnesses to call into a suit? What about the AI itself? If the future holds legal principles that may assign liability to AI by way of a legal fiction yet to be determined (similarly to what is done with companies), would the AI have any 'rights'? Could the AI (or at least logs of the AI's thought process) be 'heard' by a court?

Conclusion

Considering all the above, although the current framework of product liability law could be, in principle equipped to deal with AI systems, at a practical level there are too many uncertainties for products such as AVs to be rolled out to the public en masse.It is clear that some sort of action/strategy is needed. It would be reckless to allow the courts to make judgement calls on such delicate matters involving such moral dilemmas as the AI equivalent to the 'trolley problem'. It is submitted that to keep the status quo would also be detrimental to the development of the market. A vague legal framework does not allow industry players any certainty – manufacturers will not be willing to release their products to the public and the latter will not trust such products enough to use them.

There have been many proposals over the last few years to bridge the responsibility gap. Some proposals, such as the attribution of legal personality to AI, have been criticised as farfetched or even morally untenable. Others argue that our law of tort is already equipped to deal with damage caused by AI.Yet, as highlighted above, considering the lacunae which AI-powered technology uncovers, inaction may be just as harmful.

Owing to its size, geographical position and weather, Malta offers an ideal landscape to act as an AI test bed, potentially allowing it to become a first mover and innovator in the industry. There is no need to go overboard with extensive changes and legalisms. Perhaps the best solution is a mix of the current arsenal of laws combined with the adaptation of some of the current tools, such as insurance, tools which are to a certain extent already equipped with the ability to soften the blow of potential damages.

This piece merely exposes the responsibility gap. The best path towards bridging the gap requires another discussion altogether.

Footnotes

1.M J Mataric, The Robotics Primer (MIT Press, 2007) chapter 2

2.For a comprehensive overview of Maltese laws applicable to AI see: Micallef, TL., 'Civil responsibility for damage caused by artificial intelligence' (University of Malta, 2016)

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

To print this article, all you need is to be registered on Mondaq.com.

Click to Login as an existing user or Register so you can print this article.

Authors
 
Some comments from our readers…
“The articles are extremely timely and highly applicable”
“I often find critical information not available elsewhere”
“As in-house counsel, Mondaq’s service is of great value”

Related Topics
 
Related Articles
 
Up-coming Events Search
Tools
Print
Font Size:
Translation
Channels
Mondaq on Twitter
 
Mondaq Free Registration
Gain access to Mondaq global archive of over 375,000 articles covering 200 countries with a personalised News Alert and automatic login on this device.
Mondaq News Alert (some suggested topics and region)
Select Topics
Registration (please scroll down to set your data preferences)

Mondaq Ltd requires you to register and provide information that personally identifies you, including your content preferences, for three primary purposes (full details of Mondaq’s use of your personal data can be found in our Privacy and Cookies Notice):

  • To allow you to personalize the Mondaq websites you are visiting to show content ("Content") relevant to your interests.
  • To enable features such as password reminder, news alerts, email a colleague, and linking from Mondaq (and its affiliate sites) to your website.
  • To produce demographic feedback for our content providers ("Contributors") who contribute Content for free for your use.

Mondaq hopes that our registered users will support us in maintaining our free to view business model by consenting to our use of your personal data as described below.

Mondaq has a "free to view" business model. Our services are paid for by Contributors in exchange for Mondaq providing them with access to information about who accesses their content. Once personal data is transferred to our Contributors they become a data controller of this personal data. They use it to measure the response that their articles are receiving, as a form of market research. They may also use it to provide Mondaq users with information about their products and services.

Details of each Contributor to which your personal data will be transferred is clearly stated within the Content that you access. For full details of how this Contributor will use your personal data, you should review the Contributor’s own Privacy Notice.

Please indicate your preference below:

Yes, I am happy to support Mondaq in maintaining its free to view business model by agreeing to allow Mondaq to share my personal data with Contributors whose Content I access
No, I do not want Mondaq to share my personal data with Contributors

Also please let us know whether you are happy to receive communications promoting products and services offered by Mondaq:

Yes, I am happy to received promotional communications from Mondaq
No, please do not send me promotional communications from Mondaq
Terms & Conditions

Mondaq.com (the Website) is owned and managed by Mondaq Ltd (Mondaq). Mondaq grants you a non-exclusive, revocable licence to access the Website and associated services, such as the Mondaq News Alerts (Services), subject to and in consideration of your compliance with the following terms and conditions of use (Terms). Your use of the Website and/or Services constitutes your agreement to the Terms. Mondaq may terminate your use of the Website and Services if you are in breach of these Terms or if Mondaq decides to terminate the licence granted hereunder for any reason whatsoever.

Use of www.mondaq.com

To Use Mondaq.com you must be: eighteen (18) years old or over; legally capable of entering into binding contracts; and not in any way prohibited by the applicable law to enter into these Terms in the jurisdiction which you are currently located.

You may use the Website as an unregistered user, however, you are required to register as a user if you wish to read the full text of the Content or to receive the Services.

You may not modify, publish, transmit, transfer or sell, reproduce, create derivative works from, distribute, perform, link, display, or in any way exploit any of the Content, in whole or in part, except as expressly permitted in these Terms or with the prior written consent of Mondaq. You may not use electronic or other means to extract details or information from the Content. Nor shall you extract information about users or Contributors in order to offer them any services or products.

In your use of the Website and/or Services you shall: comply with all applicable laws, regulations, directives and legislations which apply to your Use of the Website and/or Services in whatever country you are physically located including without limitation any and all consumer law, export control laws and regulations; provide to us true, correct and accurate information and promptly inform us in the event that any information that you have provided to us changes or becomes inaccurate; notify Mondaq immediately of any circumstances where you have reason to believe that any Intellectual Property Rights or any other rights of any third party may have been infringed; co-operate with reasonable security or other checks or requests for information made by Mondaq from time to time; and at all times be fully liable for the breach of any of these Terms by a third party using your login details to access the Website and/or Services

however, you shall not: do anything likely to impair, interfere with or damage or cause harm or distress to any persons, or the network; do anything that will infringe any Intellectual Property Rights or other rights of Mondaq or any third party; or use the Website, Services and/or Content otherwise than in accordance with these Terms; use any trade marks or service marks of Mondaq or the Contributors, or do anything which may be seen to take unfair advantage of the reputation and goodwill of Mondaq or the Contributors, or the Website, Services and/or Content.

Mondaq reserves the right, in its sole discretion, to take any action that it deems necessary and appropriate in the event it considers that there is a breach or threatened breach of the Terms.

Mondaq’s Rights and Obligations

Unless otherwise expressly set out to the contrary, nothing in these Terms shall serve to transfer from Mondaq to you, any Intellectual Property Rights owned by and/or licensed to Mondaq and all rights, title and interest in and to such Intellectual Property Rights will remain exclusively with Mondaq and/or its licensors.

Mondaq shall use its reasonable endeavours to make the Website and Services available to you at all times, but we cannot guarantee an uninterrupted and fault free service.

Mondaq reserves the right to make changes to the services and/or the Website or part thereof, from time to time, and we may add, remove, modify and/or vary any elements of features and functionalities of the Website or the services.

Mondaq also reserves the right from time to time to monitor your Use of the Website and/or services.

Disclaimer

The Content is general information only. It is not intended to constitute legal advice or seek to be the complete and comprehensive statement of the law, nor is it intended to address your specific requirements or provide advice on which reliance should be placed. Mondaq and/or its Contributors and other suppliers make no representations about the suitability of the information contained in the Content for any purpose. All Content provided "as is" without warranty of any kind. Mondaq and/or its Contributors and other suppliers hereby exclude and disclaim all representations, warranties or guarantees with regard to the Content, including all implied warranties and conditions of merchantability, fitness for a particular purpose, title and non-infringement. To the maximum extent permitted by law, Mondaq expressly excludes all representations, warranties, obligations, and liabilities arising out of or in connection with all Content. In no event shall Mondaq and/or its respective suppliers be liable for any special, indirect or consequential damages or any damages whatsoever resulting from loss of use, data or profits, whether in an action of contract, negligence or other tortious action, arising out of or in connection with the use of the Content or performance of Mondaq’s Services.

General

Mondaq may alter or amend these Terms by amending them on the Website. By continuing to Use the Services and/or the Website after such amendment, you will be deemed to have accepted any amendment to these Terms.

These Terms shall be governed by and construed in accordance with the laws of England and Wales and you irrevocably submit to the exclusive jurisdiction of the courts of England and Wales to settle any dispute which may arise out of or in connection with these Terms. If you live outside the United Kingdom, English law shall apply only to the extent that English law shall not deprive you of any legal protection accorded in accordance with the law of the place where you are habitually resident ("Local Law"). In the event English law deprives you of any legal protection which is accorded to you under Local Law, then these terms shall be governed by Local Law and any dispute or claim arising out of or in connection with these Terms shall be subject to the non-exclusive jurisdiction of the courts where you are habitually resident.

You may print and keep a copy of these Terms, which form the entire agreement between you and Mondaq and supersede any other communications or advertising in respect of the Service and/or the Website.

No delay in exercising or non-exercise by you and/or Mondaq of any of its rights under or in connection with these Terms shall operate as a waiver or release of each of your or Mondaq’s right. Rather, any such waiver or release must be specifically granted in writing signed by the party granting it.

If any part of these Terms is held unenforceable, that part shall be enforced to the maximum extent permissible so as to give effect to the intent of the parties, and the Terms shall continue in full force and effect.

Mondaq shall not incur any liability to you on account of any loss or damage resulting from any delay or failure to perform all or any part of these Terms if such delay or failure is caused, in whole or in part, by events, occurrences, or causes beyond the control of Mondaq. Such events, occurrences or causes will include, without limitation, acts of God, strikes, lockouts, server and network failure, riots, acts of war, earthquakes, fire and explosions.

By clicking Register you state you have read and agree to our Terms and Conditions