R2-D2 and I, Robot are no longer just predictions of our future – we are preparing for a world where complex and self-learning robots will be commonplace in our daily lives. Robots will be very different from our tools of the past. Artificial intelligence will equip them to learn from us (and about us), interact with us and make decisions independently of our direction.

A legal framework for robots now being considered by the European Parliament provides a timely insight into some of the interesting legal issues that will arise – issues that Australian lawmakers will soon need to grapple with.

WHY ARE WE TALKING ABOUT ROBOTS?

The science fiction concept of a 'robot' has been around for a long time.1 Technological advances are turning these concepts into reality, giving rise to a wide range of increasingly sophisticated robots.

When we talk about robots, we are really talking about products that are equipped with a form of artificial intelligence or 'AI', enabling them to learn, communicate with each other and make decisions autonomously without our direction or control. Driverless cars are an excellent example of this technology. Road rules around Australia and the world are being reviewed to accommodate this new reality.

However, driverless cars are just one example. Our future life with robots will be characterised by two important qualities:

  • robots will be much more common place and varied (everything from everyday appliances and driverless cars to humanoid robots and drones), leading to more human-robot interactions and therefore more reliance by humans on the safe and proper functioning of those robots; and
  • robots will use sophisticated artificial intelligence systems, enabling them to learn and operate autonomously or semi-autonomously.

Robots of the future will be very different from current consumer technology, which is almost entirely subject to human direction and control. In addition, the spectrum of autonomy of robots (from semi-autonomous to fully autonomous) blurs the lines between whether the robot, or its operator or creator, is the cause of loss or damage.

The ability of a 'product' to learn and make decisions without human control is a fundamental technological shift that raises a number of new and interesting legal issues.

THIS IS HAPPENING NOW

To give you an idea of how seriously some governments are taking these issues, in January this year, the EU Legal Affairs Committee called for the EU Commission (Commission) to create a legislative proposal for a robust legislative framework to cover robotics-related issues.

In its press release, Members of the European Parliament stressed that any legislative framework should "fully exploit the economic potential of robotics and artificial intelligence and guarantee a standard level of safety and security."

The 'Report with recommendations to the Commission on Civil Law Rules on Robotics' was tabled in January 2017, and a motion for a resolution was adopted on 16 February 2017 (EU Resolution).2 EU citizens and organisations can contribute to public consultation on the issues raised in the Report until 30 April 2017.

The EU Resolution raises a wide range of legal issues for consideration, including:

  • liability;
  • general principles of development of robotics;
  • research and innovation;
  • ethics;
  • centralised agencies;
  • intellectual property rights and data flows;
  • standardisation, safety and security;
  • autonomous means of transport (including vehicles and drones);
  • care and medical robots;
  • repair and replacement of the human body ('intelligent implants');
  • education and employment;
  • environmental impact; and
  • international aspects.

We discuss some of these below.

LIABILITY (AND ETHICS)

Will your driverless car decide to save you or the pedestrian in the way? Who is writing the code that determines this, when we know that individual biases of coders can be 'passed on' to AI? And who is liable for the outcome in this scenario - the car owner, or the manufacturer? What about insurance?

These are some of the questions that are already arising from the use of autonomous robots. In the driverless car context, perhaps the key issue is how we can determine what the algorithm was 'thinking' at the time of an accident, given AI is self-learning and develops its own strategies.

With the exception of manufacturers taking a proactive approach,3 the legal position isn't so clear.

On this issue, the EU Resolution suggests consideration of a mandatory insurance scheme. Such a scheme may have parallels with the compulsory insurance scheme for motor vehicles used in Australia and many other countries.

The rationale behind such a scheme is to provide certainty that a person is able to obtain compensation for damage or injury where there is either nobody at fault, fault cannot be determined, or fault lies with a non-legal entity – such as a robot.

If the Australian Consumer Law applies to the supply of a robot, the liability of its manufacturer for injury or death caused by the robot is currently based on the imposition of strict liability coupled with the 'state of the art' defence. Manufacturers can escape liability if they can demonstrate that the defect in their product could not have been discovered because of limitations in the state of scientific knowledge at the time the goods were supplied. Australia's approach to liability for products with safety defects is based on the European approach.

Determining the state of scientific knowledge in the complex and rapidly evolving science of AI and robotics may be quite a task for the courts and raises questions as to whether a true strict liability regime (without any state of the art defence) may be more appropriate for autonomous robots.

PRIVACY

Robots are being developed with the ability not just to learn from us but to learn about us, including the ability to sense, acquire and interpret data about the tone, emotion and expression of people.

This type of information may not fall within the current definitions of "health information" or "sensitive information" under Australia's privacy laws, raising questions as to how such information will be used.

Drones can capture enormous amounts of visual and other data, including images of our private homes and actions in the course of a single flight. Yet Australia has no recognised tort of invasion of privacy and the collection of personal information (other than particular types of information such as health information) by drones operated by individuals or small businesses is not caught by Australian privacy law. In addition, protections afforded by surveillance device laws and criminal codes are often inadequate to deal with these 'invasion of privacy' issues.4

Current Australian privacy laws may not be equipped to meet community expectations as to privacy and control over personal information acquired and used by robots.

The EU Resolution notes that privacy is a major concern with respect to drones and other autonomous robots and indicates that 'privacy by design'5 principles should be implemented where possible. With drones, robots and other intelligent systems either collecting or accessing personal information, privacy and information security will be paramount given many of these systems will be connected via the internet.

SAFETY

The EU Resolution proposes that:

  • a voluntary ethical code of conduct be created to guide robot manufacturers in the functions they give robots to improve their safety;
  • robots should be designed on the basis of Asimov's 'Laws of Robots';6and
  • all robot designs should incorporate a "kill switch" to ensure they can be shut down in emergencies.

The EU Resolution also states that there may be areas of our lives from which it may be appropriate to limit or exclude the use of robots. For example, the EU Resolution notes that human contact is central to fields like aged care. Health industries will need to be mindful of an increased reliance on robotic technology, even if that technology has humanoid features or the ability to imitate emotion.

Further, the EU Resolution urges the Commission to consider creating a centralised agency for robotics and AI that would provide public authorities with technical, ethical and regulatory expertise. Such an agency might also maintain a register of the 'serial number' of each robot and its 'owner', much like motor vehicles are registered today.

WHAT IS AUSTRALIA DOING TO REGULATE ROBOTS?

Australian law makers are taking some steps to consider matters related to robot innovation:

  • In December 2017, a Senate committee will hand down its report on a number of regulatory issues for drones. While this inquiry focuses on safety, other issues (including liability and privacy) will also likely be forced into the spotlight.
  • State governments are now testing driverless vehicles and considering the changes that will need to be made to current road rules. Last year, the National Transport Commission issued a discussion paper7 and a policy paper,8 providing recommendations about autonomous vehicles.

These steps have a relatively narrow focus compared to the wide range of issues now being considered by the European Parliament. As yet, no Australian government appears to be openly considering any overarching legislative consultation or review for regulating the range of issues that arise from intelligent robots.

Australia could let other jurisdictions 'take the lead', however our legal system and consumer laws differ in important aspects to those of overseas jurisdictions. Community values also differ between different jurisdictions, particularly with respect to matters such as privacy.

While our common law system of law has proven its ability to adapt to technological changes over time, it is reactive in nature, generally playing 'catch-up' with new technology. In addition, Australian laws regulating the issues most relevant to robots (including privacy and consumer protection) are substantially the domain of legislation, not the common law.

The list of legal issues raised by the European Parliament is long and wide-ranging. Those issues will take time to consult on, work through and develop. It is time for Australian law makers to start thinking about how we want our life with robots to be regulated.

Footnotes

1The word 'robot' was first used in 1920 to denote a fictional humanoid in a play by Czech author, Karel Capek.

2 The EU Resolution adopted the text requesting the Commission to submit a proposal for a directive on civil law rules on robotics, following the recommendations set out in the Annex to the EU Resolution. While the Commission is not obliged to present a directive, it must provide its reasons for choosing not to do so.

3There are reports that Volvo has announced that it will accept liability caused when its cars are in autonomous mode.

4See our thinking article on drones titled ' The Drone Age: not even the sky is the limit for this changing regulatory landscape'.

5While a rather uncertain concept, and voluntary in nature under current privacy laws in Australia, 'privacy by design' involves building-in privacy protections when designing and engineering a product or system to better protect the personal information that it collects from misuse.

6 While not without critics, Asimov's laws of robots are: (1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. (2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. (3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws and (4) A robot may not harm humanity, or, by inaction, allow humanity to come to harm. See Runabout, I. Asimov, 1943.

7 National Transport Commission, Regulatory options for automated vehicles – discussion paper (May 2016).

8 National Transport Commission, Regulatory reforms for automated road vehicles – policy paper (November 2016).

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Chambers Asia Pacific Awards 2016 Winner – Australia
Client Service Award
Employer of Choice for Gender Equality (WGEA)