When we drive we are faced with a multitude of different moral decisions that affect our behaviour towards other users on the road. We all respond differently to things like emergencies or overtaking cyclists. With autonomous vehicles (AVs) beginning to emerge, we need to consider the moral decisions that we make as humans while driving and which of these we would be comfortable delegating to technology.

Researchers from the MIT Media Lab have released the results of an experiment conducted in 2014 to understand how consumers would prefer an AV to respond in a number of situations. The analysis took into account over 40 million responses.

The experiment presented a moral dilemma where participants had to decide who would be saved if an autonomous vehicle needed to swerve with a selection of different demographics to choose from such as the elderly or criminals. The results highlighted that people generally preferred to save as many lives as possible, as well as humans rather than animals.

There is general recognition that there are six levels to achieving full autonomy and as we move closer to Level 5 where driver intervention is not needed the way we travel set to change. A discussion is needed around the level of autonomy that is possible with AVs and the regulation needed to both support and develop them.

How will autonomous vehicles make roads safer?

Accidents often occur on the road when drivers make the wrong choice. An AV's ability to remove a driver from the decision making process should reduce the number of errors being made and improve safety. By reducing accidents, economic benefits should also emerge as road closures and delays also decrease.

To be able to fully leverage the economic and safety benefits that AVs could offer society, an appropriate legal and regulatory framework needs to be adopted. AVs will be developed with algorithms that ensure that damage to both human and machine are minimised but it is interesting to consider what level of decision an AV will be allowed to make.

In our report 'The Moral Algorithm: How to set the moral compass for autonomous vehicles', we highlighted that AVs are different to the other forms of transport that have become automated such as driverless trains and auto-piloted aeroplanes. AVs have a wider scope for decision making when they are not travelling pre-determined journeys and they also have a more complex environment to contend with. As there are more decisions to make when driving there is more to consider when developing the level of autonomy the vehicle has.

The algorithms that are developed for AVs will need to incorporate more than just moral choices made in emergencies but scenarios when it is acceptable to break the rules of the road dependent on the situation. In 'The Moral Algorithm: How to set the moral compass for autonomous vehicles' we identified that as AVs could be a difficult sell to the public.

What needs to be considered when regulating autonomous vehicles?

There is currently little regulation in place for AVs. For the public to feel confident in the adoption and safety of the technology, a legal framework is needed to ensure that consistency is adopted across development, manufacturing and launch.

Trying to apply existing law to new technologies often causes issues to arise during product development. We are already beginning to see vehicles operate at Level 3 of autonomy with advanced driver assistance offering control of lateral and longitudinal movement in specific use cases. New legislation needs to be put in place in advance before the technology moves forward to accommodate new products and developments. If this does not occur then often products end up not entering the market or additional risks are introduced. State regulation is needed to instruct manufacturers on the moral algorithm needed to support AVs to construct the parameters in which the technology can be developed.

Making way for autonomous vehicles

If AVs are seen to be a trusted and safe mode of transport they are set to change how we travel and interact with the world around us. For this change to happen and for society to enjoy the full potential of benefits that AVs can bring, regulations need to be put in place to ensure that legal risks are minimised and safety is ensured.

Our report 'The Moral Algorithm: How to set the moral compass for autonomous vehicles' is recommended to those working in the automotive sector and explores the issues around ethical decision making while identifying the action that needs to be taken to develop them.

Read the original article on GowlingWLG.com.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.