The impetus for change
In response to the recent horrific Christchurch attack, where Facebooks' live-streaming service was used to broadcast shootings in real time, Australia's parliament has passed a controversial law designed to prevent the 'weaponisation' of social media platforms.
The Australian Government says that this is a world-first attempt to force social media platforms to take control of abhorrent content available through their services, including by imposing significant criminal penalties for breach, and that it hopes other regulators around the world will follow suit.
Industry spectators are worried that the laws were pushed through hastily, with limited industry consultation, and warn of unintended consequences of, and practical difficulties with, complying.
Criminal offences for hosting abhorrent material
The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (Act) amends the Criminal Code Act 1995 (Cth) by introducing offences for failing to notify of abhorrent content, and for failing to take down abhorrent content.
The key concept is 'abhorrent violent conduct', which is defined as conduct whereby a person:
- engages in a terrorist act;
- murders another person;
- attempts to murder another person;
- tortures another person;
- rapes another person; or
- kidnaps another person.
Then, any audio visual material that is produced by the person carrying out the abhorrent violent conduct, or by a person who was involved in the attempt or otherwise aided or abetted the conduct, is regulated as 'abhorrent violent material'.
Notification of abhorrent violent material
The Act requires that internet service providers, content service providers and hosting service providers to notify the Australian Federal Police (AFP) within a 'reasonable time' if any abhorrent violent material involving abhorrent conduct in Australia is available on the service providers' platform, whether or not the provider is in Australia. That is, this applies to all platform providers around the world, if they host or make available the relevant material.
A failure to notify can be penalised with a fine of up to $168,000 for individuals and $840,000 for corporations.
Removal of content
If the platform then fails to remove the relevant content 'expeditiously', further penalties apply, 3 years in prison and up to $2.1m for individuals, and fines of up to $10.5m or 10% of the platform's annual turnover, whichever is greater. Clearly, for the multinational content platforms that this is designed to regulate, the potential financial impact is enormous.
Are there exceptions?
Yes, there are exceptions for certain violent or explicit activities, such as boxing, medical procedures and consensual sexual acts. There are also defences to publication around reporting in the public interest, court proceedings, research and artistic works. The federal Attorney-General can also avoid prosecutions on a discretionary basis.
What are the challenges with the Act?
There are several key issues with the Act that will need to be watched closely.
The most common criticism of the Act is that it was passed so quickly. As a result of 2019 being an election year in Australia, there were a limited number of sitting days for Parliament to consider and pass this law, and so this was passed in a matter of days. Some say that this alone means the law is too hastily implemented to fully understand the implications. To this end, Communications Minister Mitch Fifield has stated that the Act will be referred to a parliamentary committee in the next term of parliament.
Specifically within the Act, there is uncertainty about the timing required to comply with the key obligations. Notifications must occur 'within a reasonable time' of the platform becoming aware of the content. Take downs must occur 'expeditiously'. In both cases, it is unclear what standard would apply, although in public statements Attorney-General Christian Porter suggested that the Christchurch footage being available for an hour was too long. If this is an indication of the relevant standard, then platforms will have significant compliance costs in ensuring awareness of, and action to remove, relevant content.
More fundamentally, there is ongoing debate about whether Facebook and others should be regulated in this way? Social media giants, their traditional media counterparts and community expectations work at cross purposes here.
Until recently, social media platforms have resisted calls for regulation of content on their platforms, all the while still employing a huge number of content checkers, and increasingly engaging with third party content verifiers. Despite this, the Christchurch footage was live streamed for a reported 17 minutes without interruption, and Facebook publicly stated it removed 1.5 million related videos of the shootings within the 24 hours after the shootings. The enormity of the task is undeniable, although Mark Zuckerberg's recent open letter to regulators makes it clear that Facebook is open to working on further regulation, and of course, just because something is hard does not mean it should be ignored, particularly one of such central social importance.
Finally, in Australia, there is a clear trend emerging towards restrictive digital regulation.
First, there is the ACCC's Digital Platform Enquiry, investigating the market power of Facebook and Google, due to report in July 2019.
Secondly, immediately before Christmas, there was the introduction of the Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018, 'the encryption law', which requires technology providers to assist law enforcement to access encrypted communications. The encryption law, with its well-intentioned but poorly-defined caveat that technology creators are not required to build in 'systemic weaknesses', is still not confidently understood by those who would seek to rely on it, and has been the subject of intense industry criticism. Much like the Act, it was hastily passed with a post-enactment review meant to take place (although this has not occurred yet).
Thirdly, with the introduction of the Act, this combination of recent developments in Australia demonstrates a marked move towards the regulation of digital technologies. As Alex Hutchens, Partner at McCullough Robertson said in the Financial Times1, Australia is increasingly being viewed as a hostile location by the technology industry owing to tough laws on social media and encryption, as well as a government-commissioned inquiry into social media platforms. It will be interesting to see industry's response (the extreme versions of which have been to question whether Australia is viable market in which to operate), and of course, whether there is a tangible effect on the practical experience of using digital tools (including safety from hacking at a user and national level) and consuming digital content.
1 'New Zealand terror suspect to face 50 murder charges', Jamie Smyth and Alice Woodhouse, https://on.ft.com/2G0CcHq
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.