Australian Government Pushes Through Expansive New Legislation Targeting Abhorrent Violent Material Online.
Legal News & Analysis - Asia Pacific - Australia - Regulatory & Compliance
23 April, 2019
What you need to know
- The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 came into effect on 6 April 2019.
- The Act creates new offences under the Criminal Code that have the effect of requiring social media platforms and other websites to expeditiously remove abhorrent violent material and refer it to the Australian Federal Police.
- The new offences apply to any content reasonably capable of being accessed within Australia regardless of where the platform or website operator is located.
- The maximum penalties for failing to remove abhorrent violent material include fines of up to 10% of annual group turnover for corporations and imprisonment for up to 3 years for individual employees that fail to remove or refer content.
- The new offences will not apply to various categories of material including journalistic content reported in the public interest, artistic works, material used for research or material made available for the purpose of advocating social or political change.
What you need to do
- Given the serious consequences of being found liable under the new offences, update your content moderation policies to reflect the new legislation and consider deploying additional technical solutions or allocating additional resources so as to ensure compliance.
- Monitor the extent to which and how the legislation is enforced and participate in any future review or other law reform processes.
On 5 April 2019 the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (Act) received Royal Assent. The Act, which was passed with minimal consultation with the Australian technology industry, commenced on 6 April 2019.
The Act was introduced in the wake of the March 2019 Christchurch terrorist attack to create new offences under the Criminal Code Act 1995 (Cth) (Criminal Code) that would require social media platforms and other websites and providers to expeditiously remove abhorrent violent material and refer such material to the Australian Federal Police (AFP). In order to do so, the Act adds two new offences to the Criminal Code dealing with any failure to refer and failure to remove abhorrent violent material.
By introducing the law, the government intends to prevent online platforms being weaponised for the purpose of spreading violent and extreme propaganda by compelling online providers and website hosts to more proactively monitor content on their platforms.
The Act is intended to complement the existing take-down and referral procedures for online content under Schedules 5 and 7 of the Broadcasting Services Act 1992 (Cth).
The Act is, on its face, a direct response to the Christchurch attacks. However, it is also part of a broader global trend to regulate the activities of online platforms, as seen in the ACCC's digital platforms inquiry and the United Kingdom's recently released online harms white paper which proposes various measures to stop the spread of harmful content online.
While the Act passed with the support of the federal Labor Party (who promised to review and amend the legislation as necessary if it wins the next election), it has been widely criticised by the Australian and international technology communities, particularly given its scope and the lack of consultation before it was enacted.
Who does the law apply to?
The Act applies to content services, hosting services and internet service providers regardless of whether they are located inside or outside Australia.
The removal provisions of the Act apply both to content service providers and hosting service providers, which are defined by reference to the Enhancing Online Safety Act 2015 (Cth).
- A content service means (i) an electronic service aimed at enabling online social interaction that allows users to interact and post content (a social media service); or (ii) a service that delivers to end users, or allows end users to access, material over the internet (a designated internet service). A content service includes most websites but does not include user to user messaging services, such as SMS, instant messaging or email, or on-demand catch-up services for subscription and free-to-air television.
- A hosting service means a service that stores material hosted on a social media service or designated internet service (eg, a cloud storage provider).
The referral provisions apply to content service providers, hosting services providers and internet service providers (being a person that supplies an internet carriage service to the public in Australia).
What is abhorrent violent material?
Abhorrent violent material is audio, visual, or audio-visual content produced by the perpetrator(s) of abhorrent violent conduct (or an accomplice) that a reasonable person would consider offensive in the circumstances. Abhorrent violent conduct is defined to mean murder or attempted murder, a terrorist act, torture, rape or kidnapping. There is no requirement that the person needs to be convicted of an offence in order for their conduct to constitute abhorrent violent conduct.
For the purposes of the Act, it is immaterial whether or not the abhorrent violent material has been altered (for example, through the superimposition of other material).
However, if the material is altered to such an extent that it no longer meets the criteria of abhorrent violent material (eg, through appropriate editing), it will not be captured by the legislation.
The definition of abhorrent violent material is likely to create a number of problems in practice. For example, it is likely to be difficult to discern from a piece of content whether it was produced by a perpetrator of abhorrent violent conduct rather than a victim, bystander or media organisation. Furthermore, without also being aware of the context of particular actions and other relevant circumstances, it may be difficult to determine whether or not particular conduct meets the definition of abhorrent violent conduct under the Act (eg, there may be difficulties in assessing whether violent acts recorded in any content are consensual).
Failure to refer content
Under the Act, it is an offence for an internet service provider, content service or hosting service to fail to refer abhorrent violent material to the AFP where the underlying conduct occurred or is occurring in Australia. The provisions are similar to the existing notification obligations for child pornography under the Criminal Code.
If an internet service provider, content service or hosting service:
- is aware that their service can be used to access particular material;
- they have reasonable grounds to believe that material is abhorrent violent material; and
- the material records or streams abhorrent violent conduct that has occurred or is occurring in Australia,
the provider or service is required to refer details of the material to the AFP within a reasonable time unless they reasonably believe that the AFP would already be aware of such details. Furthermore, this obligation applies regardless of where the internet service provider or content service is located.
The Act does not prescribe what details the provider or service is required to supply to the AFP in order to avoid contravening the legislation. Conceivably, this could include information such as the relevant URL and even the ISP address or identity of the person that posted the content.
The term "reasonable time" is not defined in the Act. However, the Explanatory Memorandum states that this will ultimately be a question for the trier of fact (eg, a jury) and will depend on factors such as the volume of the material (eg, how frequently it was posted and re-posted) and the capacity and resources of the service provider (eg, its technical removal capabilities).
Failure to comply with the obligation to refer content to the AFP can result in fines of up to $168,000 (800 penalty units) for individuals or $840,000 (4,000 penalty units) for corporations.
Failure to remove content
In addition, under the Act it is an offence for a content or hosting service provider to fail to expeditiously remove from their content or hosting service abhorrent violent material that is reasonably capable of being accessed in Australia (regardless of where the service itself is located).
The question of whether or not specific content has been "expeditiously removed" is, again, a matter for the trier of fact and will depend on factors such as the type and volume of the material and capabilities and resources of the service provider.
The prosecution is not required to establish that the provider had direct knowledge of the abhorrent violent material in order to prove the commission of an offence. Rather, it would only be required to establish that the provider was reckless (ie, aware of a substantial risk that the relevant material was available or constituted abhorrent violent material).
If an individual is found guilty of failing to remove abhorrent violent material, the maximum penalty is a fine of $2.1 million (10,000 penalty units) or a term of no more than 3 years imprisonment. For corporations, the maximum penalty is a fine of no more than the greater of $10.5 million or 10% of annual global group turnover.
The Act also creates a new regime which enables the eSafety Commissioner to issue a notice that, at the time the notice was issued, a specified content or hosting service could be used to access abhorrent violent material. A notice may only be issued where the Commissioner is satisfied on reasonable grounds that the service could be used to access the material.
The Explanatory Memorandum to the Act suggests that a content or hosting service provider is unlikely to be prosecuted if, after receiving a notice, it removes the material specified in the notice.
However, it is important to emphasise that receiving a notice is not a necessary element of an offence under the Act and that complying with a notice may not avoid a prosecution (eg, where the offending material has already been available on the service for a period of time). Rather, if the service provider does not remove material after receiving a notice, the Act creates a rebuttable presumption of recklessness for the purpose of any prosecution.
What defences are available?
As would be expected, the Act creates a series of defences, including on public interest grounds. However, at this stage, the scope of these defences is unclear and will ultimately be a matter to be clarified through legislative reform or judicial consideration.
The removal obligations do not apply to material accessed using a content service if:
- the accessibility of the material is necessary for enforcing, or monitoring compliance with, a law of Australia or another country;
- the accessibility of the material is reasonably necessary for conducting scientific, medical, academic or historical research (eg, research into extremist behaviour or responses to terrorists acts);
- the material relates to a news report in the public interest and made by a person working in a professional capacity as a journalist;
- the accessibility of the material is reasonably necessary in connection with the performance by a public official of their official duties or functions;
- the accessibility of the material is for the purpose of advocating change to any law, policy or practice in Australia or another country and the accessibility is reasonable in the circumstances (eg, footage used to highlight and denounce abuses of power by an authoritarian regime); or
- the accessibility of the material relates to the good faith development, exhibition or distribution of an artistic work (eg, photography depicting scenes from a war zone).
More generally, the new offences do not apply to the extent they would infringe any constitutional doctrine of implied freedom of political communication.
For further information, please contact:
Robert Todd, Partner, Ashurst