Within weeks of the Christchurch mosque shootings in New Zealand, Australia has passed a new regulatory bill placing liability on content and hosting services for the removal of violent materials. An Australian citizen attacked two mosques, Masjid al Noor and Linwood Mosque, in the city of Christchurch, New Zealand. Forty-nine people were killed, and twenty more wounded. The terrorist streamed the massacre online coupled with a rambling 87-page post with white supremacist references. 

What does this law do?

The attacks occurred on March 15th, Australian Parliament passed the law on April 4th, and it was enacted two days later on the 6th. With a total timeframe measured in weeks, critics in the tech industry are raising concerns that the law was rushed and reactionary.

The law, named the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019, defines “abhorrent violent material” as visual or audio-visual content that depicts a terrorists act, murder, attempted murder, torture, rape, and/or kidnapping. The law applies whether the act was conducted within Australia or in another country, though there are additional penalties for not reporting a violent act occurring in Australia to the Australian Federal Police “within a reasonable time after becoming aware of the existence of the material.”

The law states that the Internet Service Provider (ISP), content service, or hosting service  commits an offense if they do not “ensure the expeditious removal of the material” or “expeditiously cease hosting the material”. The penalty for an individual at fault under the law is a sentence of up to 3 years and a fine of up 10,000 penalty units, or about 1.2 million American dollars. The corporate punishment is up to 50,000 penalty units, or approximately 5.7 million American dollars, or 10% of the annual turnover period of the 12 months before the offense.

Who is at fault?

The main target of this law seems to be Facebook, where the stream of the Christchurch was removed a little more than an hour after it was posted. A similar German law called the Act to Improve the Enforcement of the Law in Social Networks provides a full 24-hour timeframe to remove hate crimes and false news, raising questions about the vague verbiage in the law such as “reasonable time frame” and “expeditious”.

With sites such as Facebook, Instagram, YouTube, the content is almost entirely user-generated. Facebook alone has 2.3 billion active users and relies mostly on algorithms and user flagging to detect malicious content. However, the algorithms may have difficult differentiating between violent material and a movie or video game. If individuals aren’t flagging materials, the company may be simply unaware of its existence. No individual flagged the Christchurch stream until after it had ended.

Criticisms of the Law

Most of the critiques of this new law reflect the speed with which it was created and passed. There was no time dedicated to public or expert feedback in the creation of the law, and most of the Australian government is not intimately familiar with the tech sector. There are also concerns raised about some of the more vague details of the law that could be interpreted in more than one way.

The speed of the law’s creation and passage reflects rising fears that social media is playing a role in the spread of radicalization and violent acts. However, it is not totally clear within the law who will be held accountable, and how that accountability with be decided. The separate penalty for individuals has some companies concerned that any individual employee may be criminalized for failing to remove violent abhorrent material within an “expeditious” timeframe, whether or not they are aware of the content.

The fact that Internet Service Providers, or ISPs, are also included in this law raises other concerns. ISPs are the services that allow you to access the internet at all, but they don’t have the option to take down individual posts or content from a site because they don’t own the sites themselves. Their only option to avoid punishment may be completely removing or blocking sites for what could be one page of inappropriate material in order to avoid recourse, possibly censoring entire sites.

Most of the details of the law will need to be hashed out later on in the courtroom when the first person or company is charged. Until then, many of the finer details remain unclear.

Thank you for visiting Eye See You Now’s blog, an Austin SEO company. If you need help getting your business viewed online, contact us for a free consultation.