The Bulletin
Business Growth
.

News

Elon Musk is mad he’s been ordered to remove Sydney church stabbing videos from X. He’d be more furious if he saw our other laws

  • Written by Rob Nicholls, Senior research associate, University of Sydney
A computer screen showing a browser window reading 4chan.

Australia’s eSafety Commissioner has ordered[1] social media platform “X” (formerly known as Twitter) to remove graphic videos of the stabbing of Bishop Mar Mari Emmanuel in Sydney last week from the site. The incident was captured on the church’s livestreamed mass service.

In response to this order, X’s owner, Elon Musk, has branded[2] the commissioner the “Australian censorship commissar”.

X had agreed to part of the take-down. However, it did not agree with removing the material entirely, telling media publications “X believes that eSafety’s order was not within the scope of Australian law and we complied with the directive pending a legal challenge.”

So what are the laws around this, especially because the church incident was quickly labelled a terrorist act[3] by authorities? What powers do governments have in this situation?

Read more: Why is the Sydney church stabbing an act of terrorism, but the Bondi tragedy isn't?[4]

Prompt political fallout

The response from politicians has been swift. Labor minister Tanya Plibersek referred to[5] Musk as an “egotistical billionaire”.

Senior Liberal Simon Birmingham said[6]:

They absolutely should be able to quickly and effectively remove content that’s damaging and devastating to the social harmony and fabric of society, particularly images such as terrorist attacks.

Other Labor ministers described[7] X as “a playground for criminals and cranks” or accused the company of thinking they’re above the law.

Of course such damning remarks directed towards a much-maligned website and its equally controversial owner are to be expected. What politicians can do about it is another matter.

What do federal laws say?

The eSafety Commissioner, Julie Inman-Grant, has the power to require the take-down of material under the Online Safety Act. The power she exercised under part nine of that act was to issue a “removal notice”. The removal notice requires a social media platform to take down material that would be refused classification under the Classification Act.

A woman with shoulder-length light brown hair looks on
Australia’s eSafety Commissioner, Julie Inman-Grant, issued X with a removal notice. Mick Tsikas/AAP

The video was circulating online as the New South Wales Commissioner of Police, Karen Webb announced the attack was a terrorist incident and the alleged perpetrator would be charged with a terrorist offence[8].

While it’s these laws being applied in the case against X, there are other laws that can come into play.

Australia also has a voluntary code of practice relating to disinformation and misinformation[9]. This is administered by the industry group DiGi[10]. The signatories to this code include Adobe, Apple, Facebook, Google, Microsoft, Redbubble, TikTok, and Twitch.

X had previously adopted the code. X’s failure to comply led to its signatory status being withdrawn[11] by DiGi in November 2023.

The government released a draft of a proposed bill[12] to combat misinformation and disinformation in June 2023[13]. The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill would give the Australian Communications and Media Authority power to enforce an industry code, or make one if the industry could not. It is a variation of this bill, reflecting the substantial range of views on the draft, that now has bipartisan support[14].

Would this new law make any difference in this case?

The immediate answer is no. The eSafety Commissioner already has extensive powers. She used only one of those powers in this case, but there are are alternative courses of action.

Read more: Yes, Labor's misinformation bill could jeopardise free speech online[15]

What else could be done?

Perhaps the gruesome images in the Wakeley videos might remind some of the Christchurch massacre.

In that attack, Telstra, Optus, and Vodafone (now part of TPG), cut access[16] to sites such as 4Chan, which were disseminating video of the attack. This was without any prompting from either the eSafety Commissioner or from law enforcement agencies.

A computer screen showing a browser window reading 4chan. Telcos blocked websites like 4Chan in the immediate aftermath of the Christchurch massacre. Shutterstock[17]

The eSafety Commissioner has the power to require telcos to block access. She would need to be satisfied the material depicts abhorrent violent conduct and be satisfied the availability of the material online is likely to cause significant harm to the Australian community.

This means the commissioner could give a blocking notice to telcos which would have to block X for as long as the abhorrent material is available on the X platform.

Read more: Terrorist content lurks all over the internet – regulating only 6 major platforms won't be nearly enough[18]

Separately, the telcos have an obligation to do their best “to prevent telecommunications networks and facilities from being used in, or in relation to, the commission of offences against the laws of the Commonwealth or of the States and Territories” under the Telecommunications Act. This requires there to be an offence.

There is a potential that sharing the video material could be seen as an act done in preparation for, or planning, terrorist acts, if the video was depicting an incident police had decided was an act of terror. This would be a breach of the terrorism prohibitions under the federal Criminal Code.

All this is to say while Musk may be unhappy with the eSafety Commissioner’s actions, it’s just the tip of the iceberg of the laws that could force his site to remove terrorist content.

References

  1. ^ has ordered (www.esafety.gov.au)
  2. ^ has branded (www.news.com.au)
  3. ^ terrorist act (theconversation.com)
  4. ^ Why is the Sydney church stabbing an act of terrorism, but the Bondi tragedy isn't? (theconversation.com)
  5. ^ referred to (www.afr.com)
  6. ^ said (www.aap.com.au)
  7. ^ described (www.news.com.au)
  8. ^ terrorist offence (www.police.nsw.gov.au)
  9. ^ disinformation and misinformation (digi.org.au)
  10. ^ DiGi (digi.org.au)
  11. ^ being withdrawn (digi.org.au)
  12. ^ bill (www.infrastructure.gov.au)
  13. ^ June 2023 (www.infrastructure.gov.au)
  14. ^ bipartisan support (www.theguardian.com)
  15. ^ Yes, Labor's misinformation bill could jeopardise free speech online (theconversation.com)
  16. ^ cut access (www.sbs.com.au)
  17. ^ Shutterstock (www.shutterstock.com)
  18. ^ Terrorist content lurks all over the internet – regulating only 6 major platforms won't be nearly enough (theconversation.com)

Read more https://theconversation.com/elon-musk-is-mad-hes-been-ordered-to-remove-sydney-church-stabbing-videos-from-x-hed-be-more-furious-if-he-saw-our-other-laws-228380

The Conversation

The Bulletin Magazine

Getting through it: 10 things to know about divorce in Australia

Separation and divorce are a difficult time for any family, especially when you share so much of your lives. If you are in the early stages of div...

NewsServices.com - avatar NewsServices.com

Tips to Stay Motivated When Learning a New Language

Learning a new language is such a daunting task. You need resilience and the right attitude to keep you going even when you feel like slacking off...

NewsServices.com - avatar NewsServices.com

Puppy safety chips: what to know about microchipping your doggo

It’s a sucky reality but we never know if our doggos will take off on us! Thankfully, there are numerous ways you can find them and ensure their...

NewsServices.com - avatar NewsServices.com