Saturday, July 27, 2024
HomeTechnologyAustralian regulators require social media companies to reveal their counterterrorism initiatives.

Australian regulators require social media companies to reveal their counterterrorism initiatives.

A regulator from Australia has written legal letters to social media companies, including Facebook, YouTube, X, Telegram, and Reddit, requesting that they provide information on the measures they are doing to remove content related to terrorism.

The e-Safety Commission expressed worry that the platforms were not taking appropriate action to prevent extremists from recruiting users using live-streaming capabilities, algorithms, and recommendation systems.

Since 2022, the regulator has the authority to require large digital companies to provide information about the amount of unlawful content they host and the steps they take to stop it from spreading. There may be hefty fines for failure to comply.

According to Inman Grant, violent extremist organizations primarily use Telegram as a recruiting and radicalization tool.

In response to a Reuters request for comment, the messaging service located in Dubai, which was ranked first in the 2022 OECD assessment for the frequency of content deemed to be extremist and terrorist, did not immediately answer.

Commissioner Julie Inman Grant told Reuters in an interview that “we don’t know if they actually have the people and resources in place to even be able to respond to these notices, but we will see.” “We’re not going to be afraid to take it as far as we need to, to get the answers we need or to find them out of compliance and fine them.”

YouTube, which comes in second for violent extremist content, “has the ability to spread propaganda broadly through their brilliant algorithms in a very overt way “.

According to Inman Grant, the subjects under consideration for terrorism included violent conspiracy theories, reactions to the wars in Gaza and Ukraine, and “misogynistic tropes that spill over into real-world violence against women”.

Although the regulator has previously written legal letters to platforms asking for information about how they handle hate speech and child abuse, she said that their anti-terrorism campaign has been the most complicated due to the variety of content and ways of content amplification.

In 2023, Elon Musk’s X received the first e-Safety fine due to its response to inquiries regarding how it handled content containing child abuse. In court, it is contesting the $386,000.

In this wave of legal letters sent on Monday, the commission first targeted Reddit, a chat forum website, and Telegram. The commissioner claims that a white supremacist who murdered ten Black individuals in Buffalo, New York in 2022 stated that the platform helped him become more radicalized.

At first, Alphabet, the company that owns YouTube, and X were unavailable for comment.

According to a representative for Facebook’s parent company Meta, “there is no place on our platforms for terrorism, extremism, or any promotion of violence or hate.” The business is currently analyzing the commission’s notices.

Terrorist content has no place on Reddit, according to a representative, and “we look forward to sharing more information on how we detect, remove, and prevent this and other acts of

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments