Select Page

Social media platforms being regulated as telcos under discussion in Australia

Social media platforms being regulated as telcos under discussion in Australia

A parliamentary joint committee is currently considering whether social media platforms should be regulated as carriage service providers given the amount of communications and content sent through them.

Various experts have submitted to the committee that social media platforms like Facebook are of such a significant scale and are so uniquely pertinent to the problem of online child exploitation that they should be subject to additional scrutiny, such as being regulated as carriage service providers.

The considerations are part of the Parliamentary Joint Committee on Law Enforcement’s inquiry into Australia’s law enforcement capabilities in relation to child exploitation.

During a joint parliamentary hearing on Friday, Meta told the committee that it believes Australia’s framework for law enforcement working with social media platforms to detect child abuse material is already sufficient, and that the additional classification could be redundant.

“We’ve set up a dedicated portal, we have dedicated team to liaise with law enforcement, and then we can disclose what we call basic subscriber information data quite quickly through that process. We obviously have emergency channels if there’s any threat to life; either we proactively disclose or law enforcement can ask us for assistance through those emergency processes,” said Mia Garlick, Meta Australia New Zealand Pacific Islands public policy director.

“So I guess from where [Meta] sits in terms of our engagement with law enforcement, we feel that there is already sort of a good way to get there and so it might not be necessary to sort of tinker with definitions in the Telecommunications Act when we’ve got the ability to work constructively through the existing frameworks.”

While the eSafety commissioner said last month that social media platforms have primarily done a good job of removing abhorrent violent material, it noted in its submission to the committee that the approach to detecting and removing child abuse material is different partly due to this type of content primarily being distributed through private communication channels.

The government agency also said that as more social media platforms move towards encrypted communications, this dynamic could effectively create “digital hiding places”. It shared its worry that platforms may also claim they are absolved of responsibility for safety because they cannot act on what they cannot see.

eSafety online content manager Alex Ash told the committee yesterday afternoon that a drift towards encrypted communications by major social media platforms would make investigations into serious online child sexual abuse and exploitation more difficult. He did note, however, that in instances where eSafety was able to detect such material on social media platforms, platforms have been cooperative and quick to respond to these flagged materials.

To address these concerns regarding the growing shift toward encrypted communications, the committee on Friday sought consultation on the merits of communications to and from minors aged 13-18 being exempt from encryption from a technical standpoint, as well as whether such a framework was technically possible.

Meta’s Safety head Antigone Davis said while it may be possible to create a partial encryption system, she believes it would come at the cost of undermining encryption for other individuals engaging on the platform. As a counterpoint, Davis said her company believes it would be possible to build protections into an encrypted service through mechanisms such as enabling the blurring of images, preventing people from being able to contact minors, making it easier for users to report child abuse material, and using non-encrypted information to catch people who proliferate child abuse material.

“While they may obfuscate some of what they’re doing, what we do find is that they do leave trails, they do leave what you might think of as prompts. So for example, you may see people have this kind of interest, provoked sexualised comments under minors, or you may see what will look like an innocuous bringing together of lots of photos of minors that appear innocuous … so there are opportunities to actually use those breadcrumbs,” Davis said.

Communications Alliance program management director Christiane Gillespie-Jones, who also appeared before the committee, provided a slightly different picture of how encrypted communications could affect law enforcement’s ability to detect child abuse material. 

While Gillespie-Jones agreed with Meta’s sentiment that encrypted communications were important for user privacy, after being questioned about its impact on detecting child abuse material, Gillespie-Jones acknowledged the possibility that encrypted communications could make certain child abuse material no longer discoverable.

In terms of how much more difficulty encrypted communications would add to detecting such material, Gillespie-Jones said this was currently unquantifiable.

Related Coverage

  • Online safety and end-to-end encryption can co-exist, says data protection watchdog. But how?
  • Australian’s proposed anti-troll laws would require social media platforms to disclose personal user details for defamation lawsuits

  • Social media platforms need complaints schemes to avoid defamation under Aussie anti-troll Bill

  • eSafety thinks online platforms have done well in removing abhorrent violent content so far

  • Digital Rights Watch defends social media platforms’ efforts in removing terrorist content

  • Tech giants, telcos and Digital Rights Watch want clarity on monitoring requirements for online violent abhorrent content

Source: https://www.zdnet.com/article/social-media-platforms-being-regulated-as-telcos-under-discussion-in-australia/#ftag=RSSbaffb68