8月6日 (星期三)27°C 87
News
即時 財金 國際 本地 兩岸 News 天氣
日期:
《 上一篇       下一篇 》

eSafety Commissioner finds child safety gaps across major platforms

6/8/2025 18:31
Australia’s internet watchdog

has said the world’s biggest social media firms are still

“turning a blind eye” to online child sex abuse material on

their platforms, and said YouTube in particular had been

unresponsive to its enquiries.



In a report released on Wednesday, the eSafety Commissioner

said YouTube, along with Apple, failed to track the

number of user reports it received of child sex abuse appearing

on their platforms and also could not say how long it took them

to respond to such reports.



The Australian government decided last week to include

YouTube in its world-first social media ban for teenagers,

following eSafety's advice to overturn its planned exemption for

the Alphabet-owned Google's video-sharing site.



“When left to their own devices, these companies aren’t

prioritising the protection of children and are seemingly

turning a blind eye to crimes occurring on their services,”

eSafety Commissioner Julie Inman Grant said in a statement.



“No other consumer-facing industry would be given the

licence to operate by enabling such heinous crimes against

children on their premises, or services.”



A Google spokesperson said “eSafety’s comments are rooted in

reporting metrics, not online safety performance”, adding that

YouTube's systems proactively removed over 99% of all abuse

content before being flagged or viewed.



“Our focus remains on outcomes and detecting and removing

(child sexual exploitation and abuse) on YouTube,” the

spokesperson said in a statement.



Meta - owner of Facebook, Instagram and Threads,

three of the biggest platforms with more than 3 billion users

worldwide - has said it prohibits graphic videos.



The eSafety Commissioner, an office set up to protect

internet users, has mandated Apple, Discord, Google, Meta,

Microsoft, Skype, Snap and WhatsApp to report

on the measures they take to address child exploitation and

abuse material in Australia.



The report on their responses so far found a “range of

safety deficiencies on their services which increases the risk

that child sexual exploitation and abuse material and activity

appear on the services”.



Safety gaps included failures to detect and prevent

livestreaming of the material or block links to known child

abuse material, as well as inadequate reporting mechanisms.



It said platforms were also not using “hash-matching”

technology on all parts of their services to identify images of

child sexual abuse by checking them against a database. Google

has said before that its anti-abuse measures include

hash-matching technology and artificial intelligence.



The Australian regulator said some providers had not made

improvements to address these safety gaps on their services

despite it putting them on notice in previous years.



“In the case of Apple services and Google’s YouTube, they

didn’t even answer our questions about how many user reports

they received about child sexual abuse on their services or

details of how many trust and safety personnel Apple and Google

have on-staff,” Inman Grant said.






回主頁 關於我們使用條款及細則版權及免責聲明私隱政策 聯絡我們

新城廣播有限公司版權所有,不得轉載。
Copyright © Metro Broadcast Corporation Limited. All rights reserved.