The Media Leader Podcast podcast

Platforms' teen safety efforts amount to 'broken promises' — with Andy Burrows and Harriet Kingaby

0:00
46:46
Rewind 15 seconds
Fast Forward 15 seconds

This year, the Molly Rose Foundation, the charity founded in the memory of Molly Russell, in partnership with online safety researchers, has released a number of studies around the efficacy of social media platforms’ online safety efforts.

In August, it found that, despite platforms’ promises to get better on tackling the issue of child safety, the likes of TikTok and Instagram are still bombarding young users with a quote unquote “tsunami of harmful content” via their recommendation algorithms.

Weeks later, another report found that most of Meta's child safety tools, including and especially its Teen Accounts feature, are apparently ineffective.

Andy Burrows is the CEO of the Molly Rose Foundation. Joined by the Conscious Advertising Network's Harriet Kingaby and host Jack Benjamin, he explained the severity of the harms facing children on social platforms, and why their efforts to ameliorate them have been, in his words, "performative".

After the recording of this episode, the Foundation released a third piece of research that found half of girls aged 13-17 saw high-risk suicide, self-harm, depression or eating disorder content in the week shortly prior to the Online Safety Act taking effect this summer.

Responding to the MRF's research on its Teen Accounts, a Meta spokesperson said the report "repeatedly misrepresents our efforts to empower parents and protect teens, misstating how our safety tools work and how millions of parents and teens are using them today.

"Teen Accounts lead the industry because they provide automatic safety protections and straightforward parental controls. The reality is teens who were placed into these protections saw less sensitive content, experienced less unwanted contact, and spent less time on Instagram at night. Parents also have robust tools at their fingertips, from limiting usage to monitoring interactions. We'll continue improving our tools, and we welcome constructive feedback — but this report is not that."

Burrows and Kingaby also discussed why advertisers haven’t yet been moved to apply business pressure on platforms in response, and whether regulation is plausible.

Said Burrows: "This is a commercial decision, and children are paying the price."

Highlights:

2:07: Toplines of the MRF's research and how advertisers have reacted

13:51: Malice, ignorance, or incompetence?

17:01: Is anyone tackling child safety responsibly? Issues of transparency and trust and safety investment

23:16: AI chatbots and child safety

31:33: Is regulation plausible right now?

37:17: How should parents navigate the online world on behalf of their kids?

Related articles:

Molly Russell charity CEO: Social media’s user safety efforts have been ‘performative’

Why planners are investing so heavily in Meta, despite attention metrics

US TikTok sale brings uncertainty for creators amid free speech chill

Meta launches subscription option to allow UK users to avoid ads

---

Thanks to our production partners Trisonic for editing this episode.

--> Discover how Trisonic can elevate your brand and expand your business by connecting with your ideal audience

Visit The Media Leader for the most authoritative news analysis and comment on what's happening in commercial media.

LinkedIn: The Media Leader

YouTube: The Media Leader

More episodes from "The Media Leader Podcast"