Social media firms are offering “a product that’s killing people” and their CEOs have “blood on their hands” for failing to protect children from online predators, a US senator has told five tech executives during vitriolic hearings in which Meta CEO Mark Zuckerberg was forced to apologise to angry families of victims of online child grooming.
“No one should have to go through the things that your families have suffered,” Zuckerberg turned and said to protestors brandishing photos of children who had been harmed or committed suicide after child abusers contacted them via Meta social-media platforms Facebook, Instagram, and WhatsApp.
“This is why we have invested so much, to make sure no one has to go through the types of things that your families have suffered.”
The apology came after Zuckerberg was hectored by US Senate Judiciary Committee member Josh Hawley because “you didn’t take any action, you didn’t fire anybody, you haven’t compensated a single victim” of child sexual abuse perpetrated through Meta’s platforms.
Senators lined up to put Zuckerberg – along with X CEO Linda Yaccarino, Snap CEO Evan Spiegel, TikTok CEO Shou Zi Chew, and Discord CEO Jason Citron – through the wringer as the heads of the world’s largest social media platforms were called to task over failing to kerb the grooming and ‘sextortion’ of children through their platforms.
Social media platforms “are responsible for many of the dangers our children face online,” chairman Dick Durbin said in attacking the platforms’ “design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety [which] have all put our kids and grandkids at risk.”
Texas Senator Ted Cruz showed Zuckerberg a warning screen presented to Instagram users who searched for child abuse material, which provided them with two links: ‘get resources’ or ‘see results anyway’.
“The basic science behind that is when people are searching for something that is problematic,” Zuckerberg offered in a mealymouthed reply, “it’s often helpful, rather than just blocking it, to direct them towards something that could be helpful to get help.”
“I understand ‘get resources’,” Cruz spat back. “In what sane universe is there a link for ‘see results anyway’?”
Spiegel – who testified that default contact settings “make it as difficult as possible for people to be contacted by someone they don’t already know” – also apologised, addressing the parents of children who had been able to buy illegal drugs through the platform and saying that “we work very hard to… proactively look for and detect drug-related content.”
Stepping up the fight
The bipartisan hearing – which was livestreamed online and included the first ever Congressional testimony by Yaccarino, Spiegel and Citron as well as statements from Zuckerberg and Chew – reflects escalating efforts to stem the flood of child sexual abuse material (CSAM) online.
The US Congress is considering five separate pieces of legislation – including the STOP CSAM Act, REPORT Act, EARN IT Act/COSMA, SHIELD Act and Project Safe Childhood Act – that, the National Center for Missing and Exploited Children (NCMEC) has said, “could move mountains for survivors of CSAM.”
NCMEC’s CyberTipLine receiving over 83 million reports of such materials between 2020 and 2022.
The AFP-led Australian Centre to Counter Child Exploitation (ACCCE) – which receives around 300 reports of sextortion targeting Australian children every month – recently partnered with Meta and Kids Helpline to increase available support.
Automated reporting tools helped Facebook identify and report 21.1 million CSAM instances during 2022 alone, a new Surfshark analysis found, compared to 5 million reports by Instagram, 2.2 million by Google, 1 million by WhatsApp, 551,086 by Snapchat – and just 98,050 reports by Twitter.
That increased to 850,000 in 2023 after Twitter was bought by Elon Musk and rebranded as X, Yaccarino said during testimony in which she feted “material changes to protect minors” – including technological improvements that had led to the February 2023 submission of the company’s first fully-automated NCMEC report.
“X is an entirely new company,” Yaccarino said, noting that it last year suspended 12.4 million accounts for violating the company’s child sexual exploitation (CSE) policies – up from 2.3 million accounts removed in 2022 under Twitter management.
“We have strengthened our enforcement,” she said, “with more tools and technology to prevent those bad actors from distributing, searching for and engaging with CSE content.”
“It is time for a federal standard to criminalise the sharing of nonconsensual intimate material,” she continued. “We need to raise the standards across the entire internet ecosystem, especially for those tech companies that are not here today and not stepping up.”
The shift away from human CSAM scanning marks a significant change in social media giants’ fight against abuse material – yet some are concerned technology won’t keep up after Musk’s takeover saw the service gutting its former child safety team and reinstating banned accounts.
Having been recently kicked off of the Digital Industry Group (DIGI) misinformation code and copping criticism from Australian regulators for ignoring a $610,500 fine levied on it for reporting failures – which recently forced authorities to file a civil case against the site – X’s reliance on automation again came under fire when its technology proved unable to stop users from finding sexually explicit Taylor Swift deepfakes.
Surging attention on CSAM by US regulators follows similar steps in other countries, with the EU recently forcing large pornography sites to follow strict rules.
Australia’s eSafety Commissioner is currently writing new rules to force social media firms to step up after two proposed industry codes were rejected because they “did not contain appropriate community safeguards for users in Australia.”