Chief executives of TikTok, Snap, X, and Discord are all under fire for child safety concerns, landing them in Congress to defend their companies. CEOs from some of the biggest social media platforms are being criticized for doing far too little to protect kids and teens online. The hearing began at 10am on January 31st, featuring Meta’s Mark Zuckerberg.
Chair Durbin and Ranking Member Graham had announced subpoenas to the CEOs of Discord, Snap, and X to testify at a full committee hearing on November 20, 2023, following refusals to cooperate. On November 29th, Durbin announced that the companies will testify before the committee on January 31st, 2024.
Wednesday’s hearing, which was titled “Big Tech and the Online Child Sexual Exploitation Crisis”, covered an array of topics, including both recent and ongoing concerns that highlight how social media platforms fail to protect the youth. Examples of these concerns are instances of grooming, kidnapping and exploitation facilitated by Discord, drug sellers on Snapchat, and promotion of self harm content on TikTok. It is likely that tech companies will make platform and policy changes following the hearing, such as Meta, who recently made an update to Instagram restricting teens from receiving direct messages from unknown users.
Mark Zuckerberg, facing pressure from lawmakers, apologized to parents, some claiming that their children have died due to content on Facebook’s platforms.
I’m sorry for everything you’ve all gone through. It’s terrible”, Zuckerberg stated. “No one should have to go through the things that your families have suffered.
Despite all the criticism he is receiving, Zuckerberg has claimed that he has “40,000 people overall working on safety and security” stating that Meta has invested $5 billion in 2023 alone on these safety efforts. “We’re committed to protecting young people from abuse on our services, but this is an ongoing challenge”. He expresses how difficult it is to constantly come up with new responses, adapting and improving their strategies constantly.
It is clear that these tech companies are under regulated, and have designed platforms that are naturally addictive and harmful to the youth. Meta has been hit with numerous lawsuits surrounding child exploitation, including a civil suit filed against Meta in December, acknowledging a failure to address how the company’s apps enable sexual predators.
Lawmakers are proposing a possible solution to child exploitation through the Stop CSAM Act, which allows victims of child sexual exploitation to sue platforms that facilitated the exploitation. Some lawmakers have also been pushing the Kids Online Safety Act, which requires platforms to have more parental controls.
Any proposed regulation of the companies require time for it to pass in Congress and fully come into effect. Some worry that these bills may lead to censorship of reproductive rights and compromise the privacy of minors. Civil society groups such as The American Civil Liberties Union, and Fight for the Future, among more than 90 other organizations, urged in a letter that the bill encourages danger through unnecessary data collection.