Mark Zuckerberg, the CEO of Meta, recently took the stand in a significant trial concerning social media companies. This trial focuses on whether platforms like Instagram properly monitor their users, particularly young ones. During intense questioning, Zuckerberg acknowledged that while Meta has made strides in identifying underage users, he wished they had acted more quickly. He stated, “I always wish that we could have gotten there sooner.”
When asked about age verification, he noted that some users lie about their age, and his company does remove those users when identified. Plaintiffs’ lawyers challenged this, asking if it’s realistic to expect children to understand the platform’s rules. Zuckerberg’s response was that he found the issue unexpectedly complicated.
He also addressed how a company should care for its users, saying, “I think a reasonable company should try to help the people that use its services.” Acknowledging his public speaking struggles, he humorously admitted, “I think I’m actually well-known to be sort of bad at this.”
The court’s atmosphere was tense, especially with attendees wearing Meta’s AI glasses. The judge warned that anyone recording with these could face legal consequences.
The trial highlights serious concerns about how Meta’s platforms might affect young people’s mental health. For example, a key plaintiff, referred to as KGM, claims that her addiction to Instagram and YouTube contributed to her severe depression and suicidal thoughts. This case is part of a broader legal push against social media companies, alleging they intentionally design their platforms to be addictive.
In a prior testimony, Instagram’s CEO Adam Mosseri downplayed addiction, suggesting that high usage is “problematic” but not clinically addictive. Psychologists and researchers disagree, finding significant mental health risks associated with heavy social media use.
In the past two years, several families have voiced their frustrations. One parent, John DeMay, shared his heartbreak over the loss of his son to suicide after an online experience. DeMay has been advocating for child safety online, pushing for more accountability from tech giants. He expressed hope that this trial could lead to reform, even if the outcome is uncertain.
Meanwhile, Meta is facing separate challenges in New Mexico, with claims that it failed to disclose risks related to its platforms. The company denied those allegations.
To address safety, Instagram has introduced features aimed at young users, yet a 2025 review found that many of these tools are ineffective. Less than one-fifth function well, raising further concerns about the platform’s commitment to protecting kids.
Ex-employees have also raised alarms. Kelly Stonelake, who worked at Meta, claimed that her concerns about child safety led to harassment and retaliation. Her lawsuit highlights issues of data collection on minors without parental consent and exposure to unsafe environments on the platform.
As this trial unfolds, it reflects a growing awareness and worry about the effects of social media, especially on younger audiences. Lawmakers and families are increasingly demanding change and accountability from tech companies, hoping that the courts can step in where regulation has struggled.
Source link

