
Meta chief Mark Zuckerberg arrives at the Los Angeles Superior Court. (Photo by Jill Connelly/Getty Images)
I’ve started noticing something new in my house recently. My two young children, still many years away from having digital devices of their own, are already drawn to the glowing screens that dominate modern life. They reach for our phones and iPads whenever they get a chance. And they watch, closely, how the adults around them use these digital gateways.
That’s part of the reason why I’ve been paying close attention to a Los Angeles courtroom this week.
There, some of the most powerful executives in Silicon Valley—including Mark Zuckerberg, Adam Mosseri, and Neal Mohan—are being forced onto the witness stand to defend the design and impact of their social media platforms on young users. In fact, it's the first time Zuckerberg has ever faced a jury as head of the technology giant.
The trial centers on a now-20-year-old woman who alleges that her childhood use of social platforms, including Instagram and YouTube, contributed to depression, body dysmorphia and suicidal thoughts. Her legal team argues the apps were deliberately engineered like "digital casinos" to maximize engagement among young users and keep them on the platforms as long as possible. While TikTok and Snap settled the suit ahead of trial, Meta and YouTube have strongly disputed that characterization, pointing to age restrictions for those under 13 and parental tools.
The suit is just one of some 1,600 others now working their way through the courts. As if to underscore that piling legal pressure, Zuckerberg appeared to be served with another lawsuit as he was ushered into the courthouse Wednesday. Moments later, the judge overseeing the case, Carolyn Kuhl, admonished Zuckerberg’s team for wearing Meta's Ray Ban-branded A.I. smart glasses, complete with cameras and microphones, as they entered the courtroom.
"This is very serious," Kuhl said.
On the stand, the Meta founder denied that he had set a goal of increasing user time spent on Instagram, as was shown in a 2015 email he wrote to staff. Those were not "goals," Zuckerberg said, but an aspirational "gut check" for managers to continue improving the platform.
While the high-stakes dispute plays out in court, it is clear the case could have an outsized impact on the still unregulated wild west of social media platforms. After more than a decade of mostly performative Congressional hearings and sternly worded letters, the federal government has not imposed any meaningful constraints on the companies powering the platforms used by hundreds of millions of Americans, rewiring brains, shortening attention spans, and transforming the way the world consumes information.
Part of the problem has been that any serious attempt to regulate the platforms quickly runs into Section 230, the liability shield that has long protected online services from many lawsuits tied to user content. Lawmakers in both parties have criticized the twenty-six word statute, but have not come close to agreeing on what should replace it.
While a paralyzed Washington has sat on its hands, other nations have moved ahead with regulations. The European Union has imposed a more muscular set of rules and boundaries on tech platforms, including transparency and safety obligations. Australia in December enacted a world-first ban on all of the major social media platforms for children under 16. Britain, Denmark, and France are also considering similar safety rules.
Meanwhile, the U.S. has largely relied on voluntary commitments and pressure campaigns, including intense public pressure from Donald Trump and Republicans to end fact-checking on social media and the practice of deplatforming those who promote dangerous misinformation. Tech platforms have responded by acquiescing to Trump's demands with Meta also hiring former GOP policy chief Nick Clegg as its global policy lead in a clear signal to the administration that it was willing to play ball. Over at X, Trump-ally Elon Musk has manipulated the former Twitter’s powerful algorithm to promote conservative content while downranking posts from traditional media outlets, according to a new study published this week. “Exposure to algorithmic content leads users to follow conservative political activist accounts, which they continue to follow even after switching off the algorithm,” they wrote. That Grand Canyon-sized gap in transparency is one reason this week’s trial in Los Angeles could prove consequential far beyond the particulars of any single plaintiff.
“This is a monumental inflection point in social media,” Matthew Bergman, a lawyer for the Social Media Victims Law Center, told The Associated Press. “When we started doing this four years ago no one said we’d ever get to trial. And here we are trying our case in front of a fair and impartial jury.”
These lawsuits are trying a different route to accountability. Instead of zeroing in on harmful posts—an area where the platforms have long been shielded by Section 230—the cases take aim at the products themselves: the algorithms that keep serving up content, endless scrolling, push notifications, and other design features that researchers say can hook young users.
Whether that theory ultimately succeeds remains an open question. But the pressure on the industry is no longer confined to Capitol Hill. After years of hearings that produced little lasting change, the courtroom may be where real consequences finally take shape. That legal reckoning is placing decisions in the hands of juries empowered to assign liability—not lawmakers seeking to score viral moments of their own for social media.

