Systematic Revision: Zuckerberg Testifies in Court, Social Media Addiction Lawsuit Reshapes Tech Giants' Responsibility Mandates.
21/02/2026
Zuckerberg Testifies in Court: The Battle Over Tech Giants' Responsibility in Social Media Addiction Lawsuits
On February 19, 2026, in Courtroom 24 of the Los Angeles County Superior Court, Meta CEO Mark Zuckerberg sat in the witness stand, undergoing cross-examination by the plaintiff's attorney. This lawsuit has drawn significant attention from the legal community: a 20-year-old California woman, using the pseudonym KGM, is suing Meta's Instagram, Google's YouTube, TikTok, and Snap, alleging that these platforms were intentionally designed with addictive features, leading her to develop body dysmorphic disorder, anxiety, and depression. The core issue of the case is whether social media companies should bear legal responsibility for the psychological harm caused to teenage users. In court, Zuckerberg stated that the attorney was misinterpreting the company's internal communications and emphasized that Meta's mission is to build useful services.
Courtroom Battle: Internal Documents and the "Digital Casino" Allegation
Plaintiff's attorney Mark Lanier presented a series of Meta internal documents during the opening statement. A 2020 product memo revealed that Instagram's Explore page algorithm explicitly aims to maximize user time spent on the platform. Another 2021 user research report described teenagers' use of Instagram Reels (short video feature) in terms of a compulsive usage loop. The report indicated that 62% of users aged 13-17 in the test group reported an inability to control the impulse to scroll through videos, with an average single-session usage duration exceeding 47 minutes.
Lanier compares these platforms to digital casinos. He cited records from an internal Meta product strategy meeting in 2022, where a product manager proposed introducing variable reward mechanisms—the core psychological principle behind casino slot machine design—to increase engagement frequency among teenage users. During cross-examination, Zuckerberg acknowledged the existence of the document but stated it was merely an initial idea that was not adopted. He repeatedly emphasized: Our goal is to create valuable products that people naturally want to use.
Meta's legal team adopted a dual defense strategy. They presented KGM's medical records from 2018 to 2023, which indicated that the focus of treatment had consistently been on domestic abuse and emotional trauma; only one out of five psychological evaluations mentioned excessive social media usage. Meanwhile, Dr. David Greenfield, Meta's chief psychologist, testified that there is currently no widely accepted medical diagnostic standard for social media addiction. The American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5), only lists internet gaming disorder as a condition requiring further research.
Regulatory Wave: The Global Shift from European Bans to U.S. Litigation
This lawsuit occurred amidst a global regulatory environment undergoing dramatic changes. In January 2025, the Dutch Data Protection Authority imposed a fine of 37.5 million euros on TikTok, citing the company's failure to adequately verify user ages, which led to at least 1 million children under the age of 13 using the platform in violation of regulations. In March of the same year, the UK's Online Safety Act officially came into effect, requiring social media platforms to implement age-appropriate design for users under 18, including default disabling of location tracking, prohibition of personalized advertising, and provision of usage time reminder tools.
The impact of the EU's actions is even greater. In November 2025, the European Parliament passed an amendment to the Digital Services Act with 489 votes in favor and 152 against, explicitly prohibiting major social platforms such as Meta, TikTok, and Snap from providing algorithmically recommended content to minors under the age of 16. This ban will be fully implemented on January 1, 2027, and non-compliant companies will face fines of up to 6% of their global annual turnover. At a press conference in Strasbourg, EU Internal Market Commissioner Thierry Breton stated: "We cannot allow tech companies to exploit addictive designs to plunder the mental health of the next generation."
Political pressure within the United States is also increasing. By February 2026, attorneys general from 34 states across the nation had launched a joint lawsuit against Meta, accusing it of violating state consumer protection laws and unfair trade practices laws. A 127-page complaint filed by the office of California Attorney General Rob Bonta cited research data from Meta's internal project, codenamed Project Athena. The project tracked the Instagram usage behavior of 5,000 adolescents aged 12-18 and found that users who spent more than two hours per day on the platform had a 37% higher probability of exhibiting depressive symptoms compared to the control group.
Consensus and Divergence in Academia: Jonathan Haidt's Research Framework
Professor Jonathan Haidt, a social psychologist at New York University's Stern School of Business, did not appear in court, but his research forms an important academic background for this case. In his 2024 book "The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness," Haidt argues that between 2010 and 2015, the proliferation of smartphones and social media perfectly coincides with the deterioration of adolescent mental health indicators.
The Hite team analyzed data from the U.S. Centers for Disease Control and Prevention's Youth Risk Behavior Surveillance System from 2011 to 2021. The results show that the proportion of high school girls who reported persistent feelings of sadness or hopelessness increased from 36% in 2011 to 57% in 2021, while the proportion of adolescents engaging in self-harm behaviors rose from 17% to 30% during the same period. Adolescents who use social media for more than three hours per day are 2.3 times more likely to experience anxiety symptoms compared to those who use it for less than one hour.
However, academic opinions are not unanimous. A meta-analysis published in *Nature Human Behavior* in 2023 by the Oxford Internet Institute presented a different perspective. This study integrated longitudinal data from 72 studies involving 400,000 adolescents and concluded that the average association between social media use and mental health issues is so small as to be negligible, with a correlation coefficient of only 0.05. Professor Andrew Przybylski, the lead researcher, told *Science* magazine: "Simply attributing the adolescent mental health crisis to social media may lead us to overlook more fundamental socioeconomic factors, such as increasing educational pressure, weakening community ties, and rising economic uncertainty."
This academic disagreement directly influenced the court's acceptance of evidence. Meta's legal team summoned Jeff Hancock, a professor of communication at Stanford University, to testify. Citing his 2025 research, he stated that a two-year tracking study of 1,500 adolescents found significant individual differences in the impact of social media use on mental health, with groups receiving high levels of family support showing almost no negative effects. The plaintiff's side called upon Dr. Michael Rich, director of the Digital Wellness Lab at Boston Children's Hospital. He presented a 2024 brain imaging study showing that adolescents who frequently use Instagram exhibited brain activity patterns in the prefrontal cortex remarkably similar to those of substance addicts when viewing perfect photos of their peers.
The Dilemma of Industrial Logic: The Challenge of Balancing Growth Pressure and Safety Responsibilities
During his testimony on February 19, Zuckerberg highlighted the dilemma faced by tech companies: If you create something useful, people will naturally want to use it more. If we do a good job, people will spend more time on our services than on other things. This is a fairly normal business dynamic. Behind this statement lies Meta’s financial reality: the fourth-quarter 2025 earnings report shows that advertising revenue accounted for 97.8% of the company’s total revenue, and the core determinant of advertising pricing is precisely user engagement and time spent.
Internal documents reveal that Meta conducted an A/B test codenamed "Lighthouse" in 2023. The experimental group had a daily usage limit of 1 hour set for teenage Instagram accounts, while the control group had no restrictions. After 30 days, the experimental group's ad click-through rate decreased by 19%, and per capita ad revenue dropped by 23%. This report sparked internal debate, and ultimately, the product team decided not to roll out the time limit feature for the time being, citing the need for further research to balance user experience with business sustainability.
The deeper conflict lies in the platform governance model. During a congressional hearing in September 2025, Monica Bickert, Meta's Global Head of Security Policy, revealed that the company's content moderation team had 1,273 full-time employees dedicated specifically to youth safety, while the platform's monthly active users during the same period numbered 3.57 billion. This means that, on average, each safety employee was responsible for monitoring content generated by 2.8 million users. Bickert acknowledged: "We rely on algorithms for initial screening, but the accuracy of our machine learning models in identifying complex situations, such as psychological harm, is currently only 67%."
This lawsuit may establish a new paradigm of liability. In an article for the January 2026 issue of The Atlantic, Harvard Law School professor Lawrence Lessig wrote: The pivotal turning point in tobacco litigation, which led to the Master Settlement Agreement in 1998, was internal documents proving that companies knew their products were harmful yet concealed the facts. Now, social media litigation is following a similar trajectory—the question is no longer 'whether it causes harm,' but 'how much the companies knew, when they knew it, and what they did about it.' Lessig pointed out that in 1998, the tobacco industry agreed to pay 246 billion dollars over the following 25 years. If social media litigation follows a similar path, it could result in a settlement amounting to hundreds of billions of dollars in the United States alone.
Future Vision: Technological Remediation, Regulatory Restructuring, and Intergenerational Dialogue
The trial in the Los Angeles court is expected to last until April 2026. Regardless of the verdict, the technology industry has already entered a period of adjustment. In January 2026, Meta announced that it would enable Quiet Mode by default on Instagram and Facebook—automatically blocking notifications from 11 PM to 6 AM the next day. This feature is mandatory for users under 18 years old. In the same month, TikTok launched Family Safety Mode, allowing parents to remotely set content filtering levels and usage time limits for teenage accounts.
More fundamental changes may come from the technological architecture level. In December 2025, the Center for Humane Technology, founded by former Google design ethicist Tristan Harris, released version 2.0 of the "Youth Safety Design Standards," proposing 57 specific technical recommendations. The most disruptive among them is Article 41: Unlimited content streams should not be provided within a single session; natural breakpoints should be set (e.g., prompting a break after every 20 pieces of content viewed). This standard has gained support from the EU Digital Services Coordinator and may become the technical benchmark for regulatory review in 2027.
Senior Fellow Darrell West of the Brookings Institution, who has long observed technology policies, wrote in a policy brief in February 2026: The true value of this lawsuit lies not in the compensation amount, but in forcing the entire industry to open the black box of its design decisions. Just as the automotive industry must publish crash test data, social media platforms may need to disclose addiction test results. He predicted that within the next three years, the United States might establish a Digital Product Safety Administration similar to the Food and Drug Administration to conduct pre-market reviews of social algorithms.
Outside the Los Angeles courthouse, a group of protesters from the organization Design for Children held up signs that read, "Our children are not KPIs." The sunlight reflected off the glass curtain wall of the courthouse, casting a glaring light. Inside the courtroom, Zuckerberg was answering questions about the original intent behind the design of the "Like" button. This conversation had transcended legal boundaries, becoming a collective inquiry of an era into technological ethics—when the logic of growth clashes with human well-being, how should we redefine the meaning of progress? The answer will not emerge today, but the question has been permanently etched into the code of Silicon Valley and the records of the courtroom.