Article Summary
In March 2026, a California jury found Meta and YouTube liable for deliberately designing addictive platforms that harmed a young user. The case bypassed Section 230 by targeting platform design, not content. For anyone who works in UX design, builds digital products, or hires teams that do, this verdict changes the conversation about what design is responsible for.
Key Takeaways
- A jury found Meta and YouTube negligent for engineering addiction through design features like infinite scroll, autoplay, and push notifications, awarding $6 million in damages.
- The case bypassed Section 230 by arguing that platform design, not user content, caused harm. That legal distinction has implications far beyond social media.
- Meta's own internal documents showed the company deliberately targeted users under 13, despite its own age policy.
- This is the first time a jury has drawn a direct line between UX design decisions and measurable user harm.
- The verdict is a bellwether for over 2,400 pending lawsuits. Design teams across the tech industry should be paying attention.
- Infinite scroll, autoplay, and notifications are not inherently bad. The ethical question is how they're applied and who they're applied to.
- Australia has already banned under-16s from social media. France, the UK, and the EU are drafting similar legislation. The regulatory environment is shifting fast.
Related Video
Full Article
A jury just put UX design on the stand
On March 25, 2026, a California jury found Meta and YouTube liable on all counts in a landmark addiction case. The plaintiff, a 20 year old woman identified as Kaley, started using YouTube at age six and Instagram at nine. By the end of elementary school, she had posted hundreds of videos and was experiencing anxiety, depression, and body dysmorphia. The jury awarded $6 million in damages: $3 million compensatory, $3 million punitive. Meta was held responsible for 70% of the harm. YouTube, 30%. At the core of the case was a simple question: are platforms liable for the time users spend on them when the design is deliberately engineered to prevent them from stopping?
What makes this case different from every other attempt to hold social media companies accountable is what it targeted. Previous lawsuits focused on content: what users posted, what the algorithm surfaced, what moderators failed to catch. Those cases ran straight into Section 230 of the Communications Decency Act, a 1996 law that shields platforms from liability for user generated content. For 30 years, that shield held.
This case went around it. The plaintiffs argued that the harm didn't come from what was on the platforms. It came from how the platforms were designed. Infinite scroll. Autoplay. Push notifications. Algorithmic recommendations. Beauty filters. These are UX design features. The jury agreed they were deliberately engineered to be addictive, and the companies knew it.
That distinction, design versus content, cracked a hole in Section 230 that 30 years of litigation couldn't.
The internal documents tell the real story
During the trial, Meta's internal documents were entered into evidence. One read: "If we wanna win big with teens, we must bring them in as tweens." Another showed that 11 year olds were four times more likely to return to Instagram than users on competing apps. This was despite Meta's own policy requiring users to be at least 13 years old (1).
Mark Zuckerberg himself testified in February 2026. A former product manager's email, also entered as evidence, stated: "Our overall company goal is total teen time spent." Another noted that Zuckerberg had decided teens were the company's top priority for the first half of 2017.
There's also the language Meta used internally. The company avoided the word "addiction" and instead used phrases like "internet use disorder," a framing choice designed to minimize the severity of what their own research was showing them.
This is the part that should concern anyone who works in UX design or makes design decisions for digital products. Meta had the data. They had research showing the harm. And the design teams kept building features that deepened engagement through addictive mechanics anyway.
Dark patterns at an industrial scale
When we talk about dark patterns in UX design, most people think of annoying tricks: a button that's hard to find, an opt out buried three pages deep, a subscription that's easy to start and impossible to cancel. Those are frustrating. They erode trust. A 2022 European Commission report found that 97% of popular websites and apps deployed at least one dark pattern.
But what the Meta and YouTube verdict exposed is something beyond typical dark patterns. This is behavioral engineering at scale, applied deliberately to the most vulnerable users. Plaintiffs described Instagram and YouTube as a "digital casino." The comparison is accurate. Slot machines use variable reward schedules, the same psychological mechanism that drives infinite scroll. You don't know what's coming next, so you keep pulling. Except the users pulling the lever are children. These aren't just design features that trick users into one extra click. They're systems built to exploit developing brains. The ethical considerations are not abstract. They're legal. And they have been quantified by a jury in dollar amounts.
The ethical considerations here are not abstract. They're legal. And they have been quantified by a jury in dollar amounts.
The tools aren't the problem. The intent is.
This is where the conversation gets muddied in design circles, and it's something we've spent time discussing internally at Tennis and on our podcast and UX resources.
Infinite scroll is not inherently evil. It's genuinely useful when you're navigating a long dataset or browsing a product catalog. Autoplay makes sense in contexts where the user has clearly expressed intent to consume sequential content. Notifications are critical for time sensitive information. These are legitimate design patterns with real utility.
Aza Raskin, who invented infinite scroll in 2006, has expressed regret over what his creation became. He even testified in the New Mexico trial in February 2026. His take: "Optimizing something for ease of use does not mean best for the user or humanity." That's a foundational question in human computer interaction, and it just got a legal answer.
We agree with Raskin, but we'd add a qualifier. The problem wasn't the feature. The problem was the intent behind its application. When a UX design team takes a tool that helps users navigate content efficiently and warps it into a mechanism that keeps children on a platform for the sake of ad revenue, that's a design choice with ethical consequences. UX designers don't accidentally engineer addiction. That takes deliberate strategy, research, and sustained effort. Ethical ux design requires that practitioners interrogate intent at every stage of the process, and many organizations now look for comprehensive design services that bake that scrutiny into how products are planned and shipped.
The line between designing for engagement and engineering for addiction isn't blurry. It's just one that some companies chose to cross.
Big tobacco, same playbook
The comparisons to the 1998 Tobacco Master Settlement Agreement aren't new, but they've never been more relevant. The parallels are specific. Both industries had internal research showing their products caused harm. Both publicly denied it. Both targeted young users to build lifelong engagement. Both eventually faced coordinated legal action from state attorneys general.
The tobacco settlement resulted in $206 billion in payments over 25 years, advertising restrictions that prohibited marketing to minors, and the dismantling of industry trade groups that coordinated public deception. Youth smoking rates dropped from 36% in 1997 to 6% by 2019.
The social media litigation is following a similar trajectory. The day before the California verdict, a New Mexico jury hit Meta with a $375 million penalty for misleading users about safety and failing to protect children from exploitation. As of April 2026, over 2,400 cases are pending in the federal MDL alone. More than 41 state attorneys general have filed or joined suits. Nearly 800 school districts have filed claims.
Meta's stock dropped 8% the day after the verdict. The company lost roughly $300 billion in market value over March 2026.
The regulatory wave is already here
Governments aren't waiting for the courts to finish. Australia banned users under 16 from social media in December 2025, becoming the first country to do so. Within months, 4.7 million accounts had been removed or restricted. France is drafting legislation to restrict under-15s by September 2026. The UK, Denmark, Malaysia, Spain, and Greece are all considering similar restrictions. The EU is working on a bloc-wide framework.
In the US, the Kids Online Safety Act gained renewed momentum after the verdict. Minnesota will become the first state to require mental health warnings on social media platforms starting July 2026.
For companies building digital products, the message is clear: user privacy, age gating, and ethical design practices are becoming legal requirements, not nice to haves, and understanding the web development process end to end is now part of staying compliant. The regulatory environment around data collection, user consent, and design transparency is tightening globally, and teams need practical resources for web, product, and marketing work to keep their practices aligned.
What this means if you build things
This verdict matters beyond social media. It established that UX design decisions can be treated as product defects. That's new legal territory, and the legal regulations following it are moving fast.
If you're a marketing director evaluating a new website or platform build, this is worth asking about. A clear, user-centered website project brief is the first place those expectations should be defined. How does your team approach ethical UX design? What's the process for identifying when engagement features cross into manipulative territory? Is anyone auditing your user experience for patterns that prioritize business goals over user well being? These are no longer just ethical considerations. They're risk management.
If you're a product manager or CTO building an app, the same questions apply with higher stakes. Particularly if your product has users under 18, processes personal data, or relies on behavioral engagement metrics as a success measure. Partnering with product design and development services that understand these risks can help. UX designers on your team should be equipped to flag patterns that could expose the organization to liability, and they need development partners who can implement those decisions through user-centric development services without reintroducing risky shortcuts.
At Tennis, we've always treated the website as a product, not a project, reflecting our broader commitment to a holistic user experience. Our focus on building better digital experiences means design decisions carry weight beyond launch day. They affect users, they affect trust, and now, demonstrably, they affect legal liability. The companies that come out ahead will be the ones helping users make informed decisions, not the ones engineering around them, using modern website design and development practices that center clarity, accessibility, and long-term trust.
Frequently Asked Questions (FAQ)
What are dark patterns in UX design?
Design elements that trick users into actions they didn't intend. The Meta verdict expanded this to include systemic features like infinite scroll and algorithmic recommendations.
What's the difference between designing for engagement and engineering addiction?
Intent and audience. Engagement helps users accomplish goals. Addiction exploits psychology to keep users on-platform regardless of benefit, which is exactly what Meta's internal documents showed.
What is ethical UX design?
Building digital experiences that prioritize user well being, privacy, and informed decisions alongside business objectives. Not ignoring business goals, but achieving them without exploiting users.
How does the Meta verdict affect companies outside of social media?
The jury treated addictive design as a product defect. That legal framework could apply to any digital product using behavioral engagement mechanics.
What is Section 230 and why did it matter in this case?
A 1996 law protecting platforms from liability for user-posted content. This case bypassed it by targeting platform design instead. That opens the door for over 2,400 pending lawsuits.
What happened to Meta's stock after the verdict?
It dropped roughly 8% the day after. Meta lost approximately $300 billion in market value over March 2026.
Are other countries regulating social media for minors?
Australia banned under-16s in December 2025. France, the UK, Denmark, and several other countries are drafting similar legislation. Minnesota will require mental health warnings on platforms starting July 2026.




