Chemerinsky: Why tech giants shouldn’t be liable for creating addictive platforms ...Middle East

mercury news - News
Chemerinsky: Why tech giants shouldn’t be liable for creating addictive platforms

Although social media companies are in many ways villains that have not done nearly enough to protect children on their platforms, they nonetheless should not be held liable based on claims that they are creating addictive and harmful online environments.

Last week, a trial began in Los Angeles Superior Court in a lawsuit brought by a woman, referred to in documents as Kaley G.M., against tech giants YouTube and Instagram. (TikTok previously settled with her). The plaintiff’s claim is that these platforms were built specifically to be addictive to children. Hers is just one of more than 2,500 lawsuits now pending that are based on a variety of legal claims against some of the world’s largest corporations.

    Related Articles

    Boy, 16, trapped 50 feet down shaft in TikTok stunt on New York bridge Grok and other ‘nudification’ apps put Silicon Valley at center of global outrage Clendaniel: Declining morality of Silicon Valley’s tech leaders dragging down the nation Are TikTok and X tracking you across the internet? This privacy tool can tell you 3 teens indicted in fatal shooting of TikTok influencer

    The core of these lawsuits is that internet and social media companies, including those owned by Meta and Google, should be held liable on the same theory famously used against Big Tobacco: that brands knowingly created an addictive product. But the analogy fails for one simple reason. Internet and social media companies are engaged in speech, protected by the First Amendment, while no constitutional right is involved in regulating cigarettes and other tobacco products.

    The suits against the social media companies contend that they design the platforms in a way to keep children engaged for long periods and keep them coming back for hours on end. But you could say that about all forms of media. Books, including those for children, are often written with cliffhangers at the end of each chapter to keep people reading. Television series do the same, encouraging people to keep watching or even “bingeing” as long as they can last. Video games are obviously designed to keep people, including children, playing into the wee hours.

    Algorithms are speech

    Holding any media company liable for the content of its speech raises grave First Amendment issues. The plaintiffs in these suits are claiming that the algorithms are built and tailored towards individual users to keep them hooked. But algorithms are themselves a form of speech and there is no reason to treat this speech any differently from TV scripts or novels or the code that makes video games work. As Supreme Court Justice Elena Kagan wrote in a 2024 opinion, “The First Amendment … does not go on leave when social media are involved.”

    The Supreme Court’s decision in Brown v. Entertainment Merchants Association (2012) is crucial here. The case involved the constitutionality of a California law that made it a crime to sell or rent violent video games to those under 18 without parental consent. The Supreme Court, in an opinion by Justice Antonin Scalia, declared the California law unconstitutional. At the outset, the court expressly rejected the argument that there was lesser constitutional protection because the law was designed to protect children.

    The court instead declared that “minors are entitled to a significant measure of First Amendment protection, and only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to them.”

    California argued that playing interactive violent video games has a deleterious effect on children, making them more prone to commit violent acts. However, the court rejected this argument and stressed the heavy burden of proving causation that must be met in regulating speech.

    Scalia, writing for the majority, concluded that, “California cannot meet [strict scrutiny.] At the outset, it acknowledges that it cannot show a direct causal link between violent video games and harm to minors. … The State’s evidence is not compelling. … They show at best some correlation between exposure to violent entertainment and minuscule real-world effects, such as children’s feeling more aggressive or making louder noises in the few minutes after playing a violent game than after playing a nonviolent game.”

    The court concluded that the government could not possibly prove the causation necessary to hold video game companies liable for their content. The same, of course, is true of internet and social media companies, each of which is a unique platform for communication.

    But, as the Supreme Court recognized in Packingham v. North Carolina (2017), social media platforms are “the principal sources for knowing current events, checking ads for employment, speaking and listening in the modern public square, and otherwise exploring the vast realms of human thought and knowledge.” The Court forcefully concluded that it “must exercise extreme caution before suggesting that the First Amendment provides scant protection for access to vast networks in that medium.”

    Tech isn’t tobacco

    There are other legal obstacles to holding internet and social media companies liable for creating addictive and harmful online environments for children. Section 230 of the Communication Decency Act provides that these platforms cannot be held liable for the content posted on their sites, whether that involves what to include or what to take down. The pending lawsuits against internet and social media companies cannot overcome this immunity.

    None of this is to deny how some children are harmed by time spent on social media. There are studies showing that use of the platforms is correlated to depression, low self-esteem and bullying. There are also studies showing that playing violent video games can be linked to anti-social behavior. The solution is not to restrict speech or hold those responsible for it liable. Ultimately, parents need to make more careful choices about when and how to allow their children to engage on social media. Meanwhile, these tech giants should certainly exercise more care in material directed at children.

    Ultimately, it will be for the Supreme Court, not the jury in Los Angeles Superior Court, to decide whether social media companies can be held liable on these grounds. The answer is clear: Social media is speech, tobacco isn’t and that makes all the difference.

    Erwin Chemerinsky is the dean of the UC Berkeley Law School. ©2026 Los Angeles Times. Distributed by Tribune Content Agency.

    Hence then, the article about chemerinsky why tech giants shouldn t be liable for creating addictive platforms was published today ( ) and is available on mercury news ( Middle East ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.

    Read More Details
    Finally We wish PressBee provided you with enough information of ( Chemerinsky: Why tech giants shouldn’t be liable for creating addictive platforms )

    Apple Storegoogle play

    Last updated :

    Also on site :