As a freshman in high school, Caroline Koziol competed in the Connecticut statewide championship, swimming the 100-yd. butterfly in just over a minute. By senior year, she could barely climb the stairs without seeing black spots. In September 2021, Koziol’s coach had to pull her out of the pool after she nearly passed out during swim practice.
“My coach was like, ‘Just eat a granola bar. You’ll feel better,’” says Koziol, now a college junior. “And I was like, ‘That’s absolutely not going to happen.’”
[time-brightcove not-tgx=”true”]Back then, Koziol was deep in the grips of an eating disorder that shattered her adolescence. Now, she’s suing the social media giants Meta and TikTok, alleging that the design of their products contributed to her anorexia and made it more difficult for her to recover.
When Koziol was stuck at home during the COVID-19 pandemic, she started looking up at-home workouts on social media to keep herself in shape for swimming, and searched for healthy recipes to make with her mom. Within weeks, her Instagram and TikTok feeds were full of content promoting extreme workouts and disordered eating. “One innocent search turned into this avalanche,” she says, sipping iced coffee at a shop near her parents’ home in Hartford. “It just began to overtake every thought that I had.”
Koziol, now 21, is among more than 1,800 plaintiffs suing the companies behind several leading social media platforms as part of a case that could reshape their role in American society. The plaintiffs include young adults recovering from mental-health problems, the parents of suicide victims, school districts dealing with phone addiction, local governments, and 29 state attorneys general. They’ve joined together as part of a multidistrict litigation (MDL), a type of lawsuit which consolidates similar complaints around the country into one case to streamline pretrial proceedings, which is now moving through federal court in the Northern District of California.
The plaintiffs allege that the products created by social media giants are “addictive and dangerous,” that the defendants have “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health,” and that young people like Caroline Koziol are the “direct victims of intentional product design choices made by each defendant.”
The MDL is expected to reach trial next spring. It comes as social media is monopolizing more and more attention from children and teens. The average teenager in the U.S. spends nearly five hours per day on social media, according to a 2023 Gallup poll; those who spend more than three hours on it daily have double the risk of depression and anxiety. One study found that clinical depression among young people was up 60% from 2017 to 2021, which critics attribute partly to rising social media use. Overall, 41% of teens with high social media use rate their mental health as poor or very poor, according to the American Psychological Association. “Social media is associated with significant mental health harms for adolescents,” wrote former surgeon general Vivek Murthy, who argues for putting warning labels on social media platforms as the U.S. does with tobacco.
As some of the most powerful companies in the world work to monetize their attention, kids across the country are struggling with the consequences of social media addiction. “They’re being stalked by these companies, stalked by these really disturbed values and beauty ideas, stalked by the phone in their hand,” says Dr. S. Bryn Austin, a professor of behavioral science at the Harvard School of Public Health. “The algorithm is finding them no matter what they’re doing.” To critics, there’s little question why: Austin co-published research which estimated that in 2022 alone, children under 17 delivered nearly $11 billion in advertising revenue to Facebook, Instagram, Snap, TikTok, Twitter, and YouTube.
Legislative efforts to regulate social media companies have mostly stalled in Congress, and the plaintiffs’ attorneys in the social media MDL say this case represents the best hope of holding these tech companies accountable in court—and getting justice for those who have been harmed. The MDL, says co-lead plaintiffs’ lawyer Previn Warren of Motley Rice, is a “giant coordinated siege by families and AGs unlike anything we’ve seen since the opioid crisis.”
The plaintiffs argue the companies were aware of the ramifications for young users, yet designed their products to maximize addictiveness anyway. Internal documents obtained by the Wall Street Journal in 2021 suggest that Facebook, now Meta, knew that Instagram was harming girls’ mental health: “We make body image issues worse for 1 in 3 teen girls,” said one slide presenting internal research. (Meta disputes this characterization, saying the same internal research shows teen girls were more likely to say Instagram made them feel better than worse.)
“These companies knew that there was a disproportionate effect on young women,” says Koziol’s lawyer, Jessica Carroll of Motley Rice. She represents eating disorder clients as young as 13 years old, including some who have been on social media since they were 10. “I think that they experimented on a generation of young people, and we are seeing the effect of that today.”
Suing social media companies is a difficult challenge. Most digital speech is protected by the First Amendment, and social media platforms have long asserted protections under Section 230 of the Communications Decency Act, which grants them immunity from liability for content they host. But the plaintiffs in the MDL aren’t suing over content. They’re suing on product liability and negligence grounds, alleging that defective design features hook children to the platforms and allow them to evade parental consent, while the recommendation algorithms manipulate kids to keep them on the apps. Koziol isn’t suing Instagram’s and TikTok’s parent companies for hosting content that drove her to an eating disorder. She’s alleging that the platforms are designed to maximize her engagement, and as a result, she was drawn deeper and deeper into anorexia content.
“If I saw one or two videos, it wouldn’t have made a difference,” Koziol says. “But when it was all that I was seeing, that’s when it became obsessive.”
Critics say that bombardment is a deliberate feature, not a bug. “They have a machine-learning recommendation system that is oriented towards growth at all costs,” says Carroll, “Which means anything to keep eyeballs on it.”
Meta, which owns Instagram, says it has sought to combat the harms experienced by underage users by rolling out new restrictions. As of last year, any Instagram user from the ages of 13 to 18 will default to what Meta calls a teen account, which is automatically private, limits sensitive content, turns off notifications at night, and doesn’t allow messaging with anyone without mutual contacts. The teen accounts are meant to address key concerns that parents have about teens on the app, which Meta spokeswoman Liza Crenshaw described as “content, contact, and time.” Meta has also developed “classifiers,” she said—AI programs that scan for problematic content to remove.
“We know parents are worried about having unsafe or inappropriate experiences online, and that’s why we’ve significantly changed the Instagram experience for tens of millions of teens with new teen accounts,” Crenshaw said in a statement to TIME. “These accounts provide teens with built-in protections to automatically limit who’s contacting them and the content they’re seeing, and teens under 16 need a parent’s permission to change those settings. We also give parents oversight over the teens’ use of Instagram, with ways to see who their teens are chatting with and block them from using the app for more than 15 minutes a day, or for certain periods of time, like during school or at night.”
“TikTok proactively restricts unhealthy weight loss content and provides access to supportive resources and experts right in our app to anyone in need,” a company spokesperson said in a statement to TIME. “We’re mindful that content that can be triggering is unique to each individual, which is why we empower people with a range of tools to shape their experience, including tools to manage topics and block keywords, hashtags, and creators in their For You feed.”
Caroline Koziol’s parents thought she was too young for social media as a preteen. But many of her friends in elementary school already had accounts on Instagram and TikTok, and Koziol felt left out. Both platforms have an age limit of 13. Koziol says she signed up when she was 10 or 11. “I was able to lie about my age really easily and make an account,” she recalls. “My parents had no idea.”
Age verification is an “industry-wide challenge,” says Crenshaw, adding that this year Meta started testing a new AI tool designed to proactively find accounts of teens masquerading as adults and transition them to teen accounts. Meta supports federal legislation that requires app stores to get parental approval whenever a teen under 16 downloads an app.
As a competitive swimmer, Koziol had spent much of her life in a bathing suit, and was comfortable in her body. “I never thought twice about other people’s bodies or my own,” she says. What she cared about was her athletic career. Her dream was to swim competitively in college. Then she began spending time on Instagram and TikTok. “I started using social media, and I’m like, ‘Oh, I have really broad shoulders compared to a lot of girls,’” she recalls. “Or, ‘Oh, my thighs touch,’ whereas the fitness influencers, their thighs don’t.”
When the pandemic hit, her school went remote and swim practice was canceled. Koziol knew that even a few days away from the pool could weaken her endurance and make her less competitive. Worried about losing muscle mass, she searched on Instagram for “30 minute at-home workout,” or “cardio workout at home.” She was also filling free time by baking with her mom, and started looking up healthy recipes to make. “My searches were never ‘low calorie, low fat,’” Koziol says. “It was ‘healthy.’”
The dramatic evolution of the content the algorithm served to her was apparent only in retrospect. “It was so nefarious,” she says. Searching for healthy recipes led her to low-calorie recipes. Searching for at-home workouts led her to videos of skinny influencers in workout clothes, “body checking” for the camera. “Something in my algorithm just switched,” she says. “There was nothing I could do.” A healthy teenager who wanted to see memes and funny videos and exchange messages with her friends suddenly had her feed filled with “very thin girls in workout clothing, showing off their bodies.”
Before too long, her daily diet shrank to just a protein bar or two. Sometimes she ate nothing at all until dinner. For months she subsisted mostly on baby puffs and Diet Coke—“Just to fill my stomach”—even as she forced herself to do hours of cardio. Meanwhile, the algorithms supplied images of skinnier and skinnier girls, on more and more extreme diets. Other users chimed in with comments on those posts, Koziol says, suggesting tips and tricks to “feel like you’re eating something.” She kept her car stocked with paper towels, wipes, and extra mascara so that she could reapply makeup after forcing herself to vomit on the side of the road.
By her junior year, “I was addicted to this empty feeling,” Koziol says. Being on social media was like “getting sucked into a dark hole.” She lost 30 lb. in a year. Her eating disorders led to issues with her throat and teeth, and hormonal problems that caused hot flashes and night sweats; her brain felt fuzzy as her short-term memory faltered from malnutrition. “I was like a zombie,” she says. “I was not the same person.”
Her parents realized something was wrong, and Koziol started outpatient treatment for anorexia the summer after junior year. She met with psychiatrists, dieticians, and therapists. Nothing could break the algorithm’s grip. “When my thoughts correlated with what I was seeing on my phone, it just felt normal,” she says. “I really didn’t even see a problem with it.”
Senior year she missed half her classes to attend treatment. She wasn’t able to take the AP course load she’d hoped, and only graduated because of all the extra credits she’d taken earlier in high school. Instead of going to college, she went to Monte Nido, an inpatient anorexia-treatment facility in Irvington, N.Y.
Dr. Molly Perlman, the facility’s chief medical officer, says that while many eating disorders have an underlying genetic component, they’re triggered by social environments—and photo-based social media platforms are a perfect trigger. Perlman says that not only have those apps contributed to the rise of eating disorders, they’ve also made them far more difficult to treat. “The algorithms are very smart, and they know their users, and they know what they will click on,” Perlman says. “These are malnourished, vulnerable brains that are working towards finding recovery, and yet the algorithms are capitalizing on their disease.”
Toward the end of Koziol’s six-week stay at Monte Nido in the summer of 2022, her therapy group had a “phone cleansing day.” All the patients were asked to either delete social media altogether or to block harmful hashtags. The facility was filled with women of different ages, races, backgrounds, and hometowns. But when everyone opened their phones, Koziol noticed that their Instagram feeds looked just like hers. “We had the same creators,” she says. “The same posts, the exact same content.”
For all their differences, these women were being fed the exact same pro-anorexia content. “It showed me that this algorithm doesn’t care about who you are,” Koziol says. “All they need is that first little search.”
Gabby Cusato’s story started out a lot like Koziol’s. Gabby was a competitive athlete too, a cross-country runner. She was part of a tight-knit family from upstate New York—a quadruplet, with two older siblings and loving parents.
Gabby’s parents got her a phone in seventh grade to help the family coordinate pickups from multiple different sports practices. She had one Instagram account her parents knew about and several others that were secret. “Imagine being 15. You want to be fast, you want to be skinny,” says Gabby’s mother Karen. “And if somebody’s saying to you, here’s how you can do it, here’s how you can lose more weight—it’s just a rabbit hole.”
After her drastic weight change, Karen Cusato enrolled Gabby in intensive outpatient treatment for anorexia. She went three days a week for four hours a day. But as soon as she got out of therapy, she would be back on her phone. One of her Instagram accounts was devoted to her eating-disorder treatment; Gabby posted pictures of healthy meals she ate during her recovery. Jessica Carroll, who also represents the Cusato family, says this account may have aggravated Gabby’s problem. “By creating this account,” Carroll says, “she’s almost inviting the algorithm to send her more of this stuff.”
One day in November 2019, Karen Cusato confiscated Gabby’s phone after an argument and sent her to her room, thinking that her daughter could cool off and they could patch things up in the morning. The next morning, Gabby’s bed was made, but she was nowhere to be found. At first, Cusato and her husband thought she had run away. “Then we found her in the closet,” Cusato recalls. “The scream from my husband is something that I can never unhear.” Gabby had died by suicide. She was 15.
Most days, Cusato blames herself. But she knows this would not have happened if Gabby had not become so addicted to social media. “If she was born 20 years earlier, this would not have taken place,” she says. If Cusato could do it again, she’d buy her daughter a phone without internet access. “Never in my wildest dreams did I think this was what the phone could turn into.”
“I think she felt less than perfect,” Cusato says. “The algorithms were giving an ideal weight for an ideal height, and she was chasing that number, and, you know, she didn’t get to that number that she thought was going to make her happy.” Anorexia has the highest mortality rate of any psychiatric disorder, not just because of the malnutrition involved, but because it’s often clustered with depression, anxiety, and suicidal ideation. Carroll says that nearly half of the personal-injury cases represented by her firm in the MDL are related to eating disorders.
Cusato has been a high school math teacher for more than two decades. Every day she stands in front of classes of kids who are the same age Gabby was when she died. During the years she’s spent in the classroom, she’s noticed a change take place in her students. “Were kids depressed before? Yes,” she says. “Were they depressed in the numbers they are now? Absolutely not.”
The suits filed by Koziol, the Cusatos, and others represent a new phase in the effort to combat social media harms. “The strategy of these companies is to do anything they can to distort the language of Section 230 to protect them with impunity,” says Mary Graw, a law professor at the Catholic University of America who has written extensively about Section 230. “As a result we have an unregulated industry which is used by most of the globe with the potential to cause massive harm, and no way to look under the hood to see what the industry is doing. It is stunning. No other industry has such avoidance of accountability.”
The social media MDL cleared a major obstacle in November 2023 when a federal judge in California ruled that many of the lawsuits could proceed. While Judge Yvonne Gonzalez Rogers narrowed the scope of the litigation, she ruled that many of the negligence and product liability claims are not barred by either the First Amendment or Section 230.
Multidistrict litigation is often slow and unwieldy, since it contains many distinct cases with different plaintiffs seeking different damages. The school districts, for example, want something different than the attorneys general do. Personal-injury plaintiffs like Koziol and the Cusatos are seeking compensation for medical bills, pain and suffering, and punitive damages for what they say was “reckless, malicious conduct” by the social media companies. They also want injunctive relief—a formal ruling from the judge stating that the platforms are defectively designed, and court orders requiring the companies to remove addictive or dangerous features, add warnings, and strengthen safeguards. The fact-discovery phase of the case, in which both parties exchange documents and testimony, closed in April. The two sides are preparing for a motion of summary judgment in the fall, when the judge will decide if the case will go to trial.
Koziol can’t point to any one piece of content that caused her eating disorder. To her, “it’s the constant bombardment of all of these videos” that led her to the brink of starvation. “No third party is responsible for the algorithm,” she says. “That’s Meta and TikTok alone.” While she’s seeking financial compensation for her eating-disorder treatment as well as other financial damages, Koziol says the purpose of the suit is to hold those corporations accountable. “They knew what they did was wrong,” she says. “What they were doing was harming young girls. And it ruined my life.”
Koziol had been accepted to her dream school, the University of South Carolina. She had imagined joining a sorority and going to SEC football games on Saturdays. She even had a roommate lined up before she decided to take a year off to focus on her recovery. By the time she was ready for college, her therapists warned her that going so far away could trigger a relapse. Instead she enrolled at University of Hartford, closer to home. Greek life and football aren’t in the cards right now. She’s not on the same track as most of her old friends. She had to quit swimming.
Even after everything she’s been through, Koziol can’t bring herself to delete her social media entirely; Instagram is so ubiquitous that it feels impossible to be a young person without it. She’s blocked certain hashtags from her Instagram, reset her account, and unfollowed harmful creators. Still, the algorithm finds ways to tempt her.
“It’s taken a lot of work to not interact with the disordered content that occasionally comes up,” she says. “I always have to remind myself: the second I click on it, I’m gonna get another post tomorrow, and then another post tomorrow, and then another post tomorrow.”
If you or someone you know needs help, call the National Alliance for Eating Disorders hotline at 866-662-1235, or the National Suicide Prevention Hotline at 988.
Read More Details
Finally We wish PressBee provided you with enough information of ( She Says Social-Media Algorithms Led to Her Eating Disorder. Now She’s Suing TikTok and Instagram )
Also on site :