top of page
Search
  • Writer's pictureThe San Juan Daily Star

Meta accused by states of using features to lure children to Instagram and Facebook


The coordinated suit shows states are prioritizing the issue of children and online safety and combining legal resources to fight Meta.

By Cecilia Kang and Natasha Singer


Meta was sued by more than three dozen states earlier this week for knowingly using features on Instagram and Facebook to hook children to its platforms, even as the company said its social media sites were safe for young people.


Colorado and Tennessee led a joint lawsuit filed by 33 states in U.S. District Court for the Northern District of California, saying that Meta — which owns Facebook, Instagram, WhatsApp and Messenger — violated consumer protection laws by unfairly ensnaring children and deceiving users about the safety of its platforms. The District of Columbia and eight other states filed separate lawsuits Tuesday against Meta with most of the same claims.


In their complaint, the states said that Meta had “designed psychologically manipulative product features to induce young users’ compulsive and extended use” of platforms like Instagram. The company’s algorithms were designed to push children and teenagers into rabbit holes of toxic and harmful content, the states said, with features such as “infinite scroll” and persistent alerts used to hook young users. The attorneys general also charged Meta with violating a federal children’s online privacy law, accusing it of unlawfully collecting “the personal data of its youngest users” without their parents’ permission.


“Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens,” the states said in their 233-page lawsuit. “Its motive is profit.”


Meta said it was working to provide a safer environment for teenagers on its apps and has introduced more than 30 tools to support teenagers and families.


“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company said in a statement.


Why the case matters


It’s unusual for so many states to come together to sue a tech giant for consumer harms. The coordination shows states are prioritizing the issue of children and online safety and combining legal resources to fight Meta, just as states had previously done for cases against Big Tobacco and Big Pharma companies.


“Just like Big Tobacco and vaping companies have done in years past, Meta chose to maximize its profits at the expense of public health, specifically harming the health of the youngest among us,” Phil Weiser, Colorado’s attorney general, said in a statement.


Lawmakers around the world have been trying to rein in platforms such as Instagram and TikTok on behalf of children. Over the past few years, Britain, followed by states such as California and Utah, passed laws to require social media platforms to boost privacy and safety protections for minors online. The Utah law, among other things, would require social media apps to turn off notifications by default for minors overnight to reduce interruptions to children’s sleep.


Regulators have also tried to hold social media companies accountable for possible harms to young people. Last year, a coroner in Britain ruled that Instagram had contributed to the death of a teenager who took her own life after seeing thousands of images of self-harm on the platform.


Laws to protect the safety of children online in the United States, however, have stalled in Congress as tech companies lobby against them.


“We’ve been warning about Meta’s manipulation and harming of young people from its start and sadly it has taken years to hold it and other companies like Google accountable,” said Jeffrey Chester, the executive director of consumer advocacy at the Center for Digital Democracy. “Hopefully justice will be served but this is why it’s so crucial to have regulations.”


How the investigation started


States began investigating Instagram’s potentially harmful effects on young people several years ago as public concerns over cyberbullying and teen mental health mounted.


In early 2021, Facebook announced that it was planning to develop “Instagram Kids,” a version of its popular app that would be aimed at users younger than 13. The news prompted a backlash among concerned lawmakers and children’s groups.


Soon after, a group of attorneys general from more than 40 states wrote a letter to Mark Zuckerberg, the company’s CEO. In it, they said that Facebook had “historically failed to protect the welfare of children on its platforms” and urged the company to abandon its plans for Instagram Kids.


Concerns among the attorneys general intensified in September 2021 after Frances Haugen, a former Facebook employee, leaked company research indicating that the company knew its platforms posed mental health risks to young people. Facebook then announced it was pausing the development of Instagram Kids.


That November, a bipartisan group of attorneys general, including Colorado, Massachusetts and New Hampshire, announced a joint investigation into Instagram’s impact — and potential harmful effects — on young people.


Remedies


Under local and state consumer protection laws, the attorneys general are seeking financial penalties from Meta. The District of Columbia and the states are also asking the court for injunctive relief to force the company to stop using certain tech features that the states contend have harmed young users.


What happens next


Meta is expected to fight to dismiss the case. Weiser, the Colorado attorney general, said in a news conference that he filed the lawsuit because he wasn’t able to reach a settlement with the company. He noted that Meta had filed a motion to dismiss a separate lawsuit filed by consumers, which accuses the company of similar allegations of harms toward children and teenagers.


Separately, a group of attorneys general from more than 40 states is pursuing an investigation into user engagement practices at TikTok and their possible harmful effects on young people. That investigation, which was announced in 2022, is ongoing.

50 views0 comments
bottom of page