Meta Sued Over Features Used To Lure Children Onto Social Media Platforms

Late last month, Meta Platforms, Inc., formerly known as Facebook, Inc., was sued by a bipartisan coalition of over 40 state attorneys general that included California, New York, Georgia, South Carolina, and Illinois, accusing the social media giant of violating state and federal child privacy and false advertising laws. The lawsuits come as the states claim Meta Platforms designed its business models to radically maximize youth engagement via manipulative features that have had a strong negative effect on the mental and physical health of the children using the platforms. 

Many other states, including Massachusetts, Tennessee, and Mississippi, also filed similar actions. According to the combined lawsuits, the states said Meta’s platforms psychologically manipulate product features to induce young users’ “compulsive and extended use” of platforms like Instagram. As mentioned, the states believe the company’s algorithms were designed to push children and teenagers into rabbit holes of toxic and harmful content, creating features like “infinite scroll” and persistent alerts to hook young users. 

The attorneys general also charged the social media giant with violating a federal children’s online privacy law. The suit claims that Meta has violated the Children’s Online Privacy Protection Act, or COPPA, which is when a company unlawfully collects the personal data of children under the age of 13 without their parent’s permission.

 

Meta Is Aware of the Harm Its Platforms Pose

In a press release issued from New York Attorney General Letitia James’ office, Meta is fully aware of the harm its platforms have on youth. “Meta’s own internal research documents show its awareness that its products harm young users. Indeed, internal studies that Meta commissioned — and kept private until they were leaked by a whistleblower and publicly reported — reveal that Meta has known for years about these serious harms associated with young users’ time spent on its platforms.” 

In 2021, Meta said it was working on its social apps to help create a safer environment for youths. The company has introduced more than “30 tools” to support teenagers and families; however, many still believe this is not nearly enough to help keep their children safe. Previous attempts from the company to provide safer use of their apps have also backfired. Later that year, Facebook announced its plan to develop and launch a version of one of its social media apps that would be aimed at users younger than 13, named “Instagram Kids,” however, it quickly received backlash among concerned lawmakers and children’s groups.