Attorneys general from 33 states filed a court lawsuit on Wednesday alleging that Meta, Instagram's parent company, had received over 1.1 million reports of under-13 users since early 2019, but Meta "disabled only a fraction" of those accounts and collected data on children's whereabouts and email addresses without parental consent.

According to a report from The New York Times, the privacy charges are part of California, Colorado, and 31 other states' federal lawsuits against the social media juggernaut. The states seek to compel Meta to cease using certain features that allegedly harm young social media users.

The tech firm, under the helm of Mark Zuckerberg, might be fined hundreds of millions of dollars for violating the federal Children's Online Privacy Protection Act (COPPA).

Meta was sued last month in the US District Court for the Northern District of California for unjustly recruiting young users to Instagram and Facebook while suppressing internal research that shows user damage.

Meta Under Fire: 33 US States Allege Child Privacy Violations, Unlawful Tactics
(Photo : AFP via Getty Images)
A photo taken on March 14, 2022, shows the US social network Instagram logo on a smartphone screen in Moscow.

Evidence Unsealed: Here's What to Know

While much of the evidence cited in the initial filing was redacted, the unsealed complaint now provides additional details from the state's lawsuit. The complaint uses excerpts from internal emails, employee chats, and company presentations to assert that Instagram actively pursued underage users.

The complaint contends that Meta "continually failed" to prioritize effective age-checking systems, instead using approaches that allowed users under 13 to falsify their ages when creating Instagram accounts. It further accuses Meta executives of publicly stating in congressional testimony that the age-checking process was effective and that underage accounts were removed when identified, despite executives allegedly being aware of millions of underage users on Instagram.

Adam Mosseri, the head of Instagram, was cited in an internal company chat in November 2021 as acknowledging that "tweens want access to Instagram, and they lie about their age to get it now." However, in Senate testimony the following month, Mosseri stated, "If a child is under the age of 13, they are not permitted on Instagram."

The complaint hinges on COPPA, a 1998 federal statute that requires internet services for minors to acquire parental consent before collecting personal data from under-13s. Fines for law violations could exceed $50,000.  The lawsuit asserts that Meta intentionally avoided building systems to effectively detect and exclude underage users, viewing children as a crucial demographic for future growth.

Read Also: Julia Louis-Dreyfus Reveals ChatGPT Mistook Her for Julia Roberts in AI-Assisted Speech

Meta had indicators of underage users, including an internal chart tracking the percentage of 11 and 12-year-olds using Instagram daily. The complaint alleges that Meta knew about accounts belonging to specific underage users but "automatically" ignored certain reports of users under 13 if the accounts did not contain a user biography or photos.

Meta: Parents Should Be More Active in Safeguarding Children's Privacy

Meta responded to the allegations in a statement, stating that it has spent a decade working to create safe and age-appropriate online experiences for teenagers. The social media firm argues that the state's complaint mischaracterizes its work and uses selective quotes and cherry-picked documents.

Facebook's parent also stated in a recent news release that it is "working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent. This platform will be similar to work we have done to prevent the non-consensual sharing of intimate images for adults."

Moreover, Meta called for federal legislation to place more responsibility on parents for kids' app downloads, proposing a requirement for parents to have approval power for downloads for kids under 16, according to Engadget.

The latest legal action against Meta follows years of research indicating the negative impact of social media on children, including Facebook's research in 2021, which found that Instagram is harmful to a significant percentage of teens, particularly teenage girls, as per a report from Mashable.

Related Article: Apple Facing New NLRB Charge for Alleged Illegal Union-Busting Tactic

byline quincy

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion