The Howard County Public School System of Maryland is the latest school district in the United States who took legal action against social media companies for contributing to a mental health crisis among students. 

The Online Safety Bill Continues Its Passage Through Parliament
(Photo : Leon Neal/Getty Images)
LONDON, ENGLAND - JANUARY 17: In this photo illustration, a teenage child looks at a screen of age-restricted content on a laptop screen on January 17, 2023 in London, England. 

Taking Legal Action

Meta, Google, Snap, and ByteDance has been sued by the Howard County Public System on Thursday, claiming that the platforms operated by these companies contributed to a mental health crisis among students in Maryland. According to a report from The Verge, these platforms are made to be addictive and dangerous to construct the way children think, feel, and behave. 

A list of issues was cited in the filing. This includes the addictive rewards on each application, like TikTok's For You Page, which provides an endless stream of suggested content based on user activity, and Facebook and Instagram's recommendation algorithms that features that are designed to create repetitive and excessive product usage to attract users. 

Each platform was also accused by the school district of encouraging unhealthy negative social comparisons that create body image issues, leading to mental and physical disorders in children. The parental controls of each app were also addressed as "defective", along with safety gaps that promote child sexual exploitation. 

While this does not specify the amounts of damages, Yahoo reported that it seeks compensation and punitive damages for the financial burden by the county for providing more mental health services.

Other Districts

Aside from the Howard County Public School System, WMAR2 reported Harford and Prince George's Counties in Maryland have also taken legal action against the same companies. School systems in other states also did the same thing, including Washington State, California, Florida, New Jersey, Pennsylvania, Tennessee, Alabama, and others who filed similar lawsuits.

Social Media Companies' Response

Google denied the allegations in the lawsuit, Spokesperson José Castañeda stated that the company also built age-appropriate experiences for kids and families on YouTube, providing parents with robust controls. 

Meanwhile, Snap Spokesperson Pete Boogaard said they vet all content before it can reach a  large audience in the platform as it helps to protect their users against the promotion and discovery of potentially harmful materials. 

Meta's Head of Safety Antigone Davis stated that the company has invested in technology that finds and removes content related to self-harm and eating disorders. While these are complex issues, the company promises to continue working with parents, experts, and regulators to develop new tools, features, and policies that families needed.

Also Read: How Video Games Help Cope With Mental Illness; Best Titles for Better Mental Health

In Kenya, Meta was found liable for being the primary employer of content moderation. As a result, TechCrunch reported that the Kenyan court sued the company and its content review partner in Africa named Sama, for unlawful dismissal. Meta was accused of using its technology for this work, adhering to its performance and accuracy metrics. 

Related Article: [BEWARE] Smartphone Addiction Linked to Nomophobia; New Study Reveals Other Serious Health Consequences

Written by Inno Flores

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion