Thousands of Facebook Group members and administrators are reporting unexpected bans.
After a surge of account suspensions on Instagram and Facebook, users reported that Facebook Groups have been removed. Many have claimed that they were removed without breaking any clear policy.
Meta Confirms Technical Error Amid Mass Suspensions
Facebook Groups are the lifeline of users when they need to ask a question or simply learn firsthand experiences from other people. The mass ban spanned from science to content creation, gaming, and other groups.
Based on complaints posted throughout Reddit and online forums, the problem reaches both U.S.-based and global communities, with groups of all sizes. Some have more than one million members, and they receive unsubstantiated violation notices.
In response to growing outrage, Meta spokesperson Andy Stone confirmed in a statement to TechCrunch that the company is aware of the issue and is working on a fix.
"We're aware of a technical error that impacted some Facebook Groups. We're fixing things now," Stone told TechCrunch in an emailed statement.
Meta has yet to detail what triggered the error, how many groups were affected, or how long it will take to fix it. Without answers, users blamed AI-driven moderation tools, which many accuse of triggering bans due to buggy detection systems.
Harmless Groups Flagged for Baffling Violations
Admins of groups have posted screenshots of their groups being suspended for purported reasons like nudity, terrorism, and mentions of harmful organizations. These claims have raised suspicion, particularly when groups based on bird photography, Pokémon, or home decor are reported.
One moderator said their family-oriented Pokémon community with close to 200,000 users was accused of citing extremist material.
Another spoke of having their nearly million-member bird photo community banned for nudity. The majority of these communities strictly enforce moderation policies and have never been given content strikes in the past.
Reddit Becomes Support Hub for Frustrated Admins
The r/facebook subreddit has become a crisis hub for group admins trying to understand what is going on. Numerous have cited that several groups they administered were simultaneously taken down, with minimal recourse or direct communication from Meta.
Members who tried to appeal the bans claim the process results in an automated response or no response at all. Some have since begun suggesting that admins refrain from appealing right away, as there is a chance the bans will be reversed automatically when Meta resolves the technical glitch.
Meta Verified Support Sees Mixed Results
Some users who signed up for Meta Verified, a service that offers priority customer support, were able to recover their groups. Others report, however, that their paid subscription did them little good, particularly as the problem seems systemic and not a case-by-case affair.
With the Facebook Group ban, the users not only complained about what happened, but they also called out Meta's lack of reliable support during these times.
Automated Moderation Criticized on Other Platforms as Well
Meta is not alone in this scrutiny. In the past few weeks, mass suspensions have also occurred on Pinterest and Tumblr. Again, many users blamed AI was responsible for handling the moderation tools.
Pinterest owned up to its problem as a technical glitch but stated that AI was not involved. Tumblr attributed its suspensions to a new content filtering system but did not specify if AI was used.
While public anger worsens, the users have started a petition calling for Meta to own up and be transparent on what led to the suspensions.
The petition has already received over 12,000 signatures. Some entrepreneurs, whose Facebook Groups are a central part of their income streams, are now weighing the possibility of taking legal action.
ⓒ 2025 TECHTIMES.com All rights reserved. Do not reproduce without permission.