Apple is Not Doing Enough to Protect Children Against Ault Content on App Store, Watchdog Says

App store
Apple app store Pexels/Pixabay

Apple is not doing enough to protect minors from downloading adult content in the App Store, according to a non-profit watchdog organization. Apple is also accused of refusing to accept any responsibility for the loopholes.

Apple App Store Not Safe for Children

The Campaign for Accountability or CfA is a group that reported on Apple removing App Store games at the command of the Chinese government, according to Apple Insider.

Now, the group says the store is not protecting children enough from adult content. The group's Tech Transparency Project or TPP claims that the tech giant fails to take even the most simple child safety measures over adult material.

Michelle Kuppersmith, the executive director of CfA, said in a statement that Apple only claims that it has a tight grip over App Store creators and that it protects users from harmful content. Still, the company has not even put up obvious safeguard to keep minors safe.

The group says it created a test account that pretended to be a 14-year-old user and then tried to download content ranging from dating apps to gambling and pornography.

Attempting to download these apps has resulted in a pop-up message asking the user to confirmed if they were 17 years old or older.

If the test user claimed to be older, the app would go through with the download, but there is not further checking, according to The Guardian.

In April, Tinder stated that Google Play and Apple's App Store are the ones who should be responsible for preventing children from downloading dating apps.

CFA also stated that its test Apple ID was set for a 14-year-old and that apps should be aware of that setting. Kuppersmith added that if Apple already knows that a user is a minor, the user should not be able to download adult apps in the first place.

Kuppersmith said that the company has chosen to protect children from the app developers instead.

However, while the company has no desire to accept responsibility, it has no issue taking its cut of the profits that arise from age-inappropriate downloads.

Also, it is not clear if the group enabled the hard restrictions on age-restricted content when it tested the App Store. By default, the setting is to allow all apps to be downloaded.

Parents can do a hard block on age-inappropriate apps from within the Screen Time settings, but it needs to be done manually.

Watchdog's Report

The report regarding the App Store came after Apple announced that it is working on CSAM, a technology that will scan photos on iCloud accounts to make sure that there are no child pornography pictures stored.

In the group's full report, they wrote that they downloaded around 75 apps made for adults on the App Store. Many out of the 75 apps had their own age verification system, but they were inadequate.

CFA stated that a lot of the gambling apps were thorough, with some of them requiring scans of government-issued ID. The other adult contents only had age restriction warnings.

Out of the 75 apps tested, 37 urge users to register for an account, but the group said that the apps accepted the Apple ID even though it clearly shows the user is 14 years old.

Like the chat service Yubo, other apps are listed on the App Store as appropriate for users who are 17 years old and above, but in-app, they allow users as young as 13 years old.

Also, 31 of the tested apps offered signups through a Facebook account, and Facebook blocked the registration when it detected that the Apple ID is from a 14-year-old.

Apple has not responded to the Campaign for Accountability and its report.

Related Article: Apple's CSAM Catches San Francisco Doctor With Child Exploitative Images on His iCloud

This article is owned by Tech Times

Written by Sophie Webster

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics