December 28, 2024, 9:51 am | Read time: 3 minutes
Not every application is made for everyone, and children and young people, in particular, should pay close attention to what they download. However, the protection of minors in the Apple App Store apparently leaves a lot to be desired.
There are countless apps for mobile devices for a wide variety of purposes. In addition to everyday helpers, there are, of course, lots of games and other content just waiting to be discovered by interested users. There are also special programs for children and young people that are labeled as such. However, according to a new report, the protection of minors in Apple’s App Store has considerable shortcomings. It, therefore, represents a danger for younger users. TECHBOOK explains the details.
Youth Protection Advocates Issue Warning About Apple App Store
In the USA, two organizations have joined forces to investigate the protection of minors in the Apple App Store. The Heat Initiative consists of safety experts who specialize in child safety. ParentsTogether Action, on the other hand, is a non-profit organization that represents more than three million families. Together, they looked at 800 of the two million apps in the App Store within 24 hours.
The results were sobering: although 200 of the apps examined were rated for children aged four, nine, and twelve, they still contained problematic and inappropriate content for these age groups. It was noticeable that entire app categories contained unsuitable age classifications.
While games or chat apps with strangers, for example, were in most cases only classified from the age of 17, software for weight loss or unfiltered internet access was mostly rated from the age of four.
Several dangerous categories Identified in the Apple App Store
The report indicates that the problematic apps have been downloaded over 550 million times. These are divided into games and apps on topics such as beauty, diet, internet browsers, and chats. These include highly sexualized games, programs that encourage users to fast for 20 hours, an AI girlfriend, and a chat where allegedly only “pedophiles” can be found.
Consequently, ParentsTogether Action and Heat Initiative accuse Apple of disseminating a vast array of risky and inappropriate content to children and adolescents via the App Store. Such content could lead to serious harm, including sexual abuse, low self-esteem, eating disorders, and exposure to sexual and violent content.
The report also accuses Apple of shirking responsibility for this. Despite promising a secure store, the company reportedly shifts all legal liability onto the app developers. These would be primarily responsible for the age ratings. There is no independent review body.
TECHBOOK explains The Internet Phenomenon of Review Bombing Explained
Without Replacement Google Removes Practical Function from the Play Store
Public Beta iOS 18 now available! What’s in the iPhone update
How to Improve the Protection of Minors in the Apple App Store
In order to improve the protection of minors in the Apple App Store, the organizations are proposing an independent review. This should be similar to the process for films and video games. In addition, Apple should make the age rating process transparent and regularly check the accuracy of the classification even after an application has been published.
If a classification turns out to be incorrect, Apple should make immediate corrections. The company should actively confront developers attempting to bypass such a system. In addition, further stricter protection mechanisms should be introduced to ensure that children can only view and download age-appropriate content.
Of course, the role of parents should not be underestimated in this context. They should pay attention to how often and to what content their children have access to. Apple’s position on the report is unknown, as is whether and how it intends to improve the App Store in this respect. TECHBOOK has made a corresponding request and has not yet received a response. Youth protection authorities would likely favor an approach akin to that of the gaming platform Steam, where 23,000 games faced potential removal for lacking proper age ratings.