Apple has released a new set of App Store rules that touch on age checks, loans, branding and privacy. The update arrived last week in the App Review Guidelines. The document states that Apple may turn away any app when it feels the app crosses a line. The text reminds developers of a famous court remark, where a Justice said he would “know it when I see it.”
The first change has to do with creator apps. These apps must help people spot material that goes past an age rating and must bring in a tool that blocks younger users from that material, based on either verified or declared age. Another change deals with loan apps. Apple now says these apps cannot charge more than a 36% APR and cannot ask people to repay everything in sixty days or less. Apple also told developers that they cannot use another developer’s icon, name or brand unless they have clear approval from that developer.
Apple also gave fresh wording on mini apps built using HTML5 or JavaScript. These must follow the same rules as full apps. Apple added that any app that delivers software outside the main file cannot open up native platform tools unless Apple signs off on it. Those apps must also help users identify material that goes beyond the age rating and must add an age gate built on verified or declared age.
How Is Apple Dealing With AI And Personal Data?
A major change sits in section 5.1.2(i). Apple now demands that apps must tell users where personal data goes when shared with third parties. This includes any passing of data to third party AI systems. The company also says the app must get clear permission from the user before such sharing happens. According to Mashable, this is the first time Apple has mentioned AI in the guidelines.
Mashable adds that this changes the tone for the company. Under Tim Cook, Apple has often avoided speaking about AI during public talks and keynotes. Mashable explains that Cook tends to use the term machine learning instead. The new wording may appeal to people who dislike the idea of app data being used to train AI.
Mashable reports that the world of AI training has sparked legal fights. Mashable’s parent group Ziff Davis filed a lawsuit in April claiming that OpenAI used its work without approval. Apple now finds itself pulled into related complaints. According to Mashable, two neuroscientists and two authors filed lawsuits last month saying Apple used material from so called shadow libraries. These hold pirated books and research papers that have been copied and posted online.
More from News
- Venture Speak Easy At Slush 2025: Podcast And Drinks Hosted By TRMNL4, F1V, Meta, Solidgate and Oyster
- Have Consumers Lost Faith In Black Friday Sales?
- Reports Show The Biggest Barrier For UK Startup Success, Here’s What They Found
- Microsoft And G42 Partner To Expand UAE Data Centres
- Could Better Connectivity Stop Revenue Losses In UK Retail?
- How UK Firms Can Prepare For The Autumn Budget Without Harming Growth
- What Is ChatGPT’s New Group Chat Feature For?
- HIG Capital Returns To Norway With Industrial Portfolio Acquisition
Mashable reports that the legal picture is tough for companies accused of taking such material. It mentions that Anthropic settled a class action lawsuit in September for $1.5 billion over claims tied to shadow libraries. Those cases now sit in the background as people wait to see how Apple responds to the two filings.
What Could These Changes Mean For Developers And Users?
Mashable writes that Apple is widely seen as late to the AI race and is rumoured to be working on a version of Siri powered through Google Gemini. The company has not said much about this itself, but the talk around it has grown louder over recent months.
The new rules give Apple a chance to present itself as a company that guards user consent inside apps. The sharper language around personal data and AI may help Apple push back at claims that it has been slow to act in areas linked to privacy.
Developers must now rethink how they gather, handle and pass along data, especially where AI tools are concerned. Apps must ask for user approval and must tell users where their data is headed. Apple’s own text makes clear that failing to follow these rules can lead to an instant rejection.
The changes could give users a stronger sense of control. Apple now gives people a more visible path to understand where their data travels and how apps handle it. The company’s line about knowing when an app crosses into unsafe ground sets the tone for a tougher stance on privacy inside the App Store.