Apple plans for improving the possibility of discovering applications using AI tagging techniques are now available in the beta build version with iOS 26.
However, tags do not appear in a public application store yet, nor do they inform about the App Store search algorithm in a public store.
Of course, with each upcoming App Store update, they speculate how the changes will affect the application search ranking.
New analysisSFor example, according to App Intelligence Provider, he suggests that metadata separated from the screenshots of the application affect its ranking.
The theoretical company decided that Apple was separated by the text from the signatures of the screenshots. Earlier, only the name, subtitle and keywords of applications counted in the search ranking.
The conclusion that screenshots inform about the discovery of the application is accurate, based on what Apple announced at its conference programmers around the world (WWDC 25), but the way Apple separates this data includes AI, not the OCR techniques, as App Figures homted.
At the annual conference of Apple programmers, the screenshots and other metadata will be used to improve the possibilities of discovering the application. The company said that it uses AI techniques to separate information that would otherwise be buried in the application description, information about its category, screenshots or other metadata. It also means that programmers should not add keywords to screenshots or take other steps to affect markers.
This allows Apple to assign a tag to better categorization of the application. Ultimately, developers would be able to control which of these of these tags assigned by AI would be associated with their applications, said the company.
In addition, Apple assured programmers that people would review tags before starting work.
Over time, it will be important to better understand tags, and which of them will help discover their application when tags reach global users of applications stores.