Here are the best ads AI Apple from WWDC 2025

Last year, the main Apple WWDC program emphasized the company's ambitious progress in artificial intelligence. This year, the company toned its emphasis on Apple's intelligence and focused on updates to its operating systems, services and software, introducing a new aesthetics, which it calls “liquid glass” along with the convention of new names.

Nevertheless, Apple still tried to calm the crowd using several AI ads, such as a tool for image analysis, training trainer, live translation function and many others.

Visual intelligence

Visual intelligence is the technology of analyzing the powered image of AI Apple, which allows you to collect information about your surroundings. For example, it can identify the plant in the garden, talk about the restaurant or recognize the jacket that someone wears.

Now the function will be able to interact with information on the iPhone screen. For example, if you come across a post in the social media application, visual intelligence can conduct an image search related to what you see while browsing. The tool performs searching using Google, ChatgPT and similar applications.

To access visual intelligence, open the control center or adjust the action button (the same button usually used to make a screenshot). The function becomes available with iOS 26 when it starts this year. Read more.

Chatgpt comes to the Image Playground

Apple integrated Chatgpt from the image playground, a tool for generating a drive image. Thanks to chatgpt, the application can now generate images in new styles, such as “anime”, “oil painting” and “watercolor”. There will also be the option of sending a poem to ChatGPT to allow the creation of additional images. Read more.

Training Buddhas

The latest trainer of training based on artificial intelligence is exactly as it sounds-the text model for speech is used to ensure an encouragement during exercise, imitating the voice of a personal trainer. When you start running, AI in the training application provides a motivational conversation, emphasizing key moments, such as when conducting the fastest mile and medium heart rate. After completing the training, AI summarizes the average tempo, heart rate and whether you have achieved any milestone. Read more.

Live translation

Apple Intelligence feeds the new live translation function, guy and phone calls. This technology automatically explains the text or words spoken into the preferred user language in real time. During the guy calls, users will see live signatures, while in the case of telephone calls Apple will translate the conversation aloud. Read more.

Ai helps in unknown callers

Apple has introduced two new functions powered by artificial intelligence to telephone connections. The first is called a starting study, which automatically responds to connections from unknown numbers in the background. This allows users to hear the name of the caller and the reason for the merger before deciding whether to answer.

The second function, Hold Assist, automatically detects Hold Music while waiting for the Call Center agent. Users can remain in contact when suspending, enabling them to use the iPhone for other tasks. Notifications will notify them when a live agent is available. Read more.

Suggestions of surveys in News

Apple has also introduced a new function that allows users to create surveys in the message application. This function uses Apple intelligence to suggest polls based on the context of your conversations. For example, if people in group chat have problems making decisions, Apple Intelligence will recommend starting a survey to help you make a decision. Read more.

AI powered shortcuts

The application of shortcuts is becoming more and more useful thanks to Apple Intelligence. The company explained that when building a shortcut, users will be able to choose the AI ​​model to enable functions such as AI summary. Read more.

Contextually aware of the headlight light

A small update is introduced to Spotlight, the search function on the Mac. Currently, it will take into account Apple intelligence to improve its contextual awareness, providing suggestions for activities that users usually perform, and adapted to their current tasks. Read more.

Foundation models for programmers

Apple now allows programmers to access AI models, even during offline. The company has introduced the Framework Foundation Models, which allows programmers to build more AI capabilities in their applications of other companies that use existing Apple systems. This is probably to encourage more programmers to create new AI functions, because Apple competes with other AI companies. Read more.

Siri's surprise with AI Apple

The most disappointing news from this event was that the long -awaited events for Siri are not ready yet. Participants willingly noticed the promised AI powered features that were to debut. However, Craig Federighi, SVP Apple software, said they wouldn't have more to share until next year. This delay may arise questions about the Apple strategy for a voice assistant on an increasingly competitive market. Read more.

LEAVE A REPLY

Please enter your comment!
Please enter your name here