Apple Intelligence: Everything you need to know about the AI ​​Apple model and services

If you have recently updated to a newer iPhone, you probably noticed that Apple Intelligence appears in some of the most -used applications, such as messages, mail and notes. Apple Intelligence (yes, also shortened to AI) appeared in the Apple ecosystem in October 2024 and will remain here when Apple competes with Google, OpenAI, Antropic and others to build the best AI tools.

What is Apple Intelligence?

Cupertino marketing Madeutive called Apple Intelligence: “Ai for the rest of us.” The platform has been designed to use things that generative artificial intelligence is doing well, such as generating text and image to improve existing functions. Like other platforms, including chatgpt and Google Gemini, Apple Intelligence has been trained in large information models. These systems use deep learning to create connections, be it text, images, video or music.

The text offer, powered by LLM, presents itself as writing tools. This function is available in various Apple applications, including postal, messages, pages and notifications. It can be used to provide a summary of long text, correction and even writing messages for you, using content and hints.

Image generating has also been integrated in a similar way – though slightly less smooth. Users can monitor Apple intelligence to generate non -standard emoji (Genmojis) in the Apple House style. Meanwhile, Image Playground is an independent application for generating an image that uses poems to create visual content that can be used in messages, speeches or made available through social media.

Apple Intelligence also means that Siri was long waited for face lifting. Smart Assistant was early in the game, but it was usually neglected for the last few years. Siri is integrated much deeper with Apple operating systems; For example, instead of a known icon, users will see a glowing light around the edge of the iPhone screen when it does.

More importantly, the new Siri works in applications. This means, for example, that you can ask Siri to edit the photo, and then insert them directly into the text message. This is a non -fact that previously lacked the assistant. Awareness on the screen means that Siri uses the context of the content with which you are currently involved to give the appropriate answer.

https://www.youtube.com/watch?v=pugkqput8

Leading to WWDC 2025, many expected Apple to introduce us to an even more fake version of Siri, but we will have to wait a little longer.

“As we shared, we continue working on providing functions that make Siri even more personal,” said Apple SVP Craig Federighi software at WWDC 2025. “This work needed more time to reach our high -quality bar, and we can't wait to share more in the coming year.”

The one that has been released, a more personalized version of Siri, is to be able to understand the “personal context”, such as relationships, communication routine and many others. But according to the Bloomberg report, the version of this new Siri is the version too corrected with a mistake to send shippingHence his delay.

At WWDC 2025, Apple also presented a new AI function called Visual Intelligence, which helps you search for the images you see while browsing. Apple has also presented a live translation function, which can explain conversations in real time in messages, guy and telephone applications.

Visual intelligence and live translation are expected to be available later in 2025, when iOS 26 launches the audience.

When was Apple Intelligence presented?

After months of speculation, Apple Intelligence took a central place at WWDC 2024. The platform was announced after the stream of generative news AI from companies such as Google and Open AI, causing fear that the famous technological giant left the boat on the latest technological madness.

However, unlike such speculation, Apple had a team, working on what turned out to be a very apple approach to artificial intelligence. There was still a pizzah among demonstrations – Apple always loves to put up in the program – but Apple Intelligence is ultimately a very pragmatic approach to the category.

Apple Intelligence is not an independent function. It is rather about integration with existing offers. Although it is a branding exercise in a very real sense, technology based on a large language model (LLM) will work behind the scenes. As for the consumer, the technology will primarily introduce itself in the form of new functions of existing applications.

We learned more during the Apple iPhone 16 event in September 2024. During the Apple event, AI advertised a number of functions powered by AI, from its devices, from translation on Apple Watch Series 10, visual search for iPhones and a number of corrections after Siri's capabilities. The first wave of Apple intelligence appears at the end of October, as part of iOS 18.1, iPados 18.1 and MacOS sequoia 15.1.

Functions introduced to the market in the USA. Later, Apple added Australian, Canadian, New Zealand, South African and British English locations. Support for Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish and Vietnamese will come in 2025.

Who gets Apple's intelligence?

iPhone 15 Pro Max in natural titan, lead, showing the back of the phone

The first wave of Apple Intelligence appeared in October 2024 via iOS 18.1, iPados 18. And MacOS Sequoia 15.1. These updates included integrated writing tools, image cleaning, summary of articles and entering for writing for the redesigned Siri experience. The second wave of functions became available under iOS 18.2, iPados 18.2 and MacOS sequoia 15.2. This list includes Genmoji, a playground, visual intelligence, wand and chatgpt integration.

These offers are free, if you have one of the following equipment:

  • All iPhone 16 models
  • iPhone 15 Pro Max (A17 PRO)
  • iPhone 15 Pro (A17 PRO)
  • iPad Pro (M1 and later)
  • iPad Air (M1 and later)
  • iPad mini (A17 or later)
  • MacBook Air (M1 and later)
  • MacBook Pro (M1 and later)
  • IMac (M1 and later)
  • Mac Mini (M1 and later)
  • Mac Studio (M1 Max and later)
  • Mac Pro (M2 Ultra)

In particular, only the IPhone 15 Provil gain access due to shortcomings on the chipsets of the standard model. Presumably, however, the entire iPhone 16 line will be able to launch Apple Intelligence when it arrives.

How does AI Apple work without an internet connection?

When you ask a GPT or Gemini question, your inquiry is sent to external servers to generate a answer that requires an internet connection. But Apple took a small approach to the training.

The biggest advantage of this approach is that many of these tasks become much less intense resources and can be done on the device. This is due to the fact that instead of being relying on the approach to the kitchen sink, which drives platforms such as GPT and Gemini, the company has developed data sets in its own places to specific tasks, such as, say, composing e-mails.

However, this does not apply to everything. More complex queries will use a new private calculation offer in the cloud. The company now runs remote servers operating on Apple Silicon, which, he claims, allows it to offer the same level of privacy as consumer devices. Regardless of whether the action is performed locally or via the cloud, it will be invisible to the user, unless their device is offline, at which time remote queries will increase the error.

Apple Intelligence with other companies' applications

OPENAI and CHATGPT logo

A lot of noise was created about the pending cooperation of Apple with Openai before the launch of Apple Intelligence. Ultimately, however, it turned out that the contract did not concern the supply of Apple intelligence, but more about offering an alternative platform for those for which it is not really built. This is a silent confirmation that building a small model system has its limitations.

Apple intelligence is free. It is similar with access to chatgpt. However, people with paid accounts for the latter will have access to premium functions, which free users do not, including unlimited queries.

Chatgpt integration, which debuts at iOS 18.2, iPado 18.2 and MacOS Sequoia 15.2, has two basic roles: completing the Siri knowledge base and adding to existing writing tool options.

By enabling the service, some questions would prompt new Siri to ask the user to approve his access to chatgpt. Recipes and travel planning are examples of questions that can arise in this option. Users can also directly assemble Siri to “chatgpt queries”.

Composte is another basic chatgpt function available via Apple Intelligence. Users can access it in any application that supports the new writing tool function. Composte adds the ability to save the content based on monitor. This combines existing writing tools, such as style and summary.

We know for sure that Apple plans to cooperate with additional generative AI. The company is everything, but she said that Google Gemini is next on this list.

Can programmers be based on AI Apple models?

At WWDC 2025, Apple has announced what it calls the framework of foundation models, which will allow programmers to use their AI models during offline.

This means that programmers are more possible to build AI functions in their applications of other companies that use existing Apple systems.

“For example, if you are preparing for the exam, an application like Kahoot can create a personalized quiz from notes to make studies more engaging,” said Federighi at WWDC. “And because this happens using models for the device, this happens without the cost of the API interface in the cloud (…) We could not be more excited how developers can use the Apple intelligence to provide new experiences that are intelligent, available when you are offline, and which protect your privacy.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here