Skip to main content

Apple Intelligence: Everything you need to know about Apple’s take on AI

Type to Siri being used with Apple Intelligence in macOS Sequoia.
Apple

Apple Intelligence is Apple’s take on AI, and it looks to fundamentally change the way we interact with technology, blending advanced machine learning and AI capabilities into everyday devices.

Recommended Videos

Promising more conversational prose from Siri, automated proofreading and text summarization across apps, and lightning-fast image generation, Apple’s AI ecosystem is designed to enhance user experiences and streamline operations across its product lineup. Here’s everything you need to know about Apple’s transformational new AI.

Apple Intelligence release date and compatibility

Apple Intelligence was originally slated for formal release in September, coinciding with the initial roll out of iOS 18, iPadOS 18, and macOS Sequoia. However, as Bloomberg’s Mark Gurman reported, Apple subsequently decided to slightly delay the release of Intelligence.

It was made available to developers as part of the iOS 18.1 beta release on September 19, and officially launched alongside the 18.1 roll out in October. However, it wasn’t until the release of iOS 18.2 in December 2024 that many Apple Intelligence features such as Genmoji, Image Playground, and Visual Intelligence finally arrived for all users. The company also released a bug fix addressing the “Apple Intelligence causing devices to overheat” issue.

NEW: Apple Intelligence will arrive later than anticipated, coming in iOS 18.1 and iPadOS 18.1 in October and missing the initial September releases. Still, 18.1 will go into beta for developers this week. https://t.co/LqXDvjO6ef

— Mark Gurman (@markgurman) July 28, 2024

These new AI features are available for users on the iPhone 15 Pro, 15 Pro Max, and the iPhone 16, as well as iPads and Macs with M1 or newer chips (and presumably the iPhone 16 handsets as well, since they’ll all be running iOS 18).

Currently, the features are only available when the user language is set to English, though the company plans to add support for Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese in an update scheduled for April 2025.

New AI features

Math Notes feature in iPadOS 18.
Apple

No matter what device you’re using Apple Intelligence with, the AI focuses primarily on three functions: writing assistance, image creation and editing, and enhancing Siri’s cognitive capabilities.

Apple Intelligence is designed to span the breadth and width of the company’s product line. As such, virtually every feature found in the macOS version of Apple Intelligence is mirrored in the iOS and iPadOS versions. That includes Writing Tools, Image Playground, Memories in Photos, and Siri’s improvements.

In addition, iPadOS, when paired with Apple Pencil, unlocks more features. Smart Script in the Notes app, for example, straightens and smooths handwritten text in real time. The new Math Notes calculator will automatically solve written equations in the user’s own handwriting and generate interactive graphs based on those equations with a single tap.

We at Digital Trends took an early version Apple Intelligence for a spin using macOS Sierra beta, but came away rather disappointed with what we’ve seen so far from the digital agent — a sentiment mirrored by many Apple Intelligence users. For one, only a fraction of the AI tools were actually available to use through the beta release. And the tools we did have access to, including the writing assistant, Siri, and audio transcription, proved buggy and unreliable.

By the time 18.1 was released, Apple had thankfully addressed many of those issues, putting Apple Intelligence on par with more established AI assistants, like Google’s Gemini.

Writing Tools

Apple Intelligence's Writing Tools being used in macOS Sequoia.
Apple

The new Writing Tools feature can proofread the user’s writing and rewrite sections as necessary, as well as summarize text across Apple’s application ecosystem including Mail, Notes, and Pages. Third-party developers will be able to leverage Writing Tools in their own apps via API calls.

For example, within the Mail app, Apple Intelligence will provide the user with short summaries of the contents of their inbox, rather than showing them the first couple of lines of the email itself (though if you aren’t a fan of that feature, it’s easy to disable). Smart Reply will suggest responses based on the contents of the message and ensure that the reply addresses all of the questions posed in the original email. The app even moves more timely and pertinent correspondence to the top of the inbox via Priority Messages.

The Notes app has received significant improvements as well. With Apple Intelligence, Notes offers audio transcription and summarization features, as well as an integrated calculator, dubbed Math Notes, that solves equations typed into the body of the note.

Image Playground

The Image Playground being used with Apple Intelligence in macOS Sequoia.
Apple

Image creation and editing functions are handled by the new Image Playground app, wherein users can spin up generated pictures within seconds and in one of three artistic styles: Animation, Illustration, and Sketch. Image Playground is a standalone app, although many of its features and functions have been integrated with other Apple apps like Messages.

Apple Intelligence is also improving your device’s camera roll. The Memories function in the Photos app was already capable of automatically identifying the most significant people, places, and pets in a user’s life, then curating that set of images into a coherent collection set to music. With Apple Intelligence, Memories is getting even better.

The AI can select photos and videos that best match the user’s input prompt (“best friends road trip to LA 2024,” for example), then generate a story line — including chapters based on themes the AI finds in the selected images — and assemble the whole thing into a short film. Photos users also now have access to Clean Up, a tool akin to Google’s Magic Eraser and Samsung’s Object Eraser, and improved Search functions.

Siri

Summoning Siri on an iPhone.
Nadeem Sarwar / Digital Trends

Perhaps the biggest beneficiary of Apple Intelligence’s new capabilities is Siri. Apple’s long-suffering digital assistant has been more deeply integrated into the operating system, with more conversational speech and improved natural language processing. You’ll have to manually enable the feature on your iPhone before you can use it, but doing so is a simple task.

What’s more, Siri’s memory now persists, allowing the agent to remember details from previous conversations, while the user can seamlessly switch between spoken and written prompts. Apple is reportedly working on an even more capable version of Siri, but its release may not come until 2026.

Apple is also expected to roll out a number of new capabilities as part of the 18.4 update scheduled for around April 2025. These include Personal Context, which enables Siri to know both where a piece of content is on the device, as well as how it got there. With it, you’ll be able to ask questions like “When is Mom’s flight landing,” or make requests like “Play that podcast that Jamie recommended.”

Siri will also reportedly gain a better understanding of what is happening on your device’s screen later this spring. If, for example, you’ve been texted an address, you’ll be able to simply say, “Add that to my contacts” and Siri should be able to complete the action without added clarification from you, the user. Both new features are part of Apple’s larger app integration initiative, which will enable Siri to interact natively with, and send commands to, a wide array of iOS and third-party applications.

Apple Intelligence privacy

A diagram showing Apple's entire setup for AI computing.
Apple

In order to avoid the costly and embarrassing data leaks that some of its competitors have suffered in recent months, Apple has put privacy at the front and center of the Apple Intelligence experience, even going so far as to build out its own private and secure AI compute cloud, named Private Cloud Compute (PCC), to handle complex user queries.

Most of Apple Intelligence’s routine operations are handled on-device, using the company’s most recent generations of A17 and M-family processors, said Craig Federighi, Apple’s senior vice president of Software Engineering, during WWDC 2024. “It’s aware of your personal data, without collecting your personal data,” he added.

“When you make a request, Apple Intelligence analyzes whether it can be processed on-device,” Federighi continued. “If it needs greater computational capacity, it can draw on Private Cloud Compute and send only the data that’s relevant to your task to be processed on Apple silicon servers.” This should drastically reduce the chances of private user data being hacked, intercepted, spied upon, and otherwise snooped while in transit between the device and PCC.

“Your data is never stored or made accessible to Apple,” he explained. “It’s used exclusively to fulfill your request and, just like your iPhone, independent experts can inspect the code that runs on these servers to verify this privacy promise.” The company is so confident in its cloud security that it is offering up to a million dollars to anyone able to actually hack it.

Apple Intelligence will defer to ChatGPT on complex queries

ChatGPT and Siri integration on iPhone.
Nadeem Sarwar / Digital Trends

Apple Intelligence isn’t the only cutting-edge generative AI taking up residence in your Apple devices. During WWDC 2024, Apple and OpenAI executives announced that the two companies are launching a partnership that will see ChatGPT functionality (powered by GPT-4o) — including text generation and image analysis — integrated into Siri and Writing Tools. ChatGPT will step in if Siri’s onboard capabilities aren’t sufficient for the user’s query, except that ChatGPT will instead send the request to OpenAI’s public compute cloud rather than the PCC.

Users won’t have to navigate away from the Siri screen when leveraging ChatGPT’s capabilities. OpenAI’s chatbot functions in the background when it is called upon, and Siri will state the answer regardless of which AI handles the query. To ensure at least a semblance of privacy protections, the device will display a confirmation prompt to the user before transmitting their request, as well as for any documents or images the user has attached.

ChatGPT x Apple Intelligence—12 Days of OpenAI: Day 5

On Day Five of the 12 Days of OpenAI event in December, OpenAI CEO Sam Altman provided additional details about how the two systems will work together, noting that ChatGPT is accessible directly from their device’s user interface (regardless of whether it’s iOS, iPadOS, or MacOS) and that users will have the option of either logging into their ChatGPT account to access it or using it anonymously. You’ll also be able to access ChatGPT directly simply by telling Siri to have ChatGPT handle the task (i.e., “Siri, have ChatGPT assemble a holiday music playlist.”)

Apple Intelligence trained on Google’s Tensor Processing Units

Google's Tensor G2 chip.
Google

A research paper from Apple, published in July, reveals that the company opted to train key components of the Apple Intelligence model using Google’s Tensor Processing Units (TPUs) instead of Nvidia’s highly sought-after GPU-based systems. According to the research team, utilizing TPUs allowed them to harness enough computational power needed to train its enormous LLM, as well as do so more energy efficiently than they could have using a standalone system.

This marks a significant departure from how business is typically done in AI training. Nvidia currently commands an estimated 70% to 95% of the AI chip market, so to have Apple opt instead for the product of its direct rival — and reveal that fact publicly — is highly unusual, to say the least. It could also be a sign of things to come. Nvidia’s market dominance couldn’t last forever — we’re already seeing today’s hyperscalers making moves into proprietary chip production.

Beyond Google’s ongoing TPU efforts, Amazon announced that it’s working on its own chipline, one that would outperform Nvidia’s current offerings by 50% while consuming half as much power. Microsoft announced that it will utilize AMD’s family of AI chips in May 2024.

Andrew Tarantola
Former Digital Trends Contributor
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
Everything you need to know about OpenAI’s browser-based agent, Operator
Operator home screen

ChatGPT creator OpenAI has finally entered the agentic AI race with the release of its Operator AI in January. The agentic system is designed to work autonomously on its user's behalf and is primed to compete against already established industry rivals like Claude's Computer Use API and Microsoft's Copilot agents -- at least, once it sheds its "research preview" status.

Here's everything you need to know about OpenAI's new agent and when you might be able to try it for yourself.
What is Operator?
OpenAI's Operator is an agent AI, meaning that it is designed to take autonomous action based on the information available to it. But unlike conventional programs, AI agents are able to review changing conditions in real-time and react accordingly, rather than simply execute predetermined commands. As such, AI agents are able to perform a variety of complex, multi-step tasks ranging from transcribing, summarizing and generating action items from a business meeting to booking the flight, hotel accommodations, and rental car for an upcoming vacation based on your family's various schedules to autonomously researching topics and assembling multi-page studies on those subjects.

Read more
M4 MacBook Air: Release date, price and everything you need to know
Two people using the Apple MacBook Air with M4 chip at work.

It's Apple upgrade season, as the company has just updated the MacBook Air with a new chip, a fresh lick of paint, and a lower asking price, all alongside new versions of the Mac Studio, iPad Air and 10.9-inch iPad. Those upgrades to the MacBook Air should help to reinforce its position as one of the best MacBooks you can buy, with the popular device combining both power and portability in a lightweight package.

There's a lot to learn about the M4 MacBook Air, from its performance and design to the features it comes with. Here, we've rounded up everything we know about the M4 MacBook Air, so whether you're considering buying one or just want to get the scoop, it's all here.
M4 MacBook Air: Price and release date

Read more
Stargate Project: everything you need to know about OpenAI’s $500 billion gamble
Sam Altman at the OpenAI developer conference.

Shortly after taking office, Donald Trump touted a new private business venture, led by OpenAI, which plans to spend half a trillion dollars over the next four years building the data centers and power production plants that America's growing AI industry relies on.

“It’s big money and high-quality people,” Trump said during a January 21st press announcement alongside Sam Altman from OpenAI, Larry Ellison from Oracle, and Masayoshi Son from SoftBank. The project is “a resounding declaration of confidence in America’s potential” under his administration, Trump continued, despite the federal government not actually having anything to do with the project.

Read more