Apple Intelligence is integrated into iOS 18, iPadOS 18 and macOS Sequoia.
Apple has introduced Apple Intelligence, a personal intelligence system for iPhone, iPad and Mac, which combines the power of generative models with personal context.
The personal intelligence system was unveiled yesterday at the 2024 Apple Worldwide Developers Conference at Apple Park in Cupertino, where the company laid out its artificial intelligence (AI) roadmap.
The ongoing event highlights the latest iOS, iPadOS, macOS, watchOS, tvOS and visionOS advancements for developers.
Apple Intelligence is integrated into iOS 18, iPadOS 18 and macOS Sequoia, says the company. It harnesses Apple silicon to understand and create language and images, take action across apps, and draw from personal context to simplify and accelerate everyday tasks, it adds.
“We’re thrilled to introduce a new chapter in Apple innovation. Apple Intelligence will transform what users can do with our products – and what our products can do for our users,” said Apple CEO Tim Cook.
“Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence. And it can access that information in a completely private and secure way to help users do the things that matter most to them. This is AI as only Apple can deliver it, and we can’t wait for users to experience what it can do.”
According to the tech giant, Apple Intelligence unlocks new ways for users to enhance their writing and communicate more effectively. With new writing tools built into iOS 18, iPadOS 18 and macOS Sequoia, users can rewrite, proofread and summarise text on Mail, Notes, Pages and third-party apps, says Apple.
It points out that with Rewrite, Apple Intelligence allows users to choose from different versions of what they have written, adjusting the tone to suit the audience and task at hand.
With Priority Messages, it adds, a new section at the top of the inbox shows the most urgent e-mails, while Smart Reply provides suggestions for a quick response, and will identify questions in an e-mail.
Priority Notifications appear at the top of the stack to surface what’s most important, and summaries help users scan long or stacked notifications to show key details on the Lock Screen, such as when a group chat is particularly active.
In the Notes and Phone apps, users can now record, transcribe and summarise audio. Apple explains that when a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points.
Powered by Apple Intelligence, the company notes that Siri has become more deeply integrated into the system experience. With richer language-understanding capabilities, it says Siri is now more natural, contextually relevant and personal, with the ability to simplify and accelerate everyday tasks.
Apple explains that Siri can follow along if users stumble over words and maintain context from one request to the next. Additionally, users can type to Siri, and switch between text and voice to communicate with Siri.
Siri can now give users device support everywhere they go, and answer thousands of questions about how to do something on iPhone, iPad and Mac.
At the event, Apple also announced it is integrating ChatGPT access into iOS 18, iPadOS 18 and macOS Sequoia, allowing users to access its expertise − as well as its image- and document-understanding capabilities − without needing to jump between tools.
According to the company, Siri can tap into ChatGPT’s expertise when helpful. Users are asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly.