During WWDC 2024, Apple poured a big vat of artificial intelligence onto expectant viewers, leaving us drenched in new AI features under the banner of Apple Intelligence. But how do all these features work?
Apple just held one of its most groundbreaking Worldwide Developers Conference (WWDC) keynotes, announcing major artificial intelligence (AI) features coming to its iPhone, iPad, and Mac operating systems later this year.
While we weren't expecting the company to unveil a slew of flashy generative AI features that would knock our socks off, the new AI announcements were still enough to stun spectators at the WWDC event.
Also: Apple unveils an on-device AI image generator for iPhone, iPad, and Mac
Apple was expected to focus on incorporating AI into its apps to simplify users' daily tasks, categorizing such AI features under the name "Apple Intelligence." We didn't miss the wordplay.
Apple Intelligence focuses on broad-appeal AI features rather than advanced image and video generation. To do this, the company developed in-house AI models and partnered with OpenAI to power a chatbot that will integrate ChatGPT into iOS, iPadOS, and MacOS.
Some of the biggest AI features from Apple Intelligence include:
Aside from these AI features, iOS 18 will include new customizable icons and interface updates for Control Center, Settings, and Messages. Apple is also launching a new Passwords app to replace the iCloud Keychain and give users a more user-friendly option, similar to 1Password and LastPass.
Also: Forget LastPass: Apple unveils 'Passwords' manager app at WWDC 2024
Siri is getting an upgrade, inside and out. The virtual assistant has a new look and a new list of features for users. The Siri logo has been redesigned, and instead of seeing the Siri bubble when users talk to it, a new glowing, colorful light that wraps around the device's edges will illuminate the display.
Looks aside, Siri will now understand and respond more naturally, making interactions feel more human-like. The AI assistant will also be able to maintain context from one request to the next to answer follow-up questions accurately.
Also: Every iPhone model that will get Apple's iOS 18 (and which ones won't)
With context on what is on your iPhone, iPad, or Mac and screen awareness, Siri can also make inferences from things you've gotten in your email, photos in your library, or messages. Apple shared an example of Siri being capable of adding an address to a contact card after a friend shared it in a text message.
The voice assistant will also be able to perform hundreds of new actions across Apple and third-party apps, like opening articles from a Reading List or looking up a specific photo in your library.
Apple is also upgrading Siri to understand text, allowing users to type or speak to Siri as needed.
Like the Google Pixel's Magic Eraser, Apple is giving its devices a new, AI-powered photo editing feature called Clean Up. This tool identifies distracting objects in the background, removes them, and regenerates the background. This is perfect for removing photobombers and other objects.
Also: 'Clean up' is the iPhone's new AI editing tool to wipe out photobombers
Apple is adding natural language photo and video search to its Photos app. This will let users enter a prompt to search for a picture, video, or even a segment of a video. Users can say, "Find a photo of that starfish we found at the beach last summer," and have their iPhone pull up a photo without having to scroll through thousands of photos to find the one.
In Photos, users will also be able to give a text prompt to create Memory videos. These will be compilations of videos and photos following a description, with music suggestions from Apple Music.
Apple is offering new systemwide writing tools for the iPhone, iPad, and Mac that will help users rewrite, proofread, and summarize text using AI. You can quickly generate a reply or summarize a long email to see what it's about.
The same feature can help users make major edits to text, such as changing the tone to make it more empathetic or proofreading to catch mistakes.
Also: You can finally schedule messages on the iPhone. Here's what to know
Apple devices will also get AI upgrades for recordings and transcriptions in Voice Memos and Phone, as users can quickly transcribe voice notes with AI.
Apple will let users generate images using AI in Messages, Photos Notes, Keynote, Freeform, Pages, and third-party apps with Image Playground. The images can be generated with Animation, Illustration, and Sketch styles.
Also: Apple unveils an on-device AI image generator for iPhone, iPad, and Mac
In the Notes app, for example, a new tool called Image Wand will let you circle a rough sketch and use on-device AI to have Image Playground pull ideas from the surrounding areas to generate an image.
Image Playground will be available in beta this fall, and some features will be released over the next year.
While Apple was busy drenching us in AI updates, it also confirmed the rumors of a new partnership with OpenAI, the company behind ChatGPT. This partnership brings ChatGPT integrations with Siri and the systemwide Writing Tools.
When users make a Siri request that the voice assistant determines would be better handled by ChatGPT, the virtual assistant asks them if it's okay to share their prompt with the AI chatbot. Then, it shows the ChatGPT response in the same window.
Also: Apple finally gave us the iPad app we've waited 14 years for at WWDC 2024
Apple says its ChatGPT integration has privacy protections in place, with obscured IP addresses and no request storage by OpenAI. ChatGPT on iPhone, iPad, and Mac will be free to use, provided the device is compatible with the latest operating system. Paying ChatGPT users will get access to additional Plus features.
For months, Apple was rumored to be working on different ways to keep its AI running strictly on device for security and privacy. However, Apple Intelligence is expected to rely on the cloud for at least some tasks, though the company is prioritizing on-device processing for enhanced privacy. Whether a specific task will be processed on-device or in the cloud will depend on task complexity, resource availability, data privacy considerations, and latency requirements.
Essentially, if a task is simple enough to be processed locally, leveraging the device's processing power and battery life, and requires immediate results, it is more likely to be handled on-device. Tasks involving sensitive data could also prioritize on-device processing, as Apple prioritizes data privacy.
Also: Here's every iPhone model that will support Apple's new AI features (for now)
In turn, cloud-based AI processing requires sending data from the device to remote servers that can handle complex or computationally heavy tasks. In Apple's case, tasks requiring processing large amounts of data or updated models could include intricate analysis and advanced generative AI requests.
Apple is leveraging what it calls Private Cloud Compute for complex tasks that require cloud servers. These processes draw on larger server-based models while protecting user privacy. The servers are built on Apple Silicon, and the data is never saved in the cloud.
Depending on its complexity and system requirements, an algorithm will determine whether a task requiring AI should be processed on-device or offloaded to the cloud. Simpler tasks like a Siri request and other basic NLP tasks can be processed on-device. More complex tasks, like generating a detailed summary of a large document, will be sent to the cloud, where more robust processing can occur.
"Apple Intelligence has groundbreaking privacy protection," said Craig Federighi, Apple's senior vice president of Software Engineering, during the WWDC 2024 keynote. Apple strongly emphasized consumer privacy when it announced Apple Intelligence, with Federighi saying its "architecture is built with privacy at the core."
The company is striving to keep the AI features secure through several mechanisms. including:
Apple's new AI features will be compatible with the latest Apple devices, including iPhone 15 Pro or newer models, which run on an A17 Pro chip, and iPads and Macs with an M1 chip or newer.
While these AI features may help drive sales of new iPhones and Macs, as a current iPhone 14 Pro Max owner, I hope that at least some will trickle down to older iPhone models.