All the AI Features Apple Is Planning to Roll Out This Year

All the AI Features Apple Is Planning to Roll Out This Year


As the rest of the tech world scrambles to add as many AI features as possible to every product and service imaginable, Apple has kept quiet. In the eighteen months since OpenAI changed the game with the launch of ChatGPT, Apple has yet to roll out any substantial AI features to the iPhone, iPad, or Mac, even as Microsoft, Google, and Meta have seemed focused on nothing else.

But if the rumors are to be believed, that’s changing this year, as Apple is expected to release new AI features for iOS 18 and macOS 15 at WWDC in June. There have been murmurs about the trillion-dollar company’s AI plans for months now, and the hints keep on coming. According to Bloomberg’s Mark Gurman, who has a reputable track record of reporting on Apple rumors, the company is planning an AI approach that isn’t quite as flashy as some of the competition’s efforts. Instead, Apple will roll out AI features that integrate with the apps iPhone and Mac users already know and use, such as Photos, Notes, and Safari. The initiative is known as “Project Greymatter.”

As a bit of an AI skeptic, I like this plan. It allows Apple to enter into the AI space while offering AI-powered features people might actually use, rather than wasting resources on “revolutionary” AI capabilities that most users will ignore once the novelty wears off.

How Apple plans to power its AI features

Whatever Apple ends up announcing this year, its AI features will need to be powered by…something. Your iPhone or Mac likely already has an NPU (neural processing unit), which is a part of the hardware specifically designed for running AI tasks. (Apple already does have some AI features, including Live Text, which uses the NPU to handle processing.) The company made a big deal about the NPU in the M4 chip in the new iPad Pros, which, once added to the new line of Macs, will likely power many of Apple’s upcoming AI features.

However, not all features are ideal for on-device processing, especially for older iPhones and Macs. Gurman predicts that most of Apple’s features will be able to run locally on devices made in roughly the last year. If your iPhone, iPad, or Mac is pretty new, the hardware should keep up. However, for older devices, or any features that are particularly power hungry, Apple is planning to outsource that processing power to the cloud.

Apple has reportedly been in talks with both OpenAI and Google to lease those companies’ cloud-based AI processing to run some of its new features, but it’s not clear whether (or when) those deals will materialize. Gurman says Apple plans to run some cloud-based features from server farms with M2 Ultra chips. If Apple can manage to handle their cloud processing on their own, I’m sure they’d prefer that to signing a deal with a competitor. We’ll likely see how Apple’s grand AI plan is coming together at WWDC.

Speaking of plans, here are the AI features Apple is rumored to be revealing in June:

Generative AI emojis

Emojis are a huge part of any iOS or macOS update (we may have already seen a handful of new emojis heading for a future version of iOS 18 and macOS 15). However, Gurman suggests Apple is working on a feature that will create a unique emoji with generative AI based on what you’re currently typing. That sounds genuinely fun and useful, if done well. While there are a ton of emoji to choose from already, if nothing fits your particular mood, perhaps an icon created from whatever you’re actively talking about with a friend will be a better choice. Over on Android, users have had “Emoji Kitchen,” which lets you combine certain emoji to create something brand new, for a few years now. Apple seems poised to offer an effective iteration on that idea.

Siri, powered by AI

I don’t know about you, but I’ve never been too thrilled with Siri. The smart assistant frequently fails to follow through on my requests, whether because it misunderstands my query, or just ignores me altogether. For some reason, it’s especially bad on macOS, to the point where I don’t bother trying to use Siri at all on my MacBook. If Apple can find to supercharge Siri with AI, or at least make it reliable, that sounds great to me.

We learned about the possibility of Apple integrating AI with Siri earlier this month, based on info leaked to The New York Times by an unnamed source. Gurman’s recent report suggests Apple is planning on making interactions with Siri “more natural-sounding,” and while it’s possible the company will outsource that work to Google or OpenAI, Gurman says the company wants to use its own in-house large language models (LLM). There may even be a specialized AI Siri in the the works for the Apple Watch.

That doesn’t mean Apple is turning Siri into a chatbot—at least not according to Gurman, who reports that Apple wants to find a partner that can provide a chatbot for the company’s platforms in time for WWDC, without Apple having to build one itself. Right now, it appears the company has sided with OpenAI over Google, so we may see ChatGPT on the iPhone this year.

Intelligent Search in Safari

Earlier this month, we learned Apple has at least one AI feature planned for Safari: Intelligent Search. This feature reportedly scans any given web page and highlights keywords and phrases in order to build a generative AI summary of this site. While Gurman says Apple is working on improved Safari web search, we don’t know too much more about how Intelligent Search will work just yet.

AI in Spotlight search

Speaking of better search, Apple may use AI to make on-device searched in Spotlight more useful. The feature always gives me mixed results, so I’d love if generative AI could not only speed up my searches, but also return more relevant results.

AI-powered accessibility features

Apple recently announced a handful of new accessibility features coming “later this year,” which is almost certainly code for “coming in iOS 18 and macOS 15.” While not all of these features are powered by AI, at least two that are seem really interesting. First, Apple is using AI to allow users to control their iPhone and iPad with just their eyes, without the need for external hardware. There’s also “Listen for Atypical Speech,” which uses on-device AI to learn and identify your particular speech patterns.

Automatic replies to messages

As of last year, iOS and macOS will suggest words and phrases as you type, to help you finish your sentences faster. (This feature is not to be confused with the three predictive text options in Messages that have been around for years now.) With iOS 18 and macOS 15, Apple may roll out automatic replies as well. This means that when you receive an email or a text message, the app may suggest a full reply based on whatever you’re responding to. Soon, we may all be communicating with single button taps. (Hmm, do I want to tap the response that says “Sounds good: I’ll meet you there,” or “Sorry, I can’t. Raincheck!”)

Voice Memo transcription

This one would be a great use of AI: Apple already transcribes voice messages sent in the Messages app (quite quickly, in my experience), so utilizing AI to transcribe recordings you make in Voice Memos only makes sense.

Smart recaps

Gurman says one of Apple’s big focuses is “smart recaps,” or AI-generated summaries of information you may have missed while not looking at your iPhone, iPad, or Mac, including notifications, messages, web pages, articles, notes, and more. iOS already has a notification summary feature, but these smart recaps sound more involved.

Photo retouching

We don’t know too much about this feature, but Gurman also says Apple is working on ways to use AI for photo retouching. The company has already built an AI image editor that takes in natural language prompts to perform edits; it’s possible it will incorporate some of those features into an AI photo “enhancer” for the Photos app on iOS and macOS.



by Life Hacker