What’s especially interesting here is how this upends the usual Android assistant narrative. Google has been baking Gemini into Android with increasing urgency, and now we have Android hardware makers cutting deals with a third-party assistant — one that’s not Google. That’s a big deal. It says something about how fast the AI landscape is shifting.
And then there’s Apple.
Just imagine a future where Samsung phones have Perplexity, Google phones have Gemini, and Apple…still has Siri.
Yikes.
Now, I get it. This is very on-brand for Apple. They like to own the whole stack, and they’re not the type to go shopping for outside solutions. And to be fair, there have been increasing signals that Apple is pouring serious resources into AI lately. Hopefully that’s not just for show.
Part of me can’t help but wonder if one of the smartest moves Apple could have made was to simply buy Perplexity. But that ship may have already sailed.
It’s not just about having an assistant; it’s about having one that people want to use. The AI space is sprinting, and it feels as if Apple is still stretching on the sidelines.
For years now I’ve been writing and talking about the trouble with Siri. This problem became even more acute for Apple with the rise of large language models (LLMs) and the world’s collective realization at just how useful a smart artificial intelligence can be.
Last June, it seemed as if Apple finally found religion about making Siri better. At WWDC 2024 they outlined the “Apple Intelligence” strategy that made a lot of sense. While I never expected Apple to build something on par with one of the frontier models, like ChatGPT, I continue to think they don’t need to. If Apple’s AI could remain private and access all my data, that alone makes it more useful than most artificial intelligence. Moreover, as the platform owner, a smart Siri could act as an AI traffic cop, sending more complex requests to the appropriate outside models.
So I think Apple has the right vision, but I’m starting to question their ability to execute on it. Apple has yet to release even a beta of the iOS 18 version with, as one Apple employee explained to me, the “Siri Brain Transplant.” Indeed, Apple recently announced that the advanced Siri features won’t ship in iOS 18 after all. So the brain transplant has been postponed.
Late last year, there was a rumor that Apple is working separately on an LLM-Siri for iOS 19 that will really show how good Siri can be. The fact that there is already a rumor of a new thing when we don’t yet have the improved old thing doesn’t inspire confidence.
It gets worse, though. Mark Gurman, a reliable source, now reports the new LLM Siri is also behind and its conversational features may not release to consumers until 2027. Ugh. If true, Apple’s failure to deliver on Siri is epic at the Apple Maps and MobileMe launch levels.
The current LLM leaders are evolving weekly. Can you imagine how good they are going to be by 2027? I honestly can’t.
If these rumors are true, Apple is in trouble. It’s not the 1995 Apple-will-they-go-out-of-business-trouble, but it is trouble nonetheless. M.G. Siegler suggests that if Apple truly is this far behind, they should just default to ChatGPT until they can get their act together. That would be incredibly embarrassing for Apple, but this whole situation is exactly that. It looks like Apple’s AI initiative has a long way to go. Back in the day when the MobilMe launch failed so miserably, people joked that Steve Jobs was walking through the hallways at Cupertino with a flame thrower strapped to his back asking everyone he met, “Do you work on MobileMe?”. When it comes to AI, I think Apple is approaching a flame thrower moment. John Gruber agrees.
It is worth noting that in January 2025, 13 years after it was first released, Siri continues to be a joke. Every time I try to use Siri in front of a member of my family, there’s a 50–50 chance of success. When it fails, they laugh at it, and at me.
Recently, I was going to bed and told Siri to turn off the bedroom lights. Instead, it turned out all the lights downstairs on the rest of my family. (For bonus points, the bedroom lights stayed on.)
Paul Kafasis and John Gruber most recently documented Siri’s failings. I can’t help but feel we are at a crossroads. Apple made many promises in June about Siri improving with the arrival of Apple Intelligence, and we haven’t seen it in the public betas yet. I hope we get a peek with the next beta cycle, iOS 18.4.
I am not encouraged by the rumor that there is yet another Siri brain transplant rumored to happen in the future. I find it hard to believe that Apple lacks the competency to substantially improve Siri, which makes me wonder when they’ll get the resolve to do so. I sure hope it’s this year.
It’s easy to quarterback from the outside, but if I were in charge, I’d start a “boil the ocean” project to fix Siri and I’d seriously consider rebranding it to something else so that normal folks (like my family) don’t immediately associate it with a joke.
Mark Gurman’s got another AI/Siri report and it’s a doozy. According to the latest rumors, Apple is cooking up an LLM-powered Siri for iOS 19 and macOS 16.
The idea is that this would be yet another Siri reboot, but this time built on Apple’s own AI models. Think ChatGPT or Google’s Gemini but with that special Apple sauce (and privacy-focused access to your on-device data).
Here’s where I get a bit twitchy, though. Apple has been tight-lipped about the details of its AI strategy, and it’s starting to wear thin. If this massive LLM overhaul is really coming next year, what exactly are we getting with the current “Apple Intelligence” features that are supposed to land this year?
If, after all the WWDC and iPhone release hype, we get through all these betas only to find that Siri is still struggling with basic tasks, and then Apple says, “But wait until next year, we’ve got this whole new system that will finally fix everything!” Well, that will be just a little heartbreaking for me.
Last week Ryan Christoffel over at 9to5Mac quoted the latest Mark Gurman report about Apple developing an additional AI personality. Gurman reports that Apple is working on “[…]another human-like interface based on generative AI.” Like Ryan, I am confused by this.
For too long, Apple let Siri linger. It’s been the butt of jokes in tech circles for years. We’re told that this year will be different and Siri will truly get the brain transplant it deserves. But if so, why is Apple working on an entirely different human-like interface? Does this signal that the Siri update isn’t all it should be?
It’s too early for any of us to tell on the outside. There are some Siri updates in 18.1, but they are largely cosmetic. We’re still waiting for the big shoe to drop on Siri updates with later betas.
However, the idea that Apple is already working on the next thing before they fix the current shipping thing does make me a little nervous. I realize that at this point, we’re all just reading tea leaves, and I could be completely off the mark here, but I sincerely hope that the updates to Siri this year get all of the effort that Apple can muster.
Yesterday, Apple announced its new name for artificial intelligence tools on its platform: Apple Intelligence. If you watched the keynote carefully, it was almost humorous how they danced around the term “artificial intelligence” throughout. At the beginning, Tim made reference to “intelligence” without the word “artificial. Then, throughout the rest of the keynote, up until the announcement of Apple Intelligence, Apple relied on its old standby, “machine learning.” Nevertheless, they eventually got there with the announcement of Apple Intelligence.
The initial explanation was telling. They stated five principles for Apple Intelligence: powerful, intuitive, integrated, personal, and private. These principles are the foundation of what they’re trying to ship. Also, in Apple fashion, the term Apple Intelligence doesn’t refer to a single product or service, but a group of intelligence-related features:
Table Stakes AI This is the type of AI that everyone was expecting. It includes things like removing lampposts from picture backgrounds and cleaning up text. We already see multiple implementations throughout the Internet and in many apps already on our Macs. Apple had to do this.
They did, and the implementation makes sense. It’s got a clean user interface and clear options. Moreover, developers can incorporate these tools into their apps with little or no work. It should be universal throughout the operating systems, so learning how the tool works in one place means you can use it everywhere else. For most consumers, this is golden.
Also, it will be private. While I’m a paying customer of Grammarly, I’m aware that everything it checks is going to their servers. That means there are some things that don’t get checked. I’d much prefer to do this work privately on my device.
LLM AI There have been many rumors about Apple developing its own Large Language Model (LLM), but nobody expected them to have one competitive with the likes of OpenAI and Google. So the question was, is Apple going to ship something inferior, work with one of the big players, or not include LLM as part of this initiative? We got our answer with the partnership with OpenAI, which incorporates OpenAI’s 4o engine into the operating system.
This makes a lot of sense. Since the keynote, Craig Federighi has gone on record saying they also want to make similar partnerships with Google and other LLM providers. While nothing is going to be private sent to a company like OpenAI, Apple is doing what it can to help you out. It doesn’t require an account, and it gives you a warning before it sends data to them. Again, I think this is a rational implementation.
If you already have an OpenAI account, you can even hook it up in the operating system to take advantage of all those additional features.
Private AI
This was the most important component of Apple Intelligence and was underplayed in the keynote. Using the built-in neural engine on Apple silicon combined with Apple Intelligence, Apple intends to give us the ability to take intelligence-based actions that can only be accomplished with knowledge of our data. That bit is essential: Apple Intelligence can see your data, but more powerful LLMs, like ChatGPT, cannot.
That gives Apple Intelligence powers that you won’t get from traditional LLMs. Craig explained it with some example requests:
“Move this note into my Inactive Projects folder”, requiring access to Apple Notes. “Email this presentation to Zara”, requiring access to Keynote and Apple Mail. “Play the podcast that my wife sent the other day,” which requires access to data in the Podcasts and Messages apps.
While these commands aren’t as sexy as asking an LLM engine to write your college paper for you, if they work, they’d be damn useful. This is exactly the kind of implementation I was hoping Apple would pursue. Because they control the whole stack and can do the work on device, this feature will also be unique to Apple customers.
“AI for the Rest of Us”
During the WWDC keynote, I only heard the term “Artificial Intelligence” once. At the end, when Craig said, “Apple Intelligence. This is AI for the rest of us.” I think that sentiment summarizes Apple’s entire approach. I agree with the philosophy.
I’m convinced that Apple has considered AI in a way that makes sense to me and that I’d like to use it. The question now is whether Apple can deliver the goods. Apple Intelligence isn’t going to be released for beta testing until the fall, so now we just have promises.
Apple’s challenge is the way Siri lingered so long. You’ll recall that Siri, too, started with a good philosophy and a lot of promises, but Apple didn’t keep up with it, and Siri never fulfilled its potential.
Looking at the Siri example, I should be skeptical of Apple Intelligence and its commitment. Yet, I’m more hopeful than that. The degree of intentionality described yesterday, combined with the extent to which Apple’s stock price is contingent on getting this right, makes me think this time will be different. In the meantime, we wait.
This week, we got the news that the $4.99 per month Apple Music Voice Plan has been shuttered. I’m not surprised. When this was announced, it seemed off-brand for Apple. I don’t think many people want a voice-only music system. Combining this with spotty performance from Siri it put a spotlight on Siri’s shortcomings.
The Information has an article by Wayne Ma reporting Apple is spending “millions of dollars a day” on Artificial Intelligence initiatives. The article is pay-walled, but The Verge summarizes it nicely.
Apple has multiple teams working on different AI initiatives throughout the company, including Large Language Models (LLMs), image generation, and multi-modal AI, which can recognize and produce “images or video as well as text”.
The Information article reports Apple’s Ajax GPT was trained on more than 200 billion parameters and is more potent than GPT 3.5.
I have a few points on this.
First, this should be no surprise.
I’m sure folks will start writing about how Apple is now desperately playing catch-up. However, I’ve seen no evidence that Apple got caught with its pants down on AI. They’ve been working on Artificial Intelligence for years. Apple’s head of AI, John Giannandrea, came from Google, and he’s been with Apple for years. You’d think that people would know by now that just because Apple doesn’t talk about things doesn’t mean they are not working on things.
Second, this should dovetail into Siri and Apple Automation.
If I were driving at Apple, I’d make the Siri, Shortcuts and AI teams all share the same workspace in Apple Park. Thus far, AI has been smoke and mirrors for most people. If Apple could implement it in a way that directly impacts our lives, people will notice.
Shortcuts with its Actions give them an easy way to pull this off. Example: You leave 20 minutes late for work. When you connect to CarPlay, Siri asks, “I see you are running late for work. Do you want me to text Tom?” That seems doable with an AI and Shortcuts. The trick would be for it to self-generate. It shouldn’t require me to already have a “I’m running late” shortcut. It should make it dynamically as needed. As reported by 9to5Mac, Apple wants to incorporate language models to generate automated tasks.
Similarly, this technology could result in a massive improvement to Siri if done right. Back in reality, however, Siri still fumbles simple requests routinely. There hasn’t been the kind of improvement that users (myself included) want. Could it be that all this behind-the-scenes AI research is Apple’s ultimate answer on improving Siri? I sure hope so.
I’ve spent too much time complaining about Siri lately. That’s not because Siri doesn’t deserve criticism (it does!) but because my job is to help you get the most out of this stuff, even when the underlying tech could be better…This is a post for MacSparky Labs Members only. Care to join? Or perhaps you need to sign in?
There’s a lot of news lately about Apple staffing up Siri. First we heard that they are adding something like 100 additional engineers to the product. Now the New Your Times is reporting Apple hired Google’s former artificial intelligence chief, John Gannandrea to oversee Apple’s machine learning and artificial intelligence efforts. Reportedly, Gannandrea will report directly to Tim Cook.
Speaking at John Gruber’s Daring Fireball party a few years ago, Apple’s Craig Federighi and Phil Schiller both explained that Apple can still make Siri smart without looking at all of its user’s data the way Google does. I don’t remember the exact example, but they said something like they don’t need to look at your pictures of mountains to teach a computer what a mountain looks like. Nevertheless, Siri does lag behind competing virtual assistants. I found their confidence uplifting because I want both to protect my privacy and for Siri to get smarter.
It looks like Apple is going to try and make Siri better by increasing engineering while maintaining its position on user privacy. I hope this makes a difference because Google and Amazon certainly aren’t standing still.
Regardless, don’t expect results immediately. I think Siri improvements will be a gradual thing, over time. I think it’s similar to the way Apple has improved its cloud services. They’ve come a long way with iCloud over the past few years, but that would be easy to miss if you weren’t paying attention.