Good Grief!, Siri.

Bloomberg reports that the new Gemini-powered Siri features planned for iOS 26.4 are being pushed back. Some to iOS 26.5 in May. Some to iOS 27 in September.

Have you ever felt like Charlie Brown running to kick the football, only to notice, at the last minute, that Lucy is holding it?

This Is the Year Apple Has to Deliver on Siri

Mark Gurman delivered more Siri news this week, and I’m left with the same feeling I’ve had for over a year now: equal parts hope and frustration.

Here’s the picture as it currently stands. Apple is planning two separate Siri overhauls, releasing months apart.

The Spring Update: iOS 26.4

The first update arrives with iOS 26.4, expected around March or April. This is the non-chatbot version built on a custom Google Gemini model running on Apple’s Private Cloud Compute servers. The goal here seems straightforward: finally cash all those checks Apple wrote at WWDC 2024.

Remember those promises? Siri that understands personal context. Siri that can find the book recommendation your mom texted you. Siri that works across apps instead of being confined to one at a time. Features that were supposed to ship with iOS 18, then got pushed to “later,” then pushed again to 2026.

The Fall Overhaul: iOS 27

Then, just a few months later at WWDC 2026, Apple plans to announce an entirely different approach. This one is codenamed “Campos,” and it’s a full chatbot experience. Think Claude or ChatGPT, but baked directly into your iPhone, iPad, and Mac. Voice and text inputs. Persistent conversations you can return to. The works.

Craig Federighi has previously expressed skepticism about chatbot interfaces, preferring AI that’s “integrated into everything you do, not a bolt-on chatbot on the side.” But competitive pressure from OpenAI and others seems to have changed the calculus.

Why Two Versions?

This is where I get frustrated. Releasing two fundamentally different versions of Siri months apart doesn’t inspire confidence. The first version sounds like something they cobbled together just to say they kept their promises. I’d almost prefer they skipped it entirely and focused all their energy on the chatbot.

Why This Matters

I’ve been critical of Siri over the last decade. Every year Apple makes promises it can’t keep. Every WWDC brings demos of features that arrive late, broken, or not at all.

And yet I continue to believe that a smart model on our Apple devices, with access to our local data, where everything stays local and private or runs through Private Cloud Compute, could be one of the best implementations of AI we’ve seen.

Think about who this could help. I spent 30 years practicing law. I know firsthand how many professionals are locked out of these AI tools because the privacy story isn’t good enough. Apple could change that.

And for the rest of us? We’re not particularly excited about sharing our personal information with giant AI companies either. A truly private assistant that actually knows your life without selling it to advertisers? That’s the dream.

Apple is uniquely positioned to deliver this. They have the hardware. They have the ecosystem integration. They have the privacy infrastructure. They have over 2 billion devices that could benefit.

But they have yet to prove they can actually ship it.

Where I Am Right Now

I currently use Siri where I can, but that’s very limited. I get far more use out of Claude than I do Siri at this point. (Claude’s recent Cowork feature is shockingly impressive.) That’s not where I want to be. I want the assistant built into my devices to be the one I reach for first.

The Bottom Line

All of this feels like it’s coming to a boiling point. We’ve all been patient with Apple for years now. It’s time for them to prove whether or not they can pull this off.

Let’s hope that in six months, Apple has finally answered the call.

Gemini and Siri

We’ve now got official confirmation that Apple and Google are working together on what I’ve been calling the “Siri brain transplant.” Instead of an Apple LLM technology, it appears they’ll be using some version of Gemini. There are many unanswered questions here.

For instance, is the Gemini-based model going to be both local and in Apple’s Private Cloud Compute space? I like the idea of a Gemini-caliber PCC engine I can access privately from my Apple devices.

The original sin here isn’t Apple’s failure to build its own LLM over the last few years, but rather its lack of attention toward Siri throughout its lifespan. Anybody who cares about this stuff has experienced frustration as Siri never evolved the way it should have. Indeed, Siri has been the butt of popular culture jokes for at least a decade.

I’ll always believe that if Apple made Siri a priority ten years ago, they wouldn’t be in this position today.

Nevertheless, all of this feels more like a black eye than a knockout. There is still time for Apple to evolve and we’re all still figuring out what LLM-based AI means for all of us.

If I were going to look for a silver lining on this, at least Apple had the maturity to swallow their pride and go get help.

But let’s not forget this is all still just news. I still haven’t seen Siri work the way I’d have expected in 2026.

The Hypothetical Siri Brain Transplant

We’ve been hearing for a few weeks now that Apple is considering using a third party LLM for Siri, particularly Claude. I understand this is not usually the Apple way of solving a problem like this, but at least for the short term, it seems the right move.

I have no doubt that Apple will eventually have its own perfectly acceptable model to drive Siri, but at this point we just need Siri to work flawlessly. And if using someone else’s model for a year or two is how we get that, than I say get out the checkbook.

Perplexity Moves into Mobile

News broke this week that Perplexity AI is in talks to integrate its assistant directly into Samsung and Motorola phones.

What’s especially interesting here is how this upends the usual Android assistant narrative. Google has been baking Gemini into Android with increasing urgency, and now we have Android hardware makers cutting deals with a third-party assistant — one that’s not Google. That’s a big deal. It says something about how fast the AI landscape is shifting.

And then there’s Apple.

Just imagine a future where Samsung phones have Perplexity, Google phones have Gemini, and Apple…still has Siri.

Yikes.

Now, I get it. This is very on-brand for Apple. They like to own the whole stack, and they’re not the type to go shopping for outside solutions. And to be fair, there have been increasing signals that Apple is pouring serious resources into AI lately. Hopefully that’s not just for show.

Part of me can’t help but wonder if one of the smartest moves Apple could have made was to simply buy Perplexity. But that ship may have already sailed.

It’s not just about having an assistant; it’s about having one that people want to use. The AI space is sprinting, and it feels as if Apple is still stretching on the sidelines.

Apple’s AI Woes

For years now I’ve been writing and talking about the trouble with Siri. This problem became even more acute for Apple with the rise of large language models (LLMs) and the world’s collective realization at just how useful a smart artificial intelligence can be.

Last June, it seemed as if Apple finally found religion about making Siri better. At WWDC 2024 they outlined the “Apple Intelligence” strategy that made a lot of sense. While I never expected Apple to build something on par with one of the frontier models, like ChatGPT, I continue to think they don’t need to. If Apple’s AI could remain private and access all my data, that alone makes it more useful than most artificial intelligence. Moreover, as the platform owner, a smart Siri could act as an AI traffic cop, sending more complex requests to the appropriate outside models.

So I think Apple has the right vision, but I’m starting to question their ability to execute on it. Apple has yet to release even a beta of the iOS 18 version with, as one Apple employee explained to me, the “Siri Brain Transplant.” Indeed, Apple recently announced that the advanced Siri features won’t ship in iOS 18 after all. So the brain transplant has been postponed.

Late last year, there was a rumor that Apple is working separately on an LLM-Siri for iOS 19 that will really show how good Siri can be. The fact that there is already a rumor of a new thing when we don’t yet have the improved old thing doesn’t inspire confidence.

It gets worse, though. Mark Gurman, a reliable source, ​now reports the new LLM Siri is also behind​ and its conversational features may not release to consumers until 2027. Ugh. If true, Apple’s failure to deliver on Siri is epic at the Apple Maps and MobileMe launch levels.

The current LLM leaders are evolving weekly. Can you imagine how good they are going to be by 2027? I honestly can’t.

If these rumors are true, Apple is in trouble. It’s not the 1995 Apple-will-they-go-out-of-business-trouble, but it is trouble nonetheless. ​M.G. Siegler suggests​ that if Apple truly is this far behind, they should just default to ChatGPT until they can get their act together. That would be incredibly embarrassing for Apple, but this whole situation is exactly that. It looks like Apple’s AI initiative has a long way to go. Back in the day when the MobilMe launch failed so miserably, people joked that Steve Jobs was walking through the hallways at Cupertino with a flame thrower strapped to his back asking everyone he met, “Do you work on MobileMe?”. When it comes to AI, I think Apple is approaching a flame thrower moment. ​John Gruber agrees​.

Will Siri Ever Get Smart?

It is worth noting that in January 2025, 13 years after it was first released, Siri continues to be a joke. Every time I try to use Siri in front of a member of my family, there’s a 50–50 chance of success. When it fails, they laugh at it, and at me.

Recently, I was going to bed and told Siri to turn off the bedroom lights. Instead, it turned out all the lights downstairs on the rest of my family. (For bonus points, the bedroom lights stayed on.)

Paul Kafasis and John Gruber most recently documented Siri’s failings. I can’t help but feel we are at a crossroads. Apple made many promises in June about Siri improving with the arrival of Apple Intelligence, and we haven’t seen it in the public betas yet. I hope we get a peek with the next beta cycle, iOS 18.4.

I am not encouraged by the rumor that there is yet another Siri brain transplant rumored to happen in the future. I find it hard to believe that Apple lacks the competency to substantially improve Siri, which makes me wonder when they’ll get the resolve to do so. I sure hope it’s this year.

It’s easy to quarterback from the outside, but if I were in charge, I’d start a “boil the ocean” project to fix Siri and I’d seriously consider rebranding it to something else so that normal folks (like my family) don’t immediately associate it with a joke.

Another Siri?

Mark Gurman’s got another AI/Siri report and it’s a doozy. According to the latest rumors, Apple is cooking up an LLM-powered Siri for iOS 19 and macOS 16.

The idea is that this would be yet another Siri reboot, but this time built on Apple’s own AI models. Think ChatGPT or Google’s Gemini but with that special Apple sauce (and privacy-focused access to your on-device data).

Here’s where I get a bit twitchy, though. Apple has been tight-lipped about the details of its AI strategy, and it’s starting to wear thin. If this massive LLM overhaul is really coming next year, what exactly are we getting with the current “Apple Intelligence” features that are supposed to land this year?

If, after all the WWDC and iPhone release hype, we get through all these betas only to find that Siri is still struggling with basic tasks, and then Apple says, “But wait until next year, we’ve got this whole new system that will finally fix everything!” Well, that will be just a little heartbreaking for me.

Siri Concerns

Last week Ryan Christoffel over at 9to5Mac quoted the latest Mark Gurman report about Apple developing an additional AI personality. Gurman reports that Apple is working on “[…]another human-like interface based on generative AI.” Like Ryan, I am confused by this.

official siri icon currently in use in 2024

For too long, Apple let Siri linger. It’s been the butt of jokes in tech circles for years. We’re told that this year will be different and Siri will truly get the brain transplant it deserves. But if so, why is Apple working on an entirely different human-like interface? Does this signal that the Siri update isn’t all it should be?

It’s too early for any of us to tell on the outside. There are some Siri updates in 18.1, but they are largely cosmetic. We’re still waiting for the big shoe to drop on Siri updates with later betas.

However, the idea that Apple is already working on the next thing before they fix the current shipping thing does make me a little nervous. I realize that at this point, we’re all just reading tea leaves, and I could be completely off the mark here, but I sincerely hope that the updates to Siri this year get all of the effort that Apple can muster.

Hope Springs Eternal for Apple Intelligence

Yesterday, Apple announced its new name for artificial intelligence tools on its platform: Apple Intelligence. If you watched the keynote carefully, it was almost humorous how they danced around the term “artificial intelligence” throughout. At the beginning, Tim made reference to “intelligence” without the word “artificial. Then, throughout the rest of the keynote, up until the announcement of Apple Intelligence, Apple relied on its old standby, “machine learning.” Nevertheless, they eventually got there with the announcement of Apple Intelligence.

official Apple Intelligence text and iPhone image from their website after the june 10 2024 announcement.

The initial explanation was telling. They stated five principles for Apple Intelligence: powerful, intuitive, integrated, personal, and private. These principles are the foundation of what they’re trying to ship. Also, in Apple fashion, the term Apple Intelligence doesn’t refer to a single product or service, but a group of intelligence-related features:

Table Stakes AI This is the type of AI that everyone was expecting. It includes things like removing lampposts from picture backgrounds and cleaning up text. We already see multiple implementations throughout the Internet and in many apps already on our Macs. Apple had to do this.

They did, and the implementation makes sense. It’s got a clean user interface and clear options. Moreover, developers can incorporate these tools into their apps with little or no work. It should be universal throughout the operating systems, so learning how the tool works in one place means you can use it everywhere else. For most consumers, this is golden.

Also, it will be private. While I’m a paying customer of Grammarly, I’m aware that everything it checks is going to their servers. That means there are some things that don’t get checked. I’d much prefer to do this work privately on my device.

LLM AI There have been many rumors about Apple developing its own Large Language Model (LLM), but nobody expected them to have one competitive with the likes of OpenAI and Google. So the question was, is Apple going to ship something inferior, work with one of the big players, or not include LLM as part of this initiative? We got our answer with the partnership with OpenAI, which incorporates OpenAI’s 4o engine into the operating system.

This makes a lot of sense. Since the keynote, Craig Federighi has gone on record saying they also want to make similar partnerships with Google and other LLM providers. While nothing is going to be private sent to a company like OpenAI, Apple is doing what it can to help you out. It doesn’t require an account, and it gives you a warning before it sends data to them. Again, I think this is a rational implementation.

If you already have an OpenAI account, you can even hook it up in the operating system to take advantage of all those additional features.

Private AI

This was the most important component of Apple Intelligence and was underplayed in the keynote. Using the built-in neural engine on Apple silicon combined with Apple Intelligence, Apple intends to give us the ability to take intelligence-based actions that can only be accomplished with knowledge of our data. That bit is essential: Apple Intelligence can see your data, but more powerful LLMs, like ChatGPT, cannot.

That gives Apple Intelligence powers that you won’t get from traditional LLMs. Craig explained it with some example requests:

“Move this note into my Inactive Projects folder”, requiring access to Apple Notes. “Email this presentation to Zara”, requiring access to Keynote and Apple Mail. “Play the podcast that my wife sent the other day,” which requires access to data in the Podcasts and Messages apps.

While these commands aren’t as sexy as asking an LLM engine to write your college paper for you, if they work, they’d be damn useful. This is exactly the kind of implementation I was hoping Apple would pursue. Because they control the whole stack and can do the work on device, this feature will also be unique to Apple customers.

“AI for the Rest of Us”

During the WWDC keynote, I only heard the term “Artificial Intelligence” once. At the end, when Craig said, “Apple Intelligence. This is AI for the rest of us.” I think that sentiment summarizes Apple’s entire approach. I agree with the philosophy.

I’m convinced that Apple has considered AI in a way that makes sense to me and that I’d like to use it. The question now is whether Apple can deliver the goods. Apple Intelligence isn’t going to be released for beta testing until the fall, so now we just have promises.

Apple’s challenge is the way Siri lingered so long. You’ll recall that Siri, too, started with a good philosophy and a lot of promises, but Apple didn’t keep up with it, and Siri never fulfilled its potential.

Looking at the Siri example, I should be skeptical of Apple Intelligence and its commitment. Yet, I’m more hopeful than that. The degree of intentionality described yesterday, combined with the extent to which Apple’s stock price is contingent on getting this right, makes me think this time will be different. In the meantime, we wait.