ChatGPT Would Understand
A simple framework to measure whether Siri actually gets better in 2026.
Myke Hurley, on the latest episode of Upgrade:
I use the app Due a lot and I’m a big fan of it. And it was reminding me about something. I think it was to wash [the baby’s] bottles or whatever before I left for the office.
And I wanted to be able to just say to my phone, “snooze that notification for 30 minutes,” which is just not a thing that I can do right now. But my computer should understand what I want. It’s not a complicated thing to ask.
And I feel like I was thinking about it. Like, I feel like if I asked ChatGPT that, it would understand it and would do it, right? Like, if I had some kind of task or reminder in ChatGPT, and I was like, tell me about this in 30 minutes, I feel like it would understand it well enough to be able to execute on that command, right?
Yeah.
My iPhone can’t do that.
This reminded me of a conversation I had with Siri this morning:
Me: Siri, add paper towels to my Costco order list.
Siri: You don’t have a Costco order list. Would you like me to create one?
Me: (sigh) Siri, add paper towels to my Costco list.
Siri: Okay, I’ve added paper towels to your Costco - Bulk Shopping list.
Siri should have understood what I meant the first time. But because I appended the word “order” to my initial request, Siri completely botched it.
This is exactly what Myke is getting at. ChatGPT would have understood what I meant the first time.
In the age of LLMs, we’ve grown accustomed to AI systems almost always understanding our intent. A chatbot is now as good as, if not better than, a real human at the intern level, precisely because they can abstract meaning and intent, like a real human can.
If my intern responded to me as literally as Siri still does, I would fire them.
But Siri still can’t. And that’s what’s most frustrating about the current state of Apple’s AI. Other systems can do this. Siri still can’t.
Jason Snell, later in the same episode:
“we just want [Siri] to be good, or at least better. My prediction that I made on a podcast recently was, I think by the end of this year, Siri will be better, but we will still be dissatisfied with it.
Very typical Jason kind of prediction, which is like, it will be better and you’ll still be unsatisfied, which I think is probably the most likely scenario, right? They will not not try to make it better.
But they will also probably not totally succeed where everybody’s like, yeah, “Siri’s the best now.” That seems very unlikely just because of human beings, not just because of Apple.”
This feels like the most we can reasonably expect from Siri in 2026. It will likely get better than it currently is, but it will not be the best. ChatGPT will still interpret meaning and intent better than Siri.
But what does better actually mean? I think Myke is on to a good framework for measuring this. If we ask Siri to do something in 2026, and it can’t, but ChatGPT would have understood, that is not better.
If Siri can’t at least understand what ChatGPT can, then it will not have gotten better. And even this is a low bar, because ChatGPT can not only understand our intent, but it often can act on our intent.
I’m not even asking Siri to go the extra mile in 2026. I just don’t want to have to say, “ChatGPT would have understood.”
Will Siri get there in 2026? And what does it say about Apple if it can’t?


I get very frustrated with Siri and its incorrect interpretations of what I have asked it.
It can also be very inconsistent - I ask it to play music alot - and/or create stations based on a song - sometimes its spot on - often its like shooting fish in a barrel but with a large balloon!
From what I can tell, I think it'll say that Apple tried, but that sort of Siri isn't in the current DNA of Apple. Its beautiful design. I think we see a little bit of that now. Invoking Siri now looks beautiful but functions terribly.