For something that was supposed to be a virtual no-show at WWDC 2025, Apple Intelligence wound up having a fairly big presence during the keynote with several new features announced for Apple’s various platforms.
We’re getting Live Translation in iOS 26 across a number of apps, improved Visual Intelligence that can now read your screen, Call Screen and Hold for You in the Phone app and an AI-supercharged Shortcuts app.

But there’s something big still missing: the new Siri. Yes, Apple continues to work on promised features like understanding your personal context, on-screen awareness and in-app actions. And we have confirmation that “in the coming year” means 2026 — in other words, after iOS 26 launches this fall.
Along with Lance Ulanoff from TechRadar, I sat down with Craig Federighi, Apple’s senior vice president of software engineering, and Greg Joswiak, the senior vice president of worldwide marketing, to get a clearer picture of Siri’s future. We also discussed Apple’s overall approach to AI and how it’s fundamentally different than OpenAI and Google Gemini.
Check out the full video interview above, as we go into a lot more around the new Liquid Glass design, iPadOS taking on Mac features and more.
Seriously, what’s up with Siri?
Apple did deliver a new Siri with iOS 18, with a number of enhancements including a more conversational experience, maintaining context and type to Siri. But some of the most exciting promised features have been delayed. The question is why?
We found that the limitations of the V1 architecture weren’t getting us to the quality level that we knew our customers needed and expected…if we tried to push that out in the state it was going to be in, it would not meet our customer expectations or Apple standards and we had to move to the V2 architecture.
— Craig Federighi, Apple
“We found that when we were developing this feature that we had, really, two phases, two versions of the ultimate architecture that we were going to create,” said Federighi. “Version one we had working here at the time that we were getting close to the conference, and had, at the time, high confidence that we could deliver it.
“We thought by December, and if not, we figured by spring, until we announced it as part of WWDC. Because we knew the world wanted a really complete picture of, ‘What’s Apple thinking about the implications of Apple intelligence and where is it going?'”
As it turns out, Apple was simultaneously working on two versions of underlying Siri architecture. V1 was used to build the initial Siri demos. But V2 was needed to deliver a complete solution to customers.
“We set about for months, making it work better and better across more app intents, better and better for doing search,” said Federighi. “But fundamentally, we found that the limitations of the V1 architecture weren’t getting us to the quality level that we knew our customers needed and expected.
“We realized that V1 architecture, we could push and push and put in more time, but if we tried to push that out in the state it was going to be in, it would not meet our customer expectations or Apple standards, and that we had to move to the V2 architecture.
“As soon as we realized that, and that was during the spring, we let the world know that we weren’t going to be able to put that out, and we were going to keep working on really shifting to the new architecture and releasing something.”
So what’s the timetable now? It’s not clear, and Apple won’t announce a date until the update Siri is fully baked.
“We will announce the date when we’re ready to seed it, and you’re all ready to be able to experience it,” said Federighi.
What about live voice assistants for life advice — and therapy?
Millions of people are now using ChatGPT with Voice and Gemini Live to chat with daily, whether it’s to get answers to everyday questions, help with DIY projects or even life advice.
OpenAI’s Sam Altman has said that people are using ChatGPT regularly to make life decisions, because the chatbot has the full context of every person in their life and what they’ve talked about. Many are even starting to use these voice chatbots as stand-in therapists.
Federighi isn’t down on the concept, but it doesn’t sound like Siri will be your next life coach anytime soon.
“As a therapist, it’s a reasonable thing to do,” said Federighi. “I know a lot of people find it to be a real powerful way to gather their thoughts, you know, brainstorm, do all kinds of things. And so sure, these are great things but are they the most important thing for Apple to develop well?
“You know, time will tell where we go there, but that’s not the main thing we’ve set out to do at this time.”
Apple’s AI difference: Why it’s not building a chatbot
For Apple, the main message with its AI strategy is that it doesn’t want to build a chatbot. Instead, it wants to “meet people where they are” with AI.
That means delivering Apple Intelligence features inside apps that are designed to make your life easier or more fun, such as with the new Call Screening and Hold for Me features in the Phone app and Live Translate in Messages, Phone and FaceTime.
So, for example, if you’re in the messages app and someone sends you a message in a language that you don’t speak, Live Translate will ask if you want it to start translating for you.
“It’s integrated so it’s there within reach whenever you need it in the way you need it with it being contextually relevant and having access to the tools necessary to accomplish what you want to accomplish at that moment,” said Federighi.
“Apple’s job is to figure out the right experiences that make sense in the context of what we offer to customers and to make that technology,” said Joswiak. “The features that you’re seeing in Apple Intelligence isn’t a destination for us. There’s no app on intelligence. [It’s about] making all the things you do every day better.”
What’s next for Apple Intelligence
For now, Apple seems focused on delivering AI features that will make an impact as part of its new suite of software rolling out this fall, including iOS 26, iPadOS 26, macOS 26 Tahoe and even the Apple Watch with watchOS 26 and the new Workout Buddy feature.
The features that you’re seeing in Apple Intelligence isn’t a destination for us. There’s no app on intelligence. [It’s about] making all the things you do every day better.
Greg Joswiak, Apple
And Apple is also opening up its large language models to third-party developers so they can tap into Apple Intelligence powers on device.
A good example of Apple’s AI evolution is Visual Intelligence. The upgrade coming with iOS 26 will let you identify an object on your screen and then instantly buy it on Etsy, for example.
“I didn’t limit it just so that we let developers adopt an app and use the intents API. To plug into that experience, so if you have I don’t know an app for collecting wine and you want to look up in your wine collector app, you can easily add it to your collection,” said Federighi.
Bottom line
Apple Intelligence is still very much a work in progress, but Apple seems focused on delivering the Siri that was promised along with a wide range of AI-powered features that make its ecosystem stickier — even as the competition heats up.
“In the end, people buy products right, they buy experiences, said Joswiak. “We’re very proud of the fact that across each of our hero product categories, we’re number one of customer satisfaction right?
“There’s a reason for that, and we’re trying to make those product experiences better and better and make those products better and better and that’s what customers care about.”