This Essential Apple Maps Feature Got A Big Update With iOS 26
The iOS 26 release brought a slew of new updates to Apple's mobile interface, and one app that received a nice bit of attention is Apple Maps. A new Visited Places feature lets you quickly pinpoint repeat restaurants, shops, and other venues you frequently return to.
There's also a new feature called Preferred Routes that memorizes your paths most traveled to give you alerts for your daily drives. But one of the most significant changes that will affect how users interact with iOS Maps is a natural language search function built directly into the app.
Powered by Apple Intelligence, the natural language search uses the power of AI to make all your Apple Maps searches that much easier. You don't have to worry about digging into settings to turn this feature on, as it's baked right into the coding of Apple Maps.
Those updated to iOS 26 will see a pop-up when launching Maps that reads "Search The Way You Talk." This opens the door to more dynamic and conversational ways of searching, with prompts like "Where's the best Chinese food near me that's open late" or "Find a café with free Wi-Fi on my way to work."
Natural language search makes Maps feel more human
Apple Intelligence is one of the standout forces behind the inner workings of iOS 26, and the improved natural language tool has been integrated into numerous first- and third-party apps. But in terms of how the enhancement directly affects Apple Maps, users will no longer be confined to typing stilted search queries, which was especially frustrating when trying to search for things while driving.
When combined with the Siri voice assistant, the latest version of Apple Maps should deliver a seamless user experience, from your initial search to the delivered results. And thanks to AI, you'll be able to continue your Maps searches with contextual follow-ups. For example, once Maps returns results for "Where's the best Chinese food near me that's open late," you can follow up with "Show me the fastest way there" or "Does it close in less than 10 minutes?"
Apple's Foundation Models framework is what makes this new age of natural language tools possible. Acting as the behind-the-scenes intelligence for conversational searches, Apple's AI is able to understand more than just the words you say; it also understands the overarching intent behind them. This allows software like Apple Maps to provide a more human experience when providing search results — so you'll feel more like you're conferring with a friend or family member than a smartphone.
A more comprehensive Apple Maps experience
Imagine a world where all the new features and refinements of iOS 26 Maps work in unison: You could be driving down the highway after a grueling day at the office and say, "Take me to my usual bar." Courtesy of the Visited Places feature, Apple Intelligence will know precisely what drinking hole you're talking about, and Preferred Routes will ensure you get there as quickly as possible.
But let's say there's an accident somewhere along your route, and traffic starts building up. Thanks to natural language search, you'll be able to say, "Get me there using backroads," and the Maps app should generate a fresh set of directions without you having to tap or swipe.
Let's face it: We've all reached past our steering wheel to type a word or phrase into the Maps search field. Not only does this new feature make for a more cohesive and interactive Maps app for all iOS 26 users, but it also provides a safer driving experience. With Apple Intelligence behind the wheel of our iPhones, that's one less reason to take our eyes off the road.