Sweet vindication at last.
Credit: Apple
When Apple announced the iPhone 16E yesterday, it also confirmed that the new budget phone will get Apple Intelligence’s “Visual Intelligence” feature, marking the first time the AI trick will come to a phone without a “Camera Control” button. While the other iPhone 16 series phones use their Camera Control buttons to access Visual Intelligence, the iPhone 16E can instead map it to its Action Button, a simple change that raises the question: why not the iPhone 15 Pro, too?
Personally, as an iPhone 15 Pro owner, I’ve been asking that question for months now, as I’ve long suspected my phone’s internals were definitely capable of it—it can run every other Apple Intelligence feature without issue. It instead seemed to me like Apple was arbitrarily holding the feature back because it wanted to tie it to a specific button press I didn’t have. Well, with the iPhone 16E adopting the Action Button workaround, it seems like Apple’s finally listening. Apple representatives have now confirmed that Visual Intelligence will be coming to the iPhone 15 Pro as well, using the same strategy.
Speaking to Daring Fireball’s Jeff Gruber, an Apple spokesperson said that the iPhone 15 Pro will indeed get Visual Intelligence “in a future software update,” and that users could map it to the Action Button. Sweet vindication.
There’s no word on when exactly that software update will come, and to be honest, I’m not sure if I’ll use Visual Intelligence much, but it’s encouraging to see my phone’s software not get held back by an arbitrary push for hardware cohesion anymore.
For the uninitiated, Visual Intelligence brings AI to your iPhone’s camera. You can point your camera at a foreign language menu, for instance, to get a translation, or point it at a book to get a summary of what’s on the page, or point it at a dog to try to find out what breed it is. It can also surface information about businesses simply by looking at their storefront or signage (in the United States only), and works with Google and ChatGPT for extended search queries. In other words, it’s similar to Google Lens, but puts AI first and is built into your operating system. Again, I’ve been prevented from playing around with it much, but hey, at least I now have the option.
Michelle Ehrhardt
Associate Tech Editor
Michelle Ehrhardt is Lifehacker’s Associate Tech Editor. She has been writing about tech and pop culture since 2014 and has edited for outlets including Gizmodo and Tom’s Hardware.
Lifehacker has been a go-to source of tech help and life advice since 2005. Our mission is to offer reliable tech help and credible, practical, science-based life advice to help you live better.
© 2001-2025 Ziff Davis, LLC., A ZIFF DAVIS COMPANY. ALL RIGHTS RESERVED.
Lifehacker is a federally registered trademark of Ziff Davis and may not be used by third parties without explicit permission. The display of third-party trademarks and trade names on this site does not necessarily indicate
any affiliation or the
endorsement of Lifehacker. If you click an affiliate link and buy a product or service, we may be paid a fee by that merchant.