Top iPhone Engineer Sheds Light On iPhone 17's Revamped Selfie Camera

Apple's iPhone 17 lineup is arguably the strongest we've seen in years, thanks in large part to several significant upgrades to its camera system. While some of the more eye-catching camera improvements are exclusive to the iPhone 17 Pro, the entry-level iPhone 17 also includes a number of compelling enhancements. First and foremost, the iPhone 17 features an 18-megapixel Center Stage front camera capable of taking landscape photos even in portrait mode. What's more — and a testament to Apple's attention to detail — the camera will automatically expand the field of view upon detecting additional people entering the frame.

There are several other notable iPhone 17 camera improvements, but first, it's worth highlighting a recent interview with Jon McCormack, the head of camera software at Apple. Speaking to BusinessWorld, McCormack shed some light on how the Center Stage feature came to be, along with Apple's thinking behind its design. Naturally, Apple's obsession with the user experience helped shape the feature's overall design and implementation.

Behind the iPhone 17 selfie camera

What's interesting is that McCormack disavows the notion that Apple concerns itself with simply rolling out new camera features with no rhyme or reason. On the contrary, Apple takes a close look at how iPhone owners actually use their devices and comes up with ways to improve the user experience and address any limitations.

This methodology, McCormack says, is how work on the Center Stage camera began. Apple observed that iPhone users would routinely employ a range of techniques to take good group selfies. Some of these hacks involved selfie sticks, using the 0.5 ultra-wide camera, and, in a scenario I've personally seen play out dozens of times, simply having whichever individual has the longest arms take the shot.

"What's going on here," McCormack said, "is that our users are trying to make the camera work for them, but we knew that we could do better... what if the camera could just understand what you're trying to capture and then make those adjustments for you?"

Of course, to get the camera to make such adjustments on the fly, the entire camera system had to be rethought. The end result was a brand new square sensor. That, however, was only step one. Apple also had to take into account additional factors such as memory bandwidth, heat dissipation, and wide-angle distortion — an effect that often rears its ugly head when group photos are taken with the ultra-wide camera. To address potential pitfalls, Apple's Megan Nash adds that the camera and sensor were designed together.

"With the new Center Stage camera, we grew the sensor to almost double the size of the previous sensor to match pixel-for-pixel sharpness," Nash explains. "The result is a wider field of view that fits more people or background in the frame, along with excellent image quality—really the best of both worlds."

All told, getting the system to work just right required months of testing. Apple worked hard to ensure that the feature worked as intended and wouldn't, for example, try to include people in the background when framing selfie shots amongst a group of friends.

Other notable iPhone 17 camera features

Beyond Center Stage, the iPhone 17 camera has a few other notable features. For example, the dual capture video feature allows users to simultaneously record video from both the front and rear cameras. This is a great feature for capturing a parent's reaction to their kids playing or even a fan's reaction to watching a live sporting event. While there were iOS apps that allowed this functionality previously, Apple's implementation is naturally more streamlined. Another great iPhone 17 camera feature is improved stabilization when taking what would otherwise be shaky video footage.

One improvement to the iPhone 17 camera scheme that isn't often mentioned involves Apple's use of microphones to better capture relevant audio when taking video. To this point, McCormack explains that Apple uses advanced software to determine the primary source of audio. From there, the device uses "machine learning to pull it apart into separate tracks, which we can then remix" to provide better audio without distracting background noise.

The entire interview with McCormack is well worth a read and provides a lot of information about the technical underpinnings of Apple's latest camera features. As a final point, it's worth mentioning that iPhone 17 demand is not only stronger than anticipated, but stronger than we've seen in years. Apple hasn't commented on iPhone sales yet, but we'll have a better idea as to the extent of iPhone 17 sales when the company releases its earnings report for the September quarter on October 30.

Recommended