It’s the beginning of iPhone speculation season!
We don’t know exactly when Apple plans to have its fall iPhone event, but traditionally, its happened in either late September or early October. So we have about 8 weeks, realistically speaking before we know the real details about what Apple’s new smartphone flagship is all about.
Armchair Apple watchers like myself can only speculate about what features are going to be in the new phone. However, we already have a bunch of clues based on numerous leaks that have originated from Cupertino’s manufacturing partners in Asia as well as some other notable news items from the company’s partners.
Many of the iPhone’s upgrades in the last two years have been iterative, especially since the introduction of the FaceID facial recognition technology that has been part of the line since the launch of the iPhone X — it’s now in every current generation phone the company sells, the XR, the XS, and the XS Max.
The iPhone X was a radical departure from iPhone 7 and iPhone 8: the device was virtually bezel-less and hardware button-free on the display area. However, the XR, XS and XS Max were not substantially different from the iPhone X in terms of overall industrial design. Apple introduced a larger form factor in the XS Max and a version with 3D touch removed in the XR, with a lower quality display, at a lower price point, as well as the usual component upgrades in the A-series SoCs.
TouchID to return because FaceID stinks?
However, it appears that the FaceID technology has not been as successful as one might have expected. It has been rumored that TouchID, the fingerprint recognition technology that was initially introduced in iPhone 6 and is now only available on iPhone 7 and iPhone 8, may be making a comeback within the next year, albeit concealed underneath the screen (a la Huawei and OnePlus) rather than as a dedicated hardware button. Apple’s new version of TouchID purportedly works better than the Android in-display alternatives, as the sensor is supposed to function over a much larger screen area, so the finger doesn’t have to align with the sensor precisely.
One of the upshots of re-introducing this new under-the-screen TouchID is that it may also provide for a somewhat improved display. The rumors are that the already nearly invisible bezel from the iPhone XS on the 5.8-inch version of iPhone 11 is nonexistent, like the Samsung Galaxy S10 or the Huawei P30 Pro. The larger “Max” is also said to have a less prominent bezel as well. No word on the elimination of the controversial and much-maligned “notch.”
More cameras! USB-C! Joy?
The other significant hardware upgrade? It appears that Apple is ready to throw in the towel on the proprietary Lightning connector and may be preparing for a move to USB-C charging and accessory connectivity, just as it has done with the iPad Pro. While I welcome the possibility of Apple moving to USB-C (and the reported 4000 MaH battery on the iPhone 11, which would be the largest it has ever had), I am not sure someone with an iPhone 8 would necessarily justify moving to this device based on the data and charging connector alone.
The not so big? It sounds like the next iPhone is getting a third camera. Also, like Huawei. Yawn. More camera means more better! It also means more ugly, if the renderings of the phones and leaked case designs hold water with the big camera bulge on the back.
Quantum dots? Sorry.
The upgrade I was truly looking forward to is not going to make an appearance — the “quantum dot” sensor technology that Apple was reportedly licensing from Nanoco, a British technology firm, has been canceled in iPhone 11.
Unlike CMOS-based camera sensor technology which can only process about 25 percent of the light that it is exposed to, a quantum dot sensor is much more sensitive, allowing for complete conversion of photons to electrons as they hit the sensor matrix, yielding a 100 percent fill factor and an increase in sensitivity by a factor of four. In English? That means you can now build smartphone cameras using smaller lenses and overall sensor area with the equivalent sensitivity to large format professional SLR and mirrorless cameras which have physically much larger CMOS sensors.
However, that isn’t coming this year. Had it made it, it would have been an absolute no-brainer upgrade for many people from iPhone X or iPhone 8, including myself.
Could the iPhone 11 still go to higher pixels? It’s possible, especially if they go to a third 12MP sensor on the “Max” and do software tricks to combine them. I’m not necessarily expecting a superior CMOS sensor like the Huawei P30 Pro uses, which has a primary 40-megapixel sensor with an f/1.6 lens, a second ultra-wide 20-megapixel unit, and a third telephoto 8-megapixel sensor for zooming. The iPhone 11 “Max” is said to have a similar ultra-wide configuration to the Huawei P30 Pro, but the CMOS capabilities of the camera sensors on the yet-to-be-released device are so far unknown.
Improved image processing? Nope!
What else isn’t coming? Well, we haven’t seen any indication that Apple has made significant investments in image processing technology such as what Google has done for the Pixel 3, or what Huawei has done with its P30 Pro. This technology would be image processing on-device as well as with cloud-assist using a combination of dedicated hardware and software algorithms for features such as Night Mode and Portrait mode.
While iPhone’s conventional CMOS sensors and camera optics are as good as everyone else’s, Apple lags in software improvements that put it on par with cameras on its competitors’ devices.
I own both a Pixel 3 and an iPhone XS Max, and the quality of the close-up food photographs I take for the restaurant reviews I do for other outlets are vastly superior on the Pixel 3. The results of portrait mode on the iPhone are distorted and weird looking, and having to take photographs in dimly-lit environments are a significant challenge without additional spot lighting. That’s not the case with the Pixel 3 or the Huawei P30 Pro, which have onboard image processing capabilities.
Cloud improvements? Nah.
Apple’s image processing deficiencies is an area that, quite frankly, needs to be addressed in the iOS software and back-end cloud infrastructure itself. Part of the problem is that Cupertino does not have the cloud-based infrastructure dedicated to image processing as Google does. Apple reportedly has a $30M per month spend at Amazon, but it’s unknown to what extent it is using AWS for in terms of machine learning and other processing offload features.
That’s one of the reasons why I use Google Photos on the iPhone instead of the local Photos application. It’s a better experience even though the exposures taken with the iPhone aren’t given the same favored status in Google Photos that are granted to Pixel 3 and other Android phones, which allow for full-resolution storage and manipulation of the images in Google’s cloud using the device or your personal computer. Unlike the Pixel 3, iPhone images in Google Photos are stored in “High-Quality” compressed size than “Original” uncompressed size.
So what does this all mean? Well, it looks like the iPhone 11 is shaping up to be one of the most boring upgrades in recent memory. All of these small, iterative improvements are nice, but they aren’t particularly compelling, especially if you have an XS or an XR. Moreover, while the 11 “Max” is going to be a powerhouse with a giant, beautiful screen — just like the current XS Max is — there appears to be nothing coming for those folks who still want a smaller phone, like the iPhone SE.
Am I going to upgrade? Yes, because I am on the Upgrade Program. It means I am just going to continue the same lease payment, probably for only a few dollars a month difference. However, it feels just like bringing your Lexus into the dealership after a two-year lease and being handed the keys to almost the same model of next year’s version of your current car with no substantive improvements in it — routine and unexciting.
Will you be upgrading to iPhone 11? Talk Back and Let Me Know.