Here’s why Apple’s 'Portrait Mode' feature only works on some iPhones and not others (AAPL)
When Apple introduced the iPhone 7 Plus last year, it contained a new camera feature that quickly became one of the most talked about — and copied — in Apple's lineup: Portrait mode.
Portrait mode uses the phone's dual cameras and Apple's software to mimic the quality you would get from a DSLR camera, which keeps the subject of the photo in focus and slightly blurs out the background.
The feature originally launched in beta as an exclusive for the iPhone 7 Plus. But now, a year later, Portrait Mode is live on the new iPhone 8 Plus (and will be available on the new iPhone X launching this fall).
Here's how Portrait Mode works, how you can use it, and why it's only available on some iPhones and not others:
SEE ALSO: Here's everything Apple announced at its big iPhone launch event
Portrait Mode is only available on recent "Plus" models of its iPhones — iPhone 7 Plus, iPhone 8 Plus, and the upcoming iPhone X — for a simple reason: Apple's version of Portrait Mode requires dual cameras.
Soon after Apple introduced Portrait Mode in 2016, the feature started popping up on other flagship phones like Samsung's Galaxy Note 8 (called Live Focus) and the Google Pixel (called Lens Blur).
In the case of the Pixel phones, which only have one lens, Google relies on software to achieve that Portrait Mode quality. Apple's iPhones require two lenses to make it happen — at least for now. So if you buy the new iPhone 8, for instance, it will not have the ability to take Portrait Mode photos.
Apple's Portrait Mode requires two lenses because each lens is different: One is a 12-megapixel wide-angle lens, while the other is a 12-megapixel telephoto lens.
When taking a Portrait Mode photo, the two lenses serve different purposes.
The telephoto lens is what actually captures the image. While it's doing that, the wide-angle lens is busy capturing data about how far away the subject is, which it then uses to create a nine-layer depth map.
That depth map created by the wide-angle lens is crucial to the end result, because it helps Apple's image signal processor figure out what should be sharp and what should be blurred.
The image above demonstrates what a standard iPhone photo looks like (left) and what a Portrait Mode photo looks like (right). At a quick glance, the image on the right seems like it just has a totally blurry background, but this is where the depth map comes into play.
In order to make the photo look natural and as close to a real DSLR photo as possible, Apple's image processor goes through the layers one by one and blurs them in varying amounts, an effect known as "bokeh."
The layers closer to the subject will be slightly sharper than than the layers farthest away, and if you look closely at the above photo of my colleague Melia Robinson, you can tell: The stuff that's close to her in the photo — like the long grass and the slab of wood on the ground — is a lot easier to make out than the cliff in the distance, which is just a dark, blurry form.
See the rest of the story at Business Insider
Contributer : Tech Insider http://ift.tt/2zaxnH2









No comments:
Post a Comment