Apple to Bring 3D Laser Autofocus to iPhone Cameras, Report Says
Apple is reportedly adding a rear-facing 3D laser system to coming iPhones as part of the hardware improvements necessary for full utilization of its recently announced augmented reality development kit (ARKit). As well as depth-sensing for augmented reality application, the laser system will improve autofocus for iPhone photographers.
Since the iPhone 6, iPhones have used what Apple calls “Focus Pixels”, which is its term for phase detection AF. Fast Company reports that system will be replaced with laser autofocus possibly as soon as the next iPhone, which is set to debut this fall.
Laser autofocus works by emitting low intensity infrared beams at a subject, and calculating the distance based on the time it takes for the light to travel to the subject and back. This is especially useful in low-light situations when compared to phase detection AF. It is likely that Apple would use both AF technologies, as Google does in its Pixel line of phones.
The technology would serve a dual purpose, also allowing for better depth perception with the inbuilt camera for augmented reality apps. ARKit rolls out with iOS 11 this fall, so it would make sense to also include the VSCEL laser system in the phone launching at the same time.
According to Fast Company’s source, the system comprises “a source (the VSCEL laser), a lens, detector (sensor), and a processor”. The whole combo would reportedly cost about $2 per phone.
Contributer : PetaPixel http://ift.tt/2uFLTab
No comments:
Post a Comment