Loading...
Loading...

Apple Introduces LiDAR Technology to iPhones

Home / News / Apple Introduces LiDAR Technology to iPhones

Apple has stepped up their game and going full LiDAR! Technology that is still very new to Apple, the iPhone 12 Pro and iPhone 12 Pro Max were the first products we got to get a glimpse of. The newest iPad Pro also has this new feature.

If you look closely at the iPhone or iPad models you will see a small black dot near the camera. It is relatively the same size as the flash and that is the LiDAR sensor. It is a new type of technology that is depth sensing and makes a huge difference when collecting data.

What is LiDAR?

For those of you wondering, “What does LiDAR even mean?” we have the answer for you. LiDAR stands for light detection and ranging. Basically, it uses lasers to ping off objects and return to the source of the laser, measuring distance by timing the travel, or flight, of the light pulse.  

Some smartphones measure depth with a single light pulse. Phones that have LiDAR technology are able to send waves of light pulses and measure each one with its sensor. This creates a field of points that can map out distances and dimensions of a space plus the objects within it.

Better Camera & Augmented Reality

 Apple promises better low-light focus, up to six times faster in low-light conditions. The LiDAR depth-sensing is also used to improve night portrait mode effects. LiDAR cameras on smartphones improve focus accuracy and speed, and the new iPhone 12 Pro definitely does that.

A lot of Apple’s AR updates in iOS 14 are taking advantage of lidar to hide virtual objects behind real ones (called occlusion), and place virtual objects within more complicated room mappings, like on a table or chair. LiDAR allows the iPhone models to open and start augmented reality apps quicker.

LiDAR is something that is spreading rapidly and across multiple industries. Self-driving cars, aerial drones, robots, virtual reality and more.

Apple is not the first company to explore this type of technology on cell phones either. Google had this same idea in mind when Project Tango — an early AR platform that was only on two phones — was created.

The advanced camera array also had infrared sensors and could map out rooms, creating 3D scans and depth maps for AR and for measuring indoor spaces. Google’s Tango-equipped phones were short-lived, replaced by computer vision algorithms that have done estimated depth sensing on cameras without needing the same hardware. But Apple’s iPhone 12 Pro looks like a significantly more advanced successor, with possibilities for that lidar that extend into cars, AR headsets, and much more.