Apple has now released an iOS 14.2 update that includes a new iPhone 12 Pro LiDAR feature to detect people within a certain range. This is great for those with visual impairments, but may also be useful in this pandemic age when maintaining physical distance is important for health. As this is a feature related to LiDAR, it is only available on iPhone models that have the scanner hardware built-in.

LiDAR is an acronym for Light Detection And Ranging, borrowing the pattern from the similar technology radar. Whereas radar uses radio waves to measure distance to an object, LiDAR uses light or lasers. The light is at a wavelength that can’t be seen and, most importantly, operates at an intensity that is safe for the human eye. The 2020 12.9-inch and 11-inch iPad Pro were the first Apple devices to gain the hardware necessary for LiDAR scanning, but it was mostly used by developers. Now that the iPhone 12 Pro and Pro Max are equipped with LiDAR, the tech should be expected to be used more often.

Related: iPad Pro 2020 LiDAR Scanner vs 2018: What Does Depth Sensing Add?

Apple’s latest update to iOS 14.2 adds the ability to detect people and measure how close they are within a certain range. LiDAR is limited to five meters, which is a little over sixteen feet. That should make recognizing the 6-foot distance recommended during the pandemic quite easy. While LiDAR can operate in total darkness, this advanced object-detection feature uses machine learning to recognize humans as opposed to trees or bookcases and needs input from the camera to operate best. Therefore, security uses are not currently practical. The feature can be found in the Magnifier, but Apple has not updated its support article to detail this LiDAR feature just yet.

How To Use LiDAR People Detection

The Apple iPhone 12 Pro Gold on a fire background

In order to use the People Detection feature, an Apple device that includes a LiDAR scanner is necessary. This includes the 2020 models of 12.9-inch and 11-inch iPad Pro or the iPhone 12 Pro and Pro Max. In the future, ARKit may be able to provide this information using standard AR techniques, though the accuracy is unlikely to ever be as good as what can be achieved with the distance-measuring capability of LiDAR. Those that own or have access to such a device, can try the feature by opening the Magnifier. TechCrunch's Matthew Panzarino previously demonstrated how the feature works using the beta iOS version.

Magnifier is available in the Control Panel, which is found by swiping down at the right, but it is not there by default. It can be added in the Settings app using the Control Panel options. In iOS 14, it’s also available as an app, but this requires enabling the magnifier within Accessibility features. Once it is enabled, the Magnifier app can be found in the App Library and can be added to the Home Screen if desired. However, access via the Control Panel is the easiest method to try out the iPhone 12 Pro’s new People Detection feature.

Next: iPhone 12 Camera Features & Upgrades: LiDAR, Dolby Vision, & More Explained

Source: Apple