iPhone 12 Pro People Detection feature helps visually impaired users 'see' via LiDAR
Aside from the iPhone 12 Pro and iPhone 12 Pro Max, the 2020 iPad Pro is getting the People Detection feature as well
Now that the iPhone 12 lineup has been fully unveiled, potential buyers might want to know what the flagship smartphones bring to the table. So far, based on Apple's "Hi, Speed" presentation, the Pro models pack the most features iOS users will love. It runs on the latest A14 Bionic chipset, which surpasses the performance of Qualcomm's Snapdragon 865 silicon. Now that the handsets come with a LiDAR sensor, the newly added People Detection feature reportedly helps blind people to scan for others around them.
The LiDAR scanner first made its debut on the 2020 iPad Pro models and was mainly used for improved imaging capabilities. Other applications include augmented reality (AR) and for taking measurements without the use of traditional tools. Although its accuracy has been called into question in the past, improvements have been made to expand is functionality. It appears Apple plans to help people diagnosed with low vision or deemed legally blind by their healthcare provider.
With the help of advanced software algorithms and the LiDAR (Light Detection and Ranging) module, Apple seeks to introduce a new accessibility option for its supported devices. Aside from the iPhone 12 Pro and iPhone 12 Pro Max, the 2020 iPad Pro is getting the People Detection feature as well. CNET notes that it can be used in situations to improve the awareness of users with vision problems to individuals who are in their immediate vicinity.
In theory, it should allow them to navigate around a crowded area, find seats on public transportation, and even follow social distancing guidelines with the help of their smartphones. Under its settings, visually impaired users can choose between four types of feedback: Audio readout, Haptics, Visual readout, and setting a threshold distance.
Depending on their preference, the iPhone 12 Pro/Pro Max's LiDAR will scan the area up to 5 metres away and will also single out the person closest to the device's user. For now, the feature is still in beta and is available for developers only. It should be part of the iOS 14.2 update which could go live as early as next week at Apple's "One More Thing" event. Depending on how well it works in actual field tests, it's likely other manufacturers will soon copy the feature for their compatible smartphones.
© Copyright IBTimes 2024. All rights reserved.