What is Apple’s Door Sense feature, how does it work and how to use it

Apple unveiled a set of new software features combined with the hardware capabilities of some of its high-end devices to help users with certain physical disabilities.

These features include door detection on the iPhone and iPad; live captions on iPhone, iPad and Mac, and more. These features, Apple said, will be available later this year through software updates on Apple devices.

The iPhone had some of the best feature sets tech companies can think of, especially for people with disabilities. Year after year, the tech giant has unveiled new software features that, combined with the hardware capabilities of some of its high-end devices, help users with certain physical disabilities.

People with little or no visibility, in particular, have been at the center of these accessibility developments.

One such recent development is door detection, a new feature that informs people who are blind or have low visibility of the attributes of a door and how to operate it.

What is door detection?

One of the biggest challenges people with low visibility face in a new environment is negotiating with doors.

what is door

This feature can help blind or visually impaired users locate a door when they arrive at a new destination, understand how far they are from it, and describe the attributes of the door, including whether it is open or closed. and when closed, whether it can be opened by pushing, turning a knob, or pulling a handle.

He can even read signs and symbols on the door. All of this makes exploring an unfamiliar area much easier for a visibly disabled person.

How does door detection work?

Apple’s Door Detection works using an array of cameras and sensors found in the latest generation of high-end iPhone models.

In particular, it uses the LiDAR or Light Detection and Ranging sensor, to measure how far an object, in this case a door, is from the user. It also uses the cameras, in conjunction with the LiDAR sensor, and the phone’s onboard machine learning to read and reinterpret a live scene.

How to use the Door detection function?

Although the door detection feature will be available later after a major update is released, the idea is that a visibly disabled person will pull out their LiDAR-enabled iPhone and scan the area in their immediate vicinity using an application, or the camera itself.

The device will then read a scene, analyze the various elements in the scene and calculate where and how far they are from the user, then give audio cues to the user, guiding them to the door.

If scanned correctly, it will also be able to tell users how to open the door, whether to push or pull it, and many other attributes that will help deal with the door much easier. Keep in mind that for this to work,

Apple will also release a few other features aimed at improving accessibility features. For example, it will add a slew of new features to the Apple Watch that will help users with disabilities better control their Apple Watch using their iPhone, and vice versa.

It will also add live captions to its accessibility features, allowing people with hearing loss to follow audio content such as phone calls or FaceTime meetings using real-time captions.

All of these feature sets are currently being tested by Apple and will be available to general users after an upcoming major update.

NYXDevices is a new media, do not hesitate to share our article on social networks to give us a solid boost. 🙂

0 Shares

Leave a Reply

Your email address will not be published.