My favourite Apple feature – Screen distance

Nidhi Dubey
3 min readMay 28, 2024

--

High screen time impacting eye health is definitely one of the major health concerns for working professionals these days, especially if one is working from home and using screens extensively. A plethora of problems like dryness of the eye, irritation, fatigue, blurred vision, headaches etc are being grouped under the term CVS (computer vision syndrome) which is prevalent among 50–90% of people who work on a screen, according to some researchers.

I do not find these numbers surprising as I too have observed an uprise in CVS and awareness of eye care among my peers and friends. Amid finding my own ways of maintaining eye health, I was pleasantly surprised by Apple’s screen distance feature of alarming the user if their phone was too close to their eyes. Turning on this feature made me realise how I often keep my phone too close to me and the extremely intrusive alert has forced me to alter this habit for the better.

Features like these feel like a warm hug, like the product actually cares about your well being; also making one ponder on whether “care” could be an important lever in making a product lovable !

Let’s decode this very useful feature:

Once enabled the device starts measuring the distance between one’s eye and the screen using the TrueDepth camera. If the disatnce is less a than 30cm then an alert is triggered which covers the entire screen. To remove the eatery one needs to hold the phone farther & click on continue.

But what is a TruDepth Camera?

TrueDepth technology is Apple’s sensor system in its front facing camera that is primarily used for facial recognition. This technology works by first scanning the subject using infrared beams. These beams (several small invisible dots of light) strike the subject i.e. the user’s face and return back to the device. The time taken for these beams to return is captured and a 3D depth map is created which is pretty accurate.

But it will be very inefficient to keep striking infrared beams all the time on the user’s face. Hence, my guess is that there is most likely a proximity sensor that is used to decide if the user is close enough to start measuring the distance.

Apart from the screen distance feature, this 3D model of the user’s face or the subject in question can be used for e-commerce, augmented and virtual reality applications as well.

If you wish to read more about this technology and Face ID which is its largest use case then you might enjoy this article that I came across

--

--