Apple AR headset will apparently be able to track your hands
The forthcoming Apple AR headset, also known as Apple Glass, will apparently be capable of tracking and recognising hand gestures.
That’s the claim being made by Ming-Chi Kuo, established analyst and all-round font of informed Apple supply chain rumours. In his latest research note to investors (via MacRumors), Kuo has revealed that Apple’s AR headset will contain 3D sensing modules for picking up on positional changes and even hand gestures.
Kuo draws the parallel to Apple’s Face ID sensor technology, which is already capable of picking up on relatively subtle changes in your facial expressions, as anyone who’s ever spent five minutes making a unicorn wink will attest to.
This VR implementation would involve four sets of 3D sensors that are of an even higher grade than those Face ID sensors. The analyst mentions Apple using the technology to power “a more intuitive and vivid human-machine UI”. This interface could see you letting go of a virtual balloon by de-clenching your fist, for example.
Besides gesture control, Apple AR’s vast suite of sophisticated sensors will apparently enable object detection, eye tracking, iris recognition, voice control, skin detection, expression detection, and spatial detection.
Apple has been working on object recognition for some time. In 2017, Apple unveiled ARKit, a new developer tool for building augmented reality apps and games. It helps utilise an iPhone’s cameras and sensors to track movement and to place 3D models in the real world with uncanny accuracy.
According to recent reports, Apple is already looking to refine its first AR headset, with a lighter and faster device tipped for 2024.