News

Apple’s augmented reality headset will likely include a revolutionary user interface for recognizing user commands

In the development and presentation of devices based on augmented reality or virtual reality, an important point is how to immerse the user in the imaginary world of these products and at the same time interact with the real world; Therefore, the HMD (head-mounted device) must identify the user’s commands well and execute them in its detailed environment; Therefore, it is said that Apple is developing control mechanisms to receive user commands.

Mingchi Ko“The Cupertino-based augmented reality headset will be equipped with state-of-the-art sensors to detect the user’s environment and commands, which can be described as a turning point in the user’s interaction with virtual or augmented headsets,” said Apple Product Analyst. Who says about this:

We anticipate that the structured light of the augmented reality headset can not only detect the change of position of the user, the hand and other objects in front of the user’s eyes, but also, just like the face ID, change the dynamic details of the hand. Recording the details of hand movements can take the human-machine interface to another level.

Ko believes that the headset’s ability to detect advanced hand movements leads to a full-blown experience for users, in which the user can, for example, open his or her hand to drop the balloon virtually. To achieve this, Apple is expected to configure four sets of 3D sensors with higher quality and specifications than the models used in iPhones. The analyst sees the quality of the human-machine interface as key to Apple’s future AR headset success, noting that interface capabilities include gesture control, object detection, as well as eye tracking, iris detection, voice control, skin detection, expression detection and all-inclusive detection. .

After all, Apple’s recent inventions show that the company is looking for a product that detects where commands are coming from and executes them accurately on the headset. The company is increasingly using UWB technology to help iPhones locate each other. Apple has repeatedly tested systems with a self-contained interferometer (SMI) sensor for similar uses, and it is thought that the headset controller will also use the technology. It is not yet clear whether the accessories will come with the device or will be sold separately; But some prototypes seem to have a touch screen to detect gestures or a button called the Digital Crown to record commands.

Related article:

Earlier this year, Apple released a media patent entitled “Devices, Methods, and Graphical User Interfaces for Interacting with 3D Environments,” which described a similar nature to Mingchi Koo. Apple argued in the patent that different motions and locations of microgests and different motion parameters were used to detect commands performed in a three-dimensional environment. Using cameras to capture microgests to interact with the 3D environment allows the user to move freely in the physical environment; Without being overwhelmed by physical peripherals, allowing it to explore the 3D environment more naturally and efficiently.


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker