What is the EyeSight feature of Apple’s Vision Pro headset and how does it work?

When Vision Pro detects that you’re watching something or using an app, a pattern appears on the EyeSight screen to let others know you’re focused on the virtual world. But when someone approaches, Vision Pro shows your eyes.
To show your eyes as realistically as possible, EyeSight uses data from the built-in infrared camera to capture eye movements and facial expressions. This data is fed to a machine learning algorithm that combines it with the digital personality created when the device is launched using the front-facing TrueDepth camera. In fact, people around you don’t see your real eyes.
Different EyeSight modes in Vision Pro
The EyeSight feature uses several visual cues to let others know what you’re doing in the augmented reality world, including full immersion and photography.
Transparent Mode
Transparent mode allows people to see the user’s eyes. In fact, when someone approaches you and the headset’s sensors detect their presence, EyeSight will temporarily bring them into the AR/VR world so that you know there is another person around you right now.
At the same time, this feature also shows your eyes to the other person so that they know that you are aware of their presence. When no one is in the room, EyeSight turns off and won’t display your eyes or custom background.
Full Immersion Mode
When you are fully focused on the content in the AR or VR world, a colorful pattern is displayed on the external screen of the headset. By seeing this animation, people around you will know that you are doing something. Of course, depending on how immersed you are in the virtual world, EyeSight may show your eyes as semi-transparent.
Capture Mode
When taking photos and videos using the headset, EyeSight enters this mode. Its visual profile on the device is a fog-like pattern in white. In fact, by adding this mode to Vision Pro, Apple wants people around to know when you are recording video or taking photos. In fact, Capture Mode can be considered something similar to the shutter sound of the iPhone camera.
Apple CEO Tim Cook said during WWDC 2023: “The introduction of Vision Pro marks the beginning of a new era for the computing industry. “Just as the Mac introduced us to personal computing and the iPhone introduced us to mobile computing, Vision Pro introduces us to spatial computing.” This means that the new product of this company will revolutionize the world of technology.
As a feature unique to the Vision Pro, EyeSight is one of Apple’s biggest differentiators from competing AR/VR headsets. This new feature makes the headset not be like a barrier between the real and virtual world and the user is not completely isolated.
Of course, it remains to be seen whether we’ll get used to people wearing Vision Pro headsets on their faces and their digital eyes projected through an OLED panel. What do you think of Zomit companions about the EyeSight feature of Vision Pro?
Source link