Because there are screens in the way? The choice was to either not have the viewer’s eyes be visible, or use a screen to display eyes (not even real eyes, you can supposedly have cat eyes for an example). Considering the device is meant to be AR (augmented reality) and not VR, it kinda makes sense to show the user’s eyes since they’re still “connected” to the outside world. Otherwise you’d have a bunch of blank visors walking around and then people can’t tell if you’re looking at them or your furry waifu.
And it sucked, fov of the augmented area was tiny, the projected images were see-through and you still couldn’t really see the persons eyes because of the tinted glass.
Vr headsets with cameras are currently by far the best way to do AR.
Because there are screens in the way? The choice was to either not have the viewer’s eyes be visible, or use a screen to display eyes (not even real eyes, you can supposedly have cat eyes for an example). Considering the device is meant to be AR (augmented reality) and not VR, it kinda makes sense to show the user’s eyes since they’re still “connected” to the outside world. Otherwise you’d have a bunch of blank visors walking around and then people can’t tell if you’re looking at them or your furry waifu.
You know how Microsoft solved this problem?
With glass.
And it sucked, fov of the augmented area was tiny, the projected images were see-through and you still couldn’t really see the persons eyes because of the tinted glass. Vr headsets with cameras are currently by far the best way to do AR.
That was Google…
Google had Glass. Windows Mixed Reality used glass. The material. Like a window.
Microsoft Hololens (glass and transparent screen) and Google Glass (tiny screen)
Then go and buy Microsoft’s product. Nobody forces you to get a Vision Pro