From what I understand the thing isn’t see through and the eyes are actually projected outside. Can somebody explain why they had to add tech to do it?
Because there are screens in the way? The choice was to either not have the viewer’s eyes be visible, or use a screen to display eyes (not even real eyes, you can supposedly have cat eyes for an example). Considering the device is meant to be AR (augmented reality) and not VR, it kinda makes sense to show the user’s eyes since they’re still “connected” to the outside world. Otherwise you’d have a bunch of blank visors walking around and then people can’t tell if you’re looking at them or your furry waifu.
And it sucked, fov of the augmented area was tiny, the projected images were see-through and you still couldn’t really see the persons eyes because of the tinted glass.
Vr headsets with cameras are currently by far the best way to do AR.
You actually remove your eyes before inserting the optical couplers into your sockets. You put your eyes in the storage compartment on the front giving the appearance that you’re looking out through the device.
Heavier, too. It’s about as heavy as the competitors despite having a separate battery.
It’s not necessary to have the external screen.
The Quest has passthrough cameras to allow you to see the world with stuff displayed over it too, but Apple has decided that simulating eye contact is important.
It’s Apple’s unique selling point here, but they’d have what sounds like a high-quality headset without it.
To allow eye contact for social interactions. If you want ubiquitous AR in real life that is what you need. This is an attempt to achieve this with current technology and it “almost” works / near miss / fails spectacularly.
For no good benefit? Try comparing the display to a HoloLens 2. There’s no current display technology that’s cheaper and allows you to see through while projecting the light at the same intensity. You can search it up.
All I’d want is “Go away. Gaming.” But a Post-It would do just fine. Hell, I’d prefer googly eyes than my own projected, that’d be way cooler and more useful.
I’ll point to someone down this thread about eye contact in that case. It’s not like it costed much though, reviewers have noted that iSight’s display quality is quite horrible and it seems like all this features added was a small screen
I think this is kind of a temporary workaround. In Apples ideal world, the Vision Pro would actually be transparent and you could see the users eyes for real, but the tech isn’t ready to project what apple is doing on glasses. So they settled for a VR headset and put eyes on the outside. Eventually in however many years it takes, they will actually use glasses and won’t have to do the screen on the outside.
They must believe, that being able to see Vision Pro users eyes is integral to the product, or at least important to the product being accepted by everyone.
Imagine you’re sitting in restaurant waiting for the waiter while doing some work on your Vision Pro. The waiter shows up and says ‘sir…’. You look at him and… there were two options:
it’s just a black screen so it’s not clear if you’re actually looking at him. Are you paying attention? Of are you still ‘inside’ and can’t hear/see anyone
you have this fake eyes indicating that you’re actually looking at him
It’s a really stupid “solution” to a huge problem all VR/AR has The actual solution? Don’t buy it.
From what I understand the thing isn’t see through and the eyes are actually projected outside. Can somebody explain why they had to add tech to do it?
Because there are screens in the way? The choice was to either not have the viewer’s eyes be visible, or use a screen to display eyes (not even real eyes, you can supposedly have cat eyes for an example). Considering the device is meant to be AR (augmented reality) and not VR, it kinda makes sense to show the user’s eyes since they’re still “connected” to the outside world. Otherwise you’d have a bunch of blank visors walking around and then people can’t tell if you’re looking at them or your furry waifu.
You know how Microsoft solved this problem?
With glass.
And it sucked, fov of the augmented area was tiny, the projected images were see-through and you still couldn’t really see the persons eyes because of the tinted glass. Vr headsets with cameras are currently by far the best way to do AR.
That was Google…
Microsoft Hololens (glass and transparent screen) and Google Glass (tiny screen)
Google had Glass. Windows Mixed Reality used glass. The material. Like a window.
Then go and buy Microsoft’s product. Nobody forces you to get a Vision Pro
You actually remove your eyes before inserting the optical couplers into your sockets. You put your eyes in the storage compartment on the front giving the appearance that you’re looking out through the device.
Achieving realistic, fast camera passthrough on both sides is harder than you think
Yes, that’s my point. Why? Why make it extra more complicated and more expensive for no good benefit?
Heavier, too. It’s about as heavy as the competitors despite having a separate battery.
It’s not necessary to have the external screen.
The Quest has passthrough cameras to allow you to see the world with stuff displayed over it too, but Apple has decided that simulating eye contact is important.
It’s Apple’s unique selling point here, but they’d have what sounds like a high-quality headset without it.
To allow eye contact for social interactions. If you want ubiquitous AR in real life that is what you need. This is an attempt to achieve this with current technology and it “almost” works / near miss / fails spectacularly.
For no good benefit? Try comparing the display to a HoloLens 2. There’s no current display technology that’s cheaper and allows you to see through while projecting the light at the same intensity. You can search it up.
I think they’re asking why eyes need to be projected on the outside.
Or anything for that matter.
All I’d want is “Go away. Gaming.” But a Post-It would do just fine. Hell, I’d prefer googly eyes than my own projected, that’d be way cooler and more useful.
I accomplished everything I need by taping a piece of paper with sharpie eyes on my Quest 2 and it cost me $0 to do so!
I’ll point to someone down this thread about eye contact in that case. It’s not like it costed much though, reviewers have noted that iSight’s display quality is quite horrible and it seems like all this features added was a small screen
That’s why we’ve been stuck with windows for centuries.
Have you tried Linux? /s
So they could have stopped at many points but decided humanity must suffer
I mean, if the price tag isn’t going to dissuade you…
Maybe they think it makes you look less stupid.
They have Tesla truck success in that, then.
I think this is kind of a temporary workaround. In Apples ideal world, the Vision Pro would actually be transparent and you could see the users eyes for real, but the tech isn’t ready to project what apple is doing on glasses. So they settled for a VR headset and put eyes on the outside. Eventually in however many years it takes, they will actually use glasses and won’t have to do the screen on the outside. They must believe, that being able to see Vision Pro users eyes is integral to the product, or at least important to the product being accepted by everyone.
What do you mean? They added the outside screen to a vr headset to try to make it more acceptable to wear around others.
Imagine you’re sitting in restaurant waiting for the waiter while doing some work on your Vision Pro. The waiter shows up and says ‘sir…’. You look at him and… there were two options:
it’s just a black screen so it’s not clear if you’re actually looking at him. Are you paying attention? Of are you still ‘inside’ and can’t hear/see anyone
you have this fake eyes indicating that you’re actually looking at him
It’s a really stupid “solution” to a huge problem all VR/AR has The actual solution? Don’t buy it.
So they can sell you custom eyes like cats and aliens and shit.
And the eyes are not the wearer’s eyes. They are just digital eyes.
Meme or actually?
Actually
Removed by mod