Why there is no better screens in VR headsets ?

Associate
Joined
31 Jan 2018
Posts
3
I made a research and saw that these resolutions which cause screen-door effect.
Oculus : 2160 x 1200 HTC Vive : 2160 x 1200
Maybe some or most people will say that "there is no screen-door effect *** are you talking about ?" but i just want the answer of my question.
I know that we can produce 4k screens for mobile phones maybe we can even produce 6-8k who knows.
Then why we are not using 4k or more resolution screens on these VR headsets ?
Thank you for all answers.
Im not english so sorry for my mistakes in writing.
 
Yeah, recently someone from Reddit sent it to me.But it still doesnt explain why known headsets doesnt use 4k or more higher resolution screens.
 
Actually i thought about it and its completely possible just like Pimax is doing.If big companies had done it, it would have been done already.
 
Ok here's my take on it. VR is just taking off, some would say it still has limited appeal, certainly not worth the risk to R&D, design, build and market something that realistically is going to cost £2k for the headset alone. Then you have the issue of compute power to actually drive graphics on each screen. I guess you'd be looking at a PC of £3-5k to get there. All of a sudden your market has gone from small to ******* minute. Hence why no one's done it.

Next step should be headsets with higher pixel density in the middle, reducing towards the edges.
 
Your peripheral doesn't need the same density. You tend to move your head to look, not your eyes.
https://www.roadtovr.com/understand...and-why-its-important-for-vr-and-ar-headsets/

Current headsets force you to move your head more than you normally would because they have such a small sweet spot, its not how you would ideally, so no a mixed resolution display would not be better than one with eye tracking that could actually move the high density area to where to you are actually looking.
 
There's lots of reasons that VR headsets haven't adopted higher resolution screens quickly:
  • Contrast and Brightness of VR headsets, especially how black the blacks are is tricky. Many screens employ very clever tricks to get black blacks, but this usually comes at a cost in latency.
  • Latency and persistence are much lower than consumer displays, particularly phone displays which are generally slow to respond.
  • Data rates for high res + frame rates - It's quite hard to push very high data rates down long cables, where DisplayPort 1.4 is limited to 4k 120Hz.
  • Pixel calibration is a real issue, where Valve spent a long time sorting out the Vive screens to ensure consistent(ish) brightness and colour production between pixels.
  • Moire is another issue that Valve spent a long time combating with consumer displays. Described as looking at the screen through a very thin white cloth.
  • Standard display controllers (in the HMD) introduce a lot of latency
You've also got the issue of the lenses, and I gather the Vive was developed to be in a economic sweet spot for the combination of lenses and displays. It may be (not that I know) that the lenses to match a quality 4k sq. per eye screen are prohibitively expensive...
 
Ok here's my take on it. VR is just taking off, some would say it still has limited appeal, certainly not worth the risk to R&D, design, build and market something that realistically is going to cost £2k for the headset alone.
Our water industry company is always trying to appeal to us by running wellness campaigns for employees. They recently went on a massive mental health kick and got us on loads of training courses, one of which was about what it's like to have a mental illness. The 2-day course included several hours with each participant wearing HTC Vives - We, the company who won't spend a few hundred to fix a NAS box with £2mil of asset data stuck on it, actually went out and bought about 30 complete VR rigs just for a flippin' mental health awareness course for some of its employees....!!
Granted, they're entertaining the idea of giving us VR for use in remote asset surveys, but they'll also be bought brand new.

The better you make these things, the more people will support them in their products and the bigger the market will be.

Your peripheral doesn't need the same density. You tend to move your head to look, not your eyes.
The truth is the complete opposite.

The head may move (though not always), but the eyes move faster and so they move first. Basic biological programming and survival mechanism. If you force yourself to do it the other way, you will get dizzy.

People watching 3D movies complain a lot. One of the biggest complaints is blurred images giving them headaches. The most common cause of blurry images is where the film-makers have forced focus on the centre of the screen (which works as a 2D technique but not in 3D) but still have things moving in the periphery. Your eyes will be drawn to that movement and you'll see blurry things moving around, resulting in the nausea... as well as showing up poor film-craft. It's such a glaring error, I can even point you to exact scenes in Avatar where they make this very obvious mistake.
The general rule in 3D filming is that everything in-frame must be presented in perfect focus and in 3D, even if it doesn't look so to you in real life.

Have you been to the flight simulators at RAF Brize Norton?
I have. Got a go on one, too.
The surround screens show a basic low-resolution image, with just an oval at full hi-res. That oval moves around according to where the pilot looks. On the front of the pilot's helmet is a tracker facing inward toward his face. It monitors exactly where his eyes are looking and moves the oval accordingly.... not his head, his eyes.

When I drive or ride, my head moves very minimally. My mirrors are set up so I can just move my eyes and my head motion is minimal. Wearing a cheap, heavy bike helmet with high speed winds buffeting your bonce, you notice head movement a lot more.

I have spent many years shooting firearms. When I play VR shooting games, I naturally drop my head down, which means I'm looking through the 'peripheral' of the fresnel lens. It's blurry.

So yeah - Eyes move first, eyes move more and peripheral resolution is essential.
 
How often do you push your eyes right to the top, bottom and to the sides, the limit of their movement? Hardly ever I'd bet.
Bet away...

Bottom - Most of the time I type, as I don't touch-type.
Left and Right - Whenever I drive or ride.
Top - Whenever I shoot a gun or a bow, and whenever I go boxing.

Do your eyes not move, then?
 
I haven't got time to Google it, but I'm sure there was a video of, IIRC, Colin McRae driving a rally stage with eye tracking dots superimposed on the film. It was quite incredible how much his eyes were shooting all over the place.

The video was actually trying to explain how professional sports people don't have much better reactions, it's more that they take in more information more quickly and then process it more quickly so the decision to do something happens faster than a "normal person". The difference in lag between decision and action isn't actually that pronounced.
 
I haven't got time to Google it, but I'm sure there was a video of, IIRC, Colin McRae driving a rally stage with eye tracking dots superimposed on the film. It was quite incredible how much his eyes were shooting all over the place.
That's even part of advanced driver/rider courses, training you to scan your eyes all over the environment instead of just looking at the arse of the vehicle in front.
 
Back
Top Bottom