Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I have just returned a faulty 3070 Ti and like the look of a sapphire toxic 6900xt for use with vr on a quest 2
Does anyone use a s6700/6800/6900xt for vr and do you have any problems and what's the performance like please
if you are using Oculus link it doesnt matter what GPU you have.
I know it was explained but I was wanting both sides to make a decisionDon't get an AMD GPU for the Quest 2. The 3xxx series are better if this is your use case.
EDIT: This was explained to you in the Virtual Reality section of the forum. It's Because of the Codecs that the Quest 2 uses for PCVR.
I know it was explained but I was wanting both sides to make a decision
Perhaps pop1 would be a test alsoSo just had a go on Walking Dead Saints and Sinners in Virtual Desktop. With Ultra setting, encoding H264, 90Mbps Video Streaming, 90fps set, it plays really smooth and the textures looked great. However its probably not a very demanding VR game looking at the game. The comments recently on here I read about were regarding Skyrim VR which is probably a more demanding game just downloading it now will see what its like if I get chance. Interestingly when I played S&S on the Quest 1 I couldnt play it that long due to motion sickness as it doesnt have a teleport option. Playing it on the Quest 2 due to the movement being smooth I didnt feel that bad at all. My 6900xt copes with it no problems.
Perhaps pop1 would be a test also
Thanks for posting and I would love to know how you get on so I can make my mind up
Also I believe if you have an AMD CPU that gives the gpu a boost
Yes I have it and it's cross platform so you get both and the pcvr version is better graphicsI dont have Population One but would you just not get the Quest 2 version anyway ?
Just tried Skyrim VR and with Virtual Desktop again works fine I`m getting 90fps with 120hz according to SteamVR. I turned up all the settings in Skyrim and rode in the cart at the beginning to the village. Textures further away like on the houses do look a little blurred but the wooden construction of the cart and the characters in the kart was pretty good. It was all smooth and I didnt get any lag or stutter. Unfortunately I dont have any other term of reference as I couldnt get Airlink to connect and the USB cable for it is in my sons room and hes gone to bed.
This made me chuckle.It's your house, storm that bedroom and claim what's rightfully yours![]()
![]()
I know it was explained but I was wanting both sides to make a decision
While framerate and resolution will be fine on AMD card you will get more compression artifacts and increased latency compared to Nvidia card.
I’m using Oculus link at 500mbps and I can still see a lot of compression so I don’t know how people are using this stuff with less than 100mbps stream. I guess it’s the same people who find cloud gaming to be acceptable.
Im also using Quest 2 with 5408x2736 render resolution, 3664 encode resolution width and distortion curvature low.
You are right it does depend on the game and a lot of VR games have very simple graphics with no details in textures. Try playing Skyrim or HL Alyx and you will see loss of detail, banding and other typical downsides of compression.Depends on the game though. I played Saints and Sinners last night and saw no compression artifacts or latency issues. Plus your resolution is higher than standard 100% so the GPU will be working harder, more content being encoded etc.
While framerate and resolution will be fine on AMD card you will get more compression artifacts and increased latency compared to Nvidia card.
It’s not just my “opinion” but it is suggested by a lot of Quest/Quest 2 users.It should be noted that this guy is a bit of a Nvidia shill so his 'opinion' will be skewed somewhat.
It should be noted that this guy is a bit of a Nvidia shill so his 'opinion' will be skewed somewhat.