Is it worth upgrading to use AV1 encoding??

Soldato
Joined
10 Jan 2011
Posts
3,241
Hey all,


Not long got into VR gaming, got a Quest 3, My PC plays games fine now I have it set up with both VD and Airlink, But is it worth upgrading from a 6800XT to a 40** or 70** series card to use AV1 encoding, current gen doesnt seem like a massive upgrade from last gen in terms of actual PC gaming and it only seems to have AV1 going for it.

So from peoples experiance would you upgrade just for AV1, from videos ive seen it doesnt look any different but I guess you have to be there to see it kinda thing.
 
Hey all,


Not long got into VR gaming, got a Quest 3, My PC plays games fine now I have it set up with both VD and Airlink, But is it worth upgrading from a 6800XT to a 40** or 70** series card to use AV1 encoding, current gen doesnt seem like a massive upgrade from last gen in terms of actual PC gaming and it only seems to have AV1 going for it.

So from peoples experiance would you upgrade just for AV1, from videos ive seen it doesnt look any different but I guess you have to be there to see it kinda thing.
To me it doesn’t look any better in VD than HEVC at the same bitrate.
It seems to be completely pointless at least the implementation in VD.
 
Last edited:
To me it doesn’t look any better in VD than HEVC at the same bitrate.
It seems to be completely pointless at least the implementation in VD.
Thanks for replying to me,

This is kinda what I thought tbh, if you stream games to youtube/twitch then I get it, but for VR maybe not worth spending out for small or no gains, maybe AV1 will get better with drivers or next gen Gpus. :S
 
Thanks for replying to me,

This is kinda what I thought tbh, if you stream games to youtube/twitch then I get it, but for VR maybe not worth spending out for small or no gains, maybe AV1 will get better with drivers or next gen Gpus. :S

As Z10M says, there is no point in upgrading just for AV1. It's a good codec in low bitrate situations. But it has higher latency.

However, it would be worth upgrading to a Nvidia 4xxx GPU for VR from what you have. I have a 6900 and the only thing stopping me upgrading is that I refuse to pay any of this generations prices from either company.
 
A few thoughts:
- As I understand it, AV1 is more efficient and requires less bandwidth. If your connection is limited, then that’s probably very useful. With the Pico4, I thought that it suffered with compression noise on the picture quality. For whatever reason my Quest Pro has a significantly higher bandwidth by USB cable, and I can’t see any obvious compression noise. Maybe that additional bandwidth capability is more useful with a wireless link?
- I did wonder if x264 was using GPU resources. Apparently my GPU has a dedicated encoder and when I checked, it was only 1/3rd loaded. The implications being that the encoding is not taking away resources from rendering.
- Back go AV1, if it doesn’t need the bandwidth usage and my GPU can already happily encode x264, then the only thing that seems to be left is the time to encode and decode, which will influence headset lag. I’ve no idea if AV1 is significantly faster or not. Anyone know?
 
I think there are two camps here:
  1. You have an NVidia GPU - Don't bother updating because HEVC and H264+ are excellent in both quality & latency on NV GPU's regardless of of which GPU you have.
  2. You have an AMD GPU (before 7000 series) - Upgrading is reasonable, because AMD GPU's have poor HEVC and H264+ quality & latency compared to NV, and it will impact wireless VR significantly.
AMD 7000 series have a great AV1 (and HEVC/H264) encoder, arguably better than NV 4000 series. All other AMD encoders are poor, regardless of generation.

In terms of what AV1 provides, better image quality at ~30% less bitrate than HEVC. There are anecdotes (from VD developer) that AV1 is "smoother", due to how the encoder works and how frames are pace, but unsure if this is true.
Lower bitrates are hugely useful when network congestion becomes a problem, or your network link isn't "perfect".

From https://www.gumlet.com/learn/av1-vs-hevc/: AV1 demonstrates a 30% better compression ratio compared to HEVC and VP9. At higher resolutions like UHD 2160P, AV1 can outperform HEVC by a significant margin of 43.90%

From https://www.tomshardware.com/news/amd-intel-nvidia-video-encoding-performance-quality-tested

VAMF scores showing AMD 5000/6000 is _slightly_ worse than NV in HEVC: https://cdn.mos.cms.futurecdn.net/72bv9us7KDh6ofbARbJHoT-970-80.png.webp
FPS scores showing AMD 6000/5000 is significantly slower than NV in HEVC: https://cdn.mos.cms.futurecdn.net/VnnNQswsCeaPTohJMXezhL-970-80.png.webp

VAMF scores showing AMD 5000/6000 is _slightly_ worse than NV in H264: https://cdn.mos.cms.futurecdn.net/72bv9us7KDh6ofbARbJHoT-970-80.png.webp
FPS scores showing AMD 6000/5000 is significantly slower than NV in H264: https://cdn.mos.cms.futurecdn.net/VnnNQswsCeaPTohJMXezhL-970-80.png.webp

VAMF scores showing AMD 7000 is _slightly_ worse than NV in AV1: https://cdn.mos.cms.futurecdn.net/wnwyPXdyKzS883r96y7RCU-970-80.png.webp
FPS scores showing AMD 7000 is signficiantly faster than NV in AV1: https://cdn.mos.cms.futurecdn.net/5EjomBiNL3Y7kHGNpsUgbM-970-80.png.webp
 
Last edited:
for me it made quite a big difference. In fact, despite not liking the headset overall, the quest 3 was the first headset where I actually thought it's wireless PCVR implementation was very good. That was with using AV1.
You might not see much difference side-by-side whilst stationary, but you do (at least I did) once moving. The latest version of VD sets AV1 to 10bit by default, and it's this 10bit I think which is why I saw such an improvement. The colour banding is pretty much gone, whereas before it was quite bad.

I tried oculus link with it set to H264 960mb over a high quality cable, and I still prefered AV1 over VD.

What I didn't try was HEVC 10bit as I saw no reason to (wish I had just for comparison), but I was really happy with AV1.

You need IIRC a 4070 or above to benefit from the dual AV1 encoder chips.
 
Last edited:
I’ve no idea if AV1 is significantly faster or not. Anyone know?

It's more efficient but harder to decode than Hevc or H.264. It will have higher latency than both as well, especially the closer you get to 200Mbps.

Now, I have no way of testing this myself. But, the Virtual Desktop developer says that for fast moving content, H.264 at 400Mbps is better than AV1 at 200Mpbs. But AV1 is better at slow moving darker games.

This doesn't take into account that on Airlink you can go higher than 400Mbps. You can probably go to 600Mbps and still get less latency than using AV1 at 200Mpbs. And have better picture quality.
 
Never realised the hardware based encoder (NVENC) is actually a totally separate bit of hardware to the GPU doing all the normal rendering.
NVEC Application Note

Always assumed having to encode for sending over USB would mean less ‘grunt’ for the actual game.
 
Why would dual encoders matter? The limit would be the decoder on the headset. And that can just about handle 200Mbps Av1.
I had heard an anecdote that virtual desktop will use both encoder blocks, but NVidia suggest this is not the case:
 
Back
Top Bottom