• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Tried a bunch, even tried VRAM only at +1000 and the Cyberpunk bench was 1fps lower than the 70% PL posted above lol.

I think I will leave the clocks at stock and save myself the faff, and just have 70% PL. There isn't a game out that won't run at 100fps now anyway (assuming optimised of course lmao), so I'm sorted for a rather long time I'd say!
 
Tried a bunch, even tried VRAM only at +1000 and the Cyberpunk bench was 1fps lower than the 70% PL posted above lol.

I think I will leave the clocks at stock and save myself the faff, and just have 70% PL. There isn't a game out that won't run at 100fps now anyway (assuming optimised of course lmao), so I'm sorted for a rather long time I'd say!
Until the RTX5090 is out!
 

NVIDIA has released a still from a video comparison featuring AMD RX 7900 XTX, Arc A770 and its RTX 4080 GPUs in AV1 4K and 12 Mbps encoding comparison. The company claims that their encoder produces higher quality images with the same bitrate:

DvrpYLm.jpg
 
Last edited:

Is the 288gb/s bus really going to be good enough?

Seems like its going to fail hard at some games/resolutions.
 
Is the 288gb/s bus really going to be good enough?

Seems like its going to fail hard at some games/resolutions.

Don't worry the RX76....sorry RX7700 16GB will be enough! :cry:


Not surprised, NVEnc has been getting better each generation.

The A380 6GB is only £130,and the Arc A750 8GB is only £199.99,so for pure AV1 encoding it makes more sense to buy an Intel card because of the cost. AFAIK,their IGPs might support it too.

Edit!!

TH tested it a few weeks ago:

After running and rerunning the various encodes multiple times, we've done our best to try and level the playing field, but it's still quite bumpy in places. Maybe there are some options that can improve quality without sacrificing speed that we're not familiar with, but if you're not planning on putting a ton of time into figuring such things out, these results should provide a good baseline of what to expect.
For all the hype about AV1 encoding, in practice it really doesn't look or feel that different from HEVC. The only real advantage is that AV1 is supposed to be royalty free (there are some lawsuits in progress contesting this), but if you're archiving your own movies you can certainly stick with using HEVC and you won't miss out on much if anything. Maybe AV1 will take over going forward, just like H.264 became the de facto video standard for the past decade or more. Certainly it has some big names behind it, and now all three of the major PC GPU manufacturers have accelerated encoding support.
From an overall quality and performance perspective, Nvidia's latest Ada Lovelace NVENC hardware comes out as the winner with AV1 as the codec of choice, but right now it's only available with GPUs that start at $799 — for the RTX 4070 Ti. It should eventually arrive in 4060 and 4050 variants, and those are already shipping in laptops, though there's an important caveat: the 40-series cards with 12GB or more VRAM have dual NVENC blocks, while the other models will only have a single encoder block, which could mean about half the performance compared to our tests here.
Right behind Nvidia in terms of quality and performance, at least as far as video encoding is concerned, Intel's Arc GPUs are also great for streaming purposes. They actually have higher quality results with HEVC than AV1, basically matching Nvidia's 40-series. You can also use them for archiving, sure, but that's likely not the key draw. Nvidia definitely supports more options for tuning, however, and seems to be getting more software support as well.
AMD's GPUs meanwhile continue to lag behind their competition. The RDNA 3-based RX 7900 cards deliver the highest-quality encodes we've seen from an AMD GPU to date, but that's not saying a lot. In fact, at least with the current version of ffmpeg, quality and performance are about on par with what you could get from a GTX 10-series GPU back in 2016 — except without AV1 support, naturally, since that wasn't a thing back then.

We suspect very few people are going to buy a graphics card purely for its video encoding prowess, so check our GPU benchmarks and our Stable Diffusion tests to see how the various cards stack up in other areas. Next up, we need to run updated numbers for Stable Diffusion, and we're looking at some other AI benchmarks, but that's a story for another day.


Basically the encoder in the RTX4060 and RTX4050 series is worse than the RTX4070 series and above. So it could be that Intel actually is better under £500.
 
Last edited:
though there's an important caveat: the 40-series cards with 12GB or more VRAM have dual NVENC blocks, while the other models will only have a single encoder block, which could mean about half the performance compared to our tests here.

Always finding ways to cut costs save power @CAT-THE-FIFTH :cry:
 
Always finding ways to cut costs save power @CAT-THE-FIFTH :cry:

Encodes will take longer so more power is consumed.

Rodney! Its all going according to planned!

luster-chandelier.gif
 
Last edited:
Out of interest what clock speeds do other peoples 4090s boost to then settle to? I know typically the norm is to boost to a higher clock speed than the rated max boost, then after a few mins drop to a stable boost maximum for thermals, but I'm finding that 2700MHz is the lowest mine settles to, which is still far higher than the rated 2520MHz boost clock. It starts off at about 2760MHz, then settles to 2700 if the temp is closer to 80, or 2715 if the temp is closer to the 70 end. Zotac's Trinity cooler seems to be very good. The 4090 FE reviews I read showed that typically the FE boost settles to 26xxMHz.

Cyberpunk at 2715 for example:

w1AdnVb.jpg

I get around 2775 in Cyberpunk with pathtracing turned on and all that other jazz, thats with a 4090 FE at stock. Core temp is around 65C, and power draw 375W give or take.

c7UbRvF.jpg

By the looks of afterburner in my screengrab the card is only running at 84% power level, and the fans are running at 43% which is quiet enough for me :cool:

EDIT:

Adding +100 on the core and 110% power:

NZn2PNA.jpg

Bumping the res up to 4K and adding frame gen, at +100core and 110% power:

wzfBEUJ.jpg

The clocks stay pretty stable, and so does the temperature (gotta love the FE cooler) but the Watts used certainly goes up.
 
Last edited:
  • Like
Reactions: mrk
Do the FEs have dual bios? I am on the quiet fan mode on my Zotac, in the normal mode the temps are in the 60s too so I imagine the clocks stay in higher boost but in quiet mode it's giving a more enjoyable gaming experience with how quiet it is, even though the normal mode is much quieter than the 3080 Ti FE was lol.
 
FE's only run a single bios, although the fan curve can be tweaked through Afterburner. I did find myself tweaking the 3090 FE fan curve, but the 4090FE is just left at stock, as the cooler does an excellent job without having to run the fans that high..In fact I dont think I've seen the fans go above 50%

The power limit can be upped to 133% on the FE, but (just like boosting the core) the gains in performance arent really worth bothering with. So I'm quite happy to leave the card at stock.
I've not even bothered with an undervolt, as the 4090 does a really good job of regulating itself both up and down. I'm currently playing through Resident Evil, which barely stresses the 4090 so it stays at a lower power as its not being utilised fully, even though the clocks remain high and so do the fps.

There's no denying the 4090 is a superb card all round :)
 
Back
Top Bottom