• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

What's the TLDR for the new Nvidia cards, it looks like 4080 & 4090 are releasing soon, is the performance a mystery until NDA lifts?

The price is high, performance looks good but not stunning with current gen games, Nvidia are banking on their DLSS and ray tracing technology being embraced to make these cards really shine in the future. People are annoyed over the naming scheme, and especially the 16 and 12 Gb 4080 sharing a name when the 4080-12Gb is a very different card.

I see Nvidia are advertising a 4080 for £950. What is the likely performance uplift over my 6800XT - or, is this unknown at the moment?

Small, like 5-20% probably, going to vary by game and is unknown until these cards get properly benchmarked by independent parties. However, if future games embrace DLSS and go hard on ray tracing, the difference will be much larger.

For my money? Not worth upgrading. Wait for the next generation and see what happens with the games you want to play.
 
Last edited:
The real performance is currently unknown, all the impressive examples use DLSS 3 tricks.

We've got little indication but it's a non-zero amount, this guy did the maths already: https://www.reddit.com/r/nvidia/comments/xoufer/40_series_performance_cost_analysis_based_on/

All in all, if you're on a 3090 or below then the raster uplift is decent and I imagine the raytracing uplift will be better (probably upwards of 80% maybe?) - is it £1600 - £2200 good? No absolutely not. nVidia and Jenson can sook wan if they think it's a fair price.
 
Last edited:
The 4090 looks fine in those tables.. and the card can be designed to guzzle more power than the Fe variant. Jensen suggested that there's a large overclocking headroom not tapped by the FE
 
Last edited:
Ok, I'll try again...

I see Nvidia are advertising a 4080 for £950. What is the likely performance uplift over my 6800XT - or, is this unknown at the moment?
Sounds like you are talking about the 4070 4080 12GB. That doesn't have an FE edition it is AIB only. So prices will be higher than MSRP.
 
Last edited:
Sounds like you are talking about the 4070 4080 12GB. That doesn't have an FE edition it is AIB only. So prices will be higher than MSRP.

I have seen it reported that, this time, Nvidia have set their MSRPs to allow AIB partners to make cards at that price and return a normal profit, rather than undercutting their own partners with their FE editions. Not sure if that will turn out to be true.
 
Small, like 5-20% probably, going to vary by game and is unknown until these cards get properly benchmarked by independent parties. However, if future games embrace DLSS and go hard on ray tracing, the difference will be much larger.

For my money? Not worth upgrading. Wait for the next generation and see what happens with the games you want to play.
Wow, that is beyond underwhelming for the price. Thanks for the info, and I agree regarding it not being worth it.
Sounds like you are talking about the 4070 4080 12GB. That doesn't have an FE edition it is AIB only. So prices will be higher than MSRP.
It's the 4080 12GB edition I was referring to, but according to Nvidia it does have FE version?

 
But isn't it the only one with 2 12vhpwr outlets, I am sure we'd be needing both in next 5 years.. seasonic has nothing similar in the new lineup

Also how would the card know that it's pulling more than rated power from those dongles.. it can only monitor the aggregate power delivered at the connector
Not on the 1200w or 1300w at least so only on the massively expensive 1600w version
 
Wow, that is beyond underwhelming for the price. Thanks for the info, and I agree regarding it not being worth it.

Mostly, it's that the 6800XT is already a stonkingly fast card. With Nvidia deciding that they're not going to just pursue higher frame rates with current rendering approaches, there just isn't much difference for games that aren't tilting in the direction Nvidia is going. If you're playing games that do a lot of ray tracing, you'll see a big boost, but if you're not you won't. I'm guessing RT is going to much more important in the games we see coming out over the next couple of years, but you can always upgrade later if they do.
 
Wow, that is beyond underwhelming for the price. Thanks for the info, and I agree regarding it not being worth it.

It's the 4080 12GB edition I was referring to, but according to Nvidia it does have FE version?


its clever as the AIB will likely have to price them higher people will get tempted with the 16gb FE msrp :)

 
Mostly, it's that the 6800XT is already a stonkingly fast card. With Nvidia deciding that they're not going to just pursue higher frame rates with current rendering approaches, there just isn't much difference for games that aren't tilting in the direction Nvidia is going. If you're playing games that do a lot of ray tracing, you'll see a big boost, but if you're not you won't. I'm guessing RT is going to much more important in the games we see coming out over the next couple of years, but you can always upgrade later if they do.
Agreed.

I actually had a 3080 for 18 months and I never bothered with ray tracing as it's a total gimmick IMO so I definitely don't miss it and don't expect to. The shift to OLED and HDR was far more impressive to me.

It's interesting that they're focused on RT over FPS as it must mean they're going to invest heavily by partnering with game developers to advance it.

I'm 100% happy to skip this generation unless AMD pull a rabbit out the hat.
 
It's interesting that they're focused on RT over FPS as it must mean they're going to invest heavily by partnering with game developers to advance it.

It's a long time since I worked for any major game company, but back in the day, Nvidia made it rain graphics cards and developers are always keen to play with the latest innovations. I doubt it has changed much.
 
Agreed.

I actually had a 3080 for 18 months and I never bothered with ray tracing as it's a total gimmick IMO so I definitely don't miss it and don't expect to. The shift to OLED and HDR was far more impressive to me.

It's interesting that they're focused on RT over FPS as it must mean they're going to invest heavily by partnering with game developers to advance it.

I'm 100% happy to skip this generation unless AMD pull a rabbit out the hat.

+1

I got a 3080FE last year and haven't had any problems with any game so far at 1440P. RT and DLSS is not an essential feature for me anyway so will disable it if more fps is needed. I also find that DLSS seems to introduce a bit of motion blur/ghosting so usually disable it. I mainly play League of Legends so it's over kill as it is so will be passing the next gen until I move to a 4K 144Hz monitor.
 
Agreed.

I actually had a 3080 for 18 months and I never bothered with ray tracing as it's a total gimmick IMO so I definitely don't miss it and don't expect to. The shift to OLED and HDR was far more impressive to me.

It's interesting that they're focused on RT over FPS as it must mean they're going to invest heavily by partnering with game developers to advance it.

I'm 100% happy to skip this generation unless AMD pull a rabbit out the hat.
I got to agree that for someone who games in the dark, moving to an Oled screen is the biggest gfx upgrade I've had in some time. I'd rather a last gen gpu + oled than current gpu with lcd.
 
I have seen it reported that, this time, Nvidia have set their MSRPs to allow AIB partners to make cards at that price and return a normal profit, rather than undercutting their own partners with their FE editions. Not sure if that will turn out to be true.
If it was true I don't think EVGA would have pulled out. Then again AIBs didn't know the prices till GTC.
 
I will always turn RT on max where a game makes use of RT. The differences might vary from big to small depending on game, but there are differences. And yeah whilst OLED and HDR make a nice difference (HDR not so much on some games), the added lighting bounce from RT lighting adds a bit of extra fidelity and especially in RT reflections, something that non-RT reflections can't ever manage. Look at games like Spiderman and Cyberpunk where RT reflections adds extra depth to the world. Turning such things on might shave off 20-30fps, but on the flipside, DLSS reclaims that back without sacrifice to image quality (though that does depend on the game...).

Saw this on pcmr, lolled:

mgPFwV9.png
 
Last edited:
Agreed.

I actually had a 3080 for 18 months and I never bothered with ray tracing as it's a total gimmick IMO so I definitely don't miss it and don't expect to. The shift to OLED and HDR was far more impressive to me.

It's interesting that they're focused on RT over FPS as it must mean they're going to invest heavily by partnering with game developers to advance it.

I'm 100% happy to skip this generation unless AMD pull a rabbit out the hat.

I never get the comparison to oled/HDR, they are completely different technologies setting out to achieve completely different things.

Ray tracing gives us better visuals in the lighting, shadows and reflections department but the main benefit is to developers, which is why we're seeing so many games adopting it where they can now, no doubt nvidia will provide some extra persuasion to developers to include heavier/more RT effects but most developers are choosing to use it themselves, even most of amds sponsored games have RT now too and all 3 gpu brands have provided their own tools and documents on best practices for implementing RT. We're at the end of the road for rasterization methods, the next step is implementing RT and gradually phasing out raster. and someone had/has to get the ball rolling.

For just 1 frame/scene, this is the workload difference it makes:

XMbGFnt.png

And whilst not gaming, here is a recent statement from the first film (outside of pixar films) to use it, which shows the benefit it brings:


The use of the technology not only helped the actors as the film team was shooting, but it also reduced the post-production time by more than 300 days.

The one thing that people always say as well, is they want more dynamic environments and environments to react to how your surroundings change etc. RT allows this to be achieved.
 
I forgot about that^ Remember watching it and was amazed at the workload time saved for the devs. Granted they still have to do the manual labour version too because the game has to have the option to use RT or not (for now) - But in the near future when all GFX cards can do RT with some form of speed whether FSR/DLSS/XeSS or whatever, then we will see RT used with no option to turn it off and games can be released faster too from all that massive time saved not having to bake in lighting.

Although Lumin in UE5 kind of flips that on its head, not true RT, lighting but just as good and real time?
 
Back
Top Bottom