• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia gimmicks / features pushing up Graphics card prices?

DLSS 2
DLSS 3
Frame generation (lame)
Ray tracing.
Tensor cores for AI processing.
Floating point performance (e.g. 48.74 TFLOPS)
Huge amounts of VRAM
G.Sync
'Low latency' modes

Ray tracing is the logical next step for game graphics, As such it requires a hell of a lot of computational horsepower, This is where DLSS2 and then DLSS3 come into play.

DLSS3 includes DLSS2 but with frame generation and Reflex/low latency added in to mitigate increased latency, Tensor cores and the large floating point performance are needed for these operations.

Large amounts of VRAM are needed especially if you play at 4K which is slowly becoming the standard, Dying Light 2 for example I easily see 13GB used. You ever been in sitations where you run out of VRAM ? It creates a stuttery mess.

Larger amounts of VRAM also enable devs to push texture resolution which is needed if you want detailed worlds.
 
DLSS 2
DLSS 3
Frame generation (lame)
Ray tracing.
Tensor cores for AI processing.
Floating point performance (e.g. 48.74 TFLOPS)
Huge amounts of VRAM
G.Sync
'Low latency' modes
Support for very high framerates refresh rates
Fancy new power connectors :D
Ampere +++

Do these things seem familiar?

None of these things are direct indicators of the performance of the GPU itself. They don't indicate a graphics card with more processing cores, or higher pixel rates /texture rates. Nor do they indicate the overall 3D graphics (rasterising) performance of the card.

We see these features on the box and decide they are must have features, and it pushes prices up. I admit that DLSS 2 is a very nice thing to have, if you have a 1440p/4K monitor. But - AMD is now competitive in terms of their resolution upscaling and detail enhancement technologies, so, I think we should basically all consider buying AMD this time around...

Of course, I may end up being a big hypocrite, if the RTX 4070 /4070 TI prices seem affordable. Other wise, based on the prices of the RTX 4080, Nvidia can do one. They have reverted to type, and we are seeing similar prices to the RTX 2080 TI (or even higher) again. And endless product variations and rereleases.

Huge amounts of VRAM?

LOL they were cheaping out for years.

They had to be dragged by AMD to give cards 12GB+.
 
This is generally just a weird thread. Since I first got into PC gaming just before Wing Commander came out (and I remember the horrified whispers that a 286 wouldn't really cut it for that), developers have been pushing for ever more advanced graphics and gamers have been rewarding them with sales. It's just a fact of life of gaming.

With diminishing returns from traditional rasterisation, Nvidia have pushed different technologies to continue improving image quality and framerates. Nothing to see here, move along.

There's definitely a problem with the pricing of some of both Nvidia's and AMD's latest GPU offerings, but I suspect that has a lot more to do with two years of shortages and scalpers convincing the companies that they could get away with charging more.
 
I agree generally, there's no point paying for stuff you don't need, and RTX is something everyone is paying for but hardly anyone needs (because the fps cost is too big to use it). AI could be done without too.

It would be nice if they sold a card like the 1080 that just performs well in games, and is house trained in terms of power/heat/noise.
 
Last edited:
Does it though??

The launch price of 4090 was $1,599 - the launch price of 3090 was $1,499.

Inflation for 2021 was around 7% and 2022 around 7.1% in the US, according to the figures I could find quickly. With the drop in GBP, the effective buying power % effect is noticeably higher for the UK, thanks to petro-dollar affecting most supply chains.

So 3090 launch price in 2022 would be $1,499 x 1.07 x 1.071 = $1,717.81

Soooo... the 4090 is cheaper than the 3090.

Perhaps you're mistaking the change in cost in GBP for something Nvidia has control over?

So many people seem to have little to no grasp of macro-economic factors and their influence on local pricing / buying power.
The 3080 FE was $699, but the 4080 FE is $1199. I am an 80-class buyer and I am being priced out. Just because the already ludicrously priced 90 card looks like decent value and a big performance increase compared to the previous generation that doesn't mean that the lower models are.
 
Last edited:
I agree generally, there's no point paying for stuff you don't need, and RTX is something everyone is paying for but hardly anyone needs (because the fps cost is too big to use it). AI could be done without too.

It would be nice if they sold a card like the 1080 that just performs well in games, and is house trained in terms of power/heat/noise.
Except the tech is out there and developers are increasingly going to lean on it. DLSS and FSR are almost ubiquitous in AAA games now. RT isn't quite as widespread, but it's still getting a lot more common.
 
@Dicehunter

Forgot you bought the 4090. How you finding frame generation?

Honestly great, Only used it extensively in Spiderman Remastered but it really makes the game feel nice and smooth, The Witcher 3 though seems off, A little stuttery and odd.

Since when do people think they can decide for everyone else what things are worth?

X thing - ‘Not worth it’. For you maybe.

I used to hear this from my old social circle all the time "It's not worth it" that was from people who drank, Smoked, Spent a fortune on that tracksuit crap that makes anyone look like a chav scumbag, And went out drinking on the weekends spending a fortune... but that was apparently different.
 
The 3080 FE was $699, but the 4080 FE is $1199. I am an 80-class buyer and I am being priced out. Just because the already ludicrously priced 90 card looks like decent value and a big performance increase compared to the previous generation that doesn't mean that the lower models are.
The 3080 was more premium too with a 102 die and performance closer to the 90.

The 4080 is a joke with its 70 class die size and absurd pricing, even those who paid £1200 for a 3080 during the mining boom got a better deal than this as atleast you could have mined some of the money back.
 
You're forgetting that the 4090 is actually a xx80 class chip...
Everyone seems to forget this, the whole stack has shifted up 1 tier in terms of pricing. You can see this by just looking at the current pricing of the 4080 which is a 70 class card but priced 1 tier up, except its also been gouged a lot higher out of greed.
 
Honestly great, Only used it extensively in Spiderman Remastered but it really makes the game feel nice and smooth, The Witcher 3 though seems off, A little stuttery and odd.

Another happy customer with frame generation then :)

Have read some reports on witcher 3 being stuttery with it though, apparently people saying an issue if running an amd cpu? Some potential fixes in here to try:





DLSS, frame generation, RT are perks I would pay extra for (within reason)

Also another silly point with OP, AMD also advertise their equivalents but they don't hold the same weight as nvidias options imo.
 
It's another strange thread by the OP. He's almost as bad as that Sewerino guy.
You could do what most sensible people do when they read a point of view they they disagree with - ignore it :p

The level of debate from some people is basically - I have lots of money and I will do what I want. This is fine, but why subject other people to this bias?

I do not believe I've said anything particularly controversial - if you consider what I've written to be a bit lacking in depth, perhaps that is so. But I'm really looking at it from the point of view of someone who is either trying to buy a new card, or just trying to get into PC gaming in general.

I will buy Nvidia or AMD. Maybe even Intel after a couple more generations... It makes no difference, the only thing that counts is value. Reviewers have measured this almost entirely based on average framerate per dollar, or, in some cases the minimum framerate, measured across several games.
 
Last edited:
Ray tracing is the logical next step for game graphics, As such it requires a hell of a lot of computational horsepower, This is where DLSS2 and then DLSS3 come into play.

DLSS3 includes DLSS2 but with frame generation and Reflex/low latency added in to mitigate increased latency, Tensor cores and the large floating point performance are needed for these operations.

Large amounts of VRAM are needed especially if you play at 4K which is slowly becoming the standard, Dying Light 2 for example I easily see 13GB used. You ever been in sitations where you run out of VRAM ? It creates a stuttery mess.

Larger amounts of VRAM also enable devs to push texture resolution which is needed if you want detailed worlds.
Bolded.

No. Not any time in the next 5 years or so.
 
Another happy customer with frame generation then :)

Have read some reports on witcher 3 being stuttery with it though, apparently people saying an issue if running an amd cpu? Some potential fixes in here to try:





DLSS, frame generation, RT are perks I would pay extra for (within reason)

Also another silly point with OP, AMD also advertise their equivalents but they don't hold the same weight as nvidias options imo.
All the weight is from Jensens big black jacket.
 
Back
Top Bottom