• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia gimmicks / features pushing up Graphics card prices?

Soldato
Joined
30 Jun 2019
Posts
8,022
DLSS 2
DLSS 3
Frame generation lame
Ray tracing.
Tensor cores for AI processing.
Floating point performance (e.g. 48.74 TFLOPS)
Huge amounts of VRAM
G.Sync
'Low latency' modes
Support for very high framerates refresh rates
Fancy new power connectors :D
Ampere +++

Do these things seem familiar?

None of these things are direct indicators of the performance of the GPU itself. They don't indicate a graphics card with more processing cores, or higher pixel rates /texture rates. Nor do they indicate the overall 3D graphics (rasterising) performance of the card.

We see these features on the box and decide they are must have features, and it pushes prices up. I admit that DLSS 2 is a very nice thing to have, if you have a 1440p/4K monitor. But - AMD is now competitive in terms of their resolution upscaling and detail enhancement technologies, so, I think we should basically all consider buying AMD this time around...

Of course, I may end up being a big hypocrite, if the RTX 4070 /4070 TI prices seem affordable. Other wise, based on the prices of the RTX 4080, Nvidia can do one. They have reverted to type, and we are seeing similar prices to the RTX 2080 TI (or even higher) again. And endless product variations and rereleases.
 
Last edited:
it pushes prices up.

Does it though??

The launch price of 4090 was $1,599 - the launch price of 3090 was $1,499.

Inflation for 2021 was around 7% and 2022 around 7.1% in the US, according to the figures I could find quickly. With the drop in GBP, the effective buying power % effect is noticeably higher for the UK, thanks to petro-dollar affecting most supply chains.

So 3090 launch price in 2022 would be $1,499 x 1.07 x 1.071 = $1,717.81

Soooo... the 4090 is cheaper than the 3090.

Perhaps you're mistaking the change in cost in GBP for something Nvidia has control over?

So many people seem to have little to no grasp of macro-economic factors and their influence on local pricing / buying power.
 
Last edited:
Soooo... the 4090 is cheaper than the 3090.
Even if true, the RTX 3090 was only a bit better than the RTX 3080 10GB and was stupid expensive anyway.

Then some people that bought the RTX 3090 were undoubtedly annoyed when the RTX 3090 TI came along and was even more stupidly expensive.

The same thing will happen with the RTX 4090.
 
Last edited:
Even if true, the RTX 3090 was only a bit better than the RTX 3080 10GB and was stupid expensive anyway.

Then some people that bought the RTX 3090 were undoubtedly annoyed when the RTX 3090 TI came along and was even more stupidly expensive.

The same thing will happen with the RTX 4090.
i imagine 3090 owners were more annoyed at the fact the ti model actually got the vram temps sorted out and the price was crazy cause nvidia just made the new sku in crypto boom and gave it a massive msrp cause it would have sold regardless, back to the topic tho i dont think the features are pushing the prices up thats just corporate greed doing what it does and hoping the crypto prices can be sustained forever
 
i imagine 3090 owners were more annoyed at the fact the ti model actually got the vram temps sorted out and the price was crazy cause nvidia just made the new sku in crypto boom and gave it a massive msrp cause it would have sold regardless, back to the topic tho i dont think the features are pushing the prices up thats just corporate greed doing what it does and hoping the crypto prices can be sustained forever
The gimmicks play a big part. Lots of people will now only buy a Nvidia RTX GPU, because of DLSS 2/3. The other one that is a must have for some is Gsync
 
The gimmicks play a big part. Lots of people will now only buy a Nvidia RTX GPU, because of DLSS 2/3. The other one that is a must have for some is Gsync
gsync or freesync are both pretty common at least now, yeah i have said with the upscaling tech if you go nvidia you get to choose all 3 if they an option and amd/intel you only have 2 so that could impact choice
 
They're not gimmicks, with the possible exception of the 12VHPWR connector (which I doubt did anything significant to price). Throwing more and more power behind traditional rasterisation was always going to hit a point of diminishing returns. Most of what the OP mentions is just a sensible means of working around that, making continued advances in performance and image quality.

And the "very high framerates" thing is bizarre. Competitive gamers in particular have been pushing those high framerates for a long time now - certainly going back well before the last couple of GPU generations.
 
Does it though??

The launch price of 4090 was $1,599 - the launch price of 3090 was $1,499.

Inflation for 2021 was around 7% and 2022 around 7.1% in the US, according to the figures I could find quickly. With the drop in GBP, the effective buying power % effect is noticeably higher for the UK, thanks to petro-dollar affecting most supply chains.

So 3090 launch price in 2022 would be $1,499 x 1.07 x 1.071 = $1,717.81

Soooo... the 4090 is cheaper than the 3090.

Perhaps you're mistaking the change in cost in GBP for something Nvidia has control over?

So many people seem to have little to no grasp of macro-economic factors and their influence on local pricing / buying power.
Except the pricing of 3090 was BS to begin with and should not have been normalised.

Also the "xx90" is essentially same as what the xx80ti class was (i.e. 980ti, 1080ti, 2080ti), but got rebranded to "xx90" and with the pricing shot up.
 
Except the pricing of 3090 was BS to begin with and should not have been normalised.

Also the "xx90" is essentially same as what the xx80ti class was (i.e. 980ti, 1080ti, 2080ti), but got rebranded to "xx90" and with the pricing shot up.

You're forgetting that the 4090 is actually a xx80 class chip...
 
Throwing more and more power behind traditional rasterisation was always going to hit a point of diminishing returns.
I think you've picked up on my main point. If we think back a few years, most gamers just wanted higher framerates in 3D graphics. That (mostly) was what a graphics card was for.

And steady performance (decent minimum framerates).

Not ray tracing, nor the ability to upscale in high resolutions.

Antialising technology was generally advancing enough to deal with most of the unpleasant artifacts we see from running in lower resolutions.

Because games have these presets (Ultra, Ultra RT etc), now the PC community has decided that these are things we should all want.

We didn't get a new architecture for the RTX 4000 series, but we pay top whack for it anyway.
 
Last edited:
OP when you but a car do you only look at it’s horsepower or interior and equipment matters as well.?
Also things like Ray Tracing are normal progress in graphics and if new graphics features were not added every few generations we would still be playing games with simple graphics just with ultra high resolution and frame rates.
Weird thread in my opinion.
 
DLSS 2
DLSS 3
Frame generation (lame)
Ray tracing.
Tensor cores for AI processing.
Floating point performance (e.g. 48.74 TFLOPS)
Huge amounts of VRAM
G.Sync
'Low latency' modes
Support for very high framerates refresh rates
Fancy new power connectors :D
Ampere +++

Do these things seem familiar?

None of these things are direct indicators of the performance of the GPU itself. They don't indicate a graphics card with more processing cores, or higher pixel rates /texture rates. Nor do they indicate the overall 3D graphics (rasterising) performance of the card.

We see these features on the box and decide they are must have features, and it pushes prices up. I admit that DLSS 2 is a very nice thing to have, if you have a 1440p/4K monitor. But - AMD is now competitive in terms of their resolution upscaling and detail enhancement technologies, so, I think we should basically all consider buying AMD this time around...

Of course, I may end up being a big hypocrite, if the RTX 4070 /4070 TI prices seem affordable. Other wise, based on the prices of the RTX 4080, Nvidia can do one. They have reverted to type, and we are seeing similar prices to the RTX 2080 TI (or even higher) again. And endless product variations and rereleases.

I get that that still really - raw raster performance is the only real way to compare cards apples to apples without one of the technologies swaying fairly/unfairly performance.

But added technologies have been around forever - all the way from anti aliasing.

Difference these days is that all those old techs predominately helped make potato resolutions look better. All these technologies are designed to increase 'apparent' oir perceived performance to the eye without too much loss of IQ. It just depends how well these techs are implemented in the game. What some of these technologies do is allow some extra legs in longevity from your card when games become either too much for decent native frame rates - or in relaity allow you to play the alpha versions AAA games seem to be sent out at on release. Implemented well, unless you go pixel peeping, you can barely tell the difference. I don't think anyone would argue that they'd prefer higher fps coupled with high resolution along with VRR working over just chugging along at native lower fps.

AMD has FSR - and is a great tech for getting extra legs from your GPU as it DLSS when a game can make it chug (CP2077)
Ray tracing - visual enhancement but not worth the extra when comparing 4090 & XTX
VRAM - very useful if you play with mods that enhance a game graphically further - allows smoother play for alpha games with memory leaks or poorly optimized games on release that people love paying MSRP for.
Support for very high framerates - why wouldn't you want this? Even those still on 60hz panels cant say they don't want 120hz+ - high hz panels coupled with VRR and a GPU to run it is a gamechanger in visual comfort and clarity
Low latency - why would you want high latency between the card and the panel as a gamer?

POwer connectors - yes well some cards already had 3 PCI-E connectors and 4 is ridiculous - OK the 12vHPWR isn't a great design, but something other than multiple connections from a PSU is required over 3+ PCI-E cables into a GPU board. Blame the industry design - not Nvidia.

Where a technology is better from one manufacturer than the other it is a nice to have - RT done well in a built environment really does add immersion. Again - developer implementation varies as does the game enviroinment. In a smooth shiny built environment RT will be more apparent than in a natural environment. More opportunity in CP2077 and Control than say God of war, horizon zero dawn Witcher 3.

If i was going for a new card now - it'd be the XTX - most of the 4090 raster performance for a good chunk less. Isn't really a game that the 4090 cant do at +100fps with RT on except CP2077 with insane modes of RT. The XTX is better value for money - not needed to use DLSS 3 yet as the 4090 performs so well, but it's there if needed as would FSR from AMD to give some extra legs.

DLSS 3 even on a 4090, will be useful on the next AAA game on release to get the best performance in the alpha/beta release they want you to pay MSRP for. DLSS & FSR are very useful for lower teir cards if you want to join the hype as as a AAA alpha tester or give longevity.

As with all visual enhancing technologies, they are only as good as their implementation and programming into the game.
 
Back
Top Bottom