• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia gimmicks / features pushing up Graphics card prices?

Radeon RT off.

iu





After alcohol and now RT on.

 
We know what AMD are releasing, a 7700XT renamed as a 7800XT as the real one got renamed to a 7900XT and price jacked so don't expect it to be much faster than a 6800XT as it will likely have just 60CUs

You are out by 1 tier. The 7800 becomes 7800XT and so on.
 
Everyone seems to forget this, the whole stack has shifted up 1 tier in terms of pricing. You can see this by just looking at the current pricing of the 4080 which is a 70 class card but priced 1 tier up, except its also been gouged a lot higher out of greed.

Yup... second time they've done this.

Last time was the 680 which were the mid tier chips they released as high end because AMD had fallen so far behind. They later released the 780, which was the real high end chip.

If only we had real competition in the market... hopefully AMD will catch up.

But seeing as both AMD & Nvidia's CEOs are of the same family - I have a funny feeling there may be some verbal agreements in the background stopping it.

I want it for the performance boost - in theory we're now 2 generations behind what NVidia are actually capable of, but it's possible they scaled back R&D to cut costs.
 
Yup... second time they've done this.

Last time was the 680 which were the mid tier chips they released as high end because AMD had fallen so far behind. They later released the 780, which was the real high end chip.

If only we had real competition in the market... hopefully AMD will catch up.

But seeing as both AMD & Nvidia's CEOs are of the same family - I have a funny feeling there may be some verbal agreements in the background stopping it.

I want it for the performance boost - in theory we're now 2 generations behind what NVidia are actually capable of, but it's possible they scaled back R&D to cut costs.
Other than the 4090, AMD have caught up. Raster performance is strong, FSR competes with DLSS and RT is usable. Not sure what else people expect them to do?
 
Yup... second time they've done this.

Last time was the 680 which were the mid tier chips they released as high end because AMD had fallen so far behind. They later released the 780, which was the real high end chip.

If only we had real competition in the market... hopefully AMD will catch up.

But seeing as both AMD & Nvidia's CEOs are of the same family - I have a funny feeling there may be some verbal agreements in the background stopping it.

I want it for the performance boost - in theory we're now 2 generations behind what NVidia are actually capable of, but it's possible they scaled back R&D to cut costs.
price fixing and market manipulation, wh owuld have thought it could happen!
 
Other than the 4090, AMD have caught up. Raster performance is strong, FSR competes with DLSS and RT is usable. Not sure what else people expect them to do?

They have caught up with 2 year old ampere RT, they are still behind in RT. They're worse in vr and power efficiency/consumption.

Fsr is hit and miss too although it's definitely less of an issue now since uptake is quicker/better.

They don't have a frame generation competitor yet.

Etc. Etc.

It's basically like rDNA 2 Vs ampere all over again i.e. if you want to save some money and only care for raster/cod then go for amd.
 
Last edited:
They have caught up with 2 year old ampere RT, they are still behind in RT. They're worse in vr and power efficiency/consumption.

Fsr is hit and miss too although it's definitely less of an issue now since uptake is quicker/better.

They don't have a frame generation competitor yet.

Etc. Etc.

It's basically like rDNA 2 Vs ampere all over again i.e. if you want to save some money and only care for raster then go for amd.
I love that nVidia buyers are excited for fake frames. Spending thousands on hardware that comes with software to make it feel faster. Like someone else said, nVidia are great at inventing problems so they can sell you the solution.
 
I love that nVidia buyers are excited for fake frames. Spending thousands on hardware that comes with software to make it feel faster. Like someone else said, nVidia are great at inventing problems so they can sell you the solution.

And if it improves smoothness/motion i.e. one of the main benefits of high FPS..... without many cons/issues for those high refresh rate or/and high res. gamers, why does it matter how it is achieved? Same way if DLSS/FSR can look as good as native or better and provide double the perf. why does it matter?

Also, amd are providing a similar solution at some point next year and so are intel so again, it won't just be a "nvidia" thing, it has also been a thing in VR for a while too.

And yup damn nvidia for giving us/pushing new solutions that will improve some peoples gaming experiences! Also, how dare a company worth billions charge for their products/features..... In the words of Greta, "how dare they!" :cry:
 
Last edited:
I love that nVidia buyers are excited for fake frames. Spending thousands on hardware that comes with software to make it feel faster. Like someone else said, nVidia are great at inventing problems so they can sell you the solution.
the fake frames are only an issue when you dont have enough base frames to work with, 60+ base becoming 120+ it works well, not so much on the lower end
 
the fake frames are only an issue when you dont have enough base frames to work with, 60+ base becoming 120+ it works well, not so much on the lower end

Speaking of FG/DLSS 3, good timing, HUB were overly critical/negative in their initial review but a more "balanced" view point from them here now:


So as we all already knew, it will come entirely down to the base fps and what the individual finds "playable/good", at least in terms of base latency, for me, that is definitely 60 fps, ideally 70/80
 
Last edited:
the fake frames are only an issue when you dont have enough base frames to work with, 60+ base becoming 120+ it works well, not so much on the lower end
It works best in the place it's needed least. I'm not completely dismissing it and I can see benefits, but currently the main one appears to be skewing bar charts at the high end.
 
It works best in the place it's needed least. I'm not completely dismissing it and I can see benefits, but currently the main one appears to be skewing bar charts at the high end.

Worth watching HUB video above, they give a pretty well balanced view of when/where it is needed and what tier of gpu it would work best for and obviously the 2 main things, what res. and refresh rate you are playing at. So 4080 and maybe 4070/ti probably be best suited for FG.
 
I can watch all the videos I'd like, but it doesn't replace trying it where latency is a potential issue. Reserving judgement until I try it myself, which at these prices will be when I'm old, grey, and my vision needs upscaling, not my GPU!

If 4080 and 4070ti would be best suited for FG, they should be running things at a high base frame rate anyway, so FG conversely becomes less important, theoretically. It's like DLSS or FSR - no need to turn it on if you're already getting huge frame rates.
 
DLSS 2
DLSS 3
Frame generation (lame)
Ray tracing.
Tensor cores for AI processing.
Floating point performance (e.g. 48.74 TFLOPS)
Huge amounts of VRAM
G.Sync
'Low latency' modes
Support for very high framerates refresh rates
Fancy new power connectors :D
Ampere +++

Do these things seem familiar?

None of these things are direct indicators of the performance of the GPU itself. They don't indicate a graphics card with more processing cores, or higher pixel rates /texture rates. Nor do they indicate the overall 3D graphics (rasterising) performance of the card.

We see these features on the box and decide they are must have features, and it pushes prices up. I admit that DLSS 2 is a very nice thing to have, if you have a 1440p/4K monitor. But - AMD is now competitive in terms of their resolution upscaling and detail enhancement technologies, so, I think we should basically all consider buying AMD this time around...

Of course, I may end up being a big hypocrite, if the RTX 4070 /4070 TI prices seem affordable. Other wise, based on the prices of the RTX 4080, Nvidia can do one. They have reverted to type, and we are seeing similar prices to the RTX 2080 TI (or even higher) again. And endless product variations and rereleases.

I don't decide they are "must have" features, I look at reviews and tech videos from the likes of Digital Foundry to see if it's marketing or a genuine feature.

I've been gaming on a 4k screen now since late 2014, that was on an AMD 295x2 GPU and performance wasn't great back then.

By 2015 I had upgraded to Titan X Maxwell GPU's in SLI and suddenly thanks to G-sync, I was able to smooth out the variable frame rates when they couldn't do a locked 60fps on a lot of games.

Then 2080Ti came along with Ray Tracing, and it wasn't really up to the task of running ray tracing at 4k/decent frame rates, but then Nvidia introduced DLSS - Version 1.0 wasn't great, but within months we had DLSS 2.0 and that was a bit of a game-changer really for playing stuff like 'Control'

3090 then came along and gave you a genuine locked 60fps with DLSS 2

Now 4090 is here and you can easily get a locked 120fps most of the time.

So anyway, that was a lot of waffle but here is what I do use:

DLSS 2
Ray tracing.
Tensor cores for AI processing (DLSS use these)
Huge amounts of VRAM - Useful for 4k/Ultra textures. Remember the 3080 with 10GB VRAM stuttering on DOOM Eternal?
G.Sync
'Low latency'
Support for very high framerates refresh rates - I use 144hz on my Alienware OLED and 120hz on my G2 OLED.

These aren't gimmicks, they are genuinely useful features.

DLSS 3 is a gimmick to me at the moment, but might improve over time.
 
I can watch all the videos I'd like, but it doesn't replace trying it where latency is a potential issue. Reserving judgement until I try it myself, which at these prices will be when I'm old, grey, and my vision needs upscaling, not my GPU!

If 4080 and 4070ti would be best suited for FG, they should be running things at a high base frame rate anyway, so FG conversely becomes less important, theoretically. It's like DLSS or FSR - no need to turn it on if you're already getting huge frame rates.

And that is exactly why to watch this one as they cover the latency stuff very well :p i.e. it all comes down to the individual and what "base" fps they find good/acceptable e.g.

- some find 30 fps perfectly playable
- some find 60 fps perfectly playable
- some only find 100 fps acceptable

Essentially it doesn't really become a case of don't use FG because "latency" but only use FG "if" your base/native fps is at an acceptable level.

And the 2 main factors will come down to your res. and refresh rate i.e. 1080p 144hz and 4k/60 won't see any need/point for FG but for 4k 144hz or 3440x1440 175hz, 1440 240+hz then this is where FG really will be very beneficial i.e. to get the best motion/smoothness out of your display especially if a game is cpu bottleneck where no matter how powerful the gpu is, you won't get higher fps without FG e.g. MSFS

Even if no need for the extra perf. of dlss/fsr, I still use them over native because image quality is simply better in my experience, better temporal stability i.e. less shimmering, less aliasing, less jaggies and sometimes even better clarity/detail, until we get better AA methods or TAA improves, this won't change any time soon, also less power usage when using DLSS/FSR thus runs cooler and quieter are nice bonuses :)

I don't decide they are "must have" features, I look at reviews and tech videos from the likes of Digital Foundry to see if it's marketing or a genuine feature.

I've been gaming on a 4k screen now since late 2014, that was on an AMD 295x2 GPU and performance wasn't great back then.

By 2015 I had upgraded to Titan X Maxwell GPU's in SLI and suddenly thanks to G-sync, I was able to smooth out the variable frame rates when they couldn't do a locked 60fps on a lot of games.

Then 2080Ti came along with Ray Tracing, and it wasn't really up to the task of running ray tracing at 4k/decent frame rates, but then Nvidia introduced DLSS - Version 1.0 wasn't great, but within months we had DLSS 2.0 and that was a bit of a game-changer really for playing stuff like 'Control'

3090 then came along and gave you a genuine locked 60fps with DLSS 2

Now 4090 is here and you can easily get a locked 120fps most of the time.

So anyway, that was a lot of waffle but here is what I do use:

DLSS 2
Ray tracing.
Tensor cores for AI processing (DLSS use these)
Huge amounts of VRAM - Useful for 4k/Ultra textures. Remember the 3080 with 10GB VRAM stuttering on DOOM Eternal?
G.Sync
'Low latency'
Support for very high framerates refresh rates - I use 144hz on my Alienware OLED and 120hz on my G2 OLED.

These aren't gimmicks, they are genuinely useful features.

DLSS 3 is a gimmick to me at the moment, but might improve over time.

On topic of doom eternal and "stuttering on 3080", Matt mentioned this as well a while back so I tested this myself and didn't really notice any severe issue, iirc, there is one frame spike, which iirc was down to my second display changing wallpaper:



I remember pointing out some hitches/stutters on Jansn's 6800xt though :p





As for what features I value/use from nvidia side, it's only DLSS. I do use/enable reflex when it is in the game but wouldn't be a must have feature for me. I do have a gsync ultimate monitor but with it being "oled", the module doesn't really matter quite as much although seems HDR performance/accuracy is better than the freesync/non-module version but that might be fixed with future firmwares and only specific to the AW qd-oled model.....

Ray tracing is not a "nvidia" thing either....
 
Last edited:
Back
Top Bottom