• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Yeah it's "up to" 450 watts, that doesn't mean it's using 450 watts even at 100% utilisation depending on the game.

Just double checked in Cyberpunk with PT enabled and went into HWINFO:

SnHfC4n.png


A 400 watt max peak, obviously the average is much lower, 237 watts. Now the 3080 Ti on the other hand would be over 300 watts average.

Edit*
3080 Ti with the same settings, 314 watts but at 31 fps whereas the 4090 is at over 100fps :cry:

SJeMekz.jpg



100w more power use than a 3090.
 

SJeMekz.jpg



100w more power use than a 3090.
That's not what my metrics reflect in the games I play, as per my screenshot above, average wattage is below 300.The 3080 Ti FE was over 300. The 30 series have to work harder to maintain their frames, the 4090 doesn't have to work hard at the same settings and resolution so logically it would obviously have a lower draw average, which is exactly what I am seeing.
 
Last edited:
Shows what a lack of competitive competition does.

Nvidia have an effective, 80+/20-% semi-monopoly.

And they're totally happy screwing us with it.
AMD has the means to compete, but doesn't seem too keen to actually do it. Could be 50/50, since they don't compete with each other, doesn't matter.
To be fair, the 4090 is twice the price of the 7900 xt!

Pretty massive difference in affordability there.
Very true, but meaningless if the 7900xt doesn't cover the need that 4090 manages to do. That is true also for 7900xtx vs 7900xt or any another similar example. The card first and foremost has to cover the need of the buyer and only then we can talk about price/performance and other metrics.
 
Last edited:
What a world we live in...

$399 for a 1.15x improvement over the previous generation

ZUKCbMG.png




Is an extra 8Gb of RAM worth the $100 as well ?

The only reason I'm considering an Nvidia is because it works better with Pytorch for AI stuff and I have a Gsync monitor
 
Last edited:
:cry:




Hey great article.. Nvidia even states 16GB cards now 1080p ... 4060Ti....

The article is nothing more than them explaining how they are ripping people off with VRAM and pretending to have software tricks to help and on silicon cache that their competitor has been doing for ages now and they copied it basically and found out they can basically make weaker cards perform better with it and have an excuse why they don't need more VRAM on their cards than their competitor that has more VRAM and Cache...:cry:


:rolleyes:

Ohh also don't forget DLSS 3 AKA fake frames generator... just to make them FPS tools read high fake numbers. :cry:

Yeah this is getting to Bud Light levels of marketing incompetency
 
I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.

I do wonder if people slating it have actually used it.
 
Last edited:
I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.

I do wonder if people slating it have actually used it.
Yeah, and is the same with RT/PT.
 
I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.

I do wonder if people slating it have actually used it.
Personally I don't mind the tech as long as it genuinely makes the game more smoother to play. However, I don't like how Nvidia uses it to compare with older gen cards that doesn't support the tech just to inflate their numbers. Not every game supports it so I prefer seeing the base numbers for a more realistic comparison.
 
Can smell Jensens desperation from here. Let's see who will shill his crap.



 
I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.

I do wonder if people slating it have actually used it.
DLSS and FG are game changers, but usually it's amd proponents that go around forums talking smack. It's been that way for the last 15 years in every damn forum. I still remember the AMD crowd going nuts about how insanely good the FX8150 was. People never change...
 
Yeah it's "up to" 450 watts, that doesn't mean it's using 450 watts even at 100% utilisation depending on the game.

Just double checked in Cyberpunk with PT enabled and went into HWINFO:

SnHfC4n.png


A 400 watt max peak, obviously the average is much lower, 237 watts. Now the 3080 Ti on the other hand would be over 300 watts average.

Edit*
3080 Ti with the same settings, 314 watts but at 31 fps whereas the 4090 is at over 100fps :cry:


31fps to 100fps is not comparing apples to apples.. So 3080ti was running native and 4090 was running DLSS3 ? Is that a fair comparison now ?

Come on mate I thought better of you of all people on this forum.

I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.

I do wonder if people slating it have actually used it.

Used DLSS3 on MS flight sim and hated it as it causes a lot of artifacts and weird issues to the text on the panels and the game.


I'm not falling for Nvidia's tricks and games.. I buy hardware from them so expect hardware to give me the uplift I'm paying for and paying more for every generation, not software tricks that actually make the image worse in the titles I use and adds other strange things. I'm not buying software from Nvidia apart from the hardware drivers and nothing else all these tricks should not be needed if the hardware was up to it in the first place, so now they use tricks to show an uplift from previous gens instead of hardware.

Again I pay Nvidia for hardware so expect these uplifts at native settings as we have always measured uplift from gen to gen. Or should we go back to the tricks Nvidia and ATI/AMD played by reducing visual quality in benchmarks to make their hardware look better ? Ohh wait we are doing that now and excepting it as the norm. Sorry not falling for that and never will. Also I use my GPU for more than gaming so I don't need gaming tricks added and need real hardware uplifts for real work (soon DLSS3 for work apps.. half the data goes missing but twice as fast... :rolleyes: ).






 
Last edited:
Can smell Jensens desperation from here. Let's see who will shill his crap.




When you've over committed and have to give them away :cry:
 
31fps to 100fps is not comparing apples to apples.. So 3080ti was running native and 4090 was running DLSS3 ? Is that a fair comparison now ?

Come on mate I thought better of you of all people on this forum.



Used DLSS3 on MS flight sim and hated it as it causes a lot of artifacts and weird issues to the text on the panels and the game.


I'm not falling for Nvidia's tricks and games.. I buy hardware from them so expect hardware to give me the uplift I'm paying for and paying more for every generation, not software tricks that actually make the image worse in the titles I use and adds other strange things. I'm not buying software from Nvidia apart from the hardware drivers and nothing else all these tricks should not be needed if the hardware was up to it in the first place, so now they use tricks to show an uplift from previous gens instead of hardware.

Again I pay Nvidia for hardware so expect these uplifts at native settings at we have always measured uplift from gen to gen. Or should we go back to the tricks Nvidia and ATI/AMD played by reducing visual quality in benchmarks to make their hardware look better ? Ohh wait we are doing that now and excepting it as the norm. Sorry not falling for that and never will. Also I use my GPU for more than gaming so I don't need gaming tricks added and need real hardware uplifts for real work (soon DLSS3 for work apps.. half the data goes missing but twice as fast... :rolleyes: ).







No, 3080 Ti was running PT with DLSS Quality, 4090 is running the same, but with FG enabled, but, with FG disabled it's 64fps, so still double that of the 3080 Ti, and the GPU utilisation doesn't change so it makes no difference to power draw. So effectively my metrics are accurate and compared directly against the 3080 Ti with the same settings. You can post articles all day long but they won't change the fact that I am seeing what I am seeing and recording what I am recording as a direct comparison between both cards on the same system with the same games and the same driver versions.
 
Last edited:
Back
Top Bottom