Soldato
- Joined
- 14 Aug 2009
- Posts
- 3,247
Most people can't afford the 7900xt as well. But that wasn't may point.It's not a relevant card to the vast majority because of it's very high price.
Most can't afford the RTX 4080 either.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Most people can't afford the 7900xt as well. But that wasn't may point.It's not a relevant card to the vast majority because of it's very high price.
Most can't afford the RTX 4080 either.
Most people can't afford the 7900xt as well. But that wasn't may point.
Yeah it's "up to" 450 watts, that doesn't mean it's using 450 watts even at 100% utilisation depending on the game.
Just double checked in Cyberpunk with PT enabled and went into HWINFO:
![]()
A 400 watt max peak, obviously the average is much lower, 237 watts. Now the 3080 Ti on the other hand would be over 300 watts average.
Edit*
3080 Ti with the same settings, 314 watts but at 31 fps whereas the 4090 is at over 100fps![]()
You said there are no power saving going from 3090 to 4090 - which is just not trueAgain rated card power use from Nvidia .. 3090 350w and 4090 450w. Not talking about anything else here.
That's not what my metrics reflect in the games I play, as per my screenshot above, average wattage is below 300.The 3080 Ti FE was over 300. The 30 series have to work harder to maintain their frames, the 4090 doesn't have to work hard at the same settings and resolution so logically it would obviously have a lower draw average, which is exactly what I am seeing.
![]()
100w more power use than a 3090.
AMD has the means to compete, but doesn't seem too keen to actually do it. Could be 50/50, since they don't compete with each other, doesn't matter.Shows what a lack of competitive competition does.
Nvidia have an effective, 80+/20-% semi-monopoly.
And they're totally happy screwing us with it.
Very true, but meaningless if the 7900xt doesn't cover the need that 4090 manages to do. That is true also for 7900xtx vs 7900xt or any another similar example. The card first and foremost has to cover the need of the buyer and only then we can talk about price/performance and other metrics.To be fair, the 4090 is twice the price of the 7900 xt!
Pretty massive difference in affordability there.
Is an extra 8Gb of RAM worth the $100 as well ?What a world we live in...
$399 for a 1.15x improvement over the previous generation
![]()
![]()
NVIDIA GeForce RTX 4060 Ti and RTX 4060 announced - Pricing, Specifications, and Performance
VideoCardz has obtained significant insights into NVIDIA's forthcoming GeForce RTX 4060 Ti and RTX 4060 "Ada" graphics cards, gleaning this information from the NVIDIA press-deck. ...www.guru3d.com
Hey great article.. Nvidia even states 16GB cards now 1080p ... 4060Ti....
The article is nothing more than them explaining how they are ripping people off with VRAM and pretending to have software tricks to help and on silicon cache that their competitor has been doing for ages now and they copied it basically and found out they can basically make weaker cards perform better with it and have an excuse why they don't need more VRAM on their cards than their competitor that has more VRAM and Cache...
Ohh also don't forget DLSS 3 AKA fake frames generator... just to make them FPS tools read high fake numbers.![]()
Yeah, and is the same with RT/PT.I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.
I do wonder if people slating it have actually used it.
Personally I don't mind the tech as long as it genuinely makes the game more smoother to play. However, I don't like how Nvidia uses it to compare with older gen cards that doesn't support the tech just to inflate their numbers. Not every game supports it so I prefer seeing the base numbers for a more realistic comparison.I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.
I do wonder if people slating it have actually used it.
DLSS and FG are game changers, but usually it's amd proponents that go around forums talking smack. It's been that way for the last 15 years in every damn forum. I still remember the AMD crowd going nuts about how insanely good the FX8150 was. People never change...I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.
I do wonder if people slating it have actually used it.
Yeah it's "up to" 450 watts, that doesn't mean it's using 450 watts even at 100% utilisation depending on the game.
Just double checked in Cyberpunk with PT enabled and went into HWINFO:
![]()
A 400 watt max peak, obviously the average is much lower, 237 watts. Now the 3080 Ti on the other hand would be over 300 watts average.
Edit*
3080 Ti with the same settings, 314 watts but at 31 fps whereas the 4090 is at over 100fps![]()
I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.
I do wonder if people slating it have actually used it.
Can smell Jensens desperation from here. Let's see who will shill his crap.
![]()
NVIDIA to give away 460 GeForce RTX 4060 Ti and RTX 4060 graphics cards - VideoCardz.com
NVIDIA announces “Summer of RTX”, hundreds of RTX 40 cards to be given away As part of the today’s introduction of the GeForce RTX 4060 series, NVIDIA announced it will give away hundreds of graphics cards to gamers and influencers during its ‘Summer of RTX’ campaign. The RTX 4060 is not the...videocardz.com
31fps to 100fps is not comparing apples to apples.. So 3080ti was running native and 4090 was running DLSS3 ? Is that a fair comparison now ?
Come on mate I thought better of you of all people on this forum.
Used DLSS3 on MS flight sim and hated it as it causes a lot of artifacts and weird issues to the text on the panels and the game.
I'm not falling for Nvidia's tricks and games.. I buy hardware from them so expect hardware to give me the uplift I'm paying for and paying more for every generation, not software tricks that actually make the image worse in the titles I use and adds other strange things. I'm not buying software from Nvidia apart from the hardware drivers and nothing else all these tricks should not be needed if the hardware was up to it in the first place, so now they use tricks to show an uplift from previous gens instead of hardware.
Again I pay Nvidia for hardware so expect these uplifts at native settings at we have always measured uplift from gen to gen. Or should we go back to the tricks Nvidia and ATI/AMD played by reducing visual quality in benchmarks to make their hardware look better ? Ohh wait we are doing that now and excepting it as the norm. Sorry not falling for that and never will. Also I use my GPU for more than gaming so I don't need gaming tricks added and need real hardware uplifts for real work (soon DLSS3 for work apps.. half the data goes missing but twice as fast...).
![]()
Nvidia accused of cheating in big-data performance test by benchmark's umpires: Workloads 'tweaked' to beat rivals in TPCx-BB
GPU giant says it'll play ball soonwww.theregister.com
![]()
Nvidia accused of cheating in 3DMark 03
Futuremark, the maker of 3DMark 03, releases a patch for the graphics benchmark designed to correct the way Nvidia drivers manipulate results in the popular test.www.gamespot.com
![]()
Nvidia Points Finger at AMD's Image Quality Cheat
Are there shenanigans under the hood?www.tomshardware.com