Soldato
- Joined
- 6 Sep 2016
- Posts
- 13,657
there was 160W power difference between 5080 and 7900xtx though.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
That's wild that a 7900XTX in raster can mostly match a 5080.
I'd be ashamed if I were an engineer at Nvidia, 2+ years, Hundreds of millions and basically just more AI junk.
there was 160W power difference between 5080 and 7900xtx though.
Dying Light 2 had RT Ultra enabled in the video.
Different levels of RT depending on the game, lite RT not as taxing
Other than in DL2 on all of them the minimum FPS are higher on the 7900 XTX despite the averages being higher on the 5080, perhaps that's because the 7900 XTX is caching more data in VRam as to avoid file swapping to keep VRam usage down, eh?![]()
Because it turns out that even with 16 GB graphics cards, problems can still happen quickly. More precisely, this applies to Ultra HD, whether ray tracing is activated or not: Radeon graphics cards with 16 GB VRAM do not show the maximum performance in Spider-Man 2, the frame pacing gets around. Only from 20 GB the game runs with maximum graphics and without RT without any problems. GeForce models are then superior to memory management as usually, here 16 GB are sufficient for maximum performance.
With only 12 GB, problems can also occur on a GeForce. Because in Ultra HD the Nvidia GPUs break in with and without RT, 16 GB are simply necessary here. For UWQHD, 12 GB are then sufficient, but only as long as ray tracing remains switched off. Spider-Man 2 is playable with the beams, but the full performance does not exist either. On a Radeon, 12 GB are not enough in UWQHD, regardless of whether ray tracing is active or not. In WQHD, 12 GB are sufficient on an AMD product.
In WQHD, GeForce graphics cards still come up with only 8 GB to a decent gaming experience. The frame rate is then slightly slowed down, but the game flow remains okay. With a Radeon graphics card, 8 GB, on the other hand, is no longer sufficient, at least not if the PCIe interface has been shortened at the same time as on the Radeon RX 7600.
What Cyberpunk isn't everything???? there are RT games outside of Cyberpunk? I thought all this stuff that wasn't Cyberpunk RT was fake RT?![]()
You miss the point though. I agree the card doesn't have any punch compared to something like a 7900 XTX or 4080/5080 but still, you can play games on it just fine at 70-80+ fps 1440p and decent settings unless its a POS optimized game. So my point stands, that gaming isn't dead, fare from it. Our interest for cool powerful hardware though is certainly under pressure.Intel B580? Pretty damn meh of a card.
You miss the point though. I agree the card doesn't have any punch compared to something like a 7900 XTX or 4080/5080 but still, you can play games on it just fine at 70-80+ fps 1440p and decent settings unless its a POS optimized game. So my point stands, that gaming isn't dead, fare from it. Our interest for cool powerful hardware though is certainly under pressure.
Or...
![]()
Spider-Man 2 im Benchmark-Test: Benchmarks (WQHD, UWQHD & UHD), Frametimes und VRAM
Spider-Man 2 im Test: Benchmarks (WQHD, UWQHD & UHD), Frametimes und VRAM / Das Testsystem und die Benchmark-Szenewww.computerbase.de
![]()
Spider-Man 2 Performance Benchmark
The 12GB and the 16GB (4070, 4080 etc.)will not save you because the cars will run out of juice first (see Indiana Jones PT) and then out of vram. I don’t see the point having a truckload of vram when the actual arch does not let you run demanding RT or PT. And I repeat that more and more games...www.techpowerup.com
there was 160W power difference between 5080 and 7900xtx though.
I honestly find it hilarious people go on about power draw.
Buys a £1000 graphics card but cry's about spending an extra £20 a year on electric, if you can't afford to power the card maybe you shouldn't be buying cards in that tier in the first place (Not directed at you just a general thing)
https://www.theenergyshop.com/guides/electricity-cost-calculator you can work it out so easy 160 watts is £260 a year ASSUMING you spend 100% of the time at full power draw, every hour, every second every millisecond at 100% power draw at all times.
That's just not going to happen.
Now lets say someone plays 4 hours a day (28 hours a week (I should point out the average game time per week is 8 hours)) that's £46.37 a year again assuming that in that period of time 100% of the time you are drawing 100% power.... which again... does not happen in a gaming session, depends on the game, what you're doing, the environment.
A generous assumption assumption is it costs £20 more a year to run, which is not worth even talking about when you're spending £1000 on a card if you can't afford £20 on electric buy a cheaper card and set your priorities right.
It's just when you're talking about the level we are on this shouldn't even be part of the conversation, the cost different is negligible to non existent if you are a light gamer, barely noticeable if you are a medium gamer. Even if you are a heavy gamer you would need to be clocking 8 hours a day to warrant the difference EVEN THEN you would have to be someone who keeps the card for at least 4 years or more to factor in the difference in prices to make a saving.
This is all based on UK electric costs we have some of the higher costs, go to America and Canada their costs are even lower so the difference is even narrower.
I wish people would move away from the focus on power, performance should be the selling point.
1) I wouldn't spend £1000 on a GPU
2) If I were to get high power consumption GPU I'd need to buy a new PSU as well.
3) You can pay my leccy bill if you want
Power = heat.
Heat = noise for cooling.
That should be the discussion point (beyond ‘meltageddon’).
I don’t think anyone that’s seriously considering spanking £££££ on one of these cards is being turned off by leccy billz. For the target audience, it’s such a non-point.
I'd want to keep GPU power consumption to a sensible amount- talking efficiency.