• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

That's wild that a 7900XTX in raster can mostly match a 5080.

I'd be ashamed if I were an engineer at Nvidia, 2+ years, Hundreds of millions and basically just more AI junk.

Other than in DL2 on all of them the minimum FPS are higher on the 7900 XTX despite the averages being higher on the 5080, perhaps that's because the 7900 XTX is caching more data in VRam as to avoid file swapping to keep VRam usage down, eh? ;):D
 
there was 160W power difference between 5080 and 7900xtx though.

There is and not to take anything away from that the 7900 XTX is on TSNC 5nm while the 5080 is on an enhanced 4nm node.

According to TSMC the 4NM node Nvidia are using is 22% more efficient than the 5nm node RDNA 3 is using. 22%. :)
 
Last edited:
It is a bit of an eye opener if with Ultra RT settings the median FPS are 103 vs 89 to the 5080, that's +16%, a very real difference but considering how 'bad' RDNA 3 is commonly held to be at RT its impressive.
 
Last edited:
I've been messing with my new 7900xtx for the last week, tbh at 4k everything I've tried runs native 60+ without frame gen or scaling. I'm using lossless scaling with it to do 2x/3x frame gen (no scaling) to hit 4k 144hz and it's amazing, this even works well cyberpunk and most raytacing settings on (although I've to use fsr/xess quality).
 
Other than in DL2 on all of them the minimum FPS are higher on the 7900 XTX despite the averages being higher on the 5080, perhaps that's because the 7900 XTX is caching more data in VRam as to avoid file swapping to keep VRam usage down, eh? ;):D

Or...



Because it turns out that even with 16 GB graphics cards, problems can still happen quickly. More precisely, this applies to Ultra HD, whether ray tracing is activated or not: Radeon graphics cards with 16 GB VRAM do not show the maximum performance in Spider-Man 2, the frame pacing gets around. Only from 20 GB the game runs with maximum graphics and without RT without any problems. GeForce models are then superior to memory management as usually, here 16 GB are sufficient for maximum performance.

With only 12 GB, problems can also occur on a GeForce. Because in Ultra HD the Nvidia GPUs break in with and without RT, 16 GB are simply necessary here. For UWQHD, 12 GB are then sufficient, but only as long as ray tracing remains switched off. Spider-Man 2 is playable with the beams, but the full performance does not exist either. On a Radeon, 12 GB are not enough in UWQHD, regardless of whether ray tracing is active or not. In WQHD, 12 GB are sufficient on an AMD product.

In WQHD, GeForce graphics cards still come up with only 8 GB to a decent gaming experience. The frame rate is then slightly slowed down, but the game flow remains okay. With a Radeon graphics card, 8 GB, on the other hand, is no longer sufficient, at least not if the PCIe interface has been shortened at the same time as on the Radeon RX 7600.
 
Intel B580? Pretty damn meh of a card.
You miss the point though. I agree the card doesn't have any punch compared to something like a 7900 XTX or 4080/5080 but still, you can play games on it just fine at 70-80+ fps 1440p and decent settings unless its a POS optimized game. So my point stands, that gaming isn't dead, fare from it. Our interest for cool powerful hardware though is certainly under pressure.
 
You miss the point though. I agree the card doesn't have any punch compared to something like a 7900 XTX or 4080/5080 but still, you can play games on it just fine at 70-80+ fps 1440p and decent settings unless its a POS optimized game. So my point stands, that gaming isn't dead, fare from it. Our interest for cool powerful hardware though is certainly under pressure.

I don't expect it to be 7800XT standard but it's around 4060 performance. Worse, because of the CPU overhead/drivers/DX9

It's not good enough for the asking price. Sort the CPU overheard increase performance by ~20% and then I'll consider it.
 
Or...



I think it was HUB who investigated something quite similar to this, what is happening with Nvidia GPU's when they run out of VRam is the texture resolution is much reduced, the textures are visibly blurry in comparison to a GPU with more VRam, so it doesn't manifest its self, necessarily, on the performance charts, and it can look like the GPU is using less VRam, because it is with it running a reduced texture quality, you would actually have to compare the visuals of the game to see its not just as simple as 'number lower'
Just looking at the numbers and concluding "Nvidia better at managing VRam" in a sense yes, the question is the side effects of that and weather its a reasonable argument vs just having a higher VRam pool. I would argue it isn't but Nvidia always find a way of giving you less and then somehow have its fans die on the hill of advocating less is better.
Its quite amazing actually.
 
Last edited:
The thing is @TNA i have a 16GB GPU, i don't worry about any of this, i'm blissful in my ignorance that my GPU might not be as good as a 4070 at managing VRam.... because it doesn't ******* need to. I don't need excuses for it!
 
Last edited:
there was 160W power difference between 5080 and 7900xtx though.

I honestly find it hilarious people go on about power draw.

Buys a £1000 graphics card but cry's about spending an extra £20 a year on electric, if you can't afford to power the card maybe you shouldn't be buying cards in that tier in the first place (Not directed at you just a general thing)

https://www.theenergyshop.com/guides/electricity-cost-calculator you can work it out so easy 160 watts is £260 a year ASSUMING you spend 100% of the time at full power draw, every hour, every second every millisecond at 100% power draw at all times.

That's just not going to happen.

Now lets say someone plays 4 hours a day (28 hours a week (I should point out the average game time per week is 8 hours)) that's £46.37 a year again assuming that in that period of time 100% of the time you are drawing 100% power.... which again... does not happen in a gaming session, depends on the game, what you're doing, the environment.

A generous assumption assumption is it costs £20 more a year to run, which is not worth even talking about when you're spending £1000 on a card if you can't afford £20 on electric buy a cheaper card and set your priorities right.

It's just when you're talking about the level we are on this shouldn't even be part of the conversation, the cost different is negligible to non existent if you are a light gamer, barely noticeable if you are a medium gamer. Even if you are a heavy gamer you would need to be clocking 8 hours a day to warrant the difference EVEN THEN you would have to be someone who keeps the card for at least 4 years or more to factor in the difference in prices to make a saving.

This is all based on UK electric costs we have some of the higher costs, go to America and Canada their costs are even lower so the difference is even narrower.

I wish people would move away from the focus on power, performance should be the selling point.
 
Last edited:
I honestly find it hilarious people go on about power draw.

Buys a £1000 graphics card but cry's about spending an extra £20 a year on electric, if you can't afford to power the card maybe you shouldn't be buying cards in that tier in the first place (Not directed at you just a general thing)

https://www.theenergyshop.com/guides/electricity-cost-calculator you can work it out so easy 160 watts is £260 a year ASSUMING you spend 100% of the time at full power draw, every hour, every second every millisecond at 100% power draw at all times.

That's just not going to happen.

Now lets say someone plays 4 hours a day (28 hours a week (I should point out the average game time per week is 8 hours)) that's £46.37 a year again assuming that in that period of time 100% of the time you are drawing 100% power.... which again... does not happen in a gaming session, depends on the game, what you're doing, the environment.

A generous assumption assumption is it costs £20 more a year to run, which is not worth even talking about when you're spending £1000 on a card if you can't afford £20 on electric buy a cheaper card and set your priorities right.

It's just when you're talking about the level we are on this shouldn't even be part of the conversation, the cost different is negligible to non existent if you are a light gamer, barely noticeable if you are a medium gamer. Even if you are a heavy gamer you would need to be clocking 8 hours a day to warrant the difference EVEN THEN you would have to be someone who keeps the card for at least 4 years or more to factor in the difference in prices to make a saving.

This is all based on UK electric costs we have some of the higher costs, go to America and Canada their costs are even lower so the difference is even narrower.

I wish people would move away from the focus on power, performance should be the selling point.

1) I wouldn't spend £1000 on a GPU
2) If I were to get high power consumption GPU I'd need to buy a new PSU as well.
3) You can pay my leccy bill if you want
 
1) I wouldn't spend £1000 on a GPU
2) If I were to get high power consumption GPU I'd need to buy a new PSU as well.
3) You can pay my leccy bill if you want

Then the power draw of these tier of cards have no relation to you, like I said it's not aimed at you directly just in general.

If you're worried about your bills, i'd advise turning the switch off at the walls, phantom power adds up ;)
 
Power = heat.

Heat = noise for cooling.

That should be the discussion point (beyond ‘meltageddon’).

I don’t think anyone that’s seriously considering spanking £££££ on one of these cards is being turned off by leccy billz. For the target audience, it’s such a non-point.
 
Power = heat.

Heat = noise for cooling.

That should be the discussion point (beyond ‘meltageddon’).

I don’t think anyone that’s seriously considering spanking £££££ on one of these cards is being turned off by leccy billz. For the target audience, it’s such a non-point.

I'd want to keep GPU power consumption to a sensible amount- talking efficiency.
 
Back
Top Bottom