• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Alder lake + 6900xt and RTX3090, heaven or hell?

Thought I would try it out as it is on U+

my 5950x hasn't got enough grunt to load the 3080ti sits at ~70% GPU utilization with a 93min, 133avg and 183 max, I'd imagine if I had a similarly tuned ADL system it'd be right up there but i'm on W10, perhaps W11 is still a bit sucky for AMD.
 
Sure there testing the cpu but how releavent is it for most users ?

Exactly this.

I cannot for the life of me see how this is in any way relevant to ANYONE at either end of the spectrum, or anywhere else.

If you have a 3090 or 6900xt or 12900K (or anything else in that chart above) you will not be doing ANY gaming at 720p. And this is a gaming test/benchmark result, nothing else, just gaming.

If you are in the minority that games at 720, (then you're probably on a laptop) then surely this is irrelevant to you anyway.

Could someone please explain to me who would care about this and why? And who with top tier hardware of any sort wants to game at 720p.
Dont get me wrong, I can see the point thats being made, I just cannot see how its of any use :confused:

Its like saying I've got a Ferrari that's really poo in Morrisons car park, but the Porsche is much better because it has a tighter turning circle..... Neither car owner is going to give a monkies about Morrisons car park! :D
 
It is an odd one, at 720p over clocking my 3080Ti so it is in its least efficient state the card consumes ~190w because the CPU can't feed the GPU, set it to 4k and making the GPU earn its living doing what I bought it for and surprise surprise it is using ~390w.
 
Exactly this.

I cannot for the life of me see how this is in any way relevant to ANYONE at either end of the spectrum, or anywhere else.

If you have a 3090 or 6900xt or 12900K (or anything else in that chart above) you will not be doing ANY gaming at 720p. And this is a gaming test/benchmark result, nothing else, just gaming.

If you are in the minority that games at 720, (then you're probably on a laptop) then surely this is irrelevant to you anyway.

Could someone please explain to me who would care about this and why? And who with top tier hardware of any sort wants to game at 720p.
Dont get me wrong, I can see the point thats being made, I just cannot see how its of any use :confused:

Its like saying I've got a Ferrari that's really poo in Morrisons car park, but the Porsche is much better because it has a tighter turning circle..... Neither car owner is going to give a monkies about Morrisons car park! :D

I want to know how much future GPU horsepower I can buy before I need to upgrade a given CPU.

If they had a time machine, they could just go into the future, grab a 4090, 5090, 6090, 7900XT, 8900XT, and a 9900XT, then come back and show me which *current CPUs* hold back which *future GPUs* and by how much.

-Or they could just simulate future GPU horsepower by running current-gen GPUs at 720p.
 
I want to know how much future GPU horsepower I can buy before I need to upgrade a given CPU.

And this doesn't answer that.

The sort of people that game on top tier hardware aren't going to be bothered by the cost of a cpu that costs a fraction of high end gpu's in say 5yrs time. And even if they are, the fps shown in the charts are pretty high anyway.
 
And this doesn't answer that.

The sort of people that game on top tier hardware aren't going to be bothered by the cost of a cpu that costs a fraction of high end gpu's in say 5yrs time. And even if they are, the fps shown in the charts are pretty high anyway.

That statement is a bit wrong for me, I buy high end GPUs and could keep the same CPU for years because you are using a good GPU to push pixels at high res typically, here CPU only effects your mins in a very small scenario, the gaming performance improvement going from Ryzen gen1 to gen5 is almost non existent in gaming @ 4k, so popping the best one in you could at the start would mean you could validly keep it going for years as a new CPU gives you nothing really.

I have gone from 1900x->2920x->5950x for non gaming reasons, all with 1080Tis, there was negligible gaming improvement, whereas dropping a GPU in is a massive bump, running SLi on the 1080Tis was more cost effective for me than a CPU upgrade from a gaming perspective even if it didn't work in everything.

At lower resolutions where people are shooting for high fps CPU becomes important as you have to overcome stuff like Nvidia driver overheads by brute forcing it etc. that;s were tests like this help show what CPU is strongest, in actual gaming engines rather than Cinebench.
 
Last edited:
You could be right, certainly its not an Apple to Apples comparison, my problem is i don't have the GPU muscle to make it CPU bound at Ultra Settings, even at 720P.

720P Ultra, but this is almost entirely GPU locked.

25kaK8n.png
The game isn’t optimised as well for Intel CPUs. If you see several YT videos, the 10900K loses very badly to the 5900X by a margin whuch is huge.
 
The game isn’t optimised as well for Intel CPUs. If you see several YT videos, the 10900K loses very badly to the 5900X by a margin whuch is huge.

Wait what? Really? Farcy has always been bad on AMD vs Intel, i'm not questioning it i'm just genuinely surprised at that :)
 
RDNA2 + 8 Alder lake p-cores with hyper threading looks pretty decent for gaming. I wouldn’t be surprised to see Intel releases a 12800K without the e-cores.
 
Back
Top Bottom