A note about RT, "+50%" puts it right about here, 46 FPS, Cyberpunk is one of the worst for AMD.
Last edited:
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Yep I have the 6900XT which I use at 1440p so the numbers are looking positive. I'll still hold off for the reviews and some proper figures. Not long to go.Depending on price that might actually be worth upgrading from an RX 7800 XT.
Nice they upped RT performance but I'm disappointed they didn't shoot a little higher than 7900GRE. I was hoping for a little competition for my monies but I guess it's all eyes on nvidia.
Intel have shown that this is exactly right.The train can only crash if they price it wrong. If they get that right it will sell well and gain them market share.
All depends on price and how much you'll be able to sell your current card in Jan/Feb, but a reasonable upgrade path for those on a 7800xt or 4070 -- my guess is this is the market AMD is targeting most. ~20% raster improvenenr, much better RT.Depending on price that might actually be worth upgrading from an RX 7800 XT.
All depends on price and how much you'll be able to sell your current card in Jan/Feb, but a reasonable upgrade path for those on a 7800xt or 4070 -- my guess is this is the market AMD is targeting most. ~20% raster improvenenr, much better RT.
I think every release we always say "if priced right".
This is key.
I don't, the 6800 XT has 4608 shaders vs 3840 on the 7800 XT, the latter is 5% faster despite the former having 20% more shaders, that's a 25% difference per shader, they are clocked about the same. Actually the 7800 XT does have 625 GB/s memory bandwidth vs 512 GB/s on the 6800 XT but i'm not sure that matters much at 1440P where i'm measuring this.
With 7% more shaders, 64 CU's vs 60 and 3% more memory bandwidth 645 GB/s vs 625 GB/s about 10% higher performance looks like its on the nose, i agree and i'm not saying that's wrong, all i'm saying it you can't compare one shader to another, the difference in RDNA 1 vs RDNA 2 shaders vs RNDA 3 proves that, RDNA 3 is 2X RDNA 1 for 0.5X more shaders.
If its a 7900 GRE with better raytracing it needs to be under $450.
And every release AMD over prices its GPUs at launch. Will this time be any different?
I don't, the 6800 XT has 4608 shaders vs 3840 on the 7800 XT, the latter is 5% faster despite the former having 20% more shaders, that's a 25% difference per shader, they are clocked about the same. Actually the 7800 XT does have 625 GB/s memory bandwidth vs 512 GB/s on the 6800 XT but i'm not sure that matters much at 1440P where i'm measuring this.
With 7% more shaders, 64 CU's vs 60 and 3% more memory bandwidth 645 GB/s vs 625 GB/s about 10% higher performance looks like its on the nose, i agree and i'm not saying that's wrong, all i'm saying it you can't compare one shader to another, the difference in RDNA 1 vs RDNA 2 shaders vs RNDA 3 proves that, RDNA 3 is 2X RDNA 1 for 0.5X more shaders.
If its a 7900 GRE with better raytracing it needs to be under $450.
Tbf, the GRE was priced reasonably wellAnd every release AMD over prices its GPUs at launch. Will this time be any different?
I think every release we always say "if priced right".
This is key.
Clock speeds. They are definitely not clocked the same.
After a price drop. I remember when it launched in China, no-one here wanted it.Tbf, the GRE was priced reasonably well
I remember thinking the same with the 7900 XT. I was hoping for £700 at most. AMD got greedy… or stupid.
Yeah I remember initial responses were a bit meh, but that changed quite quickly.After a price drop. I remember when it launched in China, no-one here wanted it.