• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

I hate long rambling posts that are full of half truths, its very tiring replying to them properly, so i'm going to ignore all of it and just say you misunderstand my agenda and reasonings, Radeon are walking dead, there is no winning for them no mater what they do and here is the kicker.... i'm at the stage now where i think it would quite interesting if AMD bow out, if Nvidia are on their own without the token competition to help them pretend there is some.

When i talk about the positives of AMD's GPU's i do that because i think that, not because i'm trying to make a loser look like a winner, they are losers, i can see that without people emphasising AMD's losing whenever they see someone talk about AMD's GPU's that is in anyway positive.

Edit: all that is a bit uncalled for on my part, sorry, i am tired and i honestly don't care about GPU wars at all, there isn't one anymore Nvidia have won outright so i just don't care now....

I'll buy whoever's GPU for as long as i think its worth it, which might not be for much longer. The 4060 Ti 16GB is a joke, so is the 4070 still at £500, the 4070 S even more...... why? because it gets 24 FPS instead of 17 in Silent Hill 2? you're posting that slide with a serious face. eh..... **** it.
 
Last edited:
I did say there that 24 turns much easier into 60fps +, be that is with RT or not and it's much easier/better on the competition. Surely you see the benefit of this into something like SC/SQ42?

They are using AMD's white paper on RT, and AMD GPU's seem to handle the Volumetric clouds much better than Nvidia.

IDK but they do talk about the importance of people being able to actually run RT or what's the point.
 
Last edited:
Any ideas when the 7800xt and 7700xt will be reduced in price?

They already have, i mean i paid £480 for one of these, its £420 now.


7700 XT is now £360, down from about £430.


If you're looking for further reductions i don't know but i can't see them coming down by much more if at all, they are selling quite well now and when the replacement GPU's land next year they are going to be £550 maybe £600...

I would buy now. When stock runs low they might go back up in price.
 
Last edited:
I mentioned it in the context of AMD GPU development, as what the console does gives us the best insight as to what we can expect from the next RDNA GPUs, so let's not go off-topic into PC vs consoles. But let's be clear, a PC would wipe the floor with a console from a value perspective, and it's not even close (and without a single used component, because then it would just plain unfair for the poor console). The only advantage for a console is around simplicity of use, nothing more.

Funny how you forgot to mention the £50/year you have to pay for for basic functions on consoles. So if we assume you keep this for 4 years until PS6 then that's another £200 you have to add to the costs for PS5P.
Feel free to add another £30 for M/KB + OS key (even though it can be got for free - legally) to the PC. Still lower than the console. Never mind over a longer timeframe..

86hDXNo.jpg

You built a good system there for that money but that 4060 Ti is not going to cut it vs the PS5 Pro, Linus made the same argument and built a system for about the same money, with a case, a used case, and an RX 7800 XT, even with that he struggled to keep up with the PS5 Pro performance just in raster, RT blew it out of the water and in raster the RX 7800 XT is a solid 40% faster than the 4060 Ti, its even 15% better in RT.... so yeah, if an RX 7800 XT can't match a PS5 Pro a 4060 Ti is way out of it.

You need a 4070 S, at least. That's £550 minimum just for the GPU.
 
Last edited:
Actually, it's about on par with the 2080 Ti. Realistically it's even worse than that because on PC you can tweak it better and that doesn't include the RT differential nor DLSS quality advantage. So I'm being generous with the 4060 Ti in fact (I'd have chosen the 4060 if not for the 8 GB vram). After January it's going to be even worse for the PS5P.

lsvdLjZ.jpg


Source

I'll be honest with you, for some time now i don't trust anything DF say when it comes to anything involving Nvidia. Take this for example, they are using a single game to base their entire argument from, i think its probably the only game they could find that suits the argument they wanted to make.

And you're parroting their argument, based on a single game.
 
Last edited:
It doesn't sound like either of you watched the video. They are looking at one game, and it's established that Alan wake 2 is one of the worst ps5 pro games so far, the ps5 pro is under performing in this game for some reason. The image quality of pssr in this game is poor compared to other games that use it. Remedy dropped the ball on this one, this is more of a developer issue than a ps5 pro issue


That being said, my personal opinion of the machine has not changed - it's way overpriced and if you are a pc gamer who's PC needs an upgrade spend that money on a new GPU instead of a ps5 pro because a $700 GPU will smash the console
You're right, i don't :)
 
People have been over this a few times. When it comes to wafer supply over the last few years with AMD in terms of priority:
1.)CPUs
2.)Consoles
3.)Enterprise dGPUs
4.)Consumer dGPUs

With Nvidia:
1.)Enterprise dGPUs
2.)Consumer dGPUs
3.)Tegra

AMD simply does not produce enough dGPUs and it's obvious they target more of the DIY market.The last time AMD had a very mass market dGPU was Polaris,which was produced at Global Foundries.

When it comes to prebuilt PCs,even in the UK which has a good supply of AMD cards,it's cheaper to specify Nvidia cards.

You can get an RTX4070 Super for the price of a RX7800XT,or an RTX4060 for the price of an RX7600,etc if you configure builds. It's pointless choosing an AMD card when Nvidia is better value.But many companies don't offer even AMD options - with laptops it's even worse. IIRC,Nvidia has well over 90% of that market. Compare that to the CPU market. So unless AMD bothers to allocate more wafers to dGPUs,then Nvidia will win by default.

If anything with Strix Halo,it seems they are more interested in building a faster IGP now.

Not entirely true this, for the last decade AMD has had a problem with accumulated EOL dGPU stock, they are making more than they can sell.

Also....

£420


£530, £110 more

 
Last edited:
I have that 7800 XT i linked, in terms of build quality and cooler performance its the best card i have ever owned, the previous card, an MSI 2070 S had a cooler twice the size and weight with much larger fans, it was a 220 watt GPU vs 240 watt, the cooler on that was less effective and louder.

With a few clicks i can get +15% performance actual out of it, 24/7 use easily and still cool and quiet, that puts it on par with an RTX 4070 Ti, still a £700 GPU, out of the box stock its also 15% faster in RT than the 4060 Ti 16GB which costs the same, its 40% faster in raster, out of the box.

The 4070 S has no business being over £500, the 4060 Ti 16GB has no business being over £400 and the 4070 Ti has no business being over £700, but they are..... because everyone thinks AMD cards are not worth anything, perpetuating that mindset will change nothing.
 
Last edited:
People don't buy AMD dGPU's in OEM just as they don't in retail, a 7800 XT speced OEM box costing the same as a 4070 S is not necessarily AMD's doing, do you think it is? I'll be honest i don't, why would they do that?

I think AMD would rather be competitive with Nvidia, i think they prioritize console because they can't sell dGPU's, its necessity rather than choice, as i said AMD are constantly left with EOL dGPU's they can't sell, regularly writing them off as a loss.

IMO they should have given up a long time ago and redirected those resources to one of the many things they are good at, but they keep trying, for generations i think they have been spending far more on R&D for GPU's than they get back in sales net, they are pulling money from profitable parts of the business to prop up dGPU's, does that seem like a company who cares least about dGPU's?

How do they turn this round? By matching Nvidia in RT and features, do i belive they can do that? No... i think AMD will catch up in RT and upscaling tech but Nvidia have 10X the R&D budget to draw on and by the time AMD catch up with Nvidia as they are today Nvidia will have something entirely new that every tech jurno will tell everyone is the reason not to buy AMD.
Its why i think AMD should throw in the towel and redirect that money.

What are they actually going to do? That ^^^ and concentrate on a more narrow more focused part of the market, good luck...
 
Nvidia already have that. ^^^^

Exactly why I think it is a ludicrous idea that Nvidia would deliberately let AMD have access to markets Nvidia worked hard to win.

I agree AMD should stand on its own merits but also realise that not even AMD can work medicals as the under dog, the price of our GPU's is only on one trajectory, up and that is a choice made by Nvidia.
 
Last edited:
What they need to do is stop trying to compete on Nvidia's own terms - the DIY market is fickle. AMD sells far more CPU platforms in desktop and laptops now. Make some good OEM focussed products and integrate them tightly with their CPU products as a bundle. Ask what the DIY PC builders want - just look at the handheld gaming PC APUs AMD has made and how popular they are.

Even go to a company such as Samsung,who has extra capacity to make these cards and use the platform integration advantages AMD will have over Nvidia. Trying to go after the high end only makes sense if AMD can compete in performance and feature set.

Nvidia isn't standing still in the CPU department:

AMD was attempting something different with Strix Halo,but it seems to have been delayed by nearly a year.That is something else they need to fix - they have far more resources now as a company. I am suprised they haven't made something like Strix Halo earlier especially with chiplets and 3D V-Cache. They could have done this during the Zen2 days.

AMD are soon to bring out APU's that are mid level dGPU fast, that is a strategy and yes Nvidia have already seen it coming, never mind AMD being a mortal threat to Intel and Qualcomm being an emerging threat to X86 Nvidia are a mortal threat to all of it, they wish to be the ones monopolising the entire space and in time they will, congratulations we got exactly what we asked for if we were aware of it or not.

@gpuerrilla you're now paying £750 for a ##70 class card, let that sit in your mind for a while and then question why everyone is talking about how people think AMD are the only ones who can do something about that.
 
Last edited:
Nvidia systematically destroyed what was a very rich and vibrant GPU industry, not exaggerating look at the history, we cheered them on..... AMD are the sole survivors of Nvidia's efforts, now Nvidia allow AMD to exist to give the impression of competition.

AMD can't do anything about Nvidia.
 
Last edited:
Intel are the walking dead and AMD will not survive an onslaught by Nvidia, K15 or no K15, AMD thinks its best strategy is to team up with Intel on X86, the problem for them with that is AMD need to back off a bit or they will inadvertently kill Intel off.
 
The problem with being labelled "The budget Brand" is no mater how unintentional it is derogatory, AMD hate that label because the truth is for a lot of people being seen as "Budget" is a huge put off, no one wants to be seen owning a "budget" anything, this is why AMD rage against it and people like Steve Walton would do well to employ some critical thinking once in a while before opening his stupid mouth, he's not doing the stereo typical view of Australians a good service.
 
Last edited:
The only response I can give you is funnily enough AMD are held account to do something about the prices yet all you have to do is frequent here and see that even if the product is decent people will buy the same old nvidia equivalent anyway. Its why the joke always was "..so I can get my nvidia card cheaper"! :)

It is up to the consumer to educate and work it out. You wont be able to win over many folk and shouldn't expect to. The danger is though they dip out. The price now of the 7800 being £430 is OK so in the scheme of this is offering something to people who would buy the ##70 class card?

My mate who bought a pc at the start of covid is upgrading right now, and because he had an AMD gpu in there already asked me if the 7800XT was his fit. This alone has good merit and judging by your own experience of the 7800XT shows its really a viable alternative to the 60/70s.

Its what 20% cheaper than the 4070 S.... for all that it lacks vs that card i think it works out as the better option, if you really want or need the faster RT and DLSS then pay 20% more, if not its a really good card.

The problem with expecting AMD to be much cheaper than that vs Nvidia, always no matter what, is that this is exactly how Nvidia killed ATI, if Steve Walton thinks AMD should always be 40% cheaper then pretty soon you don't have an AMD because at that point you're selling at a loss and if Nvidia chose to really make that hurt AMD Steve Walton has given them the tools to do that because the man is a blunt idiot with no critical thinking ability.
 
What I gleaned from your feedback was really insane. Its RT is actually beating the 60's and on par with the 70? I mean when its that case there is little point in entertaining the competitor unless your really committed to the "features". :)

I didn't say on par with the 70 class, i actually made the point in the post you quoted that if you want better RT get the 70 class card.

As for the 60, yes. A lot of people find that hard to believe because everyone thinks AMD are just so bad at RT its not worth anything, but yes :)

uyCmypU.png
 
Last edited:
Those relative RT performance charts are very misleading. Any game with even moderate levels of RT and AMD cards get hit harder than a ginger haired stepchild.

They do OK and to be fair even the mid range Nvidia card suffer. But still, AMD need to see at least a 20% relative improvement in RT performance for the 8800XT. So if it matches a 7900 XTX in raster, it needs to be at least 20% and closer to 30% faster in RT.

Agreed and looking through the results at 4K the 12 GB cards don't work at all with RT at 4K in some games, that explains the large discrepancy between 1440P and 4K with the 7800 XT / 7900 GRE and the 70 class cards

But i still think at 1440P the 7800 XT / 7900 GRE are a very usable RT card's, i do push back against the idea among some, not necessarily anyone here who holds the idea that these cards are not proper RT cards, in fact i would go even further than that, the slides below are the two extremes, so the 4070 at 37 FPS vs the 7900 GRE at 30 FPS puts the 4070 23% ahead, impressive, however no one in thier right mind is going to argue that either of these cards are useable in Cyberpunk at these settings and while i can't find the video now one of these Youtubers took the very brave step to see how much difference there actually is between these GPU's when you tuned to game to run at 60 FPS on one of them and then transferred those setting to the other, the answer is at the same setting in that scenario for 60 FPS there is very little if any difference between them, they both run at 60 FPS.

So while yes the 4070 is technically better, its not useably better.

rt-cyberpunk-2077-2560-1440.png
rt-far-cry-6-2560-1440.png
 
Last edited:
This has always been my outlook on it where the lower down the stack you go the less impressive or even viable RT is, so that cohort just wont be able to benefit. So far its been the higher end each gen that samples it to see where its come.

You're right, if its 4080 vs 7900 XTX with RT the 4080 is just better, no argument.

lower down the stack Nvidia vs AMD its much more grey, if you play Cyberpunk you're not doing it 37 FPS, you're going to tune the game to run at at least 60 and if you're doing that then actually the 7900 GRE is just as good, why? Because the Nvidia cards are only better when you literally chock them to death with OT RT settings, if you don't do that RDNA 4 doesn't suffer nearly as much.

Recently people talk a lot about low resolution CPU testing for games complaining that's not realistic, is blasting your GPU with RT to the point of making it unplayable realistic? It seems to me that's the only way Nvidia win that argument which is perhaps why every tech tuber does it like that, why is no one pushing back against that? The RT charts would look very different if they used setting people would actually run the game at.
 
Last edited:
Back
Top Bottom