• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

To match the cooler performance of my £480 card you're probably looking at this one.... its £630.
This is the problem with Nvidia's mid level cards these days, they start out over priced, if you actually want one that you can live with the price is stupid! I considured all this, you don't go from being near a decade Nvidia to AMD unless you feel something is seriously wrong, there is.

 
Nope. He said he paid to play native rather than with upscaling. Do you really think the 7800xt will keep up with native while 4070s will need upscaled in raster? Giving the difference between them I'd say no. 4070 could get it's a** kicked due to VRAM, but that's different.

Anyway, hope he enjoys it. For me it isn't that different from Nvidia alternative, regardless of rtx or upscaling - so pure raster. I do hope the offering is better with next gen.


GTX 970
GTX 1070
RTX 2070S

I buy in the mid range, for 1440P, but upgrade every generation.

That only stopped when Jenson vomited what he called the RTX 3070, i still didn't buy AMD, struggled with the 2070S for another year and still didn't, then Jeson vomited again, i still didn't act but now i was getting itchy and increasingly peed off.

Until i decided, i had to decide as my GPU's third year was now long behind it, i did a lot of soul-searching and a lot of research and realised, among other things.... that for what it would cost me to get a good 4070 class card, one that didn't annoy me, one that i actually like and could take pride in owning, it would actually cost me getting to near as much as an RX 7900 XT, the price difference to move up to that card was less than the Sapphire RX 7800 XT Pulse was the other way, which i had been eyeing up for months knowing it was a good, solid version of that GPU.

So i bought it.

Had the 4070, vinila, been £480, maybe £500, for a good one, i would have bought it, what i really wanted was 16GB, which the 7800 XT also had but would have settled for the 12 GB with DLSS, but its silly money for one with a cooler that's as good as the one that's on mine.

And you know what, in the last 2 years quite a few people in my Discord have ditched Nvidia for AMD, i was pretty much the last one still holding out...
 
Last edited:
Do you have a high end GPU?

I hear ya, but I also know from experience that some people who buy high end GPU's is cause they want to max out all the settings and if you're not then what's the point of a high end GPU right

Maybe it's all about framerate now, things have certainly changed in the gaming landscape, I come from a time where when I first started PC gaming, people who had high end GPUs could get an advantage in multiplayer games due to the vast differences between low and high settings it was almost like two different games back then. And now low and high almost look the same in just about every game

He's in the market for one, currently with Nvidia, i forget exactly what it is, if he even told me :D
 
I am on a 4090 and I still use upscaling to drive my 240hz OLED monitor because I cannot see the difference in quality in motion and the motion clarity of 150-200 FPS on an OLED is insane. I actually subscribed to PureDark Patreon for his DLSS mods on non supported games.

I would rather use a less sharp image from DLSS, lay on RT on top than use native TAA. Most games TAA is just horrible. Just look at Forbidden West TAA flickering on vegetation. Even DLSS at 58% render scale makes it look better.

The higher FPS is just a bonus. To me, just losing the option and customisation of DLSS (enabling the use of DLDSR and RT) for the prospect of being able to run textures just a notch higher is just not attractive enough.

DLSS is better than FSR, way better, i don't know how many times i have to say that before people get it in to their heads that i'm not saying this isn't true, i know it is, i had an RTX card for more than 3 years....

My point is and always has been...

DLSS is a reason not to buy AMD, it is not a reason to jack the price up to silly.


I have joined the red team, Purely for WoW and some Star Citizen now and then :)

Another one... :)

AMD won against Intel partly because of Intel's own missteps which gave AMD time to get it's act together. Nvidia is not Intel and won't give them this option.

Rumors are the 5090 is 70% faster than 4090 and with AMD languishing on 7900XTX levels of performance, Nvidia will leave them in the dust before long.

True. Intel got complacent, having said that for about as long as these two have been competing AMD have been making better CPU's than Intel for at least as long as its the other way round, AMD started to make better CPU's in the late 1980's and right through the 1990's, Intel actually use a lot of AMD's IP in their CPU's, have done for decades.
There is no 64Bit without AMD.

Yep, I'm curious if they also go the chiplet route with their GPU's too but instead of having them spaced out like Ryzen, Having them connected at the silicon level like Apple does with their top end silicon.

This was meant to be RDNA 4, they shelved it, for now.

AMD are IMO the best semiconductor architectects, RDNA 3 is architecturally more advanced than Ada, just as Zen is more advanced that what Intel have, what AMD are not are software engineers, never have been, Nvidia are by far the best software engineers.
You know what, if Nvidia and AMD got together to work on a GPU it would be ______ mind-blowing....

pJytwl1.jpg
 
That was another eye opener for me, the 24/7 OC i run on my 7800 XT makes it almost identical to a 4070 Ti, its 98 or 99% as fast in raster, an £800 GPU, it is fast. At that it still run's very cool and very quiet.

Yes the 4070 Ti overclocks too, by 4 or 5%.
 
Also because on an OLED, the difference between 80 FPS and 120 FPS is huge unlike an LCD. At 150 or more FPS, the motion looks almost like real life. If you want to take advantage of this, you need DLSS, period

High end users typically don't care about power consumption. The 4090 was originally supposed to consume around 600W in their engineering samples but was toned down to 450W when Nvidia realised AMD could not compete. I would have been happy with a potential 500W XTX which competed with 4090

It will not scale like that in RT. The Nvidia is 70% or more faster. No amount of overclock will change that.

Also even if it did have good RT, you are are stuck with FSR and IMO that just looks very bad in motion. I turned it on in Forbidden West and while the image seemed sharper than DLSS, in motion things just completely break down due to shimmering in foliage. It's very distracting. Thanks to AMD's insistence on locking down FSR, you cannot even customise the render scale from 67. It really needs to be at 80% or so.

The fact that you can customise the DLSS render scale on Nvidia cards in itself makes their cards a better buy. I use DLAA in RDR2, a game which hasn't been updated in over 2 years just because DLSS is so customisable.

AMD are working on a much improved version of FSR.

1080P ultra performance, so worst case...

2.2

B2qSGmx.gif


3.1


zo3Ed1w.gif


 
Does make me wonder - everyone is certain AMD won't have a high end for a while. But these were from the same leakers who got trolled on Twitter with fake slides. It will be interesting to see what the reality is and whether RDNA5 gets pulled forward.

Its not as if they don't have a precedent for it.

Since 780 Ti vs R 290X there has been one high end competitor, arguably two if you include RDNA 3.

At this point AMD have not competed in the high end more than they have.
 
Last edited:
Nvidia apparently has increased the price of the RTX4060,RTX3050,etc by 10% - so £300+ RTX4060 cards:

It appears in the sub £500 area they CBA anymore.

"With AIB's to benefit"

Just reading that i already know what it is...

That's Nvidia conceding AIB's don't have enough margins after Nvidia take their cut, but instead of Nvidia reducing their margins they are allowing AIB's to jack the prices up 10%.
 
Last edited:
Been helping friends with budget builds,and I could get an RX7700XT or RX6800XT for similar to RTX4060TI 8GB money or an RX6750XT for RTX4060 8GB money. Last month there were RX6750XT 12GB cards for under £300. The RX6750XT is around RTX4060TI 8GB level performance. That would place the RX6750XT around 25% to 30% faster than an RTX4060 with 50% more VRAM.

And for £60 to £70 more you could get an RX7700XT 12GB/RX6800 16GB for around the same price as an RTX4060TI,which is easily 20% faster with 50% more VRAM.

Even if AMD could do better,the reality the sub £500 Nvidia offerings are poor.

And i honestly don't think AMD squeeze their AIB's as much as Nvidia do, AIB's have a larger slice of the Radeon pie.

That's why it really _______ gaulls me when i see low IQ tech jurnoes say things like "oH AMD jUst miNuS 10%, aMDEE fAult Nvida, BuY Nvidia, DlSS"
 
Last edited:

RDNA4 GPUs RX 8600 and RX 8700 will not feature GDDR7 but it will use same 18 Gbps GDDR6 memory featured on both RDNA3 GPUs RX 7600 and RX 7700.

Nvidia Blackwell RTX 5000 will be the only one that will use next generation GDDR7 memory.

18 GB/s? I get sticking with GDDR6 for now but 18 GB/s? up it to 21 if not 23, even the 7800 XT comes with 20 GB/s IC's.
 
AMD says it knows why it lags Nvidia and it wants to change; as such AMD is going to become a software focused company like Nvidia. AMD says it now knows that just trying to sell hardware doesn't work anymore. AMD says it is increasing its software team size by 300%


Welcome to 2004 AMD, you're not just making Athlon64 processors and then Microsoft goes "oh... that's neat, let's build an OS around this technology" followed by every software developer "we had better make 64Bit applications so we don't get left behind"

For example, Radeon GPUs have had tessellation capabilities at least two generations ahead of NVIDIA, which was only exploited by developers after Microsoft standardized it in the DirectX 11 API, the same happened with Mantle and DirectX 12. In both cases, the X-factor NVIDIA enjoys is a software-first approach, the way it engages with developers, and more importantly, the install-base (over 75% of the discrete GPU market-share). There have been several such examples of AMD silicon packing exotic accelerators across its hardware stack that haven't been properly exploited by the software community. The reasons are usually the same—AMD has been a hardware-first company.

Or yes to use another example.

PS: that was DX 10, not DX 11.

Why is Tesla a hotter stock than General Motors? Because General Motors is an automobile company that happens to use some technology in its vehicles; whereas Tesla is a tech company that happens to know automobile engineering. Tesla vehicles are software-defined devices that can transport you around. Tesla's approach to transportation has been to understand what consumers want or might need from a technology standpoint, and then building the hardware to achieve it. In the end, you know Tesla for its savvy cars, much in the same way that you know NVIDIA for GPUs that "just work," and like Tesla, NVIDIA's revenues are overwhelmingly made up of hardware sales—despite them being software-first, or experience-first. Another example of exactly this would be Apple who have built a huge ecosystem of software and services that is designed to work extremely well together, but also locks people in their "walled garden," enabling huge profits for the company in the process.

There is it, you finally realised it, i'm not mocking, well done, no one cares how good your hardware is, no one cares that you are first with hardware features, by the time tessellation was mainstream in games Nvidia had built a software stack around it and supported it better than you did, by the time it was mainstream it ran better on Nvidia hardware then it did on your own, that's unforgivable, you invented one of the most important graphics IP in the history of Graphics and by the time that mattered you let your competitor beat you.
This is arguably when it all started, you didn't learn from that and you didn't learn anything there after...

This is not to say that AMD has neglected software at all—far from it, the company has played nice-guy by keeping much of its software base open-source, through initiatives such as GPUOpen and ROCm, which are great resources for software developers, and we definitely love the support for open source. It's just that AMD has not treated software as its main product, that makes people buy their hardware and bring in the revenues. AMD is aware of this and wants "to create a unified architecture across our CPU and RDNA, which will let us simplify [the] software." This looks like an approach similar to Intel's OneAPI, which makes a lot of sense, but it will be a challenging project. NVIDIA's advantage here is that they have just one kind of accelerator—the GPU, which runs CUDA—a single API for all developers to learn, which enables them to solve a huge range of computing challenges on hardware ranging from $200 to $30,000

ROCm was the first time i have ever seen you take creating your own software IP stack seriously, an actual competitor to CUDA, wow.... But then you stopped in house development of it and shoved it off to the Open Source enthusiasts to 'play about with' what? Why? you should have 100 people on this with an annual budget of $300 Million developing it 24/7, that's the least Nvidia did with CUDA.
If you want to collaborate with Intel that's great, but what you should be doing is helping each other tailor make it for your respective hardware, like CUDA with Nvidia, like you said that is their advantage, it doesn't need to run on some random hardware that you have nothing to do with, that's not your problem.

There is an Nvidia accelerator, an Intel accelerator and an AMD accelerator, the end user should be able to chose one and the software just works for that hardware, they shouldn't have to go to some Open Source repository for it and ask some spotty teenage genius why this aspect of the software doesn't work because he's the only one who understands it given he made it while stuffing his face with pizza one night.

In house, everything in house, its how all the successful tech companies do it, they do it themselves for themselves.

I'm looking forward to all sorts of cool new stuff from AMD over these next few years, now that you get it don't make a mess of it, hire the very best software developers, you can afford it now, you are a $280 Billion company, literally double that of Intel, you're no longer the underdog, start behaving like the top dog.
 
Last edited:
While this is welcome news it will surely take years to bring benefits, and who knows what the GPU landscape will look like by then. I hope I'm wrong.

We could start to see the benefits of it sooner than you might think, if AMD have any urgency about it, you can get the best software developers on it if you pay them enough, if you get a lot of them on it you can get it done quickly.

AMD have a vast war chest to tap in to, the investment money, that's what its for, that's the whole idea of it, i give you money to help you grow and when you do that money has earned its interest.
In 2016 AMD was worth about $450 Million, they are now worth $280 Billion, $279.5 Billion more than they had in 2016, use some of it.....

AMD's fear is if they use it and something goes wrong and they lose it that money becomes debt, AMD's catastrophic fall from grace between 2009 and 2016 meant they lost their investors, they used some of that investment money which then became debt that crippled them, thier final act in a bit to survive long enough to get Zen 1 out was to sell their original 1969 home, it is now a carpark, that hurt, that really hurt. Scared for life kind of hurt.

Today AMD have functionally 0 debt.

jtylG7G.jpeg
 
Last edited:
It makes no sense

If AMD had a card in hand that was double the 7900xtx performance then why not release it, it would be significantly above the 4090. Unless of course its either BS, performance that only exists in a GPU emulation engine, or had stupid 800watt power draw
Why would it have 800 Watts power draw? The 7900 XTX is less than half of that. If the 5090 is 50% faster than the 4090 no one is going to say that's impossible it would have 700 watts power draw.

The rumour is it got cancelled because it would be too expensive.
 
Coming to think of it i distinctly remember when the 6900 XT was rumoured to be 2X the 5700 XT, people said the same thing about that, Oh that's impossible, it would have 500 Watt power draw, well it is 2X the 5700 XT and its 300 watts power draw.
 
Back
Top Bottom