• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

But AMD was actually in talks with Samsung and TBH I doubt Instinct has the volume to actually impact Radeon's costs.
AMD is not in an easy position TBH so either they manage to find a way to provide at least 80% of the value for 50% of the price or they won't be able to expand their market share.

Look at what's happening in the smartphone market.
Mediatek is basically playing AMD against Qualcomm but they're leaving the budget segment unguarded, what's happening is that Unisoc is starting to eat their lunch there.
Intel could potentially do the same if they can make Battlemage viable so AMD has to try something disruptive or they will find themselves Matroxed into Instinct and semi-custom.
 
But AMD was actually in talks with Samsung and TBH I doubt Instinct has the volume to actually impact Radeon's costs.
AMD is not in an easy position TBH so either they manage to find a way to provide at least 80% of the value for 50% of the price or they won't be able to expand their market share.

Look at what's happening in the smartphone market.
Mediatek is basically playing AMD against Qualcomm but they're leaving the budget segment unguarded, what's happening is that Unisoc is starting to eat their lunch there.
Intel could potentially do the same if they can make Battlemage viable so AMD has to try something disruptive or they will find themselves Matroxed into Instinct and semi-custom.

I didn't know AMD was in talks with Samsung, its an interesting one to watch.

The way i see it, the thing about workstation GPU's is that segment is growing, more and more people want them and not just in huge batches, more small outfits and even individuals are looking for them, for the latter it is about cost and if instead of selling them for $2000 a pop what if they sold them for half that or even less? AMD could carve out a new market for it self, its probably not very politically correct to make this segment so affordable, Nvidia will hate them for it but so what AMD are opening up that market to more people.

AMD did this first with HEDT where a Ryzen 1800X was half the price of a 6900K and just as good if not better, they did it again in the server space, Intel was charging up to $50,000 per chip, AMD brought that down to $18,000 with better chips, its why Intel are in trouble, running fabs is hugely expensive and you can only do that as an individual by selling chips for $50,000 a pop to pay for it all.

They could disrupt the GPU workstation segment in the same way.

Honestly i don't follow the smartphone market, other than one Samsung collaboration i didn't think AMD have any presence there at all?

Intel have all but given up on the dGPU market, the rumour is Battlemage consists of one small SKU for laptops, i don't blame them, they thought they could just walk in and take AMD's market share but quickly found out just how brutal this segment is.
 
Last edited:
I didn't know AMD was in talks with Samsung, its an interesting one to watch.

The way i see it, the thing about workstation GPU's is that segment is growing, more and more people want them and not just in huge batches, more small outfits and even individuals are looking for them, for the latter it is about cost and if instead of selling them for $2000 a pop what if they sold them for half that or even less? AMD could carve out a new market for it self, its probably not very politically correct to make this segment so affordable, Nvidia will hate them for it but so what AMD are opening up that market to more people.

AMD did this first with HEDT where a Ryzen 1800X was half the price of a 6900K and just as good if not better, they did it again in the server space, Intel was charging up to $50,000 per chip, AMD brought that down to $18,000 with better chips, its why Intel are in trouble, running fabs is hugely expensive and you can only do that as an individual by selling chips for $50,000 a pop to pay for it all.

They could disrupt the GPU workstation segment in the same way.

Honestly i don't follow the smartphone market, other than one Samsung collaboration i didn't think AMD have any presence there at all?

Intel have all but given up on the dGPU market, the rumour is Battlemage consists of one small SKU for laptops, i don't blame them, they thought they could just walk in and take AMD's market share but quickly found out just how brutal this segment is.
Here's something on the fab side from a reliable source, just a few months old: https://www.tomshardware.com/tech-i...s-it-looks-to-dual-source-future-chips-report
 
Agreed and looking through the results at 4K the 12 GB cards don't work at all with RT at 4K in some games, that explains the large discrepancy between 1440P and 4K with the 7800 XT / 7900 GRE and the 70 class cards

But i still think at 1440P the 7800 XT / 7900 GRE are a very usable RT card's, i do push back against the idea among some, not necessarily anyone here who holds the idea that these cards are not proper RT cards, in fact i would go even further than that, the slides below are the two extremes, so the 4070 at 37 FPS vs the 7900 GRE at 30 FPS puts the 4070 23% ahead, impressive, however no one in thier right mind is going to argue that either of these cards are useable in Cyberpunk at these settings and while i can't find the video now one of these Youtubers took the very brave step to see how much difference there actually is between these GPU's when you tuned to game to run at 60 FPS on one of them and then transferred those setting to the other, the answer is at the same setting in that scenario for 60 FPS there is very little if any difference between them, they both run at 60 FPS.

So while yes the 4070 is technically better, its not useably better.

rt-cyberpunk-2077-2560-1440.png
rt-far-cry-6-2560-1440.png
It's straight out of nvidias playbook. I would postulate that most people only look at what the setting is called(low/high/ultra etc.) and not what it actually does. We've seen before how Ultra or max settings are just cranking something to the 11th degree with no real visual benefit which leads to horrible performance, its always that one sneaky setting among the rest. We saw it in the Witcher 3 where both AMD and Nvidia's kepler got slaughtered. We saw it with Crysis(can't remember which one but its a classic case). It doesn't really matter much to nvidia in these partnership games if the actual framerate is playable on their newest gen either, as long as nvidia number > their competitor.
 
As per hardware unboxed's latest poll, which received over 30 thousand replies, the result was that 80% of gamers who have a 4k screen don't play at native, they use dlss or FSR
 
As per hardware unboxed's latest poll, which received over 30 thousand replies, the result was that 80% of gamers who have a 4k screen don't play at native, they use dlss or FSR
Yes I can believe it. I am really interested in RDNA 4 and hope that it performs well at 1080p with infinity cache, and that FSR4 is well supported. I just got a 9800x3d so will play as many games as possible with upscaling.
 
As per hardware unboxed's latest poll, which received over 30 thousand replies, the result was that 80% of gamers who have a 4k screen don't play at native, they use dlss or FSR
Makes me laugh that so many people bought a 4K screen to not play at 4K.

It tells you that these 80% don't know jack about what it means to game on PC, bet if you show them the options menu in a PC game they will crumble.

I will bet that 100% of that 80% never heard of anisotropic filtering for example, PC gamers are a joke for a long time now.

Did anyone see LTT video roasting peoples set up not long ago, viewers set up that is and how horrid they were, PC master race that.
 
4k is a luxury tier and has been artificially pushed since the Polaris/Pascal generation.
I don't see the point of it unless you're playing with a 40" monitor close to your face and it was just an excuse to push GPU prices up.
If you look at performance at the same price point of a GTX 1080ti, you'll realise we're at 1440p at what used to be an elite price card and firmly still 1080p for what used to be entry tier but now at mid-range prices.
 
Makes me laugh that so many people bought a 4K screen to not play at 4K.

It tells you that these 80% don't know jack about what it means to game on PC, bet if you show them the options menu in a PC game they will crumble.

I will bet that 100% of that 80% never heard of anisotropic filtering for example, PC gamers are a joke for a long time now.

Did anyone see LTT video roasting peoples set up not long ago, viewers set up that is and how horrid they were, PC master race that.
IMHO I don't see anything wrong with having a 4k screen and game at what is effectively 1440p using an upscaler. As much as I dislike the upscalers, due to the laziness it seems to introduce including the for the most part false "better than native" , they are now at a point where a 4K image using 1440p upscaling is better looking that native 1440p. You could argue that 4K and above is where upscalers starts to make the most sense.

And what does it mean to "game on a pc"? Im genuinely curious :)

Did see that LTT video and yes some setups are shocking to look at. Even gross and yucky in some cases. Why?how?WHY????!?!? clean it up!!!
 
New rumour says rdna4 is the last rdna for AMD. After rdna4 AMD will switch to a new architecture called UDNA and this architecture will feature an ALU like GCN before it and the same architecture is used for consumer and data centre


This is more interesting than a continuation of rdna as it continues to struggle in the marketplace.

My realistic expectation of rdna4 is really more of the same and it’ll not be a breakthrough product. Sales will mainly come down to how nvidia prices their lineup more so than what amd offers.

They need a breakthrough product and hopefully udna is more of that.
 
i guess it just means they dont want to duplicate r&d investments.. more of a cost optimization initiative
nvidia is owning amd in both enterprise (workstation and ai accelerators) and consumer spheres, so i guess amd will be pivoting to handhelds/consoles/apus etc.
nvidia seems to be also serious about arm cpus, so amd's handheld advantage might be shortlived as well.
 
This is more interesting than a continuation of rdna as it continues to struggle in the marketplace.

My realistic expectation of rdna4 is really more of the same and it’ll not be a breakthrough product. Sales will mainly come down to how nvidia prices their lineup more so than what amd offers.

They need a breakthrough product and hopefully udna is more of that.

To be fair an RDNA3 7900 XT or XTX with ~30% better RT and a decent price of ~$500, would be a breakthrough moment.

I know some in here would be saying “that’s just a 4070Ti or 4080 and we have had those for years”, yeah but never anywhere near the $500 segment. Ironically where they used to be.
 
Last edited:
i guess it just means they dont want to duplicate r&d investments.. more of a cost optimization initiative
nvidia is owning amd in both enterprise (workstation and ai accelerators) and consumer spheres, so i guess amd will be pivoting to handhelds/consoles/apus etc.
nvidia seems to be also serious about arm cpus, so amd's handheld advantage might be shortlived as well.

I think this is the case as well. A unified architecture will improve margins and give amd more profit on the GPU division. However will it negatively impact gaming performance like GCN did, that remains to be seen
 
I think this is the case as well. A unified architecture will improve margins and give amd more profit on the GPU division. However will it negatively impact gaming performance like GCN did, that remains to be seen
Loved my Vega 64. However it took AMD a While to get the drivers singing.
 
Makes me laugh that so many people bought a 4K screen to not play at 4K.

I think it's already been said, but a lot of folks do other things on their 4K screens, 4k video content, productivity, etc.

I think nowadays it seems normal to have a better monitor, relative to the GPU. Especially given how there's amazing deals on good spec monitors, compared to decent GPUs being too pricey for folks to afford. I.e. folks with 1440p high refresh monitors won't have GPUs that can run games at 100+ fps at 1440p, and similarly for 4k monitors, the GPUs most of those folks have wouldn't be able to run native 4k games at decent framerates. I don't bother playing games on my 1440p monitor, since the old GTX970 wouldn't be able to maintain even 60fps, let alone 100+. A relative of mine recently bought a couple decent spec 1440p high refresh IPS monitors from OCUK for £300 and that's both together, not one each (and their GPU is an old RX480). I don't see amazing deals on GPUs like that nowadays.
 
I think it's already been said, but a lot of folks do other things on their 4K screens, 4k video content, productivity, etc.

I think nowadays it seems normal to have a better monitor, relative to the GPU. Especially given how there's amazing deals on good spec monitors, compared to decent GPUs being too pricey for folks to afford. I.e. folks with 1440p high refresh monitors won't have GPUs that can run games at 100+ fps at 1440p, and similarly for 4k monitors, the GPUs most of those folks have wouldn't be able to run native 4k games at decent framerates. I don't bother playing games on my 1440p monitor, since the old GTX970 wouldn't be able to maintain even 60fps, let alone 100+. A relative of mine recently bought a couple decent spec 1440p high refresh IPS monitors from OCUK for £300 and that's both together, not one each (and their GPU is an old RX480). I don't see amazing deals on GPUs like that nowadays.
Yeah that's not what the poll I was referring to covers at all.

You can bet the people that took the poll are predominantly gamers that don't have a clue what to do in the options menu.

Let's not make excuses for dumb fake pc gamers now who embrace pc master race
 
4k is a luxury tier and has been artificially pushed since the Polaris/Pascal generation.
I don't see the point of it unless you're playing with a 40" monitor close to your face and it was just an excuse to push GPU prices up.
If you look at performance at the same price point of a GTX 1080ti, you'll realise we're at 1440p at what used to be an elite price card and firmly still 1080p for what used to be entry tier but now at mid-range prices.

Agreed, 1440p on a 27" screen looks very decent, and now with some pretty amazing deals on 144Hz+ QHD gaming monitors, its that ideal balance between 4K and better than 1080p clarity, whilst still being achievable on midrange and up hardware in modern titles. It's part the reason the pro consoles of this and last generations internally targetted approximately 1440-1600p over true native UHD in most cases, as it offers a noticeable clarity and detail boost (I realise not as much as 4K), but without requiring excessive horsepower to drive it well. I think AMD, for all thier bad decisions, did make a fairly good decision targetting 1080p and 1440p, more than 4K. 4K is more marketing halo buzzwords, and most people probably wouldn't even be able to tell you which was better, if you had a 1440p game running at decent settings, framerate and refresh rate, over 4K, but running with low settings and framerate; they MIGHT notice the sharpness, but likely if the visual settings were notably better on the QHD, they'd likely think that was the 4K. The amount of people who don't read or care is massive, compared to the relative niche that care about such things, and even amongst those of us who do, there are vast differences in the level of 'pixel peeping' people are willing to put in; especially as some performance sinks are barely visually noticeable beyond a certain level; even if they do enhance overall presentation.

As mentioned on this page as well, deals on good quality monitors are really moving forwards in a way that they just aren't in the GPU space right now.

A few years ago you'd have been lucky to get a half decent QHD screen with a not awful IPS panel that went over 120Hz, for less than £300 and QHD screens at above 60Hz without awful panels were prohibitively expensive.
By contrast, a few weeks ago, I paid £225 on offer for an IPS, 180Hz panel, with a MiniLED backlight that also hits around 1250-1300 nits brightness. It's not perfect by a longshot - I wish the zones on it were more granual/smaller (it's no where near the granularity that OLED offers), more control was allowed when it's in HDR mode, and it went a little brighter as base brightness for HDR, but my god, once you dial it in (as Windows HDR is still kinda ****), it's not perfect, but its closer to OLED or a high end very high zone-level screen than any other cheap LED I've seen, and it doesn't have any burn in risks that'd come with a fancy OLED; and frankly a little light haloing on a black or monotone screen can be ignored, as it looks fantastic when gaming or in motion; at least until considerably superior screens have reached a decent price in maybe 5+ years.

I mean, for lack of a better words, what you can get in monitors at a reasonable and acceptable price range has moved forward at such a pace in the last 5-10 years, it's left GPU tech behind big time.
 
Last edited:
Back
Top Bottom