• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
For people who want to play the latest "big" games, perhaps. Most of what I play on PC simply isn't available on console, i.e. modded games, older games, emulators and WoW. But then none of that stuff requires the latest and greatest GPU, or anything close to it in most cases. There's plenty more to do than play the latest AAA slop. Persona 5 Royal will be my next obsession in a week's time and the recommended specs for that are a GTX 760 or HD 7870. Seems once again PC gaming's death has been greatly exaggerated.
Tbh most of the big games now will run perfectly fine a 3080 or 3090. 6700xt, 6800xt etc with stupid FPS still. It's only 4k gaming that needs that extra oomph, otherwise most gamers really don't need it. I haven't come across anything that my 3090 can't push high fps and it will be quite some time until it starts to struggle.
 
No chance of me buying a console. Love having all my games on steam. Can play whatever I want when I want. Plus by the time you buy a console you might as well add another hundred or so quid on top and buy a 3080 Ti. That what I did. 3090 performance for £575 :D
Steam Deck is your other option
 
Had that a couple of weeks after launch. It is nice, but not quite there yet. Steam Deck 2 will be I recon, for a handheld anyway. Sold it for £500 profit the end after having my fun with it :cry:
I sold mine for cost, couldn't do it. Had fun with it for 2 weeks as well :cry: .
 
  • Like
Reactions: TNA
Yeah absolutely. Tbh if it wasn't the fact that I mainly only play DCS with a bit of Post Scriptum from time to time, honestly I would already have done. Shudders at the thought:)
For people who want to play the latest "big" games, perhaps. Most of what I play on PC simply isn't available on console, i.e. modded games, older games, emulators and WoW. But then none of that stuff requires the latest and greatest GPU, or anything close to it in most cases. There's plenty more to do than play the latest AAA slop. Persona 5 Royal will be my next obsession in a week's time and the recommended specs for that are a GTX 760 or HD 7870. Seems once again PC gaming's death has been greatly exaggerated.

But the issue is that as these price rises get worse,at some point the PC parts need to be replaced or upgraded. Think what happens if we have multiple generations of these sorts of high prices?

As the top end goes up it starts to destroy the value preposition of mainstream parts. I generally buy only mainstream parts,like 95% of gamers I know and the stagnation is very real especially looking at what you used to see 10 years ago. I know very few who spend £500+ on a dGPU or that much on a new CPU.

It's sheer madness,that a 300MM2 die area Ada part starts at over £900! The GA106 in the RTX3060 is almost the same size as AD104.

Many on here throw money at the hobby with unlimited budgets,so see nice gains. Its not the case if you are more of a normal gamer.

It also causes the costs of secondhand parts to go up too. The mainstream dGPUs are getting more and more expensive,as is all the other parts like needing a better CPU,etc. So you can hold off upgrading for years and end up still paying a lot for a relatively mediocre PC.

For my none gaming stuff and oldish games a decentish laptop would do the job perfectly fine. The only reason I upgrade or even have a desktop is mostly for running newer games - so if the cost of mediocre parts are getting worse and worse(which can't even run most of the effects well),then what is the point of paying so much for a mainstream PC?

A console will do the job,especially as most people have only a limited amount of gaming time,so even if the games cost more,it's still cheaper longterm. Consoles are increasing supporting mods,etc too.

In any normal situations companies seeing billions in lost revenue,tanking share prices,etc would think reducing prices to get more sales would make sense. But not for greedy tech companies such as Nvidia,Intel and AMD who feel the need to jack pricing up even more because of the availability of cheap credit. They seem to be banking on PCMR whales. I hope the financial situation gets worse for them,because the era of cheap credit is over.
 
Last edited:
No chance of me buying a console. Love having all my games on steam. Can play whatever I want when I want. Plus by the time you buy a console you might as well add another hundred or so quid on top and buy a 3080 Ti. That what I did. 3090 performance for £575 :D

Not to mention almost all console exclusives are only timed exclusives these days and eventually hit the PC.

You are making one very flawed assumption - they have a capable desktop to put the dGPU in.

Many people now use laptops,or have laptops from work so the gaming desktop is a separate PC.

So you need to factor in the cost of the desktop into it too. Most people have TVs already,and the reality is that most will only play a limited number of games per year due to time constraints. So even if console games cost more,the overall cost still makes more sense in an era of a £900 RTX4080 12GB with its sub 300MM2 dGPU(the size of the dGPU in an RTX3060).

The price hike is worse than Turing when you look at what they have done with the RTX4080 12GB. An RTX2080 had a chip almost twice the size and cost less. The GTX1070 had a bigger chip. Now imagine if a GTX1070 or GTX1080 cost £900. Nvidia has become worse than peak Apple(and AMD is following close behind).
 
Last edited:
You are making one very flawed assumption - they have a capable desktop to put the dGPU in.

Many people now use laptops,or have laptops from work so the gaming desktop is a separate PC.

So you need to factor in the cost of the desktop into it too. Most people have TVs already,and the reality is that most will only play a limited number of games per year due to time constraints. So even if console games cost more,the overall cost still makes more sense in an era of a £900 RTX4080 12GB with its sub 300MM2 dGPU(the size of the dGPU in an RTX3060).

The price hike is worse than Turing when you look at what they have done with the RTX4080 12GB. An RTX2080 had a chip almost twice the size and cost less. The GTX1070 had a bigger chip. Now imagine if a GTX1070 or GTX1080 cost £900. Nvidia has become worse than peak Apple(and AMD is following close behind).

I was referring to you and humbug who mentioned about getting a console instead. Not Tom Dick and Harry :cry:
 
I was referring to you and humbug who mentioned about getting a console instead. Not Tom Dick and Harry :cry:
It applies to me too. The mainstream dGPUs are being cut down to 8X PCI-E 5.0 according to the rumours. Because I am on an Asus B450I mini-ITX motherboard(my previous system had problems so I had to upgrade before Zen2 was out),so I would need to spend a decent amount on a B550/X570 mini-ITX motherboard to get PCI-E 4.0 or an entire platform upgrade to get PCI-E 5.0! Plus I need to be wary of heat to due to the form factor of my case. So it's not even a simple upgrade path for me! :(
 
Last edited:
But the issue is that as these price rises get worse,at some point the PC parts need to be replaced or upgraded. Think what happens if we have multiple generations of these sorts of high prices?

As the top end goes up it starts to destroy the value preposition of mainstream parts. I generally buy only mainstream parts,like 95% of gamers I know and the stagnation is very real especially looking at what you used to see 10 years ago. I know very few who spend £500+ on a dGPU or that much on a new CPU.

It's sheer madness,that a 300MM2 die area Ada part starts at over £900! The GA106 in the RTX3060 is almost the same size as AD104.

Many on here throw money at the hobby with unlimited budgets,so see nice gains. Its not the case if you are more of a normal gamer.

It also causes the costs of secondhand parts to go up too. The mainstream dGPUs are getting more and more expensive,as is all the other parts like needing a better CPU,etc. So you can hold off upgrading for years and end up still paying a lot for a relatively mediocre PC.

For my none gaming stuff and oldish games a decentish laptop would do the job perfectly fine. The only reason I upgrade or even have a desktop is mostly for running newer games - so if the cost of mediocre parts are getting worse and worse(which can't even run most of the effects well),then what is the point of paying so much for a mainstream PC?

A console will do the job,especially as most people have only a limited amount of gaming time,so even if the games cost more,it's still cheaper longterm. Consoles are increasing supporting mods,etc too.

In any normal situations companies seeing billions in lost revenue,tanking share prices,etc would think reducing prices to get more sales would make sense. But not for greedy tech companies such as Nvidia,Intel and AMD who feel the need to jack pricing up even more because of the availability of cheap credit. They seem to be banking on PCMR whales. I hope the financial situation gets worse for them,because the era of cheap credit is over.

there was a comment made by one of the top guys at ASML, that post-high NA fabrication is going to be prohibitively expensive, which i would interpret as exponentially more expensive than high-NA.. so post 2032 is going to be a real challenge for consumer markets which might start stagnating - maybe we will have super small chips or a movement to cloud computing
 
Last edited:
It applies to me too. The mainstream dGPUs are being cut down to 8X PCI-E 5.0 according to the rumours. Because I am on an Asus B450I mini-ITX motherboard(my previous system had problems so I had to upgrade before Zen2 was out),so I would need to spend a decent amount on a B550/X570 mini-ITX motherboard to get PCI-E 4.0 or an entire platform upgrade to get PCI-E 5.0! Plus I need to be wary of heat to due to the form factor of my case. So it's not even a simple upgrade path for me! :(

But did I not specifically say 3080 Ti? It’s like you need to go read the post again bro :p

I guess heat might be an issue. Though I do undervolt and overclock mine and never see 300w. As an example in CP 2077 I see roughly 280w and in God of War at 90fps cap I see much less. If I cap that to console level 60fps it is under 200W :)
 
Last edited:
there was a comment made by one of the top ASML guys, that post-high NA fabrication is going to be prohibitively expensive, which i would interpret as exponentially more expensive than high-NA.. so post 2032 is going to be a real challenge for consumer markets which might start stagnating - maybe we will have super small chips or a movement to cloud

The issue is that on top of that,these companies want ever increasing margins every year. The stock market is just expecting way too much now.
But did I not specifically say 3080 Ti? It’s like you need to go read the post again bro :p

That is far more than I spent on any dGPU. I am a mainstream buyer,not a person who buys £600+ dGPUs. I have an RTX3060TI. So now think what I feel when I see the RTX3060(or if I am being kind to Nvidia,the RTX3070 replacement) being priced at £900.

Plus I have a mini-ITX system,like more of my mates(because many of us don't really want huge boxes anymore),so again the power consumptions of these higher dGPUs,produces too much heat. It's not like I have hidden for the last decade I am a buyer of mainstream dGPUs or have had a SFF system since 2005.

You have an Ryzen 9 5900X and an RTX3080TI which are high end parts,and certainly better than almost all the gamers I know in the realworld.

So many of my friends are looking at the cost of PC parts and going "meh!" and only doing minimal level upgrades to their desktops now and downsizing to smaller boxes. Some are replacing them with laptops. Many have bought consoles. I have a feeling with this next round of massive price increases,unless greedy Nvidia/AMD/Intel change tact the desktops won't be upgraded. They are going to be replaced by laptops and a console of some sort.
 
Last edited:
The issue is that on top of that,these companies want ever increasing margins every year. The stock market is just expecting way too much now


That is far more than I spent on any dGPU. I am a mainstream buyer,not a person who buys £600+ dGPUs. I have an RTX3060TI. So now think what I feel when I see the RTX3060(or if I am being kind to Nvidia,the RTX3070 replacement) being priced at £900.

Plus I have a mini-ITX system,like all my mates(because many of us don't really want huge boxes anymore),so again the power consumptions of these higher dGPUs,produces too much heat. It's not like I have hidden for the last decade I am a buyer of mainstream dGPUs or have had a SFF system since 2005.

But again, my post said if you are going to buy a console (aren’t disc PS5’ like £450?) add a hundred or so on top and get a 3080 Ti like I did. Your main concern could be heat, but I think you would be fine even with that to be honest. Just remove case door or something if you don’t want to undervolt or frame cap :)
 
Last edited:
But again, my post said if you are going to buy a console (aren’t disc PS5’ like £450?) add a hundred or so on top and get a 3080 Ti like I did. Your main concern could be hear, but I think you would be fine even with that to be honest. Just remove case door or something if you don’t want to undervolt or frame cap :)
I have an NCase M1 V5 so the RTX3070 is the most I want to deal with because I would have to spend lots on water cooling otherwise. Plus an RTX3080TI is well over my budget and the budget of almost everyone I know who games. None of them are really enthusiasts.

The RTX4080 12GB is a 192 bit bus for £900. This is basically a 60 series class die and memory interface pushed up 4 tiers. This makes Turing look reasonable in comparison.

Plus I am limited PCI-E 3.0 on my system(Ryzen 7 5700X) and my motherboard supports PCI-E 4.0 but AMD removed support from later AGESAs. What do you think is going to be available to replace my current RTX3060TI at £500 and under? Yes,a dGPU with 8GB of RAM,a 96 bit/128 bit bus and a PCI-E 5 8.0X interface. Those dGPUs have the spec of 50 series sub £200 dGPUs but the price escalations at the top is pushing this crap right up the board. It is going to need a modern system to properly perform on or either I have to purchase something over £600 with a full PCI-E interface??

The AD104 is a 292mm2 dGPU. This GA106 was 272mm2,so basically Nvidia has not only renamed the dGPU(it should be AD106) but essentially jacked up the price by 3X. Please don't defend this as it is a disaster for mainstream buyers like me. Then you have Zen4 starting at over £300 for a six core CPU,and the mini-ITX motherboards look silly money.

Nobody I know has looked at the recent releases and felt any excitement. No wonder Zen4 sales are rubbish.

The consoles are going to get refreshed in the next year or two. Even if they end up costing £550,it will be still cheaper than buying a whole new system. My current system runs the older games fine. So unless these companies reduce these prices,I am pretty much priced out now with trying to run newer games as time progresses.

But the greedy arses,as their revenue and margins crash,are doing an Apple and trying to jack up margins. This is the same accountant lead nonsense,which meant western companies seeded market share to Chinese companies and now they are all running scared because of their greed(basically willing to drop large amounts of sales for margins).

I am not going to spend £500 on some POS "RTX4060" or AMD equivalent which is a rebranded 50 or 40 series type card which probably will perform poorly.
 
Last edited:
I have an NCase M1 V5 so the RTX3070 is the most I want to deal with because I would have to spend lots on water cooling otherwise. Plus an RTX3080TI is well over my budget and the budget of almost everyone I know who games. None of them are really enthusiasts.

The RTX4080 12GB is a 192 bit bus for £900. This is basically a 60 series class die and memory interface pushed up 4 tiers. This makes Turing look reasonable in comparison.

Plus I am limited PCI-E 3.0 on my system(Ryzen 7 5700X) and my motherboard supports PCI-E 4.0 but AMD removed support from later AGESAs. What do you think is going to be available to replace my current RTX3060TI at £500 and under? Yes,a dGPU with 8GB of RAM,a 96 bit/128 bit bus and a PCI-E 5 8.0X interface. Those dGPUs have the spec of 50 series sub £200 dGPUs but the price escalations at the top is pushing this crap right up the board. It is going to need a modern system to properly perform on or either I have to purchase something over £600 with a full PCI-E interface??

The AD104 is a 292mm2 dGPU. This GA106 was 272mm2,so basically Nvidia has not only renamed the dGPU(it should be AD106) but essentially jacked up the price by 3X. Please don't defend this as it is a disaster for mainstream buyers like me. Then you have Zen4 starting at over £300 for a six core CPU,and the mini-ITX motherboards look silly money.

Nobody I know has looked at the recent releases and felt any excitement. No wonder Zen4 sales are rubbish.

The consoles are going to get refreshed in the next year or two. Even if they end up costing £550,it will be still cheaper than buying a whole new system. My current system runs the older games fine. So unless these companies reduce these prices,I am pretty much priced out now with trying to run newer games as time progresses.

But the greedy arses,as their revenue and margins crash,are doing an Apple and trying to jack up margins. This is the same accountant lead nonsense,which meant western companies seeded market share to Chinese companies and now they are all running scared because of their greed(basically willing to drop large amounts of sales for margins).

I am not going to spend £500 on some POS "RTX4060" or AMD equivalent which is a rebranded 50 or 40 series type card which probably will perform poorly.

I would say don’t worry, AMD will have something for you, but I am not so sure. They are a premium brand these days sadly :(

Used to love AMD before all this premium brand crap. They were priced competitively and I would almost always go AMD. But they lost the plot last few gens as far as I am concerned.
 
Last edited:
I would say don’t worry, AMD will have something for you, but I am not so sure. They are a premium brand these days sadly :(

Used to love AMD before all this premium brand crap. They were priced competitively and I would almost always go AMD. But they lost the plot last few gens as far as I am concerned.

All AMD has to do in my eyes is stop playing catch up with nvidia:

- get fsr to match dlss quality, 2.1 is improved and much better but still falls short in overall temporal stability, which is noticeable based on my experience in spiderman and most importantly, get it in more games! Given it's open source, easy to integrate etc. FSR 2 and even FSR 2.1 is still very poor in uptake at the moment

- ideally a frame generation competitor, obviously nvidia still have some work to do on theirs but if I go rdna 3, I don't want to be waiting 1-2 years especially if their solution is going to be subpar when released for a further several months or longer....

- get RT to be usable, matching ampere would be nice but it needs to be better than ampere going forward

And lastly, priced appropriately, it doesn't have to match match the 4090 sheer power, it just needs to be priced right.

- bringing 3090/ti performance in a 7800xt for £800 = fail imo
- bringing 4080 16gb in a 7800xt for £700/800 = win (well still a bit too high for my liking but it's the way of the world now, things aren't getting cheaper....)
 
Last edited:
All AMD has to do in my eyes is stop playing catch up with nvidia:

- get fsr to match dlss quality, 2.1 is improved and much better but still falls short in overall temporal stability, which is noticeable based on my experience in spiderman and most importantly, get it in more games! Given it's open source, easy to integrate etc. FSR 2 and even FSR 2.1 is still very poor in uptake at the moment

- ideally a frame generation competitor, obviously nvidia still have some work to do on theirs but if I go rdna 3, I don't want to be waiting 1-2 years especially if their solution is going to be subpar when released for a further several months or longer....

- get RT to be usable, matching ampere would be nice but it needs to be better than ampere going forward

And lastly, priced appropriately, it doesn't have to match match the 4090 sheer power, it just needs to be priced right.

- bringing 3090/ti performance in a 7800xt for £800 = fail imo
- bringing 4080 16gb in a 7800xt for £700/800 = win (well still a bit too high for my liking but it's the way of the world now, things aren't getting cheaper....)

Agreed. But the way you said it reminded me of 4K8K.

He used to come out with stuff like that. All AMD need to do is xy and z. He would make it sound all so simple like only he could see it and people working at AMD were dumb (well their marketing team are not the sharpest to be fair).

Thing is these things are designed years before and whatever they did now would again take years to appear in products at which point Nvidia might be on to something else, at which point someone on here will be saying all AMD need to do is… :p

So apart from that I agree with your post :cry:
 
Agreed. But the way you said it reminded me of 4K8K.

He used to come out with stuff like that. All AMD need to do is xy and z. He would make it sound all so simple like only he could see it and people working at AMD were dumb (well their marketing team are not the sharpest to be fair).

Thing is these things are designed years before and whatever they did now would again take years to appear in products at which point Nvidia might be on to something else, at which point someone on here will be saying all AMD need to do is… :p

So apart from that I agree with your post :cry:

True that! :cry:

Thing is as well, nvidia will know exactly what amd have (same way amd would know what nvidia have and are working on before release date, I don't think it's quite so hush hush behind locked doors like many think) so it's not like it will be a surprise for nvidia on launch day if amd do stomp all over nvidia :p Jenson has probably already got a number of moves ready to happen should amd come out swinging....

Personally I think we are going to see a return to the glory days of the TI models, nvidia have made sure of this with perf. gaps between the 4080 and 4090 this time round as it was utterly ridicolous the ampere line up with the performance gap between the 3080/ti and 3090/ti models considering what the price difference was.
 
  • Like
Reactions: TNA
The mainstream dGPUs are getting more and more expensive,as is all the other parts like needing a better CPU,etc. So you can hold off upgrading for years and end up still paying a lot for a relatively mediocre PC.
I got a B450 a ryzen 3600 and an rtx 3080 for £940, now a B650 + 7600X and 4080 16gb which is really only a 4070ti costs £1800 so nearly double in just a few years.
 
Status
Not open for further replies.
Back
Top Bottom