• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The CPU determines GPU performance. On Nvidia anyway.

It didn't push up the price of zen 2 though even though that was also on 7nm and you would think as the process matures and yields increase that the costs to manufacture the CPUs would get become cheaper.



The 3600 is faster than a 1600AF.

The 5800X is miles faster than both. 7nm is significantly more expensive then 12nm. Nvidia didn’t like the price.
 
The 1600AF is 12nm. Shrugs shoulders.
Both the 3600 and 5600X are on 7nm yet going on the prices they released at there was almost a 50% price increase for the latter.
Sure the 1600AF was cheaper but then it was a re-released 2600 to use up some silicon AMD had sitting around and also came out after the 3600 had been released so it's not like they were going to charge much for it, for reference the 2600 and 3600 launched at $199 while the 5600X launched at $299.
 
Even going from a 3600 to a 5800X with a 3080 @1440p was only about 10% at best case and certainly not worth the 400 quid I paid. On the other hand £650 I paid for the 3080 was about a 120% boost over my old GTX 1070ti so well worth the cash.

Yeah that sounds about what the tech review feedback roundups were reporting. As you say paying that premium for such a small uplift (not needed as dont game @ 1080p high fps) means save the cash for a substantial upgrade later down the line.
 
Both the 3600 and 5600X are on 7nm yet going on the prices they released at there was almost a 50% price increase for the latter.
Sure the 1600AF was cheaper but then it was a re-released 2600 to use up some silicon AMD had sitting around and also came out after the 3600 had been released so it's not like they were going to charge much for it, for reference the 2600 and 3600 launched at $199 while the 5600X launched at $299.

TSMC have had an ever increasing pricing. Google TSMC 7nm price increase. You should nform yourself about topics you want argue.

Edit: Just 4 days ago. TSMC announced another 20% price increase on 7nm and under.
 
Last edited:
TSMC have had an ever increasing pricing. Google TSMC 7nm price increase. You should nform yourself about topics you want argue.

Edit: Just 4 days ago. TSMC announced another 20% price increase on 7nm and under.
While that is the case I think come Alderlake that the 5600X will be under £200 which will show just how much profit AMD was gobbling up.
 
While that is the case I think come Alderlake that the 5600X will be under £200 which will show just how much profit AMD was gobbling up.

What you think is wrong. AMD prices have tracked TSMC price increases. AMD are probably moving to 3D-V cache now to avoid the latest TSMC price increase.

I hope AMD are making a healthy profit. They have the better products.
 
As long as it trickles down in price I can wait. Charging high prices on launch isn't too bad as some times the herd rush have to have it so let them pay to be first. What we don't want to see is prices stay the same or increase, for the sake of 10% type gains this is not of interest to some earlier gen users.

10% performance uplift across a number of cores is the difference between a loss and a profit in some tasks.

In this case (gaming) you gain 20-30% by choosing RDNA2 over Ampere even with an Intel chip. So if bang for buck is critical buy a lower tier RDNA2 card and lower priced CPU.
 
10% performance uplift across a number of cores is the difference between a loss and a profit in some tasks.

In this case (gaming) you gain 20-30% by choosing RDNA2 over Ampere even with an Intel chip. So if bang for buck is critical buy a lower tier RDNA2 card and lower priced CPU.

I think it depends on your selection of build components in the first place. Then it depends on what res you are gaming at and if you prefer it to be locked within sync range, or need max fps hz at all costs.
 
I think it depends on your selection of build components in the first place. Then it depends on what res you are gaming at and if you prefer it to be locked within sync range, or need max fps hz at all costs.

The RTX processing overhead is across the board that is just how the architect has been designed. Either way you can get more done with less on an all AMD system. In no way should I be seeing less performance when upgrading to a RTX 3080 over a Vega 64 or 5700XT regardless of what machine I’m fitting it in.
 
The RTX processing overhead is across the board that is just how the architect has been designed. Either way you can get more done with less on an all AMD system. In no way should I be seeing less performance when upgrading to a RTX 3080 over a Vega 64 or 5700XT regardless of what machine I’m fitting it in.
That's just the way CPU bottleneck works if you try run a high end card of a low end machine, it just effects Nvidia more than AMD.

Besides cards like the 3080 is designed for 4K so if running them below that you and on an old CPU then you cant expect the full potential and it's the same with AMD but just to a lesser degree.
 
That's just the way CPU bottleneck works if you try run a high end card of a low end machine, it just effects Nvidia more than AMD.

Besides cards like the 3080 is designed for 4K so if running them below that you and on an old CPU then you cant expect the full potential and it's the same with AMD but just to a lesser degree.

They are not designed for 4K that would be the 3090. I’ve seen over 9gb of VRAM use on the 3080 at 4K
 
The RTX processing overhead is across the board that is just how the architect has been designed. Either way you can get more done with less on an all AMD system. In no way should I be seeing less performance when upgrading to a RTX 3080 over a Vega 64 or 5700XT regardless of what machine I’m fitting it in.

That's just the way CPU bottleneck works if you try run a high end card of a low end machine, it just effects Nvidia more than AMD.

Besides cards like the 3080 is designed for 4K so if running them below that you and on an old CPU then you cant expect the full potential and it's the same with AMD but just to a lesser degree.

They are not designed for 4K that would be the 3090. I’ve seen over 9gb of VRAM use on the 3080 at 4K

If people are buying and using 3060Ti/6700's for 1440p with good results, I think it is fair to say (even Jensen's kitchen release championed it) that a 3080 is indeed a 4k card. What do you think DLSS is also for?
 
If people are buying and using 3060Ti/6700's for 1440p with good results, I think it is fair to say (even Jensen's kitchen release championed it) that a 3080 is indeed a 4k card. What do you think DLSS is also for?

You can’t relay on DLSS to offset a memory limitation. It’s marketing fluff. The 3080 should have more memory. Even the 3060 ships with 12gb. The 3060 and 3090 have more VRAM than is needed and the 3080 is struggling.
 
You can’t relay on DLSS to offset a memory limitation. It’s marketing fluff. The 3080 should have more memory. Even the 3060 ships with 12gb. The 3060 and 3090 have more VRAM than is needed and the 3080 is struggling.
If the VRAM was an issue it would have been shown up in reviews yet the 3080 beats AMDs 16gb 6800XT in the majority of titles at 4K.
 
If the VRAM was an issue it would have been shown up in reviews yet the 3080 beats AMDs 16gb 6800XT in the majority of titles at 4K.

Well it is. You want some screen shots?

The RX 6800XT is a better product than the RTX 3080. I’ve used both and RDNA2 is jus a better experience.
 
Back
Top Bottom