• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA2 Refresh for 2022 6x50XT cards

Its really not, Nvidia already have 85% marketshare, they don't care about you so much, they can't really grow thier market by selling more cards to you and me, which is why they are going after miners so aggressively.

AMD on the other hand want a bigger share of our custom, because mining is not a long term thing, they right now more than Nvidia want you and me to buy their cards.

If AMD wanted more market share they would have priced their card lower than nvidia instead of sliding them in the stack. AMD have been on a massive cash grab with CPU's and GPU's this Gen with their 'we are the best so we can charge more' mentality.
 
If AMD wanted more market share they would have priced their card lower than nvidia instead of sliding them in the stack. AMD have been on a massive cash grab with CPU's and GPU's this Gen with their 'we are the best so we can charge more' mentality.

Given that both Nvidia and AMD are selling every card they make I don't see how pricing them lower would make any difference to market share.
 
18GB/s VRAM....

Obviously amd don't feel current vram speed is enough....

:cry:


Even faster is coming for next gen GPUs, Samsung has already sampling 24GB/s G6 vram and SK Hynix announced today its started sampling on its 27GB/s G6 vram.

Next gen GPUs for later in 2022 will have well over 1TB/s bandwidth - which they will need for 8k gaming
 
Given that both Nvidia and AMD are selling every card they make I don't see how pricing them lower would make any difference to market share.

You can say that on hind site but when the cards were released the price to performance was set with the MSRP. You can't use the retail prices in the current market as a yard stick, they are anomalous.
 
You can say that on hind site but when the cards were released the price to performance was set with the MSRP. You can't use the retail prices in the current market as a yard stick, they are anomalous.

You're talking about market share though. AMD wouldn't have got more market share by pricing lower. They'd have still sold all their cards, just made a lot less money.

Would a few people have potentially not bought Nvidia to try and get some impossible to source AMD card? Maybe, but given the cards were pretty impossible to get hold of anyway the impact of this seems unlikely to be significant.
 
Even faster is coming for next gen GPUs, Samsung has already sampling 24GB/s G6 vram and SK Hynix announced today its started sampling on its 27GB/s G6 vram.

Next gen GPUs for later in 2022 will have well over 1TB/s bandwidth - which they will need for 8k gaming

I think you missed the whole point of my post and took it too seriously.... :cry: I'm using the same logic that some have used with regards to nvidia releasing a 3080 12gb i.e. "said manufacturer not does feel that said spec is good enough hence why they release refreshed said product" :cry:
 
I think you missed the whole point of my post and took it too seriously.... :cry: I'm using the same logic that some have used with regards to nvidia releasing a 3080 12gb i.e. "said manufacturer not does feel that said spec is good enough hence why they release refreshed said product" :cry:


Ok boomer
 
Its really not, Nvidia already have 85% marketshare, they don't care about you so much, they can't really grow thier market by selling more cards to you and me, which is why they are going after miners so aggressively.

AMD on the other hand want a bigger share of our custom, because mining is not a long term thing, they right now more than Nvidia want you and me to buy their cards.

Agree about NVidia, strongly disagree on AMD. If they wanted a bigger share of our custom, why no MSRP cards in the UK?

They don't want a bigger share of our custom here in the UK, they earn enough from the EU it would appear otherwise we'd actually have a choice here.

But no way was I going to pay AIB prices for AMD cards just because they can't be arsed to sell at MSRP here.
 
If AMD wanted more market share they would have priced their card lower than nvidia instead of sliding them in the stack. AMD have been on a massive cash grab with CPU's and GPU's this Gen with their 'we are the best so we can charge more' mentality.

They Did.

6700XT: $479
3070: $499

6800: $579

6800XT: $649
3080: $699

6900XT: $999
3090: $1199
 
It's actually kind of amazing when you look at the MSRP for the medium/high end range (ignoring scalping/pricing situation for a moment), for an extra £30/50 or so, you get so much more with nvidia imo:

- dlss + dlaa (sadly dlaa not in many games yet though)
- access to amds fsr
- NIS, essentially amds fsr but global for any game you want
- eventually access to intels version of dlss, although it is looking to be a damp squib with the limited/niche games it will be in...
- superior ray tracing performance
- DLDSR (given it makes use of the tensor cores for ai upscaling, I don't think amd will be able to do anything similar with rdna 2...)
- being able to inject RT reshade via geforce in select games (still not a patch on properly implemented ray tracing but with time and a lot more work, it could be quite the game changer)

And other things like better vr experience, better for streaming etc. etc.

Nvidia are just providing a lot more value for money at the most popular tiers these days imo. Gone are the days where amd were really toe to toe on "all" fronts and considerably cheaper, they need to bring back the game bundles they had, that added a ton of value and was a good incentive.
 
Well, I always consider the amount of VRAM to be great indicating of longevity.

In which case whether 8GB 3070 Vs 12GB 6700, or 10GB 3080 Vs 16GB 6800 XT, the Nvidia looked the worse value to me.

Of course, plenty of buyers don't care about longevity especially on forums like this.

But all those cards were too expensive for me anyway at launch and since then things got totally crazy. For me, even if prices were back at MSRP, there's little value this generation.
 
They Did.

6700XT: $479
3070: $499

6800: $579

6800XT: $649
3080: $699

6900XT: $999
3090: $1199
At the higher end AMD isn't too bad but it's at the lower end where they fall down with the 6600XT $379 vs 3060ti $399 with the 3060ti being 15% faster and having better features. Then the 6600 $329 which is slower, has less VRAM and a worse feature set vs 3060 $329 and soon the 6500XT which despite the $199 MSRP has half the VRAM + memory bus and is only a pcie x4 vs the full pcie x16 3050 8gb at $249
 
Well, I always consider the amount of VRAM to be great indicating of longevity.

In which case whether 8GB 3070 Vs 12GB 6700, or 10GB 3080 Vs 16GB 6800 XT, the Nvidia looked the worse value to me.

Of course, plenty of buyers don't care about longevity especially on forums like this.

But all those cards were too expensive for me anyway at launch and since then things got totally crazy. For me, even if prices were back at MSRP, there's little value this generation.

8GB vram defo is not enough imo but I don't think the 3070 cuts it for grunt anyway (defo not at 4k) thus settings need to be reduced which in return reduces vram usage as well and if you have to turn down 1-2 settings to reduce any issues because of the lack of vram, is it really any different to having to drop settings because of the lack of grunt? The card is still going to do incredibly well if you know what its limitations are and what to do to avoid those issues i.e. don't be like the guy on here who tried running cyberpunk max on everything with a 3070 and complained about it..... bearing in mind even a 3090 can't do max settings in cyberpunk with an acceptable fps experience....

10GB wise, we're all still waiting 1 year 5 months on to see any real evidence as to issues here, will it be an issue at some point, perhaps and then the question is, how many games will we be talking about? I'll be surprised if we see more than a handful, if that.....

Things like DLSS (which in return also reduces vram usage) are only going to become more important when games are pushing the graphical effects especially on the ray tracing front going forward.

Then you have direct storage, this in theory will change how vram management works entirely and vram capacity should not be as much of an issue.

Factor in that nvidia are regarded as being better with their vram management than amd i.e. they use less vram where as amd generally gobbles up as much as they can, whether this is a design decision or an optimisation problem is another question.

In terms of amds longevity, given that ray tracing is being adopted into the vast majority of the games now and amd being considerably slower in this (even when the effects are limited and in their own sponsored games too) as well as not being able to handle as complex RT scenes, there is no doubt, amd will suffer in the long run here imo, especially since we have already seen in several RT titles, a 3070 matching/besting a 6900xt because of the lack of RT ability.

TLDR: either way, with time, every card is going to have an issue because of some limitation. Question to ask is why will people be upgrading in a year or so?

Upgrading for better ray tracing performance?
Upgrading for better rasterization performance?
Upgrading for more vram?

No doubt the next gen of gpus will be far more interesting and at this stage, might as well wait, however, they are likely to be even more expensive.
 
Last edited:
Well, I always consider the amount of VRAM to be great indicating of longevity.

In which case whether 8GB 3070 Vs 12GB 6700, or 10GB 3080 Vs 16GB 6800 XT, the Nvidia looked the worse value to me.

Of course, plenty of buyers don't care about longevity especially on forums like this.

But all those cards were too expensive for me anyway at launch and since then things got totally crazy. For me, even if prices were back at MSRP, there's little value this generation.

Longevity also requires GPU power. As games get more demanding, you’ll need to turn settings down, which will reduce VRAM requirements.

Also consider features, where the RDNA2 cards are weak on RT and don’t have spare capacity for DL. Having DLSS is great for longevity and I’m sure AMD are working on an equivalent and is probably catered for in RDNA3.
 
At the higher end AMD isn't too bad but it's at the lower end where they fall down with the 6600XT $379 vs 3060ti $399 with the 3060ti being 15% faster and having better features. Then the 6600 $329 which is slower, has less VRAM and a worse feature set vs 3060 $329 and soon the 6500XT which despite the $199 MSRP has half the VRAM + memory bus and is only a pcie x4 vs the full pcie x16 3050 8gb at $249

They Did.

6700XT: $479
3070: $499

6800: $579

6800XT: $649
3080: $699

6900XT: $999
3090: $1199


I think whole lineup is overpriced.

3060TI gives 6700XT a run for it's money and smashes it in anything heavily raytraced. 5700XT was £400.. roughly what the 6700XT should cost to be competitive. Why is it 20% more ?

6800 sat in no mans land because it should be competing with the 3070 at £500.

6800xt fine if you disregard raytracing and production work but £600 would have definately given them a lot more appeal.

6900xt.. whatever lol. Only reason 3090 gets to sit up there is because of it money making potential (production work and content creation).

Cash grab imo, same as ZEN3... only differnce being that ZEN3 was actually a superior product.

I reckon Nvidia screwed the pooch on VRAM though, 3070 and 3080 should have been 12gb and i'm not agreeing with the price of the RTX3000 series either, it's a discrace what you have to pay for a half decent card at MSRP.. £400 for a 60 class card.. £1200 for a 'gaming' GPU.. if my PC wasn't so central to my existence i'd be telling them both to go ****** themselves.
 
8GB vram defo is not enough imo but I don't think the 3070 cuts it for grunt anyway (defo not at 4k) thus settings need to be reduced which in return reduces vram usage as well and if you have to turn down 1-2 settings to reduce any issues because of the lack of vram, is it really any different to having to drop settings because of the lack of grunt? The card is still going to do incredibly well if you know what its limitations are and what to do to avoid those issues i.e. don't be like the guy on here who tried running cyberpunk max on everything with a 3070 and complained about it..... bearing in mind even a 3090 can't do max settings in cyberpunk with an acceptable fps experience....

Does fine for 1440p and last generation games and older at 4K - but I'd agree personally with 8GB being a limitation especially at 4K.

CP2077 played fine on my rig with everything on ultra, thanks to DLSS quality, 1440p - but 4K drops down to ~40FPS and if you aren't careful can hit up to 10GB VRAM utilisation unless you drop DLSS down to balanced never mind no DLSS. (At which point on my 43" 4K display it is very noticeably lower quality than native 4K and you are kind of not getting any advantage over 1440p).

Without DLSS IIRC I was averaging about 27FPS at 1440p with everything on ultra - the couple of settings that went to psycho would bring it down to about 19 but IIRC don't really change much noticeably aside from some small areas of the game.

3080 definitely should have been a 12GB card minimum - I don't think on the 3070 it makes much odds as you start to run out of GPU horsepower about the time VRAM utilisation starts to get around the 8GB mark in most [demanding] games/applications.
 
Back
Top Bottom