• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PC Gamer: RX 6500 XT looks worse on paper than AMD's $199 GPU from six years ago

Associate
Joined
31 Dec 2008
Posts
2,261
As @RavenXXX2 pointed out HUB most likely already tested the card and knows it will suffer from limited pcie bandwidth otherwise they would not release that video.
The titles they choose are the same ones they found before to be suffering from this but this is not a review.
Obviously “some” people will say “it has infinity cache” but that doesn’t seem to reduce vram usage on their other cards and is there to help with memory bandwidth.
But what do I know I’m just “irritating nvidia shill” after all right.?
 
Soldato
OP
Joined
15 Jan 2006
Posts
7,768
Location
Derbyshire
I miss the days when mid-range graphics cards were slightly cut down high end things. I seem to recall the 9500 Pro was a $199 card and basically you messed with a resistor, flashed the bios and a 9700 (or was it the pro?) was yours.

Oh yeah, and the fastest graphics card you could buy at the time was $399. OK, it's a long time ago and inflation, and whatnot...
 
Last edited:
Associate
Joined
22 May 2015
Posts
1,937
Location
Manchester
I miss the days when mid-range graphics cards were slightly cut down high end things. I seem to recall the 9500 Pro was a $199 card and basically you messed with a resistor, flashed the bios and a 9700 (or was it the pro?) was yours.

Oh yeah, and the fastest graphics card you could buy at the time was $399. OK, it's a long time ago and inflation, and whatnot...

Yeah I remember this sort of thing, unlocking pixel pipes on certain models of card. I think the one I remember best was turning an X800GTO into an X800PRO with a vbios flash that unlocked 33% more performance.

Imagine that now :eek:
 
Soldato
Joined
6 Feb 2019
Posts
17,468
As @RavenXXX2 pointed out HUB most likely already tested the card and knows it will suffer from limited pcie bandwidth otherwise they would not release that video.
The titles they choose are the same ones they found before to be suffering from this but this is not a review.
Obviously “some” people will say “it has infinity cache” but that doesn’t seem to reduce vram usage on their other cards and is there to help with memory bandwidth.
But what do I know I’m just “irritating nvidia shill” after all right.?


Infinity cache won't fix the issue, you can see this on other cards that already use infinity cache when you compare them to non infinity cache AMd GPUs - it doesn't reduce vram requirements and it doesn't prevent using ram as a memory pool across the PCIE link - if infinity cache did any of this AMD would have not given its other cards 16gb of useless VRAM that does nothing because infinity cache can handle it.


AMDs argument will be that even on gen 3 x4 the 6500xt still offers "decent" 1080p performance and they're not wrong but reviewers will tear it up because it's leaving a lot of performance on the table, especially when the actual price in stores for the card is $400usd
 
Last edited:
Associate
Joined
31 Dec 2008
Posts
2,261
Yeah I remember this sort of thing, unlocking pixel pipes on certain models of card. I think the one I remember best was turning an X800GTO into an X800PRO with a vbios flash that unlocked 33% more performance.

Imagine that now :eek:

It was still happening not that long ago.
4GB RX480 I bought on release day for £175 I managed to unlock to 8GB card by flashing bios after less than a week of owning it from what I remember.
Funny thing is that even the box was stating 8GB and only had a sticker put over it saying 4GB. :D
 
Associate
Joined
11 Jun 2021
Posts
1,024
Location
Earth
I don't see why a 4GB RX 5500 should behave any different than a 4GB RX 6500 when on PCI-E 3.0 x4. It's the same bottleneck. If your game wants to use system memory to compensate for the lack of VRAM, your limit is the PCIE bus. 4GB means you hit that bottleneck more frequently.

This is another RX 5500 test. https://www.techspot.com/review/2396-pcie-bandwidth-test/

If you're on PCI-E 3.0 it's a major limitation and there is no reason whatsoever why the RX 6500 is going to be any different. The combination of 4GB and PCIE 3.0 x4 knackers it in a way that wouldn't happen if you had either a x8 bus, a PCIE 4.0 bus or 8GB VRAM.


jTUImDP.jpg

Magic ?
 
Soldato
Joined
6 Oct 2007
Posts
22,261
Location
North West
Soldato
OP
Joined
15 Jan 2006
Posts
7,768
Location
Derbyshire
I think part of the issue I have here is that I have trouble believing that this cost anywhere near $199 to manufacture. Everything about this card screams entry level and cost cutting to me.

The complication is the GPU is on a 6nm node. TBH I have no idea what that means for manufacturing costs because there's not much else to compare it to at the moment.

GPU core self reporting to GPU-Z that it's a PCI-E 4.0 x16 (before it's corrected). It's not quite clear to me exactly where on the card the x4 bottleneck physically is from what techpowerup are reporting. Do they mean it's on the chip (but not the its core) or just that it's a limitation of the card PCB?
 
Soldato
Joined
1 Nov 2002
Posts
10,156
Location
Sussex
The X4 bottleneck is on the card PCI-E slot. It physically doesn’t have the additional lanes.

Why would it cost 199 to make, isn’t that the MSRP? It probably costs about 50 quid to make. It’s a tiny chip.
 
Soldato
OP
Joined
15 Jan 2006
Posts
7,768
Location
Derbyshire
Yes - and I appreciate that AMD need to make their money for design fees etc. However compared to recent years if you showed me those specs and asked me to guess a price I'd have it somewhere in the region of £100. AMD themselves are comparing it to GTX 1050 and GTX 1650. Those cards launched at RRP not much more than that.
 
OcUK Staff
Joined
17 Oct 2002
Posts
38,206
Location
OcUK HQ
Yes - and I appreciate that AMD need to make their money for design fees etc. However compared to recent years if you showed me those specs and asked me to guess a price I'd have it somewhere in the region of £100. AMD themselves are comparing it to GTX 1050 and GTX 1650. Those cards launched at RRP not much more than that.


A 1050Ti is £200 and a 1650 is £250 currently what cards launched at is irrelevant, times have changed. The 1650 sells in huge volumes at current prices.

If AMD can comfortably outperform those cards at the same money in the current climate that is what matters. Prices from yesteryear don’t count as those kind of prices no longer exist unfortunately.
 
Soldato
Joined
9 Nov 2009
Posts
24,771
Location
Planet Earth
Yes - and I appreciate that AMD need to make their money for design fees etc. However compared to recent years if you showed me those specs and asked me to guess a price I'd have it somewhere in the region of £100. AMD themselves are comparing it to GTX 1050 and GTX 1650. Those cards launched at RRP not much more than that.

The AMD/Nvidia cartel is basically depending on the ignorance/desperation of PCMR to throw money at these turds. Basically PCMR made fun of Mac fans,but Mac fans are having the last laugh.

However,with Blizzard being bought up by MS,I can see even more focus on consoles,and game streaming to connected devices. PC literally only made up 20% to 25% of all gaming revenue in 2020,and if games such as WoW are now ported to consoles and streaming services the landscape might look very different over the next 10 years!
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,656
Location
Uk
If AMD can comfortably outperform those cards at the same money in the current climate that is what matters
Those cards will run fine on older pcie gen 3.0 boards / CPUs but it looks like this could be an issue for the 6500XT which means it may not comfortably outperform them.
 
Back
Top Bottom