• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PC Gamer: RX 6500 XT looks worse on paper than AMD's $199 GPU from six years ago

Soldato
Joined
7 Dec 2010
Posts
8,222
Location
Leeds
The mobile GPU: Well done AMD, well done.

Want something funny to read, then read this thread :- https://forums.overclockers.co.uk/threads/rx-7900xt-15-360-cores-mcm-tapeout-q4.18933812/

7900xt is going to be 3x 6900xt in performance... Read it ..it makes funny reading and some people clearly don't understand how much power a gpu can really use. Ohh also 3x 6900xt at real price of £1500 will for sure make it a £4500 real price and then scalped too... to £5k+


AMD the saviour for gamers... they give you a card with again another fake msrp and a card no better than 5 year old tech and worse as it is missing simple hardware cards before had.

I think people are confusing AMD the cpu company and AMD the graphics company, they are two very different companies and the graphics side has always been a mess for many reasons. I really wish ATI didn't sell up to AMD and sold to another company.
 
Soldato
Joined
17 Jun 2004
Posts
7,587
Location
Eastbourne , East Sussex.
Polaris 21XT which was used in the RX560 was 123mm2 and used a 128 bit memory controller. The RX6500 has a smaller GPU,a 64 bit memory controller and has worse media decode/encode capability.

Media decode is the same (its the encode which is different) Navi 14XTX 158mm2 , 128bit.

AMD are banking on the +1ghz higher clocks than 5500XT to carry it through
 
Don
Joined
19 May 2012
Posts
17,062
Location
Spalding, Lincolnshire
It has a memory bandwidth of 64 bit. No card in history with that memory interface has been a 1080p card without severe compromise.
No other 64 Bit card has ever had anything like Infinity Cache implemented :)

When all is said and done, no matter if this card boosts to infinity and beyond, it only has 4GB of memory, and that wasn't enough five years ago, never mind now.
4GB is absolutely fine for the target market - Counterstrike, Fortnite and the like still run fine at 1080P with 2gb cards from years ago.

There has to be some new low end cards launched at some point - you can't keep recommending people on a tight budget to keep buying RX480's and the like that have been mined on and are now unreliable.
 
Soldato
Joined
9 Nov 2009
Posts
24,771
Location
Planet Earth
Media decode is the same (its the encode which is different) Navi 14XTX 158mm2 , 128bit.

AMD are banking on the +1ghz higher clocks than 5500XT to carry it through

Problem is the RX5500XT 4GB had problems with the PCI-E 4X interface,and this might have the same issue too. The issue is if the streetprice is nearly £300 it really is not great for the price.

No other 64 Bit card has ever had anything like Infinity Cache implemented :)


4GB is absolutely fine for the target market - Counterstrike, Fortnite and the like still run fine at 1080P with 2gb cards from years ago.

There has to be some new low end cards launched at some point - you can't keep recommending people on a tight budget to keep buying RX480's and the like that have been mined on and are now unreliable.

£150~£350 is not lowend - its mainstream. The problem is that the streetprice seems closer to £300.

Its literally the area where 95% of all gamers I know buy dGPUs in including myself,and most people I know game at 1080p or qHD.

Plus a lot of those games,you talk about are fine on an older dGPU because of the cartoony graphics and this is going to be way too weak for newer games going forward.

On top of this the interface is only PCI-E 4X,which was an issue with the RX5500XT 4GB as it ran out of VRAM - if you are on older system which has no PCI-E 4.0,it means PCI-E bandwidth is halved which is huge number of AMD and Intel systems. Polaris is going to have less issues in this regard sadly.
qsZHZya.jpg

If you look at the AMD chart,the ideal amount of Infinity Cache for 4K is 128MB(which the RX6800/RX6900 series has),96MB is ideal for qHD(which the RX6700 series has) and the RX6600 series ideally needed 64MB for 1080p. However,it only had 32MB which is why it falls behind at qHD and even 1080p at times(once you start enabling AA) - that is also my experience from comparing cards like the RTX3060,RTX3060TI,RX6600 and RX6600XT as my gaming mates have these cards.

The RX6500XT only has 16MB of cache,which even by what AMD has shown is not enough. 32MB has nearly half the cache hit rates 16MB will have.

In a power limited laptop scenario it might be OK with the lowish clockspeeds on mobile CPUs. However,in a desktop scenario I can see it having bigger issues.

Honestly I would be more inclined to wait and see how Intel Arc pans out.
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,771
Location
Planet Earth
Nah its low end in 2022 , the days of cheap cards are now long gone. With 7nm wafers over $20,000 (Ian Cutress from 3 days ago), nothing will be cheap anymore.

Only because people justify the prices on tech forums. Its a 107mm2 die salvaged part - plenty of phones are using similar sized SOCs on 7NM/6NM and literally cost less as a whole package than a £300 RX6500XT.

The XBox Series S has a nearly 250MM2 7NM SOC and I have seen them for nearly £200.

You can get 6C Zen2/Zen3 based laptops with an RTX3060 for £900~£1000 relatively easily. I can find prebuilt systems for £900~£1100 with RTX3060/RTX3060TI dGPUs relatively easily. Its literally cheaper to buy a prebuilt gaming system which is why I have told people to go that way.

The reality is that I told people years ago to stop justifying the price increases by sprouting "muh wafers cost £10 million" stuff - it was the same stuff people said on 28NM when the £1000 Titan dropped and Nvidia margins started skyrocketing after that. Its in their own interests to push this information out to justify jacking up prices,whilst their own margins massively go up each year which is indicative of what they are doing.

Its straight out of the Intel/Nvidia playbook.

Companies realised gamers just paid any price and they look at us as very high margin whales. Gamers got what they deserved because they could just stop buying every bloody turd AMD/Intel/Nvidia launched at massively above RRP. Literally everyone I knew who got an RTX3060/RTX3060TI/RX6600/RX6600XT didn't pay massively over RRP,and many just stayed put.

How did they manage this feat? Being patient,and if none of them could get a card at RRP/close to it,just stick with what they have,buy a console or.....shock......horror.....do something else instead of gaming!
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,771
Location
Planet Earth
Its a case of wait and see with the 6500, the raw numbers dont help (and neiter do Asus with the **** you AMD we going in at double what you want pricing.

At £200 its merely OK,at nearly £300 - what is the point? It needs a relatively new system to get the most out of the GPU anyway. The RTX3050 looks a better bet,but with no FE it will be another con RRP right there.

Its like someone selling narcotics - they are banking on the desperation of gamers to buy this. But if RTX3060 based laptops have not been that hard to find under £1000,then how long before you can find a Ryzen based laptop with this dGPU for say £600~£700 as an entire system? If you are on a budget you might as well buy this GPU in a total system. Buying it as an individual part means AIB partners,suppliers,etc are all adding their ripoff margin to it.

If companies can put these dGPUs in loads of systems and the prices are OKish,it shows you what they think of DIY PC builders.
 
Soldato
Joined
21 Oct 2002
Posts
7,424
Location
Bexhill on sea
Well heres a bit of info no-one wanted hear! I just stumbled across a video on Y/T from a site called "son of a tech" who deals with MINING! and it appears that the high boost clocks of the 6500xt and possibly the 6400 are of interest for mining "flux" or summat to do with that. Whatever it is , even this waste of time card is now popping up on the miner's radar, and thats after all the effort AMD made to stop that from happening.
 
Associate
Joined
27 Mar 2010
Posts
1,463
Location
Denmark
So the question is how much does the 6500XT’s newer architecture, faster clock speeds and 16MB Infinity Cache help alleviate a possible bandwidth issue, especially for PC’s using PCI-E 3.0.
Just 2 days left until the card releases with reviews all over the place!
 
Soldato
Joined
17 Jun 2004
Posts
7,587
Location
Eastbourne , East Sussex.
I doubt the tiny 16mb cache is going to recover the 50% performance loss seen in some titles.

If you watch the MLID video on the same subject he has a very valid point - the performance is all over the place; pcie x4 has greater effect in different titles (and its not a 1440p anything card, so why test at that?), he also said the cache does help - however, the 6500Xt will be a cherry pickers dream (as will the RTX 3050), both excel in some titles, far faster than the spec suggests, and falls on its ass in others. However, the suggestion is a possible 8GB card could well be in the works.
 
Caporegime
Joined
18 Oct 2002
Posts
29,679
image.png
 
Back
Top Bottom