• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PC Gamer: RX 6500 XT looks worse on paper than AMD's $199 GPU from six years ago

Associate
Joined
11 Jun 2021
Posts
1,024
Location
Earth
I see what you mean by curve now, i just go +100 on the GPU and +400 on the memory, i have had it running +150 / +600 but had memory artefacts, so i just left it stock, it doesn't seem worth it to me, tho i have never tried the Curve tuning, i might have to :)

Good score, 11% on me :)

Would have been a great generational price/perfmance leap if they had actually cost £369 with a basic cooler !
 
Caporegime
Joined
17 Mar 2012
Posts
47,579
Location
ARC-L1, Stanton System
Would have been a great generational price/perfmance leap if they had actually cost £369 with a basic cooler !

Yes, its not an upgrade from someone like me tho, according to TPU the 3060TI is 10% faster than my card, that ties in with what we see here, the 3070 is 15% faster again, 25 to 30%, that's not completely horrible for a price to price (MSRP) upgrade, but its also not enough, and it has the same memory architecture, i don't trust a 256Bit 8GB memory system for 3 years of game advancement, i feel like its under-performing a bit and has some pretty aggressive obsolescence built in.
 
Soldato
Joined
15 Oct 2019
Posts
11,689
Location
Uk
I think people fail to realise that Ethereum isn't the only coin moned though either.

Do people actually think mining is going to disappear just because ETH goes pos?

And I'm not saying that prices won't drop a bit (they might not either however) but if/when crypto sees a higher levels don't be surprised when the prices of GPU's goes back up again if they do drop.
You'll be lucky to make 20p a day when ETH goes POS with the enormous hash power which will switch over to other coins causing the differculty to rocket and couple that with the sky rocketing energy prices it just won't be worth buying cards and those with cards would be better off selling up and investing into ETH staking.
 
Associate
Joined
11 Jun 2021
Posts
1,024
Location
Earth
Yes, its not an upgrade from someone like me tho, according to TPU the 3060TI is 10% faster than my card, that ties in with what we see here, the 3070 is 15% faster again, 25 to 30%, that's not completely horrible for a price to price (MSRP) upgrade, but its also not enough, and it has the same memory architecture, i don't trust a 256Bit 8GB memory system for 3 years of game advancement, i feel like its under-performing a bit and has some pretty aggressive obsolescence built in.

Agreed, I look for atleast 50% performance increase and a bump in VRAM when it's upgrade time, 8gb at 1440p could certainly be pushing it in the next couple of years for AAA titles.
 
Associate
Joined
29 Aug 2004
Posts
2,381
Location
Alpha centauri
I see what you mean by curve now, i just go +100 on the GPU and +400 on the memory, i have had it running +150 / +600 but had memory artefacts, so i just left it stock, it doesn't seem worth it to me, tho i have never tried the Curve tuning, i might have to :)

Good score, 11% on me :)

It`s a strange one with the RTX 3000 cards that just use the ddr 6 non X some will do 1000 + on the memory with out artefacts others less then 600 but yes +400 is all you need any more will not make any difference
 
Soldato
OP
Joined
15 Jan 2006
Posts
7,768
Location
Derbyshire
I note that these were designed to go alongside mobile Ryzen 6000 series APUs. I wonder what the chances are that could be in some sort of crossfire-like configuration, where the RDNA 2 based graphics of the APU could be used alongside this GPU to improve rendering performance.
 
Caporegime
Joined
17 Mar 2012
Posts
47,579
Location
ARC-L1, Stanton System
It`s a strange one with the RTX 3000 cards that just use the ddr 6 non X some will do 1000 + on the memory with out artefacts others less then 600 but yes +400 is all you need any more will not make any difference

Ampere is running ECC memory (Error Correcting Code) so with in limits you can run it above where its stable.

I note that these were designed to go alongside mobile Ryzen 6000 series APUs. I wonder what the chances are that could be in some sort of crossfire-like configuration, where the RDNA 2 based graphics of the APU could be used alongside this GPU to improve rendering performance.

With all the Heterogeneous stuff AMD are doing these days it wouldn't surprise me if AMD came up with something like this.

AMD will be the first to crack multichip gaming GPU's, if they can do that they might have the "Glue" to make a dGPU and an iGPU work as one...
 
Caporegime
Joined
17 Mar 2012
Posts
47,579
Location
ARC-L1, Stanton System
@CAT-THE-FIFTH

Just for a minute imagine that it is true the cost of making GPU's has gone up significantly, just imagine for a minute that the inflation we all know is here is real, imagine there is some truth to what AMD say in that its difficult now to make $200 GPU's.

With that in mind AMD did try, the result is compromised, the compromise is to get the costs down to where its doable, and while this is worthy of criticism it is a perfectly good GPU if you treat it like a mid IQ settings GPU. and its cheaper brand new with its 3 year warranty than an equivalent 3+ year old used card on FleaBay.

It is reasonable for reviewers to be critical of it for its faults, as PC-World did, but is it reasonable to write hyperbolic headlines and put the don't buy it its the worst card ever conclusion at the beginning of the video before the review, and then deliberately treat it like its an RTX 3080 in that review to show it in its worst possible light?

Do you think AMD will look at that and say "oh ok, we will reduce the price to $140 then" because these reviewers who do this don't think that, if AMD are going to act on this at all it will be to stop making $200 GPU's all together, because they don't want that kind of press, and if AMD aren't going to do it who will? Nvidia? what they are likely to do is label something at $200 to fool people like Steve Walton and then sell 97% directly to miners at $400.

All i'm saying is its a crap situation, i hate it, but it is real and this sort of crap https://forums.overclockers.co.uk/posts/35385372/ is no help to anyone other than Steve Walton and his view count, and that's the only reason for it.
 
Soldato
Joined
14 Aug 2009
Posts
2,764
@CAT-THE-FIFTH

Just for a minute imagine that it is true the cost of making GPU's has gone up significantly, just imagine for a minute that the inflation we all know is here is real, imagine there is some truth to what AMD say in that its difficult now to make $200 GPU's.

They're a premium company, of course it is! :rolleyes:

They'll say everything that needs to be said in order to bump the prices even higher for next gens.
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
@Alan McFall Last one, +125 / +500. i beat your stock score :) I ran the fans at 100%..... yes that's cheating :D

irJs3n1.png

+100 core / +1001 memory. :D

Considering I am running this card balls to the wall in a M-ITX case I am ok with 62c max with 100% fan speed but the Inno3D was never a top tier card.

Your card is running strong man! nice result for a 2070 Super. :cool:

h8hN9IO.jpg
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
47,579
Location
ARC-L1, Stanton System
+100 core / +1001 memory. :D

Considering I am running this card balls to the wall in a M-ITX case I am ok with 62c max with 100% fan speed but the Inno3D was never a top tier card.

Your card is running strong man! nice result for a 2070 Super. :cool:

h8hN9IO.jpg

Thanks, +23% over mine.

I'm pretty happy with the Gaming X, massive cooler and fans, during normal operation its whisper quiet at <70c, the fans come on but they just idle, even with the fans at 100% its not that loud, runs at about 50c like that.

I count myself lucky all things considered, i paid £480 for it brand new, add £50 on top of that now used.

Its this one. https://www.techpowerup.com/gpu-specs/msi-rtx-2070-super-gaming-x.b7142
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
Thanks, +23% over mine.

I'm pretty happy with the Gaming X, massive cooler and fans, during normal operation its whisper quiet at <70c, the fans come on but they just idle, even with the fans at 100% its not that loud, runs at about 50c like that.

I count myself lucky all things considered, i paid £480 for it brand new, add £50 on top of that now used.

Its this one. https://www.techpowerup.com/gpu-specs/msi-rtx-2070-super-gaming-x.b7142

Lovely card mate.

This is my daily hair dryer.

https://www.techpowerup.com/gpu-specs/inno3d-ichill-rtx-3070-x4-lhr.b8978
 
Soldato
Joined
15 Oct 2019
Posts
11,689
Location
Uk
@CAT-THE-FIFTH

Just for a minute imagine that it is true the cost of making GPU's has gone up significantly, just imagine for a minute that the inflation we all know is here is real, imagine there is some truth to what AMD say in that its difficult now to make $200 GPU's.

With that in mind AMD did try, the result is compromised, the compromise is to get the costs down to where its doable, and while this is worthy of criticism it is a perfectly good GPU if you treat it like a mid IQ settings GPU. and its cheaper brand new with its 3 year warranty than an equivalent 3+ year old used card on FleaBay.

It is reasonable for reviewers to be critical of it for its faults, as PC-World did, but is it reasonable to write hyperbolic headlines and put the don't buy it its the worst card ever conclusion at the beginning of the video before the review, and then deliberately treat it like its an RTX 3080 in that review to show it in its worst possible light?

Do you think AMD will look at that and say "oh ok, we will reduce the price to $140 then" because these reviewers who do this don't think that, if AMD are going to act on this at all it will be to stop making $200 GPU's all together, because they don't want that kind of press, and if AMD aren't going to do it who will? Nvidia? what they are likely to do is label something at $200 to fool people like Steve Walton and then sell 97% directly to miners at $400.

All i'm saying is its a crap situation, i hate it, but it is real and this sort of crap https://forums.overclockers.co.uk/posts/35385372/ is no help to anyone other than Steve Walton and his view count, and that's the only reason for it.
You have to remember Sony turn a profit on the ps5 disc version which uses a die that's 3X the size of the 6500XT which they have to buy from AMD the middleman who no doubt takes a cut, they also use 16gb of GDDR6, a larger more complex PCB and cooling solution, more IO with wifi etc, 850mb gen 4 nvme, a blue ray drive, an intergrated power supply, all housed in a case and also comes with a top notch controller. Then there is the box size and weight which adds to shipping costs etc so I don't buy that a GPU of this calibre can cost anywhere near $200 to produce and I would guess the margins are close to 100%.
 
Caporegime
Joined
17 Mar 2012
Posts
47,579
Location
ARC-L1, Stanton System
You have to remember Sony turn a profit on the ps5 disc version which uses a die that's 3X the size of the 6500XT which they have to buy from AMD the middleman who no doubt takes a cut, they also use 16gb of GDDR6, a larger more complex PCB and cooling solution, more IO with wifi etc, 850mb gen 4 nvme, a blue ray drive, an intergrated power supply, all housed in a case and also comes with a top notch controller. Then there is the box size and weight which adds to shipping costs etc so I don't buy that a GPU of this calibre can cost anywhere near $200 to produce and I would guess the margins are close to 100%.

I can't find prices for 6nm, its relatively new but the price of a 300mm 5nm Wafer was $17,000 in 2020 and its gone up at least 10% since then, so $19,000.

That's 5nm, i think 6nm would be at least $15,000, lets go with that.

At 107mm^2 in a 300mm $15,000 wafer that is $27 per die.

On top of that you have 2X 2GB 9GB/s GDDR6 memory modules, i wish i could find how much they cost now in bulk but even in 2019 2GB GDDR memory IC's were $15 to $20 each. They are more than that now.

So before you even get to the PCB, high and low MosFets, $2 each, a bunch of chokes, $1 each, Caps $1 for a few, a Voltage Controller, $5 to $10, fans, heat sync, the shroud, assembly costs, shipping costs..... before all that you're already at $70.

then you have to sell it in to a supply chain, they take a slice before it goes to retailers, who take their slice before its at your door.
 
Last edited:
Soldato
Joined
8 Nov 2006
Posts
22,979
Location
London
@CAT-THE-FIFTH

Just for a minute imagine that it is true the cost of making GPU's has gone up significantly, just imagine for a minute that the inflation we all know is here is real, imagine there is some truth to what AMD say in that its difficult now to make $200 GPU's.

With that in mind AMD did try, the result is compromised, the compromise is to get the costs down to where its doable, and while this is worthy of criticism it is a perfectly good GPU if you treat it like a mid IQ settings GPU. and its cheaper brand new with its 3 year warranty than an equivalent 3+ year old used card on FleaBay.

It is reasonable for reviewers to be critical of it for its faults, as PC-World did, but is it reasonable to write hyperbolic headlines and put the don't buy it its the worst card ever conclusion at the beginning of the video before the review, and then deliberately treat it like its an RTX 3080 in that review to show it in its worst possible light?

Do you think AMD will look at that and say "oh ok, we will reduce the price to $140 then" because these reviewers who do this don't think that, if AMD are going to act on this at all it will be to stop making $200 GPU's all together, because they don't want that kind of press, and if AMD aren't going to do it who will? Nvidia? what they are likely to do is label something at $200 to fool people like Steve Walton and then sell 97% directly to miners at $400.

All i'm saying is its a crap situation, i hate it, but it is real and this sort of crap https://forums.overclockers.co.uk/posts/35385372/ is no help to anyone other than Steve Walton and his view count, and that's the only reason for it.

Wow, what an AMD defence.

Your argument seems to be on the basis let's assume AMD are correct and let's not do anything to upset our overlords by giving their cards a bad review.

Never in a million years would you write such a post in defence of Nvidia or Intel. In fact you decide to hypothesise some random BS about nvidia in the very same post to make them sound bad, when this thread isn't even about anything they have done.

There have been very few cards released in the last 15 years, my memory since being an enthusiast, that have been worse than the 6500XT. It is worth mentioning.
 
Back
Top Bottom