• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

Look at how much r290/x price was back in the day, Fury /x/nano, Vega56/64 (so expensive cards with new, expensive memory and tech!), and how much AMD is charging now. AMD is not doing me any favors when they have the possibility to do so, ergo I won't do it either. For sure I'm not gonna pay more for a lot less, lol.

LE: Humbug is fine with the 7900xt to be $749 at launch time. 2nd best card in the past, r290 was $399. Vega 54 was $399 as well. If he's fine with that almost double in price, do you think AMD will ever come back to those prices? :) And is the same train of thought I've seen at other posters: "if it only would have been $200 cheaper" thereabouts). So forget the doubling in price, $200 cheaper and is fine. No, prices won't go down, because people are happy to pay more (sometimes for less).

£400 in 2013 is £520 today, that also doesn't reflect a far more significant raise in shipping costs, component costs and manufacturing costs.

The truth is add also the size and wight of modern GPU's, the power increase, the quality of the boards and coolers an equivalency is probably around £650.

No one wants to go from this handsome quality thing......

Kr092CS.jpg

back to this.

mSaBgrA.jpg
 
Yep, I was all against the price increases until my 4080 arrived but I understood at least part of the rise when I picked it up. It weighs nearly twice what my 3080 did and that was itself huge compared to all my previous cards. If TSMC raised prices on chips by 30% and the build costs had to rise to make these behemoths, then I could understand a 50% increase in the end product (i.e. £700+350= £1050 for a 4080.)


Sadly they are still even higher than that which is where I think they are just trying to be greedy and to force sell through of the remaining 3000 series cards by making them seem better value until stock runs out.

Yeah, i mean take nothing away from Nvidia's FE cooler, its a fantastic piece of design and engineering, it is in that sense better than AMD's more traditional reference coolers, having said that it is also very good, i think AMD's cooler is the best so far of the traditional reference coolers, Nvidia's coolers are the foundation of the new age in coolers.

The thing is this is what we want, we do want that quality and design, Nvidia understood this far sooner than AMD who had only just caught on with RDNA2, before then AMD's thinking was keep costs low by gluing the most basic crap to it and let AIB's do all the heavy lifting, the 290's cooler was trash, AMD was like "yes we know, it makes the card cheap, its what you want isn't it?" no.... no not really.

As for TSMC, they have doubled prices in the last two years, shipping costs are up 30% in the same time and an RX 7900XT weights 2X as much as an R9 290.

Costs have gone up, things are more expensive, i think we have to be reasonable about that and also allow for our high quality expectations.
 
Last edited:
I hate these driver arguments. they are stupid,

Since i got my 165Hz screen my GPU can't decide if it wants to idle at 300Mhz or 1600Mhz and use a lot more power, if i wake the screen from sleep it takes a while to come on and then waits a while longer before realising "i'm connected to a G-Sync screen, let me turn that off and on again for you....." right in the middle of writing an angry OCUK post....

If you're an Unreal Engine hobbyist every driver update has you spending an hour testing to see what's broken now....

I could go on. I have friends with AMD GPU's and they don't suffer any more or more serious problems than i do...
 
My problem with amd isn't the drivers. It's the lack of performance. The 7900xt launched at a price that made it a worse option even for raster. It lacks on so many fronts (consumption, RT, upscaling) and yet it even has worse raster per $ compared to the 4070ti. Like... WHAT??

They are the same $799, one is 10% faster in raster, the other 8% faster in RT, one has the absolute bare minimum 12GB the other 20GB.

I've said over and over again the 7900XT should have launched at $749.
 
Thats the price the XTX and 4080 should have launched at the 7900XT and 4070ti should have been around $500-600

I see your game here, making me be that guy :p

The 7900XTX is 47% faster than the 6900XT its replaced for the same launch MSRP, the 4090 is 22% faster for 60% more money.

its £150 cheaper than the 4080 while being 10% faster and has 24GB vs a not unreasonable 16GB.

Its perfectly ok.
 
As @CAT-THE-FIFTH pointed out Nvidia have $5 Billion worth of inventory collecting dust.

I don't know what it is with Nvidia but despite that ^^^^ they still insist on selling last gen for above MSRP, they are loosing sales, so much so that they have cut TSMC orders by more than one third.

They are incredibly pig headed about making us like this new normal of very expensive cards, AMD have slashed last gen prices to below MSRP now, in some cases by a huge chunk.
They have already bought up some of Nvidia no longer needed capacity and have recently announced they will be buying up a lot more.
 
Last edited:
As @CAT-THE-FIFTH pointed out Nvidia have $5 Billion worth of inventory collecting dust.

I don't know what it is with Nvidia but despite that ^^^^ they still insist on selling last gen for above MSRP, they are loosing sales, so much so that they have cut TSMC orders by more than one third.

They are incredibly pig headed about making us like this new normal of very expensive cards, AMD have slashed last gen prices to below MSRP now, in some cases by a huge chunk.
They have already bought up some of Nvidia no longer needed capacity and have recently announced they will be buying up a lot more.

I can't wait for the 7800XT to cost less than the 3080, currently £750 and kill it stone dead so Nvidia are forced to either write off $5 Billion or fire sale the 3000 series, at long last.
 
If they price the 7600XT, 7700XT and 7800XT just right with plenty of VRAM they could put a nice dent in Nvidia's sales/pricing. Do they have the volume though? I'd sell my 3060Ti for a 7800XT if they price it right.

Don't know, i think 16GB and = to 6900XT for $649 for the 7800XT.
AMD bought up massive amounts of 5nm and Zen 4 isn't selling as well as Zen 3 did, so it should be good capacity, they are also buying up Nvidia's cancelled 4nm.
 
Last edited:
They were misquoted from an original statement they made, regarding future plans, so responded with that on what their future plans are.

All but confirming they don't want to compete in that price segment. 11:30 in the video of the tweet:)

Ah ok thanks :)

I've heard it said before that internally AMD just don't want to go above $1000 for retail GPU's, too much of a risk for them and they think their partners to design $1400 GPU's that they then have to reduce to $1000 because no one wants them.
 
Totally worth it for not having to use FSR alone.

I'm not going to complain about the price of the 4090, its a halo money is no object product, and a good one.

My problem is with Nvidia trying to push people who don't have unlimited resources in to much higher price bands or you get nothing.
Yes people may point at the 7900XT but at lease AMD quietly admitted they got that wrong and corrected it, Nvidia just give you the finger.
 
With the 4090 you're getting 28% better raster and 72% better RT when you look at a summary of multiple reviews for 53% more money as a UK buyer which is actually very good scaling for a halo card.


AMD haven't corrected the 7900XT all they have done is drop the price to match the terribly priced 4070ti, both those cards should be around £600.

i'm looking at TPU 4K results, as is my usual for many years, i really don't know why you should feel the need to correct me in this way and big the 4090 up more. all sites have variation from eachother and i've said the 4090 is a good card, i have no issue with it.
 
Last edited:
Why do textures on old games look great without consuming too much VRAM? If they want 1000 different types of rock textures in their games that's great, but they also need an option of 100 types for lower end hardware because most people aren't going to notice the difference. The reality is they just don't want to spend more time on optimization, they want to churn out crappy console ports as fast as possible with minimal effort, and they want the end user to pay for it.

The consoles have use of 12GB of VRam, which is 50% more than 8. :D

When you say old how far back are we talking? there was a time when texture looked awful but i guess you're not talking about that.

For most games texture resolution hasn't changed much, BF3 may have used 1024 textures where as BF5 probably uses 2048, the real difference is BF3 had 2 maybe 3 layers of textures, an Albedo or Colour map, a Bump Map and maybe a Dirt or Detail Layer.
These days games are expected to have more advanced shading, or lighting and more detail, so they have Grey Scale Maps, Specular Maps, Metallic Maps, various shading detail maps and depending what it is, like for example a rock it could also have terrain blending layers.
So there is vastly more data and certainly a lot more weight in VRam resources, the simple fact is these days if you're trying to fit the same amount of texture resolution for the same assets you use more VRam than you used to, you may even have to reduce the texture resolution and then they start to look muddy.
Idealy for the best look you would run all of your texture data at 4K resolution, but that would require a leap forward in VRam capacity.
 
Last edited:
My personal and uninformed opinion would say they like to have a stalking horse in the race because they like to keep in touch with the cutting edge of gpu technology, once you drop out of the race its very hard to get back into it its taken years to become competitive again or at least within sight of the leader. If they concentrate on purely cpu and console they risk getting left behind and becoming stagnant like Intel did with CPU's. The discreet sales of dGPU's probably aren't important its keeping an iron in the fire and the trickle down that matters

Personally though part of me wishes they would quit simply to see the amount of whinging and bellyaching when Nvidia inevitably raises prices above the board as a consequence it'd be glorious.

As to what on earth Jim is on about in that video gawd only knows he himself in one of his own earlier videos how the GPU war was won by Nvidia spells out exactly why they're in the position they're in know even back when ATI were ahead on performance and price Nvidia still outsold by 10:1 or whatever the exact ratio was, their bank balance increased exactly as ATI were decreasing and they simply didn't have the R&D budget to compete again so ATI threw in the towel and sold up. So quite what was the point of that unfocused rant was I have no idea I do wonder about the state of his mental health honestly but thats for another thread.
You make a good point as to the reasons for sticking with it.

On ATI, many of an older generation, my self included look back on ATI and think of that as the glory days, as good as it was post 3DFX anyway...

The truth is some of those cards, like the famed HD 4870 were already AMD GPU's under the ATI branding, earlier ones came at R&D costs that ATI never recovered, the fact of the matter is if you spend more on R&D than your bringing in as net revenue eventually you're going to run out of money, that's exactly what happened, ATI was flat broke by the time it came round to the HD 4870, AMD developed that card, ATI's last gasp, the Radeon 1950XTX was not a great card.

There was once quite a few GPU vendors, Nvidia dispensed with all of them, at least they did if you morn the passing of ATI, Quite an impressive feat, but Nvidia didn't do that on their own, aside from 3DFX ignoring the rise of Direct X people also falling for an 'Apple' marketing strategy helped, that's even more true now that it ever was, have to have those mythical drivers and DLSS. anything else is a poor imitation and i'll pay the price to get the genuine thing.

Well, keep paying then...
 
Last edited:
Thing is that fat margins attracts competition.
That's why Intel smelled easy cash, if prices keep being this high there is at least another dormant player that might wake up.
Remember when Environmental Bump Mapping was the ray tracing of its era?

Intel tried to flog a card slower than a $300 card for $450.

They were sniffing something allright....
 
Back
Top Bottom