• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

7900XTX and 7900XT.

Makes that 9 cards in the current gen and last gen. Also for sure more to come this gen with 4090Ti and probably some super updates from Nvidia and AMDs refresh cards and 7800 cards will for sure have more than 12GB.

So from what we know 4090ti,7800XTX,7800XT,7800 to be added to that list yet too.

So a total of 13 (unlucky for some) cards this year for sure with over 12GB. OR 12 for sure without the 7800XTX.

Ok so still basically **** all of the market who have gpus with more than 12gb vram then..... :p

Most likely the next gen cards is when a good chunk will upgrade so we'll have a ton of vram but the problem we'll still have is **** ports i.e. the main benefit/purpose of having all that vram is just to avoid issues/poor optimisation rather than providing a real worthwhile benefit over consoles.
 
7900XTX and 7900XT.

Makes that 9 cards in the current gen and last gen. Also for sure more to come this gen with 4090Ti and probably some super updates from Nvidia and AMDs refresh cards and 7800 cards will for sure have more than 12GB.

So from what we know 4090ti,7800XTX,7800XT,7800 to be added to that list yet too.

So a total of 13 (unlucky for some) cards this year for sure with over 12GB. OR 12 for sure without the 7800XTX.
What percentage of gamers own those GPU's, 10%?
 
What percentage of gamers own those GPU's, 10%?
wxO6nJn.gif

But there soon will be 12+ cards available and 9 currently with over 12GB.. So guessing people will soon be moving to one if they 4K game or use high texture settings. Really don't know the percentage of gamers that own cards with large VRAM amounts as I mostly worry about work apps and VRAM, gaming I rarely consider anymore as I normally have a GPU with more than enough VRAM for work apps.
 
Honestly i don't know why AMD don't just throw in the towel, i think if it wasn't for the consoles they would, irronically, but the fact that they are still in this game, despite everything perhaps indicated that on some level they do care.

My personal and uninformed opinion would say they like to have a stalking horse in the race because they like to keep in touch with the cutting edge of gpu technology, once you drop out of the race its very hard to get back into it its taken years to become competitive again or at least within sight of the leader. If they concentrate on purely cpu and console they risk getting left behind and becoming stagnant like Intel did with CPU's. The discreet sales of dGPU's probably aren't important its keeping an iron in the fire and the trickle down that matters

Personally though part of me wishes they would quit simply to see the amount of whinging and bellyaching when Nvidia inevitably raises prices above the board as a consequence it'd be glorious.

As to what on earth Jim is on about in that video gawd only knows he himself in one of his own earlier videos how the GPU war was won by Nvidia spells out exactly why they're in the position they're in know even back when ATI were ahead on performance and price Nvidia still outsold by 10:1 or whatever the exact ratio was, their bank balance increased exactly as ATI were decreasing and they simply didn't have the R&D budget to compete again so ATI threw in the towel and sold up. So quite what was the point of that unfocused rant was I have no idea I do wonder about the state of his mental health honestly but thats for another thread.
 
My personal and uninformed opinion would say they like to have a stalking horse in the race because they like to keep in touch with the cutting edge of gpu technology, once you drop out of the race its very hard to get back into it its taken years to become competitive again or at least within sight of the leader. If they concentrate on purely cpu and console they risk getting left behind and becoming stagnant like Intel did with CPU's. The discreet sales of dGPU's probably aren't important its keeping an iron in the fire and the trickle down that matters

Personally though part of me wishes they would quit simply to see the amount of whinging and bellyaching when Nvidia inevitably raises prices above the board as a consequence it'd be glorious.

As to what on earth Jim is on about in that video gawd only knows he himself in one of his own earlier videos how the GPU war was won by Nvidia spells out exactly why they're in the position they're in know even back when ATI were ahead on performance and price Nvidia still outsold by 10:1 or whatever the exact ratio was, their bank balance increased exactly as ATI were decreasing and they simply didn't have the R&D budget to compete again so ATI threw in the towel and sold up. So quite what was the point of that unfocused rant was I have no idea I do wonder about the state of his mental health honestly but thats for another thread.
You make a good point as to the reasons for sticking with it.

On ATI, many of an older generation, my self included look back on ATI and think of that as the glory days, as good as it was post 3DFX anyway...

The truth is some of those cards, like the famed HD 4870 were already AMD GPU's under the ATI branding, earlier ones came at R&D costs that ATI never recovered, the fact of the matter is if you spend more on R&D than your bringing in as net revenue eventually you're going to run out of money, that's exactly what happened, ATI was flat broke by the time it came round to the HD 4870, AMD developed that card, ATI's last gasp, the Radeon 1950XTX was not a great card.

There was once quite a few GPU vendors, Nvidia dispensed with all of them, at least they did if you morn the passing of ATI, Quite an impressive feat, but Nvidia didn't do that on their own, aside from 3DFX ignoring the rise of Direct X people also falling for an 'Apple' marketing strategy helped, that's even more true now that it ever was, have to have those mythical drivers and DLSS. anything else is a poor imitation and i'll pay the price to get the genuine thing.

Well, keep paying then...
 
Last edited:
Thing is that fat margins attracts competition.
That's why Intel smelled easy cash, if prices keep being this high there is at least another dormant player that might wake up.
Remember when Environmental Bump Mapping was the ray tracing of its era?
 
Last edited:
Thing is that fat margins attracts competition.
That's why Intel smelled easy cash, if prices keep being this high there is at least another dormant player that might wake up.
Remember when Environmental Bump Mapping was the ray tracing of its era?

Intel tried to flog a card slower than a $300 card for $450.

They were sniffing something allright....
 
Intel tried to flog a card slower than a $300 card for $450.

They were sniffing something allright....
RTX ON, 20th century style: https://www.anandtech.com/show/298/5

Every time price is bumped by the major players they are unconsciously invoking the videocardzombie army.
First Intel, then Innosilicon, who will be next?
S3/VIA?
Matrox?
PowerVR?

Luckily for AMD they do have an extremely scalable architecture which would allow them to quickly plug any holes on the lower mid range market, however burn enough good faith and people will turn to someone else.
Even the cult of Steve is faltering in sales this year, so it's not impossible that even leather jackets will go soon out of fashion...
 
Intel tried to flog a card slower than a $300 card for $450.

They were sniffing something allright....
Well, that and they confused their costs and expected margins to set their price as if consumers care that Intel made a part which costs them too much to make.
Thing is that fat margins attracts competition.
That's why Intel smelled easy cash, if prices keep being this high there is at least another dormant player that might wake up.
Remember when Environmental Bump Mapping was the ray tracing of its era?
Hence why companies are so fond of patents!

Creating a GPU without stepping on patents must be really hard as some of them are so broad.
There was once quite a few GPU vendors, Nvidia dispensed with all of them, at least they did if you morn the passing of ATI, Quite an impressive feat, but Nvidia didn't do that on their own, aside from 3DFX ignoring the rise of Direct X people also falling for an 'Apple' marketing strategy helped, that's even more true now that it ever was, have to have those mythical drivers and DLSS. anything else is a poor imitation and i'll pay the price to get the genuine thing.
The problem with the way they did it and their continues MO in the decades since, is that Nvidia play dirty. Really dirty. Some say all corporations are the same, but while their mission statement is "make as much money for their shareholder" there are many ways to do so without having to discent to Nvidia's level.

(Shareholder companies with their obsession with short-term profits are the one thing which might kill Western Capitalism. There is a reason some oriental companies are run differently. Even a lot of German-language area Mittlestand firms aren't ordinary shareholder companies and think a bit more long term.)
 
Well, that and they confused their costs and expected margins to set their price as if consumers care that Intel made a part which costs them too much to make.

Hence why companies are so fond of patents!

Creating a GPU without stepping on patents must be really hard as some of them are so broad.

The problem with the way they did it and their continues MO in the decades since, is that Nvidia play dirty. Really dirty. Some say all corporations are the same, but while their mission statement is "make as much money for their shareholder" there are many ways to do so without having to discent to Nvidia's level.

(Shareholder companies with their obsession with short-term profits are the one thing which might kill Western Capitalism. There is a reason some oriental companies are run differently. Even a lot of German-language area Mittlestand firms aren't ordinary shareholder companies and think a bit more long term.)
I named a few dormant holders of significant IP in the area... And that is without considering players that will care little about patents, especially if the current geopolitical trend won't subside.
There are also unserved niches that may or may not be interesting (think hardware voxel rendering!), especially as we're deep into diminishing returns on improving traditional polygons and there will be a need of something new after ray tracing.
 
RTX ON, 20th century style: https://www.anandtech.com/show/298/5

Every time price is bumped by the major players they are unconsciously invoking the videocardzombie army.
First Intel, then Innosilicon, who will be next?
S3/VIA?
Matrox?
PowerVR?

Luckily for AMD they do have an extremely scalable architecture which would allow them to quickly plug any holes on the lower mid range market, however burn enough good faith and people will turn to someone else.
Even the cult of Steve is faltering in sales this year, so it's not impossible that even leather jackets will go soon out of fashion...

That's an interesting read thanks.

Bump Maps are now just one of many layers in the texturing data stack, i had no idea how it came about or where it came from, before my time, never really thought about it.

Its not all that different to a much newer technique, Parallax Occlusion Mapping.
While not invented by CIG like many things they are pioneering a more developed use case for it.

Video about it here from Digital Foundry. Along with other things, for example decoupling particle spawner's from screen space, ecte...

 
Last edited:
That's an interesting read thanks.
Bump Maps are now just one of many layers in the texturing data stack, i had no idea how it came about or where it came from, before my time, never really thought about it.

Its not all that different to a much newer technique, Parallax Occlusion Mapping.
While not invented by CIG like many things they are pioneering a more developed use case for it.

Video about it here from Digital Foundry. Along with other things, for example decoupling particle spawner's from screen space, ecte...

It was just an example of how certain features in the past were a big FPS hit, so what we're seeing now is nothing that new.
RT will probably become mainstream in 3 generations, after that something else will come.
 
It was just an example of how certain features in the past were a big FPS hit, so what we're seeing now is nothing that new.
RT will probably become mainstream in 3 generations, after that something else will come.

Anything and everything has a performance penalty.

My point is you can use some of these techniques as a way to increase fidelity while at the same time reducing the performance cost, the trade off is high skill and workload, that video goes in to some of it.
 
Anything and everything has a performance penalty.

My point is you can use some of these techniques as a way to increase fidelity while at the same time reducing the performance cost, the trade off is high skill and workload, that video goes in to some of it.
Optimization is the key of course, sadly very few devs can do it properly nowadays...
 
I just had a look, GPU best sellers, the 7900XT is #16.

However, the highest ranked 4000 series is the 4090 at.............. #48 :cry: the 4080 is at #66 :cry:
:eek::D:cry:

Yet 4090 gains more market share every month than rx7900xt or any rx7000 GPU for that matter
 
Last edited:
And yet that doesn't explain how Plague Tale uses 4.5 to 5.5 gb of VRAM at 4k ultra while looking way way better than TLOU, Forspoken, Godfall and the rest of the amd sponsored armada of games that use three times as much vram. Really makes you wonder, doesn't it?

Plagues tale reuses a lot of assets for one thing which is very common in indie games. So it helps when you design one wall and then copy paste it 100 times through the game it saves so much space as you only need to store one asset. So yeah go play plague tale, pay attention to the world and you'll see the same trees, same walls, same foliage, same windmills, same houses, same torches and the list goes on, you see the same asset all over the world
 
Last edited:
Plagues tale reuses a lot of assets for one thing which is very common in indie games. So it helps when you design one wall and then copy paste it 100 times through the game it saves so much space as you only need to store one asset
I get all that, but does it matter? At the end of the day it looks way way better than eg. forspoken, while using less than half the VRAM. Plague Tale is one of the few games that make me go "wow, that looks amazing", especially the first few chapters were jaw dropping. So I don't see the point of having to pay more for cards with extra vram, which in turn will consume more power, just so ill end up playing games that just look...worse. What's the benefit I get out of all of this?
 
Back
Top Bottom