• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

Also why do people keep posting german language sites as more authoritative sources?
Germans are too serious to be wrong. I am married to one and live 3 days per week in Munich with her.

If Germans made a mistake they would be immediately ostracised from all family, friends and neighbours for like a month :p
 
Last edited:
i'm looking at TPU 4K results, as is my usual for many years, i really don't know why you should feel the need to correct me in this way and big the 4090 up more. all sites have variation from eachother and i've said the 4090 is a good card, i have no issue with it.
TPU tested the 4090 with a 5800X though while the 7900XTX was tested with a 13900k so its not really the best apples to apples comparison which is why its better to take an average among many reviews.
 
It's important to acknowledge that AMD plays a crucial role in keeping healthy competition in the market. While it may be difficult for me to admit, we even need Intel, particularly for graphics cards. Although I was tempted to post a thread about whether 16GB VRAM is sufficient for the next four years, I sense that such a post could lead to a ban
 
It's important to acknowledge that AMD plays a crucial role in keeping healthy competition in the market. While it may be difficult for me to admit, we even need Intel, particularly for graphics cards. Although I was tempted to post a thread about whether 16GB VRAM is sufficient for the next four years, I sense that such a post could lead to a ban

It will be until the narrative no longer suits ;)

But alas, 16gb is enough but as predicted, gpus that came out 2-3 years ago are lacking the grunt now thus settings reduced across the board and/or higher presets of upscaling is required.
 
Don't know, i think 16GB and = to 6900XT for $649 for the 7800XT.
AMD bought up massive amounts of 5nm and Zen 4 isn't selling as well as Zen 3 did, so it should be good capacity, they are also buying up Nvidia's cancelled 4nm.
At that price I could either go 6950XT or 7800XT. I'd probably go for the newer card.
 
TPU tested the 4090 with a 5800X though while the 7900XTX was tested with a 13900k so its not really the best apples to apples comparison which is why its better to take an average among many reviews.
If that’s the case then it’s a very bad move on their part as even at native 4K my 5800X with tuned memory struggles to keep up with my 4090 in many games.
 
It's important to acknowledge that AMD plays a crucial role in keeping healthy competition in the market. While it may be difficult for me to admit, we even need Intel, particularly for graphics cards. Although I was tempted to post a thread about whether 16GB VRAM is sufficient for the next four years, I sense that such a post could lead to a ban
In four years even 24GB won't be enough if lazy developers get their way with not wanting to spend any time on optimization.
 
In four years even 24GB won't be enough if lazy developers get their way with not wanting to spend any time on optimization.

Get out of here! Didn't you get the memo, people rather spend their hard earned money on unnecessary hardware so they can avoid/brute force through issues/poor optimisation

This sort of comment comes from an obvious place of ignorance.

Witcher 3 - running much better with latest patches especially on lower end hardware
Forspoken - running much better with latest patches especially on lower end hardware
TLOU - much better with latest patches especially on lower end hardware
Hogwarts - much better with latest patches especially on lower end hardware

What do you call this?
 
Now pay attention.... :D

Why do textures on old games look great without consuming too much VRAM? If they want 1000 different types of rock textures in their games that's great, but they also need an option of 100 types for lower end hardware because most people aren't going to notice the difference. The reality is they just don't want to spend more time on optimization, they want to churn out crappy console ports as fast as possible with minimal effort, and they want the end user to pay for it.
 
Why do textures on old games look great without consuming too much VRAM? If they want 1000 different types of rock textures in their games that's great, but they also need an option of 100 types for lower end hardware because most people aren't going to notice the difference. The reality is they just don't want to spend more time on optimization, they want to churn out crappy console ports as fast as possible with minimal effort, and they want the end user to pay for it.

The consoles have use of 12GB of VRam, which is 50% more than 8. :D

When you say old how far back are we talking? there was a time when texture looked awful but i guess you're not talking about that.

For most games texture resolution hasn't changed much, BF3 may have used 1024 textures where as BF5 probably uses 2048, the real difference is BF3 had 2 maybe 3 layers of textures, an Albedo or Colour map, a Bump Map and maybe a Dirt or Detail Layer.
These days games are expected to have more advanced shading, or lighting and more detail, so they have Grey Scale Maps, Specular Maps, Metallic Maps, various shading detail maps and depending what it is, like for example a rock it could also have terrain blending layers.
So there is vastly more data and certainly a lot more weight in VRam resources, the simple fact is these days if you're trying to fit the same amount of texture resolution for the same assets you use more VRam than you used to, you may even have to reduce the texture resolution and then they start to look muddy.
Idealy for the best look you would run all of your texture data at 4K resolution, but that would require a leap forward in VRam capacity.
 
Last edited:
Why do textures on old games look great without consuming too much VRAM? If they want 1000 different types of rock textures in their games that's great, but they also need an option of 100 types for lower end hardware because most people aren't going to notice the difference. The reality is they just don't want to spend more time on optimization, they want to churn out crappy console ports as fast as possible with minimal effort, and they want the end user to pay for it.

This is the problem....

How many gpus have more than 8, 10-12gb vram?

4090/4080
6950xt/6900xt/6800xt
3090/3090ti

And that is an incredibly small amount of pc users.

What we really need is direct storage properly implemented, as proven, it is holding back pc compared to consoles now and in theory, it would minimise/reduce the issues that are present.
 
The Master gave me a right ticking off again about it and told me not to do it again or he'd start touching me in the bad places like before...

:D

image.png
 
This is the problem....

How many gpus have more than 8, 10-12gb vram?

4090/4080
6950xt/6900xt/6800xt
3090/3090ti

And that is an incredibly small amount of pc users.

What we really need is direct storage properly implemented, as proven, it is holding back pc compared to consoles now and in theory, it would minimise/reduce the issues that are present.

7900XTX and 7900XT.

Makes that 9 cards in the current gen and last gen. Also for sure more to come this gen with 4090Ti and probably some super updates from Nvidia and AMDs refresh cards and 7800 cards will for sure have more than 12GB.

So from what we know 4090ti,7800XTX,7800XT,7800 to be added to that list yet too.

So a total of 13 (unlucky for some) cards this year for sure with over 12GB. OR 12 for sure without the 7800XTX.
 
Last edited:
Back
Top Bottom