• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

I get all that, but does it matter? At the end of the day it looks way way better than eg. forspoken, while using less than half the VRAM. Plague Tale is one of the few games that make me go "wow, that looks amazing", especially the first few chapters were jaw dropping. So I don't see the point of having to pay more for cards with extra vram, which in turn will consume more power, just so ill end up playing games that just look...worse. What's the benefit I get out of all of this?

You get to come here and argue endlessly. Seems to be something you enjoy, surely a benefit?
 
Plagues tale reuses a lot of assets for one thing which is very common in indie games. So it helps when you design one wall and then copy paste it 100 times through the game it saves so much space as you only need to store one asset. So yeah go play plague tale, pay attention to the world and you'll see the same trees, same walls, same foliage, same windmills, same houses, same torches and the list goes on, you see the same asset all over the world
And those vRAM hungry games have each rock, each tree, each NPC, each item unique?
 
For what it's worth, I think the best looking games are PS5 exclusives right now. We're really at that sweet spot in the console cycle. If you get a 4090 you can push for higher settings but generally on games that look worse to begin with.
 
@Bencher You're right but also wrong. Its quite possible that with more work and better techniques some of these titles may be able to reduce the VRam footprint, maybe.

But as a studio you don't always have the money to hire the very best devs and a huge number of them to get what is in that case a huge increase in workload done in good time, CIG have over 900 of some of the best devs in the industry and development on their game is very slow, however yes this vast game fits inside 8GB of VRam, just.... while still looking incredible.
 
Last edited:
@Bencher You're right but also wrong. Its quite possible that with more work and better techniques some of these titles may be able to reduce the VRam footprint, maybe.

But as a studio you don't always have the money to hire the very best devs and a huge number of them to get what is in that case a huge increase in workload done in good time, CIG have over 900 of some of the best devs in the industry and development on their game is very slow, however yes this vast game fits inside 8GB of VRam, just.... while still looking incredible.
Plague Tale was basically made by a small indie studio though. The games that actually hog vram like crazy on other hand are not, and they are asking a full AAA price.
 
Plague Tale was basically made by a small indie studio though. The games that actually hog vram like crazy on other hand are not, and they are asking a full AAA price.

You can't compare any game to any other game

In the same way you can stick a 65 HP engine from a MK1 VW Golf in to a MK7 VW Golf, it just wouldn't move, you wouldn't complain the MK7 is "Unoptimised" its an entirely different car with about 600KG of stuff on it that the MK1 Golf doesn't have.

Its like.... my card is 8GB, i should expect 2017 levels of VRam weight in a 2023 AAA game, and its all your fault if you can't do that because i paid $500 for this card.
 
Last edited:
You can't compare any game to any other game

In the same way you can stick a 65 HP engine from a MK1 VW Golf in to a MK7 VW Golf, it just wouldn't move, you wouldn't complain the MK7 is "Unoptimised" its an entirely different car with about 600KG of stuff on it that the MK1 Golf doesn't have.

Its like.... my card is 8GB, i should expect 2017 levels of VRam weight in a 2023 AAA game, and its all your fault if you can't do that because i paid $500 for this card.
All I'm saying is, 8gb of vram is enough to play one of the best if not the best looking games at 4k Ultra in all its glory. Actually, even 6gb are enough. That much is a fact.

Of course 8gb are not enough to play every game, some games require 10gb even in 720p while they look much worse.

So said card, if it can run the best looking game, but can't run a bad looking game, I have to conclude that the problem lies with the game, not with the card.
 
I'll put it another way.....

2019: $500 for a high performance 8GB card.

PC Gamers: That's not enough VRam, that's going to cause problems soon.
Cultists, that's plenty, remember 4 cores was enough to last for 10 years.

A couple of years later...... PC Gamers predictions come true.
Cultists, no..... its because games are unoptimised.
 
Last edited:
I'll put it another way.....

2019: $500 for a high performance 8GB card.

PC Gamers: That's not enough VRam, that's going to cause problems soon.
Cultists, that's plenty, remember 4 cores was enough to last for 10 years.

A couple of years later...... PC Gamers predictions come true.
Cultists, no..... its because games are unoptimised.
Αgain, you are missing the point. I agree it has caused problems. That's not because of the lack of VRAM. If that was the problem then you wouldn't have better looking games using half or less than that the vram of horrible looking games. No matter how much VRAM nvidia adds, there will always be amd sponsored games that ask more than that. The only thing nvidia can do to stop these amd schemes is to add more vram than amd cards have. Else they are screwed, it will keep on happening
 
To paraphrase a good comment that I saw elsewhere "some PC ports might be ****, and when they are AMD has you covered but Nvidia doesn't".

The thing is after a lot of searching i found some pricing of Memory IC's.

Its $32 for 16GB of Micron GDDR6 21GB/s, that's including sales tax, its retail if you buy 1250 units.

GPU vendors will get those for a chunk less than $32 for a 16GB kit when they are buying not 1250 units but a million.

I can't post the side for forum rules reason.

But there you are, $32 for a 16GB kit of some of the fastest GDDR6 memory available, proper high end stuff at 21GB/s, retail pricing.

It is nothing to do with the cost and that's why AMD are able to put 16GB on their $500 GPU, it just doesn't cost much to do that.
 
Last edited:
The thing is after a lot of searching i found some pricing of Memory IC's.

Its $32 for 16GB of Micron GDDR6 21GB/s, that's including sales tax, its retail if you buy 1250 units.

GPU vendors will get those for a chunk less than $32 for a 16GB kit when they are buying not 1250 units but a million.

I can't post the side for forum rules reason.

But there you are, $32 for a 16GB kit of some of the fastest GDDR6 memory available, proper high end stuff at 21GB/s, retail pricing.

It is nothing to do with the cost and that's why AMD are able to put 16GB on their $500 GPU, it just doesn't cost much to do that.
For the money the 4070 ti ought to have 24GB on it.
 
For the money the 4070 ti ought to have 24GB on it.

16 is fine, the problem is that's not possible, because Nvidia are trying to keep the costs down by using these small memory buses, its 192Bit, 6X 32Bit, so the options are 6GB or 12GB, there are no 4GB Memory IC's so they would have to use 12 2GB IC's.

The 3070 however could easily have been 16GB, but they used 1GB IC's, not 2GB like AMD did, they did that for an entirely different reason, Nvidia's market dominance is such that they can get away with shenanigans, and as a bonus AMD get the blame for it.
 
Last edited:
AMD get the blame for it.
So you don't think there is any correlation between a game being AMD sponsored and requiring 28 petabytes of vram? None whatsoever? Take a look at this list


6 out of 7 vram hogging games are on that list, and the one that isn't is just fine on 8gb cards unless you activate RT. So....I guess it's coincidence
 
Last edited:
So you don't think there is any correlation between a game being AMD sponsored and requiring 28 petabytes of vram? None whatsoever? Take a look at this list


6 out of 7 vram hogging games are on that list, and the one that isn't is just fine on 8gb cards unless you activate RT. So....I guess it's coincidence

You can add Unreal Engine to that list, a lot of collaboration between Epic Games and AMD in the last 3 years.

I think AMD are sponsoring a lot of modern games.
 
So you don't think there is any correlation between a game being AMD sponsored and requiring 28 petabytes of vram? None whatsoever? Take a look at this list


6 out of 7 vram hogging games are on that list, and the one that isn't is just fine on 8gb cards unless you activate RT. So....I guess it's coincidence

Clearly you do so lets consider exactly how it would have happened if Nvidia was sponsoring those titles.

If Nvidia was involved there would have been instructions given to the company not have settings that exceed the memory limits of Nvidia cards?

4 cores are what we make so make the game limited to 4 cores is completely appropriate as a comparison.

Just drop the settings if you're bouncing off the RAM limit of an Nvidia card obviously. People using AMD cards have to do exactly the same when finding settings that favour Nvidia hardware.
 
Clearly you do so lets consider exactly how it would have happened if Nvidia was sponsoring those titles.
If nvidia was involved the game would be one of the best if not the best looking games of the generation while consume half or 1/3rd the vram of forsaken godfall and tlou. For example, Plague Tale :D
 
So you don't think there is any correlation between a game being AMD sponsored and requiring 28 petabytes of vram? None whatsoever? Take a look at this list


6 out of 7 vram hogging games are on that list, and the one that isn't is just fine on 8gb cards unless you activate RT. So....I guess it's coincidence
Sponsoring games to push your line's advantage is being Nvidia 101. If they cheaped out on a cheap part presumably to screw their own consumers and AMD is pressing that advantage then I say fair play to them, Nvidia's had it coming.
 
Back
Top Bottom