• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Close this thread please, it's old and tired :)

Status
Not open for further replies.
8gb was and still is enough on my Vega 64. It's criminal for cards around twice as fast and much newer to be rocking the same amount. As for Hogwarts Legacy bought it for Ps5 as i still can't bring myself to support greedy Nvidia/AMD. I grew up in the 80's play an Atari 2600 so maybe that's why game play matters more than visuals. the difference between Ps5 and the 4090 ain't so much if you just try to enjoy the game play. Game play is where it's at for me and my bank account still loves me for it.
 
Nvidia are blessing us with 8gb again in 2023 with the 4060s, absolute joke as they will come in at £400+.

More like £500, mark my words.

Stop buying this junk, the moment people stop buying this junk and stop defending it is the moment this insanity will stop.

From the 6700XT (A $479 GPU) and up all had 2GB memory IC's on them, what? AMD can but Nvidia can't?

The RTX 3060 with its 192Bit Bus had 12GB, it has more memory all the way up to and including the RTX 3080 (LOL) they put 2GB memory IC's on it because the alternative was to put the same 1GB memory IC's they do on all their other GPU's. So 6GB, and they were ridiculed for the 6GB RTX 2060 not having enough so they didn't do that again.

Tech tubers who are either stupid or far to 'loyal' to Nvidia are at fault here for saying cards like the 3070 had enough, when clearly it doesn't and never did.
 
Last edited:
But surely since the 4090, 7900xtx, 3090, rDNA 2 GPUs have all that vram, they should be getting better performance especially at 1440p? Or wait... Just maybe, they have run out of grunt thus require reducing settings and/or using a lower quality preset of dlss/fsr but no instead let's make a big song and dance about how a £450 2-3 year old mid range GPU is struggling because of vram..... Never mind a £1400+ new GPU ******** the bed too :cry:

If you can't see that point then there's no helping you but then again, if you have spent £1+k to also get a **** experience then of course people will be denial over that hence the lack of responses :)

All he seems to see is "I have a 4090, na na, naa na, naa naaa" :cry:

Why is this thread still open anyway? :D
 
More like £500, mark my words.

Stop buying this junk, the moment people stop buying this junk and stop defending it is the moment this insanity will stop.

From the 6700XT (A $479 GPU) and up all had 2GB memory IC's on them, what? AMD can but Nvidia can't?

The RTX 3060 with its 192Bit Bus had 12GB, it has more memory all the way up to and including the RTX 3080 (LOL) they put 2GB memory IC's on it because the alternative was to put the same 1GB memory IC's they do on all their other GPU's. So 6GB, and they were ridiculed for the 6GB RTX 2060 not having enough so they didn't do that again.

Tech tubers who are either stupid or far to 'loyal' to Nvidia are at fault here for saying cards like the 3070 had enough, when clearly it doesn't and never did.
When the 3080 came out 2gb GDDR6X modules were not available so the only options for Nvidia were to use a 384 bus with 12gb which would have made it even closer to the 3090, put 16 or 20gb on it and have chips both sides of the card or use 16gb of 2GB GDDR6 which would have meant a large bandwidth cut and resulted in the card losing to a 6800XT by a 10% margin.
 
Looks like @tommybhoy was right after all, he reported quirks with his 3070 when you get near full utilisation of memory. Saw similar on my 5700 XT too when I had one in a few games.
Yep, 16gb would have been perfect for the 70 but I keep saying it, the 3070's got the body of 8pack with Mr Muscles legs..

laughing-laugh.gif
 
When the 3080 came out 2gb GDDR6X modules were not available so the only options for Nvidia were to use a 384 bus with 12gb which would have made it even closer to the 3090, put 16 or 20gb on it and have chips both sides of the card or use 16gb of 2GB GDDR6 which would have meant a large bandwidth cut and resulted in the card losing to a 6800XT by a 10% margin.

The 3070 used GDDR6, not GDDR6X.


The 3080 released on September 17 2020, the 3090 on September 24 2020, one week later, if 2GB GDDR6X wasn't out until a week later delay it a week and make it a 20GB card.

No, you can't make the 3070 a 16GB card because that would make the 3080 a 20GB card and if the $699 3080 is a 20GB card then what's the point of the 24GB $1499 3090?

There it is....

The 3080Ti was still a 12GB card.
 
Last edited:
The 3080 released on September 17 2020, the 3090 on September 24 2020, one week later, if 2GB GDDR6X wasn't out until a week later delay it a week and make it a 20GB card.

No, you can't make the 3070 a 16GB card because that would make the 3080 a 20GB card and if the $699 3080 is a 20GB card then what's the point of the 24GB $1500 3090?

There it is....

The 3080Ti was still a 12GB card.
The 3090 has chips on both sides, wasn't until a year or so after release that they had 2gb GDDR6X chips so having chips on both sides of the 3080 and double the vram would have meant a card closer to £1000 rather than £650

IMO they should have used a 320 bit bus on the 3070 and made it a 10gb card just with GDDR6.
 
The 3090 has chips on both sides, wasn't until a year or so after release that they had 2gb GDDR6X chips so having chips on both sides of the 3080 and double the vram would have meant a card closer to £1000 rather than £650

IMO they should have used a 320 bit bus on the 3070 and made it a 10gb card just with GDDR6.

So put chips on both sides of the 3080.....
 
I think basically lots of cheaper GDDR6 ought to be possible on the majority of cards, e.g. 12GB. Especially because in 2023, there is reportedly an oversupply of DRAM.

I don't see a massive need for GDDR6X, except to increase framerates a little. It will seem like only a small improvement, compared to GDDR7, which we are very likely to see included with the next generation of cards in 2024.
 
Last edited:
But surely since the 4090, 7900xtx, 3090, rDNA 2 GPUs have all that vram, they should be getting better performance especially at 1440p? Or wait... Just maybe, they have run out of grunt thus require reducing settings and/or using a lower quality preset of dlss/fsr but no instead let's make a big song and dance about how a £450 2-3 year old mid range GPU is struggling because of vram..... Never mind a £1400+ new GPU ******** the bed too :cry:

If you can't see that point then there's no helping you but then again, if you have spent £1+k to also get a **** experience then of course people will be denial over that hence the lack of responses :)

Resident evil 4 runs great on my 4090 and no it doesn't crash like it does on 8gb cards. But that isn't what you wanted to hear. A waste of my time even typing this response since you're just going to go off on some tangent again. People aren't responded because they don't want to read your massive monologues about nothing. You should try making YouTube content, it's a better format - people will watch 10 minute videos before they read your 10 minute posts
 
Last edited:
Resident evil 4 runs great on my 4090 and no it doesn't crash like it does on 8gb cards. But that isn't what you wanted to hear. A waste of my time even typing this response since you're just going to go off on some tangent again. People aren't responded because they don't want to read your massive monologues about nothing. You should try making YouTube content, it's a better format - people will watch 10 minute videos before they read your 10 minute posts

VincentHanna did and he has a 4090. A bunch of others on steam with a 4090 have also.

Yeah, but let's ignore those users and believe the one who has a epeen vram agenda :p
 
Last edited:
More like £500, mark my words.

Stop buying this junk, the moment people stop buying this junk and stop defending it is the moment this insanity will stop.

From the 6700XT (A $479 GPU) and up all had 2GB memory IC's on them, what? AMD can but Nvidia can't?

The RTX 3060 with its 192Bit Bus had 12GB, it has more memory all the way up to and including the RTX 3080 (LOL) they put 2GB memory IC's on it because the alternative was to put the same 1GB memory IC's they do on all their other GPU's. So 6GB, and they were ridiculed for the 6GB RTX 2060 not having enough so they didn't do that again.

Tech tubers who are either stupid or far to 'loyal' to Nvidia are at fault here for saying cards like the 3070 had enough, when clearly it doesn't and never did.

Stopping buying this garbage is not going to work. NV etc are already rolling out their next business model with subscription access to 4080/4090 gpus with Geforce Now. This is obviously where they want us to be and it is way easier to generate profits when they can blame lag or low gfx fidelity on your own ISP.

Pricing GPUs out of the reach of the average consumer is a win win for Nv and AMD do not care either because they win by console sales...... I may be drunk and this may be the wierdest thing I say in my life but "Help me Intel ,, Only you can save me now..." We need real competition in the market and not just the good old boys running a monopoly to maximise profits.

I , you , or Dave next door may be able to afford the most upto date hardware but we are seeing a clear indication with the sales figures and the Steam hardware surveys of the slower uptake in the most modern gpus. What this means to progress of new game is unknown, if I games developer is making a game for under 10% of its target audience is that a good idea?
 

I dont play and do not really care but is Hogwarts Legacy a Lumin software RT game ? If it is then AMD gpus do very well and it is not until hardware RT is enabled that NV get their mojo back.

This could just be the first of many instances of games producers aiming for the lower tiered products to maximise sales. Why sell to the 1% and ignore the other 99%. Or it could just be a lzy Unreal 5 engine game that uses software RT and gives an advantage to RDNA2.

If anyone was daft enough to think that an 8gb 3070 was going to be a better long term card than a 16gb 6800 then I despair. The AMD fine wine shizzle is true, if you want performance for 2 years buy Nv if for 5+ buy AMD imho.
 
Status
Not open for further replies.
Back
Top Bottom