• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

its silly but all self inflicted, looking forward to those 2k gpu's :rolleyes:.

Yes I don't believe the 4090 will be the same price as the 3090 and think it will be $2000 for the 4090 and $2500 for the 4090ti if the performance the rumours are saying are true, Nvidia will be charging by the frame rate compared to the last gen and say hey well it's x amount faster for look this price ... more you buy the more you save line will be used a lot this upcoming gen for sure.. Then they will gloss over the power use and when the 4080 comes out it will be the same as the 3080 and they will gloss over the VRAM or some other thing they will neuter to make sure people upgrade again when the 5000 series comes out, the 3080 this gen was a disgrace regarding the VRAM and really should have been 12GB from the start and 16GB for the 3080ti and 3090 24GB/48GB versions and 3090ti 24GB/48GB versions for the pro users, but that option only came in A series versions (Quadros of the past).
 
Nvidia will be charging by the frame rate compared to the last gen and say hey well it's x amount faster for look this price ... more you buy the more you save line will be used a lot this upcoming gen for sure.. Then they will gloss over the power use and when the 4080 comes out it will be the same as the 3080 and they will gloss over the VRAM or some other thing they will neuter to make sure people upgrade again when the 5000 series comes out, the 3080 this gen was a disgrace regarding the VRAM and really should have been 12GB from the start

:cry:
 
Yes I don't believe the 4090 will be the same price as the 3090 and think it will be $2000 for the 4090 and $2500 for the 4090ti if the performance the rumours are saying are true, Nvidia will be charging by the frame rate compared to the last gen and say hey well it's x amount faster for look this price ... more you buy the more you save line will be used a lot this upcoming gen for sure.. Then they will gloss over the power use and when the 4080 comes out it will be the same as the 3080 and they will gloss over the VRAM or some other thing they will neuter to make sure people upgrade again when the 5000 series comes out, the 3080 this gen was a disgrace regarding the VRAM and really should have been 12GB from the start and 16GB for the 3080ti and 3090 24GB/48GB versions and 3090ti 24GB/48GB versions for the pro users, but that option only came in A series versions (Quadros of the past).
I’ve never seen my 3080 use all its VRAM or any issues or loss of performance at 3440x1440p UW at 100Hz in any game, even Cyberpunk with everything on. Perhaps at 4K?
 
Yes I don't believe the 4090 will be the same price as the 3090 and think it will be $2000 for the 4090 and $2500 for the 4090ti if the performance the rumours are saying are true, Nvidia will be charging by the frame rate compared to the last gen and say hey well it's x amount faster for look this price ... more you buy the more you save line will be used a lot this upcoming gen for sure.. Then they will gloss over the power use and when the 4080 comes out it will be the same as the 3080 and they will gloss over the VRAM or some other thing they will neuter to make sure people upgrade again when the 5000 series comes out, the 3080 this gen was a disgrace regarding the VRAM and really should have been 12GB from the start and 16GB for the 3080ti and 3090 24GB/48GB versions and 3090ti 24GB/48GB versions for the pro users, but that option only came in A series versions (Quadros of the past).
Nvidia can't currently shift 3090s for £1400 so what makes you think they increase the price on the 4090 by another 25% in the middle of a recession and a crypto bear market? They might shift a few of the first batches to people who have more money than sense but without miners or the ability of gamers to mine on the side to offset costs those willing to pay such amounts will soon dry up.

Nvidia could have put 12gb on the 3080 but then what would have been the point of the 3080ti? Also if nvidia had put 16gb on the 3080ti it would have been slower than a 10/12gb 3080 due the handicapped bandwidth of the 256 bus.
 
Damn that pitiful 3080 and it's measly 10gb vram! I feel shafted having paid £650 and encountering all these vram issues that are holding back the 3080... Oh wait... :cry:



In all seriousness though, can't wait for the 4070/4080 and the extra RT grunt it will bring :cool: Hopefully in time for the avatar game!

I’ve never seen my 3080 use all its VRAM or any issues or loss of performance at 3440x1440p UW at 100Hz in any game, even Cyberpunk with everything on. Perhaps at 4K?

I play at both 4k and 3440x1440 and have never encountered vram issues except in cyberpunk when I added several 4-8k texture packs.
 
Nvidia can't currently shift 3090s for £1400 so what makes you think they increase the price on the 4090 by another 25% in the middle of a recession and a crypto bear market? They might shift a few of the first batches to people who have more money than sense but without miners or the ability of gamers to mine on the side to offset costs those willing to pay such amounts will soon dry up.

Nvidia could have put 12gb on the 3080 but then what would have been the point of the 3080ti? Also if nvidia had put 16gb on the 3080ti it would have been slower than a 10/12gb 3080 due the handicapped bandwidth of the 256 bus.
Cos 4090 goes brrrrrr, there will always be a group that will pay anything to get the latest and greatest. It is what it is.
 
Damn that pitiful 3080 and it's measly 10gb vram! I feel shafted having paid £650 and encountering all these vram issues that are holding back the 3080... Oh wait... :cry:



In all seriousness though, can't wait for the 4070/4080 and the extra RT grunt it will bring :cool: Hopefully in time for the avatar game!



I play at both 4k and 3440x1440 and have never encountered vram issues except in cyberpunk when I added several 4-8k texture packs.
To be fair it's not an issue for rich people like you ;) more the users who buy and run for quite a few years. Often the cards are still capable, but they start to run out of VRAM as the games get more demanding. Almost as if by design...
 
To be fair it's not an issue for rich people like you ;) more the users who buy and run for quite a few years. Often the cards are still capable, but they start to run out of VRAM as the games get more demanding. Almost as if by design...
Same way people will have to turn down settings due to lack of grunt as is happening right now though isn't it? Plenty of 3090(ti) users are having to reduce settings in several games as of today, this is only going to get worse especially as more RT effects get dialled up/added, not to mention, new engines such as UE 5 will definitely require more powerful gpus.

My last few gpus (amd) had more vram than the competition/norm but it wasn't of any use since settings had to be turned down due to lack of grunt anyway.
 
Same way people will have to turn down settings due to lack of grunt as is happening right now though isn't it? Plenty of 3090(ti) users are having to reduce settings in several games as of today, this is only going to get worse especially as more RT effects get dialled up/added, not to mention, new engines such as UE 5 will definitely require more powerful gpus.

My last few gpus (amd) had more vram than the competition/norm but it wasn't of any use since settings had to be turned down due to lack of grunt anyway.
Correct me if I'm wrong but there have been a few cards in the recent past which had two different amounts of VRAM, same GPU. As far as I was aware the ones with more VRAM aged better? i.e. still useable and even with lower settings they could still perform better than the lower VRAM versions, which would have to lower setting even further.
 
Okay, so were the VRAM defenders consistent back when the 1080TI came out and complained that its 11GB was far too much and a waste? :p

Anyway, while modded Skyrim, Flight with tons of mods, etc. might be niche there are a few people here who are interested in that niche.

As for the power usage: it would be nice if Nvidia were ridiculed like with Ferni or Hawaii but I suspect 99% of reviewers will glance over it if for no other reason than to not jeopardise any future reviews.

Nail on the head. I#ve never seen NVidia/AMD give away that much performance before. If they did manage those perf gains, there would be some hand rubbing and the opportunity to make a generation or two to drip feed the performance whilst making more money.

If it is true it would be win win for consumer. Biggest perf was offered by the 1080ti in more recent times, cant see 4000 series offering up waht people are saying now 4070 beating 3090 etc for cheap. I hope it's true, just cant see it.

If there was zero competition I think this is what we would see. Kepler being mostly 600 and 700 series, or Hawaii going from 200 to the 300 series etc.

Having said all that, there is no doubt that nodes are getting scarcer and the time between new ones is drawing out. If we were still on the 90s or 00s model of a new node bringing twice the transistors for less power at a similar price every 18 months, then I don't think we would be seeing the next gen top cards heading over 400W.

Point being, with nodes progression stalling, companies almost have to a bit of drip feeding. If 4nm was 2023, and 2nm was 2024 but afterwards the next node was in 2030 then the whole industry would have change.
 
Nail on the head. I#ve never seen NVidia/AMD give away that much performance before. If they did manage those perf gains, there would be some hand rubbing and the opportunity to make a generation or two to drip feed the performance whilst making more money.

If it is true it would be win win for consumer. Biggest perf was offered by the 1080ti in more recent times, cant see 4000 series offering up waht people are saying now 4070 beating 3090 etc for cheap. I hope it's true, just cant see it.
Nowadays Nvidia are under more pressure from AMD. That must have an impact on performance/ pricing in this round of releases.
 
Same way people will have to turn down settings due to lack of grunt as is happening right now though isn't it? Plenty of 3090(ti) users are having to reduce settings in several games as of today, this is only going to get worse especially as more RT effects get dialled up/added, not to mention, new engines such as UE 5 will definitely require more powerful gpus.

My last few gpus (amd) had more vram than the competition/norm but it wasn't of any use since settings had to be turned down due to lack of grunt anyway.
Correct me if I'm wrong but surely the RAM's just holding textures? Increasing texture quality barely affects performance in my experience. My 6 year old 8gb RX480 is still playing almost anything on HIGH with ULTRA textures for this reason.

Vram "issues" is just edge cases from edge lords.

o.....k..... those edgy kids with their... memory capacity superiority complexes.
 
Last edited:
As soon as the 2080ti sold out at £1,400 they realised that there was no figurative £1,000 barrier and enthusiasts would buy no matter what the price.
It's a different world now. The new GPUs will cost more to produce, pretty much like everything else but unless Nvidia and / or AMD are just after low volume, high margin sales, they'll have to price accordingly.
 
Correct me if I'm wrong but there have been a few cards in the recent past which had two different amounts of VRAM, same GPU. As far as I was aware the ones with more VRAM aged better? i.e. still useable and even with lower settings they could still perform better than the lower VRAM versions, which would have to lower setting even further.
Of course if you're comparing exactly the same models with only difference being vram then the higher vram model will age better over the long run, question is how long will it take for them benefits of having more vram to "really" show though? e.g. look at the 290s (I had the 4GB model), it took a long time (years) for the 8GB model to really show its benefit over the 4gb model and whilst the 290 4gb model had to sacrifice more settings or/and primarily the texture settings, the 8gb model perf. wasn't exactly anything to shout home about either and still had to reduce just as many settings to get acceptable fps, by that time, you had far better and cheaper gpus available. Any PC gamer, even a budget gamer would more than likely be upgrading to a new gpu after 3/4 years (where even a low end tier gpu would beat a 3/4 year high end gpu).

Then the other consideration point is price, how much extra is more vram worth? Would you say the extra £600+ for 2GB more over the 3080 is worth it? Bearing in mind, you'll probably be able to pick up a 4070, which will match/beat a 3090 for £500-600....

The key is to buying a card with the right balance and upgrading when is best to achieve a significant leap in perf. i.e. I think 8gb is too little for 1440 and above going forward (especially where amd sponsored titles are concerned) but 10gb should be fine for the majority of things. Look at the 2080ti, has more vram than the 3080 yet still loses in pretty much every gaming scenario. As I always say, why do people upgrade their gpus? Is it solely for more vram and nothing else? Or is it for the performance it outputs?

Not to mention, for all we know once direct storage gets properly used in games (hoping sooner than later), it might completely change up the vram situation i.e. vram amount (within reason) might not be as much of a consideration due to the way the assets get streamed in (at least that is theory of it according to nvidia and linustechtips video on it)

Anyway, while modded Skyrim, Flight with tons of mods, etc. might be niche there are a few people here who are interested in that niche.

I heavily modded both skyrim and fallout 4 not long back there, as well as other games like gta 5 redux with mods and no issues at all. Only cyberpunk with the several 4-8k texture packs has had issues with vram.

Correct me if I'm wrong but surely the RAM's just holding textures? Increasing texture quality barely affects performance in my experience. My 8gb RX480 is still playing almost anything on HIGH with ULTRA textures for this reason.

More or less is. See bit above in response to troezar. Although reducing other settings can also reduce vram usage (not to mention if using dlss and fsr), obviously not as much as the texture setting though.
 
  • Like
Reactions: TNA
More or less is. See bit above in response to troezar. Although reducing other settings can also reduce vram usage (not to mention if using dlss and fsr), obviously not as much as the texture setting though.
I mean I sort of see your point, except I've made it 6 years and it's still just about good enough, whereas a 4gb card maybe not. I honestly couldn't tell you when 4gb was not enough though and whether it was worth it overall. Also, I am going to upgrade this year or early next I expect, depending what either party put out.
 
Back
Top Bottom