• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

To be fair it's not an issue for rich people like you ;) more the users who buy and run for quite a few years. Often the cards are still capable, but they start to run out of VRAM as the games get more demanding. Almost as if by design...
Prior to the 3000 launch, I remember posts on here from people with a 2080Ti saying that their card was running out of "grunt" at 4k. Obviously, settings can be turned down to improve frame rates and / or to reduce vram usage but some prefer the full bells whistles experience.
 
Hey old internet buds. All good here but not been gaming at all and moved on to drones. Starting to miss it a bit now, so want to get back (not in this heat though). Rusty I have no idea in thruth but think he found girls :D :D

Good to see you back Gregster. Hope you stick around this time. Always did enjoy debates and banter/windups with you :cry:

You should grab the most expensive GPU next gen and if people start making lube jokes etc, just tell them, its my money, I worked hard for it and I will spend it as I see fit! :p:D;):cry:
 
To be fair it's not an issue for rich people like you ;) more the users who buy and run for quite a few years. Often the cards are still capable, but they start to run out of VRAM as the games get more demanding. Almost as if by design...

Okay, so were the VRAM defenders consistent back when the 1080TI came out and complained that its 11GB was far too much and a waste? :p

Anyway, while modded Skyrim, Flight with tons of mods, etc. might be niche there are a few people here who are interested in that niche.
 
Not really - Nvidia 3000 series outsold AMD'x 6000 series on a monstrous scale.

Some sources: https://webtribunal.net/blog/gpu-market-share/#:~:text=The market share of Nvidia's,remaining 17% going to AMD.&text=In 2020, 80% of the,less than a year later!

83% discrete market share for Nvidia, vs 17% for AMD. Note AMD have lost market share since 2020, according to the Jon Peddie Research cited
I'm talking about performance. AMD have caught Nvidia with raster performance. If they improve ray tracing significantly, then the difference could just be price. Do you think Nvidia will release another 80 card for £650?
 
Yesterday my Quest 2 HUD showed that 20 out of 24GB VRAM was reserved/used when playing my modded SkyrimVR with lots of 8k and 16k textures. :eek:
Well, there goes my idea to buy a used 16GB card once they come down to play Skyrim Wabbajack modlists without any VRAM issues. Have no intention of trying VR though, so maybe I will be fine with only 16GB.
 
Well, there goes my idea to buy a used 16GB card once they come down to play Skyrim Wabbajack modlists without any VRAM issues. Have no intention of trying VR though, so maybe I will be fine with only 16GB.
You will be fine with 4k textures I am sure. With VR not only is the resolution higher but lower resolution textures look significantly worse to me close up. I do need to do some optimizing though with texture sizes and packs to get things in a nice sweet spot!
 
Not really - Nvidia 3000 series outsold AMD'x 6000 series on a monstrous scale.

Some sources: https://webtribunal.net/blog/gpu-market-share/#:~:text=The market share of Nvidia's,remaining 17% going to AMD.&text=In 2020, 80% of the,less than a year later!

83% discrete market share for Nvidia, vs 17% for AMD. Note AMD have lost market share since 2020, according to the Jon Peddie Research cited
I think nVIVIDA was better at mining plus RT, so... is only logical that came on top.
 
I'm talking about performance. AMD have caught Nvidia with raster performance. If they improve ray tracing significantly, then the difference could just be price. Do you think Nvidia will release another 80 card for £650?
Based on what amd have told us, I wouldn't hold out much hope on the RT front, iirc, their words were more or less "RT for RDNA 3 is more advanced than RDNA 2", doesn't exactly fill me with confidence that as literally anything would be better than RDNA 2 RT perf...... If they had come out and said "RT will exceed ampere" then we would be talking.
 
I think nVIVIDA was better at mining plus RT, so... is only logical that came on top.
You also had dlss as being a big perk for a vast number of people, amd had nothing to compete with this.

Would say one of the main reasons was also down to stock/supply. AMD stock was pitiful but no surprise given they were supplying 80% of their stock to consoles.

Also not having their store available in as many countries as nvidia didn't help matters, a lot of people had zero chance of grabbing an amd card at MSRP.
 
Yesterday my Quest 2 HUD showed that 20 out of 24GB VRAM was reserved/used when playing my modded SkyrimVR with lots of 8k and 16k textures. :eek:

I'm pretty sure God of War also used over 20GB vram on my 3090 when I was playing it a few months back at 4k

Probably just a case of "if its there, use it, if not, don't"
 
I'm pretty sure God of War also used over 20GB vram on my 3090 when I was playing it a few months back at 4k

Probably just a case of "if its there, use it, if not, don't"
Mostly, but have a gander at this Wabbajack modlist recommendation:
Aldrnari is meant to use every single inch of my computer, and here are my specs: I7-7700k 1080Ti Zotac - 11GB of VRAM 32GB of 3200mhz DDR4 RAM Full PC Part Picker setup is here. I would recommend atleast 8GB+ of VRAM for 1080p, and for 1440p you will need a minimum of 10GB of VRAM, although more is highly recommended. There are tweaks at the bottom of this readme for 6GB of VRAM users, but they void support. I do not have stable 60FPS on 1440p everywhere with my setup because, frankly, I do not care about framerate if combat is fluid and I can take sexy screenshots
Now that is extreme but like if said before, a studio would strike a balance in terms of effects, models, textures while most modders go texture crazy.
 
I'm pretty sure God of War also used over 20GB vram on my 3090 when I was playing it a few months back at 4k

Probably just a case of "if its there, use it, if not, don't"
That's generally the case for most of these supposed "vram heavy" scenarios.

Sounds like you had a memory leak problem with GOW, although I do recall of a patch to fix this as quite a few people supposedly had their vram or/and RAM being eaten up. Personally didn't have any issues with my 3080 at 4k using launch day version.



 
You also had dlss as being a big perk for a vast number of people, amd had nothing to compete with this.

Would say one of the main reasons was also down to stock/supply. AMD stock was pitiful but no surprise given they were supplying 80% of their stock to consoles.

Also not having their store available in as many countries as nvidia didn't help matters, a lot of people had zero chance of grabbing an amd card at MSRP.

I would probably have tried a 6800 XT but couldnt get one from AMD store, eventually got a 3080
 
I got the 3080fe from the drop even if AMD had UK store I would still went with Nvidia because of the increased RT performance even though I hardly used it on anything but just having that option also DLSS

But AMD have closed the gap hopefully they can continue doing that

Edit forgot to add it was better at mining literally paid for itself and some hehe
 
Last edited:
That's generally the case for most of these supposed "vram heavy" scenarios

Agreed, but at the end of the day it's down to how well the developer optimises the engine regarding the various hardware out there.

I had no issue with GOW using 20GB+ It was just strange to see :D And performance never faltered.

Although I don't agree with the 3080 10GB enough argument/thread. I did on one occasion in Cyberpunk have the fps tank when the engine hit the 10GB ceiling in one location I found (when I owned a 3080).

My view is that the engine should not of got to that limit to result in massive performance loss. I feel there was no reason to have all 10GB populated for what was happening on screen. IMO ;)
 
Back
Top Bottom