• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Yeah there's no way those awful medium textures should be the best an 8GB VRAM GPU can hope for. At some point we may have to suck it up though and accept that VRAM management is not a priority and we just need to get MOAR. :)
 
This is what I said on the YouTube video (review of TLOU 1 on HUB channel). I think Hardware Unboxed click bated PC gamers. With the assertion that 8 gig of VRAM in 2023 is no longer good enough. To get views.

I made the point that the game review is only one game, of many. That the game was known to be extremely badly programmed for PC, when the source code originates for a decade old game, with flashy new textures.

I will look at the Digital Foundry article. I just wish HUB would retract their flawed review.

100%

I said it a bit further back too, their core fanbase/viewers are rapid amd fanboys and these are the ones who are the most vocal online hence why HUB play to them more as their clickbait titles and views get more online activity i.e. "nvidia bad, amd good", you just have to witness what happens on forums like this, amd issues (and similar issues too), hardly any activity/posts, nvidia or intel issues (even very minor issues), threads explode, again, I don't blame them, they make a living and controversy is what gets a **** ton of money.

Don't get me wrong, HUB provide great "product" reviews but that is it, for any kind of in depth technical analysis, DF can't be beat, they are the only ones to point out what causes the problems (and as shown by the twitter replies, people are backing up those statements with their own evidence), it is because of them, developers have taken note to add things like shader compile and address other issues, meanwhile HUB "oh said manufacturers just need to provide more hardware specs, which will require you to spend even more ££££ to bypass/brute force the issues" :cry:
 
I'm not sure why people expect to be able to run newly released AAA games at Ultra settings on a high res monitor at blinding framerates without a bleeding edge card. Games will inevitably outpace 8Gb cards, all you need to do is turn the settings down. If that wasn't the case, the question you should be asking is why those AAA games aren't making the most of the power available in those bleeding edge cards?
 
This is what I said on the YouTube video (review of TLOU 1 on HUB channel). I think Hardware Unboxed click bated PC gamers. With the assertion that 8 gig of VRAM in 2023 is no longer good enough. To get views.

I made the point that the game review is only one game, of many. That the game was known to be extremely badly programmed for PC, when the source code originates for a decade old game, with flashy new textures.

I will look at the Digital Foundry article. I just wish HUB would retract their flawed review.
If you run the game long enough, it gets near 17-18GB of VRAM usage after 4-5 hours of gameplay at 4k. Basically nothing apart from the 7900XT, 7900XTX and 4090 can run this at 4k. This is apart from the awful stutters to sub 60 fps even on 4090s. I am not even getting into the 32GB of recommended system ram and 100% CPU usage

Any professional review outlet would have stopped testing this game at that point but HUB soldiered on to make it fit their agenda
 
Digital Shilleries - an Nvidia PR mouthpiece attacking another channel that actually calls out how ***** they are? Gasp, i am surprised.
The thing which I think is daft is the game shows, IIRC, 1.6GB of vram used for OS & Apps at 1440p using DLSS, and almost 5GB at 4K. HUB saw this but didn't mention it, DF did mention it but that's about it. If the game isn't using that VRAM but is reserving it for some reason then it might be possible to reduce it making 8GB cards usable but still not ideal. While I agree 8GB was a **** decision for the 3070 and maybe even the 3060Ti (which I have), if the game is doing stupid stuff with the VRAM then it's not the cards fault really.
 
I'm not sure why people expect to be able to run newly released AAA games at Ultra settings on a high res monitor at blinding framerates without a bleeding edge card. Games will inevitably outpace 8Gb cards, all you need to do is turn the settings down. If that wasn't the case, the question you should be asking is why those AAA games aren't making the most of the power available in those bleeding edge cards?

Read this post:


It's nothing to do with expecting to be able to max a game out with max settings but rather the game is fundamentally flawed in its port to PC.

The only benefit having more vram is providing is to bypass/avoid core issues, all that vram is not actually being utilised for any worthwhile benefit over gpus with lesser vram, essentially people are paying all that ££££ to avoid such issues and even then, they still aren't avoiding the core issues as shown in this thread and in DF video etc.

So far, after all this time, the only real benefit to large vram pools is still when it comes to high res mods/texture packs or/and 4k gaming and you refuse to use dlss/fsr.
 
Last edited:
If you run the game long enough, it gets near 17-18GB of VRAM usage after 4-5 hours of gameplay at 4k. Basically nothing apart from the 7900XT, 7900XTX and 4090 can run this at 4k. This is apart from the awful stutters to sub 60 fps even on 4090s. I am not even getting into the 32GB of recommended system ram and 100% CPU usage

Any professional review outlet would have stopped testing this game at that point but HUB soldiered on to make it fit their agenda
A couple of years back, HUB stated that 1080p resolutions were no longer relevant for testing. That 1440p and 4K only pertinent for GPU testing. I again disagreed with them, because 1080P is ideal for high refresh competitive gaming, and identifying CPU bottlenecking of your GPU.

I think The Last of Us is yet another poorly optimised port that fails as a PC game.

God of War a PS4 game runs my RTX 3070 into the ground when I play it. I think this is down to lazy optimisation or programming for the PC platform. However, compared to TLOU God of War is a good Sony PC re-release.
 
Maybe the really are using the VRAM as a place to store things like they could on a unified memory console? Seems strange, but possible.

Thing is, rather than the moans about "bad console" ports, people do have to ask themselves: what drives gaming, which platform gets developed for, etc?

Yes, some things are due to sloppy ports, but if a game has been in development for a couple of years and has made all kind of assumptions on the system (VRAM amount, core count, decompression hardware etc.), and if the PC port is an after though...

In an ideal world, the PC port would have been in the thinking at the design stages; no ports would be rushed etc.

However, as things stand... This is only going to get worse.

It might even be that matching a PC with 8 Zen 2 cores is futile and the nature of PC ports and lack of disk compression hardware means that 12+ cores of Zen3/4 will be required to max things out.
 
The higher/higheset presets are used by reviewers/ tech sites to compare the performance different GPUS. If a GPU drops down the list below expected because of the lack of vram it's going to get called out.

However in the real world If you find you don't have enough vram then you simply lower some settings to get back within your limit.....If a game runs like **** with RT enabled then turn it off(like most people do). It's fairly basic stuff.

And if the devs optimize to better manage vram usage then you can try again at higher settings and ask the review sites to look at the GPU performance again.


Please send me some eggs to suck.........
 
Is funny how all the hate is channeled against nVIDIA when the problem is primarily at the game(s). And for HUB to go about and just have the "answer" through higher vRAM is... amateur hour.

Star Citizen with its gargantuan worlds and assets fits in 8GB with no problems, but a linear, basic game, does not. First thought? Nvidia skimped on the vRAM! :rolleyes:
 
Is funny how all the hate is channeled against nVIDIA when the problem is primarily at the game(s). And for HUB to go about and just have the "answer" through higher vRAM is... amateur hour.

Star Citizen with its gargantuan worlds and assets fits in 8GB with no problems, but a linear, basic game, does not. First thought? Nvidia skimped on the vRAM! :rolleyes:

Star citizen is a PC game from ground up.

Every other AAA game developer considers the pc to be an afterthought. Just people whove spent almost 2k on a gpu feel they are owed something in return lol.
 
New steam hardware survey out.


When looking at all video cards

RTX 4090 0.31%
RTX 4080 0.20%
RTX 4070 Ti 0.18%

So about 0.7% of all video cards are ada.

Steam has in excess of 100m users.


March is out.

Something has caused a massive jump in RTX 3070, RTX 3060 Ti, RTX 3060, RTX 2060 and GTX 1060 cards.

Can't really see any particular game that could cause it. These were all good mining cards though. Other mining cards lower down have also jumped.

Screenshot-2023-04-03-205409.png


Ignoring that for a second,

RTX 4090 has dropped to 0.25%
RTX 4080 has dropped to 0.19%
RTX 4070 Ti has increased to 0.23%

These shares have obviously been affected by the flood of older cards in the survey.
 
Last edited:
Star citizen is a PC game from ground up.

Every other AAA game developer considers the pc to be an afterthought. Just people whove spent almost 2k on a gpu feel they are owed something in return lol.
There are a handful of games that adopt a shady coding mentality, not the industry as a whole. These need to be called out. Blaming Nvidia will only get you so far.
 
These shares have obviously been affected by the flood of older cards in the survey.
Going off the GPUs listed, it coincides with the 4 series laptop launch.

Last gen, but more so older gen laptops would have been reduced to shift stock.

You can buy a 4080 FE and every other current GPU any day of the week-bar a 4090FE, which took 3 days to 'sell out':o last drop, they're not selling anymore.

Curious now for next month's steam survey.
 
Back
Top Bottom