• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Close this thread please, it's old and tired :)

Status
Not open for further replies.
No surprise and yet we had lots of threads which got locked to denial.

The threads were the 10gb one and the other troll/bait 12gb thread, we didn't have any 8gb threads as most people including myself said 8gb defo wasn't enough for 4k going forward.

I haven't watched the full video but seems the titles he has picked to showcase these issues are well regarded as being terrible titles for optimisation and/or the RT implementation specifically but sadly the uninformed or rather people who don't have a background in development industry will just put such "optimisation" issues down to lack of a hardware spec. :o

What I find funny though is yes, vram has run out in the case of the 8gb gpus in those poorly optimised games with RT on but grunt has also run out on gpus with 16/24+gb of vram i.e. settings also need reduced or/and higher preset of fsr/dlss thus reducing vram usage anyway

bzj4Bt2.png


rb7fIsB.png


e54c8Hy.png


"is 16gb enough or is 24gb maybe enough? Discuss...."

:cry:

PS. Pretty sure there have been 4090 users also reporting crashes in RE 4 remake when RT is enabled so not sure how this is specifically down to solely vram.

But see this point again:

sadly the uninformed or rather people who don't have a background in development industry will just put such "optimisation" issues down to lack of a hardware spec. :o
 
Last edited:
  • Like
Reactions: TNA
Problem here is the progression. We went from 1.5GB/2GB(GTX580/HD6970) at the high end in 2011 to 11GB/12GB in 2016 with the GTX1080TI/Titan Xp. The same high end has definitely slowed down but the RTX3090TI/RTX4090 have 24GB so it has doubled.

The sub £350 graphics cards went from 512MB/1GB in 2011 to 6GB/8GB to 2016,and from 256MB/512MB to 2GB/4GB at the entry level.

So roughly a 6X to 8X increase in VRAM capacity at most tiers.But since then only one £300ish card has gone past 8GB,the RTX3060 12GB,and the RX6700XT was above £400 at launch,so is pushing into lower end enthusiast pricing(the RX6700 10GB was barely available).Mainstream cards should be all at 12GB~16GB VRAM now,and entry level ones under £200 at 4GB~8GB.

So basically there has been a big stagnation at mainstream. Plus memory bandwidth isn't increasing as much,which adds to the problems.
 
Last edited:
Problem here is the progression. We went from 1.5GB/2GB(GTX580/HD6970) at the high end in 2011 to 11GB/12GB in 2016 with the GTX1080TI/Titan Xp. The same high end has definitely slowed down but the RTX3090TI/RTX4090 have 24GB so it has doubled.

The sub £350 graphics cards went from 512MB/1GB in 2011 to 6GB/8GB to 2016,and from 256MB/512MB to 2GB/4GB at the entry level.

So roughly a 6X to 8X increase in VRAM capacity at most tiers.But since then only one £300ish card has gone past 8GB,the RTX3060 12GB,and the RX6700XT was above £400 at launch,so is pushing into lower end enthusiast pricing(the RX6700 10GB was barely available).Mainstream cards should be all at 12GB~16GB VRAM now,and entry level ones under £200 at 4GB~8GB.

So basically there has been a big stagnation at mainstream. Plus memory bandwidth isn't increasing as much,which adds to the problems.

I think we can safely sum up the gpu market right now as being **** on all fronts regardless of which vendor you prefer.
 
Lol, when you create a second 8gb enough thread after there was already an 8gb enough thread and then have the brains to complain that the 12gb thread was baiting, comedy gold :cry:


 
I take back what I said, it looks like 8GB VRAM isn't quite enough (by around 400MB VRAM for WD: Legions).

While playing Watch Dogs: Legion, I noticed that exceeding the VRAM budget by playing at 4K (any DLSS setting), Ultra textures (optional texture pack installed) + Ray tracing on medium, causes prolonged freezing in game.

It's very weird when it happens, your character just freezes and nothing happens for ~30 seconds, but the game doesn't crash. Afterwards, gameplay resumes as normal.

I don't mind too much, as I'm sure turning down the texture setting to High will fix the over budget VRAM issue.

There's a tech radar article here that discusses the problem:
https://www.techradar.com/uk/news/n...6gb-vram-runs-watch-dogs-legion-more-smoothly

As a workaround for now, players can reduce the texture resolution setting from Ultra to any other setting, as the difference between low and Ultra is about 4.2GB of VRAM.

Overall though, framerates seem much better on an 8 core CPU and new RAM, can get 50-60FPS on 4K DLSS balanced, solid 60 on DLSS performance.

Guess I'll be upgrading the GPU in 2-3 years if this happens in more than a few games.

So much for the nonsense peddled by some that you'll always run out of grunt before VRAM. 8GB was suitable VRAM for 2017, on cards costing £150. Not cards costing 4x five years later...

People have bene running out of VRAM for years!


:D
 
Last edited:
Well, that issue I talked about in that comment I wrote (quite a while ago I think), was actually caused by a faulty Crucial BX500 2TB SSD (hundreds of thousands of read errors each day reported in gameplay after checking the SMART stats). It was quite a tough issue to track down actually.

I think there's a small problem with running out of VRAM too on some cards in WD:Legions, but it ended up being two separate things.
 
Last edited:
I've only briefly tried the RE4 demo but it's quite apparent that 8GB is not enough to enable all the bells and whistles. However, this comes as no surprise, of course.
 
The threads were the 10gb one and the other troll/bait 12gb thread, we didn't have any 8gb threads as most people including myself said 8gb defo wasn't enough for 4k going forward.

I haven't watched the full video but seems the titles he has picked to showcase these issues are well regarded as being terrible titles for optimisation and/or the RT implementation specifically but sadly the uninformed or rather people who don't have a background in development industry will just put such "optimisation" issues down to lack of a hardware spec. :o

What I find funny though is yes, vram has run out in the case of the 8gb gpus in those poorly optimised games with RT on but grunt has also run out on gpus with 16/24+gb of vram i.e. settings also need reduced or/and higher preset of fsr/dlss thus reducing vram usage anyway

bzj4Bt2.png


rb7fIsB.png


e54c8Hy.png


"is 16gb enough or is 24gb maybe enough? Discuss...."

:cry:

PS. Pretty sure there have been 4090 users also reporting crashes in RE 4 remake when RT is enabled so not sure how this is specifically down to solely vram.

But see this point again:

Still waiting for an answer........

"is 16gb enough or is 24gb maybe enough? Discuss...."

:cry:
 
Last edited:
My 2080 only has 8gb vram and i get warnings all the time about running out of vram, deathloop and guardians of the galaxy were 2 recent ones. I just have to keep lowering settings or resolution until it works.
 
The way it's meant to be played.......the recommendation is what we should be accepting from modern games.......wonder if console games will do the same.

Source Techpowerup Hogwarts Legacy review

Just like in most other recent releases, shader compiling and stuttering is a problem in Hogwarts Legacy. Right on startup, before the main menu, the game presents you a "compiling shaders" screen that lasts for three to five minutes. Unfortunately the shaders that get compiled at this time are only the most essential ones. As you progress through the game you'll encounter serious drops in framerate (to unplayable levels) for about 30 seconds. Just stop and wait for the shader compiler to finish—this breaks the immersion of course and I wonder if this can't be solved more elegantly. I investigated a bit further and the game is compiling shaders in the background even during normal stutter-free gameplay, too, without affecting the game much. We've seen such problems in many games recently and I keep hearing how these will get fixed in the "day one" patch, "soon" after launch, maybe with the next big patch, or with the next DLC. For nearly all games these issues never get fixed, and there's never any noteworthy performance improvements, so /doubt. My recommendation is to just live with it, stop playing for a few seconds when you get hit by the stutter, take a short break, and resume when things are smooth again. Just to clarify, for 98% of my play time in the first few hours there was no stutter and no frame drop issues, everything was super smooth.


Even without shader trouble, performance requirements of Hogwarts Legacy are very high. In order to reach 60 FPS at the 1080p Full HD resolution you need a GeForce RTX 2080 Ti, RTX 3070, or a Radeon RX 6700 XT. For 1440p gaming, an RTX 3080 or Radeon RX 6800 XT is required to achieve 60 FPS and beyond. If you're gaming at 4K then you'll need an RTX 4090 or RX 7900 XTX. Don't believe any benchmarks that are run inside the castle. Here there isn't much to see and your view distance is low. We've done our benchmark runs in the open world areas and FPS are much lower here. I picked a test scene that's demanding, but not worst case. Such high hardware requirements are fairly similar to what we saw in Dead Space and Forspoken, and I'm starting to wonder if we'll even see any well-optimized games this year. Lowering the settings is possible, but the range is rather small. Going from "Ultra" to "Low" you'll gain 10 to 20%, which isn't much. Sure, the game looks great on lowest settings, but I don't think that's the point of lower settings, rather they should offer a way for the game to perform well on lower-end PCs, even if it means compromising on the graphics quality—the gameplay will still be good and enjoyable.
 
The way it's meant to be played.......the recommendation is what we should be accepting from modern games.......wonder if console games will do the same.

Source Techpowerup Hogwarts Legacy review

Just like in most other recent releases, shader compiling and stuttering is a problem in Hogwarts Legacy. Right on startup, before the main menu, the game presents you a "compiling shaders" screen that lasts for three to five minutes. Unfortunately the shaders that get compiled at this time are only the most essential ones. As you progress through the game you'll encounter serious drops in framerate (to unplayable levels) for about 30 seconds. Just stop and wait for the shader compiler to finish—this breaks the immersion of course and I wonder if this can't be solved more elegantly. I investigated a bit further and the game is compiling shaders in the background even during normal stutter-free gameplay, too, without affecting the game much. We've seen such problems in many games recently and I keep hearing how these will get fixed in the "day one" patch, "soon" after launch, maybe with the next big patch, or with the next DLC. For nearly all games these issues never get fixed, and there's never any noteworthy performance improvements, so /doubt. My recommendation is to just live with it, stop playing for a few seconds when you get hit by the stutter, take a short break, and resume when things are smooth again. Just to clarify, for 98% of my play time in the first few hours there was no stutter and no frame drop issues, everything was super smooth.


Even without shader trouble, performance requirements of Hogwarts Legacy are very high. In order to reach 60 FPS at the 1080p Full HD resolution you need a GeForce RTX 2080 Ti, RTX 3070, or a Radeon RX 6700 XT. For 1440p gaming, an RTX 3080 or Radeon RX 6800 XT is required to achieve 60 FPS and beyond. If you're gaming at 4K then you'll need an RTX 4090 or RX 7900 XTX. Don't believe any benchmarks that are run inside the castle. Here there isn't much to see and your view distance is low. We've done our benchmark runs in the open world areas and FPS are much lower here. I picked a test scene that's demanding, but not worst case. Such high hardware requirements are fairly similar to what we saw in Dead Space and Forspoken, and I'm starting to wonder if we'll even see any well-optimized games this year. Lowering the settings is possible, but the range is rather small. Going from "Ultra" to "Low" you'll gain 10 to 20%, which isn't much. Sure, the game looks great on lowest settings, but I don't think that's the point of lower settings, rather they should offer a way for the game to perform well on lower-end PCs, even if it means compromising on the graphics quality—the gameplay will still be good and enjoyable.

Sure just spend £1000+ on a gpu and £900+ on cpus only to still encounter the same stutters as people who spent <£700 on a gpu and <£300 on a cpu :cry:
 
Sure just spend £1000+ on a gpu and £900+ on cpus only to still encounter the same stutters as people who spent <£700 on a gpu and <£300 on a cpu :cry:


Exactly mate, it's the hardware, not the game.

I'd recommend to those that spend time stuck in slow rush hour traffic, getting a Bugatti Chiron will to get you home faster..
 
Last edited:
Still waiting for an answer........



:cry:

You want to ask stupid questions get stupid answers or no answers. We're here to discuss the 8gb cards, not rant off on a tangent about other cards. Also quoting yourself to increase your post count doesn't help your argument either
 
Last edited:
Ask stupid questions get stupid answers or no answers. We're here to discuss the 8gb cards, not rant off on a tangent about other cards

But surely since the 4090, 7900xtx, 3090, rDNA 2 GPUs have all that vram, they should be getting better performance especially at 1440p? Or wait... Just maybe, they have run out of grunt thus require reducing settings and/or using a lower quality preset of dlss/fsr but no instead let's make a big song and dance about how a £450 2-3 year old mid range GPU is struggling because of vram..... Never mind a £1400+ new GPU ******** the bed too :cry:

If you can't see that point then there's no helping you but then again, if you have spent £1+k to also get a **** experience then of course people will be denial over that hence the lack of responses :)
 
Last edited:
  • Haha
Reactions: TNA
Never really thought it would be an issue for me, but now with upscaled 4K - oh damn. Arkham Knight, now 8 years old, was pulling not far off 7GB. 3070 still has plenty of life in it.
 
Last edited:
This argument has always bamboozled me, IMO it is the ultimate in 'defending your favourite brand at any and all costs' it is that some people argue for stagnation, built in obsolescence, and even over pricing given that at a certain price point one should expect a certain level of design and engineering befitting its price, everyone does for everything else, just not Nvidia GPU's.

Insanity.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom