• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16GB of Vram The Standard For High End Graphic Cards?

Status
Not open for further replies.
Really...... :cry:

4k has only really taken of in the last 2-3 years and if looking at it from a PC gaming perspective, make that the last 1-2 years, more so since RDNA 2 and ampere along with FSR/DLSS has made 4k gaming somewhat more of a reality now (if it weren't for DLSS/FSR, we would still be chasing 4k60 with max/high settings....). 8k is not going to be a thing anytime soon bearing in mind that 4k was first announced/released in 2012 iirc. Heck it could be argued that 4k is still pretty niche for PC gamers:



And either way, as you said yourself, 8k won't even be a serious thing until the next consoles or even the consoles after that....




Since getting my QD-OLED 3440x1440 display yesterday, I'll be gaming more on this now than my 4k lg oled going forward, as despite being a lesser res., "overall" the IQ is superior and more enjoyable experience. Still going to upgrade to 40xx series for the much better RT grunt ;)

EDIT:

According to steam survey for vram:

Remember the current gen consoles are sold as 8k ready and the next update on them the Ps5 Pro and xbox series x Pro will be out for sure when there is many more 8k TV screens available as they did when the ps4 pro came out by that year of release the 4k screen became the norn to buy and hardly any 1080p tvs. Same will happen again this time. 8k to me needs over 100 inch screens to even be worth using at normal viewing distance and really I think 150 inch will be the sweet spot for 8k screens in the living room but UK homes don't really have the space for them even 75inch and 85inch in most homes means you have to sacrifice where you put them and over the fireplace doesn't count lol.. People that do that learn the hard way later and forget the middle of the screen should be at the normal seating hight and eye level should be straight to the centre of the screen and any higher or lower is not right and equals neckache or a very poor experience.
 
Remember the current gen consoles are sold as 8k ready and the next update on them the Ps5 Pro and xbox series x Pro will be out for sure when there is many more 8k TV screens available as they did when the ps4 pro came out by that year of release the 4k screen became the norn to buy and hardly any 1080p tvs. Same will happen again this time. 8k to me needs over 100 inch screens to even be worth using at normal viewing distance and really I think 150 inch will be the sweet spot for 8k screens in the living room but UK homes don't really have the space for them even 75inch and 85inch in most homes means you have to sacrifice where you put them and over the fireplace doesn't count lol.. People that do that learn the hard way later and forget the middle of the screen should be at the normal seating hight and eye level should be straight to the centre of the screen and any higher or lower is not right and equals neckache or a very poor experience.

They might be 8k ready but it isn't going to be a good gaming experience at all unless these upscalers get even better.... Digital foundry have already shown very well where various games are sacrificing graphical settings or/and having to use adaptive resolution or/and sacrifice RT settings entirely on consoles. PC wise, we are already seeing the 3090 barely able to do 4k60 with RT and even dlss.

I would say we are at least 2 console gens and maybe 3 desktop gpu gens away from current graphical fidelity including "some" RT at 8k60 with dlss/fsr.

I hear you on TV placements! Never get how people can put TVs up that high, awful placement :o
 
They might be 8k ready but it isn't going to be a good gaming experience at all unless these upscalers get even better.... Digital foundry have already shown very well where various games are sacrificing graphical settings or/and having to use adaptive resolution or/and sacrifice RT settings entirely on consoles. PC wise, we are already seeing the 3090 barely able to do 4k60 with RT and even dlss.

I would say we are at least 2 console gens and maybe 3 desktop gpu gens away from current graphical fidelity including "some" RT at 8k60 with dlss/fsr.

I hear you on TV placements! Never get how people can put TVs up that high, awful placement :o

You're spot on about 8K adoption. Still loads of people on 1080p TV's. Most media is still 1080p. High end GPUs struggle to do high refresh rate 4k with settings turned up and very few people can afford that. 8K, like 4K will be niche for many years to come. It is exasperated by the fact that 4K is already very good and even fewer people will be motivated to buy into 8K for even less obvious improvement in quality whilst having a high cost. I'll look at it again in about 2030 ;)
 
They might be 8k ready but it isn't going to be a good gaming experience at all unless these upscalers get even better.... Digital foundry have already shown very well where various games are sacrificing graphical settings or/and having to use adaptive resolution or/and sacrifice RT settings entirely on consoles. PC wise, we are already seeing the 3090 barely able to do 4k60 with RT and even dlss.

I would say we are at least 2 console gens and maybe 3 desktop gpu gens away from current graphical fidelity including "some" RT at 8k60 with dlss/fsr.

I hear you on TV placements! Never get how people can put TVs up that high, awful placement :o


There is already one or two 8k games on the ps5, but they are games that can easily run at 8k on pc too

 
I thought it was just the one game?

images


While 10 GB is a problem with a game (or system resources fault), more than 16 GB doesn't help in any game released thus far.

Also fixed.
 
Wasn't sure that was noticed:p

I am uneducated remember.. :rolleyes:

TPU are not on the trusted sources list sadly gents, along with anyone else that highlights the issue.

So what we are saying is it never got debunked? Just distorted and the usual narrative to suit agenda then. Glad we cleared that one up! :cry: #debunkingownbs

LOL @LtMatt forgot about them trusted sources! They the same sources though that have mentioned VRAM issues a few times now and we are seeing some creep from one game to many games.

Some of these guys are upgrading their panels I noticed. It will be pleasant to see (or maybe not when denial is present) how the 'bottleneck' as TPU put it on the vram handle the rest of 2022/23 when MOAR games come out eh?

#notmyproblemasiupgradedtoa40seriescardbutwonttelltheguyzitwasbecausevramlol
 
I don't doubt TPU results/comments as they are a good reputable reviewer but the problem is as per usual, if comments are made, at least post some evidence to show what the issue is, why is that so hard to do and more importantly, understand?

Not going into FC 6 debate as no point since it has been done to death and people are still unable to debunk my points or provide correct answers to my questions :cry:


As for doom eternal, I played this through on my 4k oled and there were no issues with performance at all, at least nothing that immediately made me go "oh vram shortage", looks like the game properly manages vram memory allocation.

Having a quick look at youtube for footage (surprisingly not many 3080 vs 6800xt videos for doom eternal?), not really seeing the "vram bottleneck" in 3080 footage:


Also, 6800xt perf from what I can see looks lower on the whole, also see them fps latency spikes.... Must be vram.....


Currently uploading footage from my 3080 (both with dlss on and off), will do a longer play through at weekend. But looks to match the above, certainly not any worse than the 6800xt.

Once again, it seems to be another case of people picking select quotes to suit their narrative or pigging back on other peoples comments with no evidence whatsoever and best of all coming from people who don't even own the GPU.... :cry: (that bit is not referring to you obviously tommy since you do have the gpu yourself, speaking of which, have you noticed any issues with doom eternal? If so, got some footage?)

I hope they aren't just going by vram "consumption" and assuming "oh dear using all the vram, obviously need more".... :o If they are, seems they also need some education on that.

TPU are not on the trusted sources list sadly gents, along with anyone else that highlights the issue.

Weren't you crying about them not using SAM not so long ago and saying their results can't be trusted? :cry: ;) :p :D
 
LOL @LtMatt forgot about them trusted sources! They the same sources though that have mentioned VRAM issues a few times now and we are seeing some creep from one game to many games.

Some of these guys are upgrading their panels I noticed. It will be pleasant to see (or maybe not when denial is present) how the 'bottleneck' as TPU put it on the vram handle the rest of 2022/23 when MOAR games come out eh?

#notmyproblemasiupgradedtoa40seriescardbutwonttelltheguyzitwasbecausevramlol

Funny thing is it's you who keeps pigging back on other peoples comments and making it out like it is an issue in loads of games yet we have several 3080 owners on this forum saying they have had no issues with vram at all so why do you keep trying to insist that there are issues when there is still no solid evidence to show this? Perhaps starting to feel buyers remorse over the 3090 price to performance benefits eh?


The only game so far to date which is "potentially" showing issues is FC 6 but again, given the whole thread on that, it's not exactly a clear cut case of vram given all the points and "facts" raised, which you and 2 others keep ignoring because it doesn't suit the narrative.

PS.

Upgrading panels, guessing you are referring to me (which is odd as supposedly I am on ignore....), if you understood what resolution was and how it has an impact on perf. and vram, you would realise, many of those whilst upgrading to a better "panel", the res. is staying the same for a lot of us i.e. we're staying at 3440x1440 or 4k (both of which have been my main res. for a long time now)

@LtMatt @gpuillible:p

On a topic of how the 3080 Vram and rtx cause a bottleneck according to w1zzard's findings instantly turns into a 'but but a 6800xt v 3080 performance' deflection.:cry:

It's not a deflection, it's a simple question of he has made that comment but where has that comment come from, where are the findings? Has he showed this in any reviews or posts why he has come to that conclusion? Again, why is it so hard for people to post things to show/backup their points. It's like all the times piggy back on any other random comment from some reddit guy and go "oh look, believe this!" i.e. hzd fiasco which was quickly debunked....

I raised 6800xt vs 3080 because he clearly states as per the comment above that the 3080 is running out of vram in doom eternal yet based on my own experience, I haven't seen this causing issues? Hence why I am bringing up the 6800xt because it's comparable performance (especially since the RT is light so it doesn't buckle) but with 16GB vram. In theory, if vram was causing issues, the 3080 would be dropping in fps and performing worse than a 6800xt or showing stuttering? Right?

Again, lets please use some logic, it's not hard.
 
Can you all give it a rest. Please have a read of the GPU section rules, debate is good but continual childish remarks are not. Also putting smilies after comments will not save you.
 
Status
Not open for further replies.
Back
Top Bottom