• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16GB of Vram The Standard For High End Graphic Cards?

Status
Not open for further replies.
My doom eternal videos, as said, I'll upload longer footage when I get more time, they are still processing so HD/4k versions probably won't be finished till later tonight:



I count one area where FPS drops to 57, no frame latency spike/stutter at that point, which is unusual if vram did run out i.e. nothing like we see on the above 6800xt video:

qT7J37X.png
 
They saw the error of their ways and they now enable SAM/Rebar I believe, just HUB to go now. :D

HUB are also enabling SAM/rebar too, at least it was for their last comparison of 3070 vs 6700xt

Edit;

Which is not exactly a good thing if forcing it on Nvidia GPUs harms performance as I experienced with FC 6 at 4k with no fsr.
 
Last edited:
The youtube videos of mine are now HD. Might do a different area later but I think that is the most demanding area? Or would inside the metal/glass building be better as iirc, there is more RT happening there.

It's actually quite interesting seeing the allocated and dedicated vram there, dedicated vram is 8.9+GB and no issues in frame latency/drops (nothing serious standing out when comparing to other gpus?) at all. Now that is what you call optimisation :)
 
Last edited by a moderator:
8K content come on guys.... We won't see any adaption of 8k apart from some niche showcase. Hard enough getting 4K content in movies never mind games. PC gaming will always be constrained by the console generations, which are running something like a 2070super. As the console generation continues to age, they won't even be able to keep up with 60fps hard locked. You watch in 2-3 years and the graphics will improve slightly but they will be tied to 30fps.
 
Should have got the 6700xt @TNA , clearly you are missing out on all the benefits of that extra vram and GOTY, FC 6, oh wait...... :D

FBJqJuf.png


SAM enabled too so no excuses ;)


Also, thanks for bumping thread and reminding me about this gem with regards to doom eternal and "so called" issues at 4k with 10GB vram..... Computerbase.de won't be a trusted source now..... ;) :p :cry:

xG4jEmA.png

Look at them "MASSIVE" gains of the 3090 too @TNA, money well spent.... ;)

8K content come on guys.... We won't see any adaption of 8k apart from some niche showcase. Hard enough getting 4K content in movies never mind games. PC gaming will always be constrained by the console generations, which are running something like a 2070super. As the console generation continues to age, they won't even be able to keep up with 60fps hard locked. You watch in 2-3 years and the graphics will improve slightly but they will be tied to 30fps.

It's already happening, has been for a good few months now, just have to watch DF to see how many graphical settings are being reduced AND sacrificing RT settings AND using adaptive resolution where majority of the time, they aren't locked to 4k.
 
Should have got the 6700xt @TNA , clearly you are missing out on all the benefits of that extra vram and GOTY, FC 6, oh wait...... :D

FBJqJuf.png


SAM enabled too so no excuses ;)


Also, thanks for bumping thread and reminding me about this gem with regards to doom eternal and "so called" issues at 4k with 10GB vram..... Computerbase.de won't be a trusted source now..... ;) :p :cry:

xG4jEmA.png

Look at them "MASSIVE" gains of the 3090 too @TNA, money well spent.... ;)



It's already happening, has been for a good few months now, just have to watch DF to see how many graphical settings are being reduced AND sacrificing RT settings AND using adaptive resolution where majority of the time, they aren't locked to 4k.

:cry::D:cry:
 
8K content come on guys.... We won't see any adaption of 8k apart from some niche showcase. Hard enough getting 4K content in movies never mind games. PC gaming will always be constrained by the console generations, which are running something like a 2070super. As the console generation continues to age, they won't even be able to keep up with 60fps hard locked. You watch in 2-3 years and the graphics will improve slightly but they will be tied to 30fps.

"True" 8K - sure, we won't see it, but we are right on the cusp of 8K + DLSS/TSR/FSR, which next gen cards will absolutely be able to handle, and that's exactly the sort of scenario where your compute power will outrun your vram (looking at today's cards <16 GB).
Also, don't count the consoles out just yet, there's a high chance that they do the same thing for their "Pro" console releases in 2-3 years, because 8K is a huge marketing selling point and the TV market is starting to push that more & more also. For example, TCL which is a big seller now & approaching the #1 spot in the TV market, has changed their mainstream value best seller the "6 series" from 4K + decent fald, to 8K, instead of just keeping it 4K and making slight adjustments to the dimming zones & algorithm. Why? Easy, 99% of people buying their displays are completely clueless about them but will absolutely pick-up on 4K vs 8K.

So FSR2 Ultra Performance could absolutely do a good job with 8K scaling from just 1440p, so no need to go over the top on specs to achieve that. Watch this:

 
Sony, LG, Panasonic, Samsung can sell 8k capable displays all day long I’m willing to bet there will be 1% available content to watch on it.

TV sales are facing the same dilemma as mobile phones. They push the tech as their only selling point but there is no-one making content to fill it. Apple and Samsung just add a better lense on the camera and call it a day.
 
Sony, LG, Panasonic, Samsung can sell 8k capable displays all day long I’m willing to bet there will be 1% available content to watch on it.

TV sales are facing the same dilemma as mobile phones. They push the tech as their only selling point but there is no-one making content to fill it. Apple and Samsung just add a better lense on the camera and call it a day.


Is anyone streaming in 8k yet? The only 8k content I can think of atm is the odd YouTube video
 
12gb is enough at 4k in current titles from what i've seen.
8gb is enough at 1080p from what i've experienced :D

Yep. Here's what Steve had to say:

Oh boy! Did Farcry 6 stir the pot up in our last benchmark?!

I mentioned how AMD fans didn't like us comparing the 6700XT with the RTX 3070. Well, boy oh boy, didn't nvidia radicals not like us enabling the HD texture pack in far cry 6!

Despite the fact the 6700 XT can still deliver over 60 FPS 4K with the HD textures enabled, we were told its highly unrealistic and we only did this to make the GeForce GPU look bad. Truth is though, we've always tested games and offer HD texture packs with them enabled, especially when testing high and to mid-range hardware. So this was nothing new. HD textures shouldn't impact performance or shouldn't if you have enough VRAM? It's a free visual upgrade and for most it's well worth using - again if you have enough VRAM, which you very much should when spending $500 plus on a graphics card!

Those of you gaming at 1080p or with is the 6700 XT RTX 3060Ti really need not worry about VRAM in this title, even with the HD texture pack enabled as both worked really well despite the radeon GPU offering up to 10% more performance.

Rather it's the 4K data that's a problem for Nvidia, as here the 3060ti was unable to deliver playable performance while the 6700xt sailed along quite nicely without any issues thanks to that larger 12GB VRAM buffer which in our opinion should be the bare minimum at this price point, and bizarrly nvidias slower rtx 3060 non Ti version that does pack a 12GB VRAM buffer - so go figure.
 
You guys do realise that the debate in the original FC 6 thread was originally about the 3080 and 10 GB vram (something which hub did not report issues in their recent 3080 vs 6800xt video), talk about moving goal posts again :cry: Just concretes my points even more :D

Amazing how no other game shows similar FPS drops/issues at 4k even when using all vram (which arguably look better and have more going on etc.) except for one AMD sponsored title (which just happened to on release "require" a magic number of 16GB vram) but nope lets not use logic or reasoning as obviously this isn't the right place for it.


Also, on the video at hand, the pricing of the cards in the UK actually has the 3070 in the same bracket as the 6700xt, not to mention that is the card AMD were also comparing it to upon announcement/release, only reason Steve compared the 3060ti to the 6700xt is because you had the fanboys crying about it hence the above video and he still recommends the 3060ti over the 6700xt :cry: But nope, ignore all that :cry:



Also, no one going to debate my doom eternal debunking above? Or as per usual, nothing of proper substance to come back with?

giphy.gif
 
Last edited:
12gb is enough at 4k in current titles from what i've seen.
8gb is enough at 1080p from what i've experienced :D

+1 on the 8gb@1080p.

Must be a local system issue with HUBs test system @gpuerrilla @tommybhoy. :cry:

Basically anything but the vram. Other classics are runs out of horsepower way before you saturate vram. Need more ray tracing.

Steve added Ghost Recon as heavy on the vram on that vid too!:eek:

Not be long for the next 'wahaaaay it uses less than 10gb' vid, bricking it we are!:p
 
Status
Not open for further replies.
Back
Top Bottom