• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
My stance has been that neither of said titles show graphical improvements to justifiy the VRAM usage and that targeting an amount greater than Nvidia's flagship product was akin to Nvidia's tessallation scam back in the day, a good move by AMD which I've said before. Sure if you want to use more VRAM for something such as RT caching then go ahead, but to gimp titles by simply following sponsor money is not good for us as the consumer.

Visual fidelity is an odd one. DLSS vs FSR, Nvidia wins. RT, Nvidia wins. Newer engines with optimised streaming and cache, it's a draw. It's only when we see the VRAM issue forced that AMD wins.

Remember the consoles currently consider 10GB of VRAM to be enough, while game engines tend to have hardware vendor specific paths. It's also worth looking at Direct Storage and Sampler Feedback as this should also help keep VRAM requirements down.

I purchased my 3080 for just one generation as I understood the acceleration in hardware RT support, as such it will end up on the shelf due to not having enough horsepower while the 10GB was plenty.
I did say these settings are somewhat academic earlier as they make little difference but for academically testing hardware, it is best to use the highest settings, HUB have seemed to not follow their usual trend of maximum settings and only for FC6?
 
Godfall was debunked and turns out when RT was turned on, the 6900xt got absolutely hammered in the low fps :cry: Will look for the videos later on.

The marketing job was done by then. BTW you can pickup a Steam key for £10 on cdkeys ATM.

What I found more cringe worthy at the time was AMD's attempt to cover up terrible RT performance with the lack of support for CP2077 at launch. Nvidia publicly commented that no proprietary code for RT was used, it was just DXR. CP2077 also managed to highlight the importance of DLSS, so no surprise AMD made that decission.
 
I did say these settings are somewhat academic earlier as they make little difference but for academically testing hardware, it is best to use the highest settings, HUB have seemed to not follow their usual trend of maximum settings and only for FC6?

I enjoyed early AdoredTV as Jim went in to great detail. These days I look forward to PCper's podcast, including burger of the week. It's hard to find good tech tuber's these days as most including HUB,RGT and MLID appear to push clickbait more than anything. I'll admit HUB do decent monitor reviews though.
 
For the 3080 10GB? Only 2 sites, that's not multiple :p :D :cry: And so far only 2 users, well, 3 if you factor in me when rebar is enabled with no fsr.

Don't you see the flaw there with the system issue argument..... You're saying it might be a system issue because you don't see issues on your end yet here we are, I don't see issues on my end hence why I am saying possible system issue....

And nope, whilst I enjoyed it somewhat, no way I am paying more than £5 for it especially since it will only be launched in order to record and upload footage for 10 minutes :cry:



Godfall was debunked and turns out when RT was turned on, the 6900xt got absolutely hammered in the low fps :cry: Will look for the videos later on.
Okay, I'll play Devils Advocate using your example. Can you provide two tech sites (with frame time variance graphs like Computerbase/PcGamesHardware) showing frame time variance issues with Doom Eternal on the 6900 XT?

I did also say it could be a driver/game issue as well as local system issue. Those videos you referenced are over a year old so who knows what changed in all the game and driver updates since then. It was a guess and my post is pretty clear on that.

Only 10 minutes! Afraid you might run out of video memory if you play for 60 minutes from the start of the game? Better get closing down those windows and disabling services in Windows, you'll need every spare GB you can get your hands on. :p

If you do the above, your claims that the game runs fine with 10GB at 4K max settings will hold some serious weight IMO. As it stands, the jury is against you by 9-1. :D

@Bill Turnip You simply have to test this now. :D
 
@Bill Turnip You simply have to test this now. :D

I will, but not until the middle of the week as I'm not home. @Nexus18 is there anything specific you want me to check/look for? I'll taking my undervolt off and running stock, just for a clearer base.

Before I do this, my position is that I still maintain that 10GB is enough generally, and I'm happy with it. However, there will be exceptional cases where it isn't, and I think that this will be one of those.
 
I will, but not until the middle of the week as I'm not home. @Nexus18 is there anything specific you want me to check/look for? I'll taking my undervolt off and running stock, just for a clearer base.

Before I do this, my position is that I still maintain that 10GB is enough generally, and I'm happy with it. However, there will be exceptional cases where it isn't, and I think that this will be one of those.
I said that back at the start of this thread, it didn't do much good though. :cry:
 
Ladies and Gentlemen

Hi Paul!!!

I am calling BS then Matt...

The HD textures are intentionally disabled for the benchmarks, as graphics cards with too little VRAM not only run a little slower, but also collapse completely. If HD textures are not possible on the hardware, this is marked accordingly in the diagrams.

/popcorn

10Gbs plenty:p

Microsoft and Sony know better!

Better get closing down those windows and disabling services in Windows, you'll need every spare GB you can get your hands on. :p

If you do the above, your claims that the game runs fine with 10GB at 4K max settings will hold some serious weight IMO. As it stands, the jury is against you by 9-1. :D

If this is what people are resorting to to win an internet argument then.. oh wait I'm not surprised.
 
Okay, I'll play Devils Advocate using your example. Can you provide two tech sites (with frame time variance graphs like Computerbase/PcGamesHardware) showing frame time variance issues with Doom Eternal on the 6900 XT?

I did also say it could be a driver/game issue as well as local system issue. Those videos you referenced are over a year old so who knows what changed in all the game and driver updates since then. It was a guess and my post is pretty clear on that.

Only 10 minutes! Afraid you might run out of video memory if you play for 60 minutes from the start of the game? Better get closing down those windows and disabling services in Windows, you'll need every spare GB you can get your hands on. :p

If you do the above, your claims that the game runs fine with 10GB at 4K max settings will hold some serious weight IMO. As it stands, the jury is against you by 9-1. :D

@Bill Turnip You simply have to test this now. :D

The 6800xt footage isn't that bad i.e. it's not bad enough to be picked up upon by them sites probably despite it being shown to happen in several videos hence why choc raised it as he has a 3090 and was wondering why he was getting spikes as was shown in those amd videos.

If someone gives me there account I'll do 20 minutes at most, game is boring and I much rather play something else with the small bit of free time I get at the moment. Certainly no way I am playing for hours :cry: I already have 29 hours in the game, which is too much for what it was.... And as stated, only time I ever encountered the fps drop to single digits was with rebar enabled and no FSR (and as stated for when I played at 3440x1440, I didn't use FSR due to it harming IQ too much for my liking)

I will, but not until the middle of the week as I'm not home. @Nexus18 is there anything specific you want me to check/look for? I'll taking my undervolt off and running stock, just for a clearer base.

Before I do this, my position is that I still maintain that 10GB is enough generally, and I'm happy with it. However, there will be exceptional cases where it isn't, and I think that this will be one of those.

Not really, just play as you would I suppose. Only things to note from my end I suppose are my specs and setup:

- 5600x UV -30 across all cores, dual rank 2x8GB sticks with xmp bringing it to 3200mhz
- 3080 undervolted, .825@1815MHz
- windows 11 game mode enabled
- don't enable rebar (as with my testing it did cause fps drops to single digits at 4k and with fsr, fps did drop after a while for no reason [in the same area])
- my nvidia drivers at the time of recording and gameplay was using nvcleaninstall with the minimal setup
- background windows, I let whatever starts up with windows run in the background, obviously I keep any browser windows like chrome closed due to them being RAM hogs

You can see what settings I used in my previous videos, all max except I disable things like motion blur, lens effects etc. Can't remember but you'll need to check if I enabled CAS, I think I might have but not overly keen on it due to over sharpening.
 
I have since found a sweet spot for my 3070 and that is a mix of high - Ultra settings, no HD textures and all RT effects on but am CPU bottlenecked.. unbelievable.

Further on in the opening part of the game GPU usage drops to like 70%.

Yes the game is a bit CPU limited once FPS get much over 100. Even I see that at 4K Ultra Settings if I try and use FSR Ultra Quality. It's not needed though so I keep it off as FPS are 75-95 anyway.

This game likes an Intel 12900K. I am excited to see what a 5800X3D can do here, I am expecting good things. :)

Thing is though HU ran the game just fine on the 6700XT and would likely run well on a 3060 (that has 12GB VRAM).
 
Well in a few months the 3080 will become a 4060/4070 tier card and I wonder how much vram it will have. Be abit of a slap to the face if the 4060/4070 tier card comes with more vram :D

Everyone would've sold their 3080's by then and all this would've been forgotten about.... maybe... :cry:
 
Last edited:
Not really the right place for this but @Tired9 , had a quick look there to see RAM usage for other people with 3070 and 1440p, as I suspected, your RAM usage is a bit of looking:




Well in a few months the 3080 will become a 4060/4070 tier card and I wonder how much vram it will have. :D

Everyone would've sold their 3080's by then and all this would've been forgotten about.... maybe... :cry:

Based on history, a 4070 will be matching a 3090 and beating it comfortably in RT for about half the price :cry: :D 4060/ti will be the equivalent to the 3080.

I would be expecting the 4070 to have at least 10gb and 4080 to have 12 or 16gb.

Personally I'll be getting a 4070 or 4080 depending on how much they cost and what I can get for my 3080 as my 3080 already has run out of grunt (at least it isn't hitting the fps I want) in quite a few RT titles now (both at 4k and 3440x1440), of course we could pick up a 3090 for about £500 and a half eaten sandwich upon 4070 announcement but given a 3090 is also struggling in those same titles, it wouldn't make much sense... :cry:
 
pantim-dog-in-circles.gif
 
Well in a few months the 3080 will become a 4060/4070 tier card and I wonder how much vram it will have. Be abit of a slap to the face if the 4060/4070 tier card comes with more vram
Would it really though? I mean you can get 3060 with more VRAM than the 2080Ti, I don't consider that a slap in the face, it's just that typically VRAM increases with later generations, having more VRAM isn't much use if you don't have the grunt to go with it.
 
Whatever we conclude in this thread, sadly it does seem for the foreseeable future DF will remain the only reviewer to check for things like texture stream quality and frequency of flush and restream. The rest of the industry seems happy enough to just check FPS.
 
Status
Not open for further replies.
Back
Top Bottom