• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I’m playing HZD at the moment so I’ll take a look at that this evening.

One thing I did notice last night was that since capping FPS for GSync maxed out at 4K my lowly 3090FE only got up to 230W even with a hefty OC, with a max temp of 60C.

Thought I’d read somewhere that this game was taxing on systems but it’s clearly not.

Not sure it could manage 120Hz but I might try it. Only played the Doom games at 120Hz so far.
 
I’m playing HZD at the moment so I’ll take a look at that this evening.

One thing I did notice last night was that since capping FPS for GSync maxed out at 4K my lowly 3090FE only got up to 230W even with a hefty OC, with a max temp of 60C.

Thought I’d read somewhere that this game was taxing on systems but it’s clearly not.

Not sure it could manage 120Hz but I might try it. Only played the Doom games at 120Hz so far.
I forget who posted it, but they referenced a reddit thread from a 5700 XT user and noted that the game was changing his texture rate as noted by different screen captures regardless of what was set in the video options.

Had a quick look through this thread but can't find it now. Hopefully whoever it was sees this post and pipes up.
 
Thing is, until it is proven either by a trusted source or multiple times then anyone can say x or y on reddit etc which does not make it so.

@LtMatt do you agree that DLSS looks better than native? :D
 
Thing is, until it is proven either by a trusted source or multiple times then anyone can say x or y on reddit etc which does not make it so.

@LtMatt do you agree that DLSS looks better than native? :D
It will be a long time before review sites start testing for it i suspect. It'll just be check the dedicated usage/allocated and call it a day. It is what it is but that testing methodology is not sufficient anymore.

If you've experienced video memory saturation, it's not the best and i certainly did in multiple games with 8GB.

The old saying goes though, better to have something and not need it, than need something and not have it. That certainly applies to video memory for me having come from an 8GB card.

I'll take native any day, but YMMV.
 
Last edited:
It will be a long time before review sites start testing for it i suspect. It'll just be check the dedicated usage/allocated and call it a day. It is what it is.

If you've experienced the

The old saying goes though, better to have something and not need it, than need something and not have it. That certainly applies to video memory for me having come from an 8GB card.

I'll take native any day, but that's just me.
Yeah I agree better to have it. But when the prices are what they are right now things change a little. Also as I say if you know for a fact that you will be selling and moving on to next gen and not keeping it long term, it is even more of a less issue.


I'll take native any day, but that's just me.
Agreed. I have been saying this from the start. Some did not like it :p

But point is many believe this to be true. Does not mean it is.

That said, I do like DLSS 2 and with some more improvements I do think it will become a must have feature in the future.
 
A little off topic but given recent posts from @Smiffy-UK, the GDDR6 used on the MBA 6800 XT and 6900 XT has a maximum temperature of 100c.

Once you hit this temperature the GPU will throttle core and memory speed until temperatures drop, or you alt tab out to reapply a GPU Tuning profile.

The temperatures below are from a 3 hour session of Red Dead Redemption Ultra settings at 4K, in a 23c room, with fan speeds on the GPU completely silent at 1500RPM, which is the default 6900 XT fan speed.

The maximum temperatures are on the right and are peak, typically temperatures in game are a couple of C lower.

The 6900 XT memory junction temperature will peak at 94c, 6c away from throttling temperature. The core Junction temperature peaks at 104c, also 6c away from throttling temperature.

The difference between edge and core junction temperature is 14c peak, which indicates excellent contact between heatsink, thermal pad and GPU die.
bbuHPvq.png


People worry about temperatures, but as long as you're below the maximum temperature it will not affect the life span of the GPU. Voltage will kill your GPU faster than temperature will.

Is your memory overclocked? And are you using stock or high power limit?
 
Is your memory overclocked? And are you using stock or high power limit?
2400Mhz core clock 2112Mhz memory, 0% power limit. No Increased power limits this is 24/7 gaming clocks. I favour low RPM fan speed over low temperatures. My 420MM AIO running at peak of 1000RPM.
 
2400Mhz core clock 2112Mhz memory, 0% power limit. No Increased power limits this is 24/7 gaming clocks. I favour low RPM fan speed over low temperatures. My 420MM AIO running at peak of 1000RPM.

Can you try to increase the power limit and check again?
I'm curious if the memory temps will move into throttling range.
 
Can you try to increase the power limit and check again?
I'm curious if the memory temps will move into throttling range.
No the core junction is the first to throttle if I increase voltage and don’t increase fan speed to keep things in check. Never seen memory junction go past 95c AFAIK.
 
Doesn't look like that on a 3080 though so it would need more testing other than some randomer on reddit. Does it happen with other 8GB cards like the 2070S, 3070 etc or is it a 5700XT issue?
 
It will get worse over time, not better. As new games are released that are developed for the new consoles, VRAM required will inevitably increase. 3080 is a ticking time bomb @ 4k.

On the one hand consoles will only use up to 10gig vram for textures and other graphical content. On the other hand I deep down know you are right, straight up ports without enhanced texture packs may be ok, but games enhanced for PC with PC packs I fear will make the 3080 feel obsolete.

FF15 a game made in 2016 when using its high res texture packs, the 1080ti and pascal titan were the only pascal cards able to handle it properly, and even then you had to disable nvidia turf.

I am in agreement with hardware unboxed in regards to RT, meaning I think if only AMD could meet MSRP the AMD cards would be a better buy this gen.

We all know deep down also though that a 3000 series card with 16+gig vram and the horsepower of the 3080 is going to be nowhere near £650, so personally I would rather have my 10 gig 3080 at this price than a higher vram option which would likely be out of my budget.
 
Are there any other games that show similar issues as hzd? Hzd was/is notorious for texture pop in

Doesn't look like that on a 3080 though so it would need more testing other than some randomer on reddit. Does it happen with other 8GB cards like the 2070S, 3070 etc or is it a 5700XT issue?

This
 
Last edited:
We all know deep down also though that a 3000 series card with 16+gig vram and the horsepower of the 3080 is going to be nowhere near £650, so personally I would rather have my 10 gig 3080 at this price than a higher vram option which would likely be out of my budget.
Exactly this. The performance is more important than the vram right now. By next year we will be rocking next gen cards with more vram anyways. So I am well happy they went with 10gb to give us all a lower cost option.


That was it, good find. That is how you test it, you need two or more GPUs with varying sizes of video memory.
That just seems like confirmation bias. For all we know that could be 4K8K. You do remember him don’t you? He kept going on about AMD having much better image quality than nvidia. Then when we digged deeper turns out he was using a decade old or older laptop with an nvidia card as a comparison. Lol.

Until we get many people here to confirm it, or a reputable website/YouTuber, then it is as credible as 4K8K’s scientific paper :p

Funnily enough, is he not rocking a 3080 these days? Haha.
 
That just seems like confirmation bias. For all we know that could be 4K8K. You do remember him don’t you? He kept going on about AMD having much better image quality than nvidia. Then when we digged deeper turns out he was using a decade old or older laptop with an nvidia card as a comparison. Lol.

Until we get many people here to confirm it, or a reputable website/YouTuber, then it is as credible as 4K8K’s scientific paper :p

Funnily enough, is he not rocking a 3080 these days? Haha.
I don't believe that user is a member here, and his postings seem sensible to me so I don't think he findings should be ignored because they go against a narrative.

I definitely agree that it needs to be tested, but I very much doubt there is any appetite to do it. True vram requirement testing needs measurement of 0.1/1% lows/frametimes and now looks to need image quality comparisons/pop in tests comparing different GPUs with differing memory amounts. As well as the obligatory dedicated vram usage/allocated. It's a lot of work.

I had a 8GB and a 16GB GPU in the 5700 XT and Radeon VII, but Horizon Zero Dawn/Black Ops were not released back then. What i can say is that the 5700 XT had worse 0.1/1% lows and hitching in some games where VRAM usage was close to 8GB whereas the Radeon VII didn't.

If you don't have two cards of different sizes, you can't really test it. I didn't think the 5700 XT had a problem until i tried a GPU with double the memory, then i began to see and measure differences. People can be blissfully ignorant sometimes.

It's easy to use one GPU and say there's no issue, and maybe there's not but sometimes you don't notice these things until you try something different - is my point i guess.

At the time I did not know that games were coming out that would adjust image quality automatically regardless of what was set in the video options. COD BO even mentions this in the video options i believe.

I wish i still had a 5700 XT as I'd love to do some comparisons with the Radeon VII or the 6900 XT but I sold it and no chance of me getting another at a sensible price anytime soon.

I very much doubt this is just a problem with the 5700 XT btw as has been mentioned in this thread several times.

I'm not sure if @tommybhoy will want to jump in here as this was a private conversation offline, but he has mentioned that frame times on the 3070 can be all over the place once you get near to 8GB utilisation, which was similar to what I saw with the 5700 XT in a few games that used lots of video memory.
 
Until someone else posts a video/image showing the exact same with a different 8gb gpu, you can't say for definite that is because of the lesser vram though.

Looking at other comparisons, I can't see any difference but then youtube compression.....


Remember gregster posted a video comparing his amd and nvidia comparison in games and the way his nvidia card had lesser iq, turns out that was down to his nvidia drivers or/and settings or/and not restarting the game after applying the settings or something.
 
Status
Not open for further replies.
Back
Top Bottom