• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Understood :)

It's still a great card. Just thought that may of been the problem, I've got thermal pads coming this week and I intend to fix my 3080.

I was playing star Trek online with my friend for about 6 hours the other day. When we finished, I then checked my thermals and has seen my card was thermal throttling whilst playing games. So bad for a card that cost me £950, gigabyte haven't done good this time. I won't be going back to them.

I would RMA. But I doubt I'll get the card back or a replacement. Just wish the UK had the right to repair.

Have you tried a custom fan curve?
 
They didn't test image quality

Some new games downgrade image quality dynamically based on vram buffer so 6800xt can have better image quality than rtx3080 - until they add image quality to reviews it's not like for like

though I suppose things like texture quality and texture pop in don't have too much affect on performance so if you just care about how many frames you get then so be it
Your good point got missed in all the banter.

Testing VRAM requirements just got a whole lot more complicated due to games now starting to automatically lower image uality and reduce LOD depending on the amount of video memory available.
 
I was playing control using my 6800 xt at 3440x1440 and I noticed the vram was at 12gb. on the face of it seems that 10gb isn't enough for the resolutions that these cards are aimed at.

Doesn't mean it's using those 12GB. Not saying that it can't, but memory isn't micro-managed especially if it has more to play with.
 
Doesn't mean it's using those 12GB. Not saying that it can't, but memory isn't micro-managed especially if it has more to play with.

The more memory it has to play with then the more it does have to manage. Housekeeping will take longer the more memory is in use, while performing such action less will mean it takes longer resulting in stutter.
 
The 10GB VRAM on the 3080 is only going to be an issue in extreme cases. I have a 3080 and at 4K I have to compromise with settings on some modern games at 4K to get over 40FPS lows. So the 3080 will be out of GPU grunt long before the 10GB becomes a problem.

Now the 3070 on the other hand, a decent 1440p card but 4K is a push considering the limited VRAM buffer.
 
The 10GB VRAM on the 3080 is only going to be an issue in extreme cases. I have a 3080 and at 4K I have to compromise with settings on some modern games at 4K to get over 40FPS lows. So the 3080 will be out of GPU grunt long before the 10GB becomes a problem.

Now the 3070 on the other hand, a decent 1440p card but 4K is a push considering the limited VRAM buffer.

Blimey, what games are you playing to see those lows ?
 
Have you considered upgrading to a 3060 TI? :p
You know, I actually have thought about it. Was thinking if I can pick one up at some point on the cheap, it could end up being my temp card while waiting for AMD and Nvidia's next gen offerings :)

But before that, I am still waiting on Dying Light 2 and Vampire Masquerade Bloodlines 2 at the very least before selling my 3080 on. Until then I will enjoy all the games that require the most grunt :D

Your good point got missed in all the banter.

Testing VRAM requirements just got a whole lot more complicated due to games now starting to automatically lower image uality and reduce LOD depending on the amount of video memory available.
I know right. You should consider upgrading to a 3090 ;)

Sorry, could not help it :p:D
 
The more memory it has to play with then the more it does have to manage. Housekeeping will take longer the more memory is in use, while performing such action less will mean it takes longer resulting in stutter.

Sure, but it'll allocate based on what's available and not what it actively needs, like how COD holds 20GB of VRAM on the 3090. Even when it comes to usage, the code might be lazy and not actively manage assets until it meets certain thresholds.

Would also have to consider throughput to the VRAM. 6800XT has a smaller memory interface and slower memory so it can use the extra storage to store assets for upcoming scenes.

The only time that I see VRAM ever being an issue is if the scene demand is larger than what the VRAM can hold at a single point in time, and even then it's not a straighforward calculation.
 
You didn't read this thread? The next gen consoles only have a total of 16GB of RAM. The XBox has a split with 6GB slower and 10GB faster RAM. Now if you take away the OS, housekeeping such as recoding, game code and data you will be left with ~10GB of RAM for graphics.



Yep, I don't see Ampere or RDNA2 having much life expectancy due to the GPU already being pushed beyond it's capabilities.



Made me chuckle. I'm sure he's not happy about having to say that, though he's still trying to down play RT.



yes because pc user use the exact the same settings as the consoles SMH
 
More vram does not guarantee the ability to max-out all games.

My 3080 can't do it now on my Reverb, and it has nothing to do with vram.

The promise of max-settings future-proofing is a mirage.
 
Yeah fans at 100% case fans at 100% open case now, dust filters off. It's currently mining like that and every day I'm having to limit power. Today the power is set to 59% due to waking up and it's thermal throttling, overnight the pc restarted itself.
My Gigabyte 3080 Vision OC thermal throttles, I checked HWInfo and the GPU Memory Junction Temperature goes to 110C if I set it to mining.. and close to 100C if just playing games (though it doesn't appear to be throttling in games, yet).
 
A little off topic but given recent posts from @Smiffy-UK, the GDDR6 used on the MBA 6800 XT and 6900 XT has a maximum temperature of 100c.

Once you hit this temperature the GPU will throttle core and memory speed until temperatures drop, or you alt tab out to reapply a GPU Tuning profile.

The temperatures below are from a 3 hour session of Red Dead Redemption Ultra settings at 4K, in a 23c room, with fan speeds on the GPU completely silent at 1500RPM, which is the default 6900 XT fan speed.

The maximum temperatures are on the right and are peak, typically temperatures in game are a couple of C lower.

The 6900 XT memory junction temperature will peak at 94c, 6c away from throttling temperature. The core Junction temperature peaks at 104c, also 6c away from throttling temperature.

The difference between edge and core junction temperature is 14c peak, which indicates excellent contact between heatsink, thermal pad and GPU die.
bbuHPvq.png

People worry about temperatures, but as long as you're below the maximum temperature it will not affect the life span of the GPU. Voltage will kill your GPU faster than temperature will.
 
Status
Not open for further replies.
Back
Top Bottom