• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I might have to go digging for more posts it seems, there were a lot more posts than that on the fury x, one that I remember well, "it's HBM, which is much faster than GDDR so you don't need as much vram" being stated a few times ;) Bet you Matt has gone back to edit those posts now though :D :cry: ;)
 
Was is LtMatt by any chance? As I recall he could also not see the difference RT made in Dying Light 2 when the difference was day and night. Makes one think that being the case why he goes on so much about textures :p :cry: :p
Honestly cant remember it was months ago now.
 
The whole point was to run maximum settings at 4K (as the card had the grunt to do it) to see the frame rate collapse to single digits, as pointed out via PcGamesHardware and Computerbase. If you ran FSR you may not see the FPS collapse as the game is running at 1440P and not 4K resolution. I would have thought this would be obvious, but apparently it's not. These tech results were deemed to be false, so we asked actual owners to use the same settings (4K etc) and test. We then found out the tech results were true, and the people claiming otherwise since the launch of the game were wrong. This is the part that stings I guess. Also, no one has ever said you cannot enable FSR, unless you are trying to test the exact scenario reported by tech outlets and now owners, which this thread has largely cantered around.

None of the current gen cards are capable of running every game maxed out at 4K so why fret about using FSR / DLSS when it's available.

We so far have one game that causes the 3080s frames to collapse when certain conditions are met yet RDNA 2 cards frames collapse in the majority of RT games with many not even having the option of FSR to Make them playable and instead require disabling the feature altogether.

Next gen will bring many more RT games compared to games that require more than 10gb, the 3080 likely won't be hitting 60fps at 4K max settings in the majority of next gen titles so will need settings dropping anyway or FSR/DLSS which will in turn reduce VRAM usage. This is why older cards rarely struggle with VRAM before raster since the newer more demanding games require lower settings to stay above 60fps.
 
none of them picked up on the now infamous gtx 970 Mordor VRAM issues, same with FF15. Was amusing reading all their excuses when they all failed on Mordor.
What Mordor issues? I played a load of Shadow of Mordor if that is what you are referring to (at 4K) without having problems with VRAM. Batman Arkham Knight on the other hand, that game choked its VRAM pretty bad and I couldn't get around it short of really lowering the settings until I upgraded to the 1080ti. :p
 
What Mordor issues? I played a load of Shadow of Mordor if that is what you are referring to (at 4K) without having problems with VRAM. Batman Arkham Knight on the other hand, that game choked its VRAM pretty bad and I couldn't get around it short of really lowering the settings until I upgraded to the 1080ti. :p
You never heard of it?

On the 970 (I think with certian options set), it would use the much slower part of VRAM, as the 970 has 2 segmented areas of VRAM, one 3.5 gig segment is fast, and the final 512meg segment is slow, when Mordor started using that slower segment the frames tanked. The issue went viral which forced a fix and force many of the reviewers to give reasons why they didnt find the issue.

Bear in mind the drivers got patched to mitigate the issue so it might not happen anymore.
 
None of the current gen cards are capable of running every game maxed out at 4K so why fret about using FSR / DLSS when it's available.

We so far have one game that causes the 3080s frames to collapse when certain conditions are met yet RDNA 2 cards frames collapse in the majority of RT games with many not even having the option of FSR to Make them playable and instead require disabling the feature altogether.

Next gen will bring many more RT games compared to games that require more than 10gb, the 3080 likely won't be hitting 60fps at 4K max settings in the majority of next gen titles so will need settings dropping anyway or FSR/DLSS which will in turn reduce VRAM usage. This is why older cards rarely struggle with VRAM before raster since the newer more demanding games require lower settings to stay above 60fps.

Don’t try and bring logic in to this discussion please.

Look, it’s as simple as this. Can the 3080 run every single game in every single possible setting maxed out without running out of vram? The answer is NO. There you go, you lost the argument now and that’s all there is to it.

Takes red shades off.

Wow. What just happened. What did I just say?

:cry:
 
I wonder if there is some sort of "pride of ownership" for specific parts of a graphics card that is more intense with some people than just what the fully-assembled graphics card can or can't do.

The most common thing holding back the 3080 is the GPU itself.
 
Last edited:
I wonder how many posters in this thread:
Have a 3080 10Gb?
Have had issues with VRAM bloat/overflow that adversely affects FPS without massive amounts of 4K hi-res textures/mods being installed?
Is FPS the only important metric?

Low VRAM wont typically affect FPS directly, but more about stutters and dynamic quality textures going to a lower level of detail.

For reference yes I own a 3080 10 gig FE model.

Yes to your last question, the high res textures in FF7 remake are still not that good quality, they definitely not 4k textures. The game itself actually has a budget nowhere near 10 gigs for textures, but hits the issues due to putting system RAM stuff in VRAM, same issue that affected far cry 6, which i expect will be more common place in ports from current gen consoles in future.
 
None of the current gen cards are capable of running every game maxed out at 4K so why fret about using FSR / DLSS when it's available.

We so far have one game that causes the 3080s frames to collapse when certain conditions are met yet RDNA 2 cards frames collapse in the majority of RT games with many not even having the option of FSR to Make them playable and instead require disabling the feature altogether.

Next gen will bring many more RT games compared to games that require more than 10gb, the 3080 likely won't be hitting 60fps at 4K max settings in the majority of next gen titles so will need settings dropping anyway or FSR/DLSS which will in turn reduce VRAM usage. This is why older cards rarely struggle with VRAM before raster since the newer more demanding games require lower settings to stay above 60fps.

Also, as pointed out by both the gospel sites pcgh and computerbase ;) :p A 6800xt doesn't even have enough grunt for FC 6 @ 4k without fsr either, averages of 50s and dips into 40s (same as what my 3080 gets), not ideal imo but YMMV.
 
Last edited:
  • Haha
Reactions: TNA
Is FPS the only important metric?

Low VRAM wont typically affect FPS directly, but more about stutters and dynamic quality textures going to a lower level of detail.

For reference yes I own a 3080 10 gig FE model.

Yes to your last question, the high res textures in FF7 remake are still not that good quality, they definitely not 4k textures. The game itself actually has a budget nowhere near 10 gigs for textures, but hits the issues due to putting system RAM stuff in VRAM, same issue that affected far cry 6, which i expect will be more common place in ports from current gen consoles in future.
So that’s poor game design rather than an inherent issue with the card only having 10GB VRAM?
 
So that’s poor game design rather than an inherent issue with the card only having 10GB VRAM?
Its both really.

If you relying on only playing games that are optimised properly I feel you in cloud cuckoo land, most games I play have optimisation issues of some sort, so hardware should be designed to account for that.

I also think moving forward we going to see more games that use a unified memory approach (using VRAM for more things) as it saves time for the developers porting from consoles.

My opinion is moving forward budget cards should have 6-8 gig, mid range 12 gig, and high end 16 gigs. I consider xx80 cards high end. I also have the point of view where grunt on these new GPUs is more than enough, so to keep costs down they can shift some rasterization chip space over to VRAM. I dont own any games that my 3080 cant maintain 60fps at 4k rendering. It does it at 1440p whilst at jogging pace. 4k Rendering in ff7 remake only drops frames when its doing shader, texture swaps in VRAM. So yes VRAM is my current bottleneck, sorry if that doesnt fit with the narrative.
 
My opinion is moving forward budget cards should have 6-8 gig, mid range 12 gig, and high end 16 gigs.

Agreed. Can’t see it playing out any different to be honest.

Not long left now, I will be upgrading to a next gen card which will have a lot more vram anyway :D
 
That's how it comes across. Talking about the 3090, mining, the cost of the GPU and the prices they are selling for now. It just comes across as jealousy from folks that are using a lower tier of GPU.

Whilst I can't speak for every 3090 owner, I'm sure that people that can afford a high end GPU are not too worried about it losing value in the second hand market, especially if they were on the mining craze.


Not entirely representative.

Many waited for the performance results of the 3090/6900XT to be released, then made an informed decision that due to the 3080 being 85% of the performance of these cards was the best buy, if you could get one. The scramble started for the 3080 due to 85% performance for half the price. Same for mining.

So nothing to do with people who just want the best or disposable income. You are biased as you like to play benchmarks and get a highest score - which means you do need the top cards/binned GPU's.

For many, there seemed little point in paying double for the single digit FPS uplift that a 3090 gave over the 3080 at 4k. The lack of available 3080's and scapled 3080 prices is what pushed those who needed a card to the 3090's/6900XT's as they couldnt get a 3080 hence the massive waiting lists and queues generated. People hanging on for months to get one at the original prices.

If I'd have not managed a 3080 then I'd have got a 3090. But I managed to get one like many others, before the 3090's/6900XT's were available to purchase once reviews were available. If the perf uplift was more, say +50% of the 3080 then I probably would have gone for a 3090. But 15%......nah. GAmers and miners saw which was value for money and at 4k too. Maybe at 1080p with very high hz monitors it made sense to max out 240hz or towards 360hz. 4k performance - single fps on release which is measurable, but for £700 extra you wouldn't see it.

Benching GPU's is the same as CPU's need top chips if you want to be competitive. BEnchmarking for everyone else is just to guage if your purchase is in the ball park of how it stacks up in the silicon lottery. So your reason as a bencher for having the top GPU is rather misleading as you spend hours benching and enjoy being the top of some scoreboards.
 
  • Like
Reactions: TNA
Status
Not open for further replies.
Back
Top Bottom