• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
An argument can indeed be made for anything... whether it actually makes sense or not is another matter.

Sure, but your objection wasn't that my argument didn't make sense, but rather that I focused on the last gen and not the next gen, when I've done both.

Professional reviewers in the main do not know how to measure vRAM accurately, I've looked through pretty much every single link posted in this thread which relates to reviews of games and their vRAM usage, and it's clear that what is being measured is not vRAM usage but vRAM allocated. Which means they don't understand the difference nor do they know of the tools to measure it accurately. I've reached out via email to I think 4 of these professional reviewers about their methodology for measuring vRAM and giving them heads up to the new Afterburner Beta but I've heard nothing back from any of them. Much of the community is the same incidentally, this has been discussed to death now over 100's of pages and we still see people posting images or statements relating to vRAM allocated as if that somehow matters.
 
Is this really 115 pages of people arguing whether vram measurements are usage or allocation?

It seems no one really knows (and would possibly require a crystal ball) but in my experience of computers for over 25 years, it's always best to have more ram than you need. And dropping £649 (lol) on a graphics card with potentially not enough vram doesn't seem like a wise move.
 
some say if you go to the darkest corners of the interweb and utter the words "vram" frosty will be there to try explain the phenomena of used vram vs allocated.
 
Is this really 115 pages of people arguing whether vram measurements are usage or allocation?

It seems no one really knows (and would possibly require a crystal ball)
Yep, no one knows for sure no.

but in my experience of computers for over 25 years, it's always best to have more ram than you need. And dropping £649 (lol) on a graphics card with potentially not enough vram doesn't seem like a wise move.

Yeah and one could equally say it is not wise to spend £200-£300 extra on a 3080Ti 20GB when you consider by the time there is more than a handful of games that you will want to play that needs more than 10gb that Hopper will be out. That extra money will then pay for more half the cost of something like a 4070 which will no doubt come with 16gb and offer 3090 or there abouts in performance.

Not only that, but you will be running out of grunt before vram in many cases.

Basically it is all down to the way you look at it, so it makes me laugh when people make statements like that. Now if we had 3080 20GB come out for £50-£100 extra then you may have a point.

Just look at Dave who has been banging on about 10Gb not being enough. He loves vram so much that he went with a Radeon VII with 16GB vram. Look how that turned out, the card runs out of grunt WAY before it needs that 16gb and is beaten in pretty much every game by a 3070 with 8gb's. What's more funny is, after flagging of the 10GB's on a 3080 he has sold his Radeon 16GB to upgrade to a 3080 10GB. Says it all. Lol :D
 
This isn't about spending £300 extra for a 20GB (which is now irrelevant after AMD's announcements), it's about Nvidia releasing a card that is not good enough for future proofing. All the while making a paper launch, releasing even more product lines, and to top it off, ******* off 2080ti owners, y'know the people who dropped over a grand on their previous cards, with this paper launch.

Man, 2020 Nvidia suck.
 
Is this really 115 pages of people arguing whether vram measurements are usage or allocation?

Quite a lot of it, yes. There's an awful lot of back and forth with people posting claims that X game uses 16Gb of vRAM but when you investigate and test those claims it turns out it actually needs like 6Gb.

If it was best to have more vRAM than you need and that was the end of the argument, then you'd surely go for a 24GB 3090, because what if 16Gb on say a 6800XT is not enough? It's obvious that this argument in isolation is not sufficient because it could lead to you getting something with way more vRAM than you need. What we need some kind of barometer that tells us when it's appropriate to stop overestimating vRAM for the just-in-case scenarios, especially because vRAM is expensive and drives up the cost of cards.

Let's just be honest about this, people go by their gut feeling which is informed partly by past experience and partly by the evidence in front of them of what games need now. Except peoples perceptions of what games need right now is severely skewed and has been for years because we've failed to address this important distinction.
 
This isn't about spending £300 extra for a 20GB (which is now irrelevant after AMD's announcements),

It is not irrelevant as many here have reservations or simple refuse to buy AMD.

it's about Nvidia releasing a card that is not good enough for future proofing. All the while making a paper launch, releasing even more product lines, and to top it off, ******* off 2080ti owners, y'know the people who dropped over a grand on their previous cards, with this paper launch.

Man, 2020 Nvidia suck.
None of that is surprising to me. They have been greedy and not consumer friendly for a very long time as far as I am concerned. They have the tech, the mind share and simply want to bleed us of every penny they can.
 
Yeah and one could equally say it is not wise to spend £200-£300 extra on a 3080Ti 20GB

Stop acting like the 3080ti will cost an extra £200-£300 because they added an extra 10GB of RAM. With Nvidias buying power that extra 10GB probably costs them £75 max

Just look at Dave who has been banging on about 10Gb not being enough. He loves vram so much that he went with a Radeon VII with 16GB vram. Look how that turned out, the card runs out of grunt WAY before it needs that 16gb and is beaten in pretty much every game by a 3070 with 8gb's. What's more funny is, after flagging of the 10GB's on a 3080 he has sold his Radeon 16GB to upgrade to a 3080 10GB. Says it all. Lol :D

giphy.gif
 
Stop acting like the 3080ti will cost an extra £200-£300 because they added an extra 10GB of RAM. With Nvidias buying power that extra 10GB probably costs them £75 max

Yeah and Nvidia will sell you that at cost will they? :rolleyes:

The only reason Nvidia will come in at a lower price than £849 for a 3080Ti is because of AMD. But until we see benchmarks, we will not know what's what for sure. End of the day people pay more for Nvidia. They still have the better RT and mindshare so even if they are 5-10% less rasta performance than a 6900XT, they will still likely charge close to a grand for it.

What are you his mate or something? lol
 
With rumours that AIBs are making virtually no profit on most of their 3080s, im not sure what that means for 3080ti pricing but it doesn't sound great.
It will be £728 according to Dave's mate Chuck Chuck :p

They would (or close to cost) if people stopped accepting this **** from Nvidia.




I don't need to be his friend to facepalm at how awful that post was. So many assumptions based on what you are wishing to be the right answer.
Yeah, whatever mate. Apart from a typo where I meant to say slagging off, it is an accurate description of what has occurred. Yes it is awful, but not my doing :D
 
Let's just be honest about this, people go by their gut feeling which is informed partly by past experience and partly by the evidence in front of them of what games need now. Except peoples perceptions of what games need right now is severely skewed and has been for years because we've failed to address this important distinction.
I think there is quite a few who are coming from the 1080ti/ 2080ti who dont want to buy a newer product with less vram than they currently have (me included). Its hard not to think you are losing out on something however small that may be in the end.
 
I think there is quite a few who are coming from the 1080ti/ 2080ti who dont want to buy a newer product with less vram than they currently have (me included). Its hard not to think you are losing out on something however small that may be in the end.
Yeah. That is true.
 
This isn't about spending £300 extra for a 20GB (which is now irrelevant after AMD's announcements), it's about Nvidia releasing a card that is not good enough for future proofing. All the while making a paper launch, releasing even more product lines, and to top it off, ******* off 2080ti owners, y'know the people who dropped over a grand on their previous cards, with this paper launch.

Man, 2020 Nvidia suck.

The GPU will choke to run the game in the future before hitting 10GB.
Will a Nvidia 1080 with 20GB out perform 3080 with 10GB? NO
By the time the games will start use 10GB+ in the future (say in 2 years time), your GPU will choke. It doesn't matter if you have 16gb or 24gb, the GPU will choke before that.
Therefore, it's better to give us adequate at cheaper price.
Believe me, next year 4080 with 16Gb will out perform 3090 with 25gb.
 
I think there is quite a few who are coming from the 1080ti/ 2080ti who dont want to buy a newer product with less vram than they currently have (me included). Its hard not to think you are losing out on something however small that may be in the end.

I can see why it's upsetting on the face of it. But the truth is that the 11Gb for those cards was overkill, and between launch day and today has not provided the user with any benefit. Most likely the only reason they had that much vRAM in the first place was to get a memory config with high enough memory bandwidth to feed the GPU and not cause a bottleneck. That's purely about architecture constraints.
 
Came from a 1080ti to a 3080, never once did I think the 10gb was a problem or going to be in the "near" future, but as with anything on the net these days people sure love to ride them a bandwagon, pitchforks ahoy!
 
So based on techpowerup’s review of Assassins creed Valhalla, not only does it look better than next gen consoles on PC, it looks better than watchdogs legion and only uses 6GB of vram. LOL!

Give it a few days, he will be back again saying the same things. As I said, it seems like the fella is a few cards short of a full deck :p

So, case in point, I reached out to W1zzard who did that article on Techpowerup and asked how he measured vRAM and he confirmed it was with GPU-Z which only measures what is allocated and not what it is in use. I've gone back to him again to see what he has to say about potentially measuring with Afterburner, if he replies again I'll post an update.
 
Last edited:
I feel that Valhalla is not a good case anyway in this argument because for me I do not consider it a next gen tittle.

It's using the same engine as Origins if not older is it not?

It has no RT either.

So it's a very distinctly current gen game.
 
Status
Not open for further replies.
Back
Top Bottom