• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16GB of Vram The Standard For High End Graphic Cards?

Status
Not open for further replies.
It's a good thing there are many people with various cards on the internet too... 20% is a very worthwhile increase when talking about 40-70 fps range...





Also, good video showing all the 3080 models here:


Look at that whole 2GB extra of VRAM going to town.....

psjix5h.png

ceXjlFP.png

FGmP9k8.png

HnlVEsv.png

gx5s7Hw.png

AsuTQrN.png

;) :cry:

Definitely not a worthwhile/noticeable difference between the 3070 and 3080... :cry:

The difference going from a 3070 to a 3080 is far more worthwhile than it is going from a 3080 to a 3090 in todays games.... especially when you look at the FPS range of going from something like 40/50 (3070 perf) to 60/70+ (3080 perf) fps as opposed to going going from say 70 (3080 perf) fps to 85/90 (3090) fps. Of course as new more demanding games come out, a 3090 will last longer than a 3080 but by that time when you get to unplayable fps <60/70fps (subjective), new better and cheaper cards will be out....
Cheers!

Looking out for what the NV 4k series deliver. I am hoping to get a sizable upgrade over my 3070 within or near the same power consumption envelope.

Win win for all if NV have somewhat normal prices as the 4000 series look to be a huge gain over the 3k series.
 
Probably not too bad for the low tiers. If AMD continue with their momentum and get nvidia spooked the theory is they will push the higher tier hard to save face. This could marry up with why they have changed the power connectors recently. Excited to hear about the leap in performance coming but not so excited about if it eats more juice than this generation to get that.
 
It's the difference between 100 and 120-125 FPS for example. Almost a generational leap. I'm sure you would notice if we snuck that not so noticable difference out your pay packet :p

What RTX games are you playing at 100fps?
I know my 3080 could do with a lot more horsepower at 1440p when I turn on RT.

For talking sake:

80 does 60fps, 70 at 48fps, drop down two game settings, maybe 3 at a push and the 70 gets back up to the 80's ~60 fps.

Or 80 does 125fps, 70 at 100fps, drop down two game settings, maybe 3 at a push and the 70 gets back up to the 80's ~125 fps, yes or no?

At least you haven't spammed the thread with countless math comparisons that totally misses the point of dropping a few settings to gain fps parity.:p

Again you misunderstand. I'm saying AMD made use of more VRAM rather than streaming textures in properly. AMD was aware that Nvidia's top consumer card had 10GB therefore targeted a VRAM usage greater in both Godfall and FC6 in the hope they could win some marketshare. Do you think it was a simple coincidence that 12GB was required, while at the same time AMD were pushing 12/16GB cards?
No I don't misunderstand, I've said it countless times AMD 100% BROKE the 80's vram, why wouldn't they when Nv can and have broke AMD's RT'ing ability.

I'm saying you have double standards moaning that AMD won't/can't use more RT because of hardware limitations(as their RT is rotten in comparison), wheras AMD bad because they can't do RT but can cripple 10gb or less because they don't have a comparable vram limitation in this instance.

People don't see a 12GB Nvidia product, they see a new graphics card at the top of the performance charts with a greater feature set than the other guys. @Johnny Silverhand has his new 12gb Nv product.
What has his purchase to do with in relation to you saying thats Nv's way of making you purchase now instead of lovelace.
 
The issue I have with these back of fag packet percentages is it varies from person to person. When the cards launched, it was the 3090 "is only 10%" better than the 3080. Then as games got benched or released the gap could be as large as 30% dependant on your resolution. Then the 3080Ti came to the fold and suddenly that is only 10% better than the 3080 which is impossible as it used to be the 3090 which still beats a 3080Ti. So again it is off the cuff statistics supremely generalised and make it sound like there is nothing in it. The same performance shenanigans can equally be applied to 6800XT vs 3080 when these percentages of "only" get used.

When it is within an actual game and represented using fps for example, that can be the difference of a smooth >60 compared to a low 50's.

With the willingness from some to get angry because they can't get a 90 while fighting the corner of the mid ranged stingy vram 70 and 80s, I imagine a few owners in here bought themselves some cuban heels with the monies saved from getting an FE. to compensate for not getting in the 90 club.:p
 
I know my 3080 could do with a lot more horsepower at 1440p when I turn on RT.

The discussion has always been pushing the card to see how good it is and if there are any weaknesses, you seem to have a lot to say yet fail to pick up on the simplest of points. I don't think I can recall of anyone saying the 3080 was a terrible card, in particular at the resolution of 1440p. This is where DLSS comes in to bail out most cards. You could easily argue that at that resolution the 3070 and Ti are this target sku marketed by nvidia. So what we have is even by your own admission here dGPU's needing a bail out for fps when your jacking into ray tracing settings, from what you term as horsepower. However there are occasions putting ray tracing aside where high textures are used that horse power is not the focal point but the lack of vram becomes the issue. Whether this is deliberate as you put it, or simply an oversight from the engineers is to be seen, but we do have other cards now available with more VRAM which likely do not suffer from this (3080Ti, 3080 12Gb - funny how these came to be released if there was no issue regarding vram that you and others speak of - maybe just gullible punters then!).

So the statement could also read "I know my 3080 could do with some more vram when using these large textures" but that would be too much of admission, so we have the ever decreasing circles of this denial.
 
With the willingness from some to get angry because they can't get a 90 while fighting the corner of the mid ranged stingy vram 70 and 80s, I imagine a few owners in here bought themselves some cuban heels with the monies saved from getting an FE. to compensate for not getting in the 90 club.:p

It seems so Tommy Boy! However the green eyes seem to think its easier to spittle "compensating for overspending" or to that effect, whereas you know more than anyone that bagging that specific card has been way more difficult than portrayed by some especially within the first six months of release fighting bots, scalpers etc. Then it leads on to the point I made above. People seem to want more vram as per cohort like willhub et al for the longevity, however if you want this as a feature your suddenly fair game for the pitchforks and torches that anything above 10Gb is unnecessary! :cry: People also sidestep that many a person in the past year has spent a grand on an AIB 3080 which is anyone's eyes is not a good deal.

The case of it will do is just a compromise for the status quo. Ask anyone worth their salt in the industry and the figure for a flagship GPU or high end will be greater than what you got with the original 3080. Luckily for some the 40 series is nearly here.
 
For talking sake:

80 does 60fps, 70 at 48fps, drop down two game settings, maybe 3 at a push and the 70 gets back up to the 80's ~60 fps.

Or 80 does 125fps, 70 at 100fps, drop down two game settings, maybe 3 at a push and the 70 gets back up to the 80's ~125 fps, yes or no?

At least you haven't spammed the thread with countless math comparisons that totally misses the point of dropping a few settings to gain fps parity.:p


No I don't misunderstand, I've said it countless times AMD 100% BROKE the 80's vram, why wouldn't they when Nv can and have broke AMD's RT'ing ability.

That has to be the most flawed argument so far... To all owners of low end/older gpus, just reduce settings and you too can have 3080 performance, heck 3080 owners just reduce 1-2 settings if that to get 3090 performance... :cry: :p :D Also, reducing settings for what a card is capable of grunt wise also reduces the vram usage ;)

If anything that has further proved the point I and others have been making all along, eventually all gpus are going to have to reduce settings because of lack of grunt and not being able to hit a playable fps as we have seen with a 3070 and are also now witnessing with not just the 3080 but also the 3090, same way RDNA 2 gpus aren't capable of any heavy/complex RT workloads thus have to reduce the settings or turn off RT entirely.

But either way:

RJeJQ9a.png

Again, for some who might have missed it in all the rambling, a good watch ;)

 
I've always felt like 10GB was too little for a latest gen high end card but now that I have ordered a 3080 I'll be arguing that it's fine just to make me feel better :)
 
It seems so Tommy Boy! However the green eyes seem to think its easier to spittle "compensating for overspending" or to that effect, whereas you know more than anyone that bagging that specific card has been way more difficult than portrayed by some especially within the first six months of release fighting bots, scalpers etc. Then it leads on to the point I made above. People seem to want more vram as per cohort like willhub et al for the longevity, however if you want this as a feature your suddenly fair game for the pitchforks and torches that anything above 10Gb is unnecessary! :cry: People also sidestep that many a person in the past year has spent a grand on an AIB 3080 which is anyone's eyes is not a good deal.

The case of it will do is just a compromise for the status quo. Ask anyone worth their salt in the industry and the figure for a flagship GPU or high end will be greater than what you got with the original 3080. Luckily for some the 40 series is nearly here.

It's not just the aib 80's, saw plenty of 70 non ti stock in the region of £7-900 flying off shelves too, then everything else because folks preffer aibs over the always in stock FE's.

Waiting for the 'yeh go for it, it's a massive boost, infact the FE is in stock right now' replys when asked in the next '70 to 80 Will I notice an improvement' thread:cry:
I've always felt like 10GB was too little for a latest gen high end card but now that I have ordered a 3080 I'll be arguing that it's fine just to make me feel better :)
:cry:

Laughing aside, cracking gpu.
 
Last edited:
I've always felt like 10GB was too little for a latest gen high end card but now that I have ordered a 3080 I'll be arguing that it's fine just to make me feel better :)

Nothing wrong with honesty. We all think it, its only going to affect people that have the card for a while and play on hi res displays to be fair (which I have always said, it wont apply to 1440p in the main).
 
It's not just the aib 80's, saw plenty of 70 non ti stock in the region of £7-900 flying off shelves too, then everything else because folks preffer aibs over the always in stock FE's.

:cry::cry::cry:

I know right! It was the same can kickers that were calling out the 3090 being a rip off, then do you remember the couple of guys that drummed this beat ended up buying a 3090 anyway?!!! :cry::cry: :cool:

If people were paying 800 for a 3070, and people were paying a grand+ for a 3080 then suddenly the value for money argument goes flying out the window.
 
It's not just the aib 80's, saw plenty of 70 non ti stock in the region of £7-900 flying off shelves too, then everything else because folks preffer aibs over the always in stock FE's.

Waiting for the 'yeh go for it, it's a massive boost, infact the FE is in stock right now' replys when asked in the next '70 to 80 Will I notice an improvement' thread:cry:

:cry:

Laughing aside, cracking gpu.
Still didn't get it for MSRP but a 3080tuf oc and a waterblock to match for £940 I can just about live with that. I'm only playing 1440p 144hz
 
An extra 7% performance for an extra 37% price? :p ;) As said it's up to individuals whether they think that plus the 2GB extra is worth it. I think next gen it will be 16GB standard but there's *surely* only so far they can keep pushing the VRAM requirements ? Game installs are already getting quite ridiculous with the amount of assets. :D
 
If you read many many pages on this buddy, apparently 10Gb is all that's needed. :rolleyes:;)

Yes you have a problem with reading comprehension.

The guy said GPU grunt. The core GPU not memory usage.

Most of you go off of MSI AB numbers which are not accurate unless you use process versions of the memory readers.

As of yet I have never even hit 7GB Vram usage in any game at 1440P maxed or not but I have dropped below 60FPS.

Heck Assassin's Creed Unity brings an RTX 3070 back down to Earth with the demanding TXAA option enabled and only at 1440P.
 
As of yet I have never even hit 7GB Vram usage in any game at 1440P maxed or not but I have dropped below 60FPS.

Yes reading comprehension. Keep up.

Nothing wrong with honesty. We all think it, its only going to affect people that have the card for a while and play on hi res displays to be fair (which I have always said, it wont apply to 1440p in the main).

Unreal @Woodsta888 its like a few posts away! :cry:
 
Status
Not open for further replies.
Back
Top Bottom