• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
So AMD are pulling smart marketting tricks with their VRAM but NVIDIA aren't with RTX and DLSS (features used sparingly and RTX which slaughters FPS)?

Well the extra vRAM offers no benefits but costs the consumer, the price of those additional memory modules is passed onto the consumer by making the card more expensive than it otherwise need be.

RTX is a trade off, better visuals but at the cost of frame rate, which DLSS is designed to help mitigate. If you don't think the trade off is worth it then I have no argument about that, extra visuals are always a trade off with performance and each is a personal choice we make based on our subjective preferences. I can only say that I personally want to play games like Cyberpunk 2077 in all its ray traced glory, I'm happy to mitigate the performance loss with DLSS because the quality of DLSS 2.0 and 2.1 is extremely good.
 
Soldato
Joined
6 Feb 2019
Posts
10,267



The 8gb is enough crowd did not last very long

watch dogs legion decimates low vram cards - if you have less than 8.5gb vram and you enable the Ultra HD texture pack your performance will tank and stutter - oh and this happens at 1440p LMAO



ouch
 

TNA

TNA

Soldato
Joined
13 Mar 2008
Posts
21,232
Location
London



The 8gb is enough crowd did not last very long

watch dogs legion decimates low vram cards - if you have less than 8.5gb vram and you enable the Ultra HD texture pack your performance will tank and stutter - oh and this happens at 1440p and 4K



ouch
Indeed. Best pay the extra grand and get 24GB to future proof :p:D
 
Soldato
Joined
17 Aug 2003
Posts
19,879
Location
Woburn Sand Dunes
god i hope legions is a decent game. I wouldnt want to hang on entire argument on it anyway, but i would be especially cautious of doing so with the 3rd in a trilogy of, so far, hideously optimised games. 3090 not managing to average 60fps at 4k. ouchy. Loved the first game though, so what shut up!
 
Last edited:
Soldato
Joined
23 Apr 2010
Posts
10,928
Location
West Sussex
Well the extra vRAM offers no benefits but costs the consumer, the price of those additional memory modules is passed onto the consumer by making the card more expensive than it otherwise need be.

If they put 10gb on there and increased the bandwidth and price I am sure no one would have been complaining. At all. This is clearly why the 6800 costs more. However, many would argue that the extra £50 to make it a fully fledged 4k card *and* next gen somewhat proof would be better than paying out over a monkey for a card that could be crying in the corner in less than a year.


This is why I did not respond to your posts. I had no reason to. It's not about who wants to be right it's about reading the market. If you seriously, honestly think it would be enough going into the next gen? I can't help you. So yes, a lot of it is guesswork and estimation. However, like I said on the previous page if you have been in the game for a while? you would be more concerned.

I have been burned on VRAM three times because of people who told me it was more than enough etc. They were wrong.
 
Soldato
Joined
18 Feb 2015
Posts
5,888
What's happened to hair, 18:45? Do you really expect to run a 3070 at 4k max? I'd have thought the 3070 was for 1440p given it's reduced GPU and bandwidth.
The funny thing is, I'm running it at 1800p with a 480 with a healthy mix of high settings & even textures on ultra! So why shouldn't the 3070 be able to do 4K? It's close to three times faster! And guess what's making that possible? Yup, you guessed it, me not being dumb enough to listen to all the 3gb/6gb is enough crowd and not buying a 1060 over it way back when.

I will repeat: compromising on vram is the dumbest thing anyone can ever do when buying a GPU. It just offers so much cheap & great visual upgrades, it's insane. You will never get better ROI on anything else from a GPU than having enough vram for however long you plan to keep it (and maybe even longer than that! my 480 is just the back-up card I've got after selling my main and hoping to snag a 6800XT/3080, but it's giving some seriously good results thanks in no small part to the 8 GB vram).
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland



The 8gb is enough crowd did not last very long

watch dogs legion decimates low vram cards - if you have less than 8.5gb vram and you enable the Ultra HD texture pack your performance will tank and stutter - oh and this happens at 1440p LMAO



ouch

Why would you consider ultra on a mid ranged card? These are not photo realistic textures. Do the pepsi challenge as I don't think people could tell ultra from high, no zoomed in screenshots :D
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
The funny thing is, I'm running it at 1800p with a 480 with a healthy mix of high settings & even textures on ultra! So why shouldn't the 3070 be able to do 4K? It's close to three times faster! And guess what's making that possible? Yup, you guessed it, me not being dumb enough to listen to all the 3gb/6gb is enough crowd and not buying a 1060 over it way back when.

I will repeat: compromising on vram is the dumbest thing anyone can ever do when buying a GPU. It just offers so much cheap & great visual upgrades, it's insane. You will never get better ROI on anything else from a GPU than having enough vram for however long you plan to keep it (and maybe even longer than that! my 480 is just the back-up card I've got after selling my main and hoping to snag a 6800XT/3080, but it's giving some seriously good results thanks in no small part to the 8 GB vram).

When did they unlock the RT on the 480? :p
 
Soldato
Joined
23 Apr 2010
Posts
10,928
Location
West Sussex
Poneros can tell the difference between ultra and ultra nightmare texture pool settings on Doom eternal, didnt you know?

Seriously?

Come on now. It is pretty crystal clear what has happened here. Nvidia penned out their next gen cards totally oblivious of what AMD were doing. They cut costs by going with Samsung, and they cut back the cards to slot them in. However, with or without AMD those holes would have still been there. Like I have said over and over, what is the point of basically taking a 4k card and cutting it back so that it's held back by VRAM? now in the real world it is clear they did that on purpose, not wanting to give too much away. They could have charged £550 for this card which is what? £90 for a tiny bit extra VRAM making this card fully capable of an entire broad range. Instead they want top dollar for a 1440p card because tbh? the 2070S released at less. Maybe not RRP, but I paid £428 for mine about three days after launch (KFA2 2070S dual fan 1 click OC) and it was every bit as capable of 1440p then as it is now. So they made an over powered 1440p card, which is a shame when it could have been a half decent 4k card for a little extra cost.

However, they did that deliberately. They know better than any of us what is coming, what is needed and so on. That is why they have totally marketed it as a 1440p card.

Now, if AMD did not exist? or were absolutely no threat? they would have gotten away with it. However, they have quite clearly been caught with their pants down rationing out portions of GPU totally oblivious that AMD may have had something this round. Which has been made completely clear with this utter farce of a launch, the laughable stock levels and their continued attempts to derail AMD by moving launches around. It still does not give more VRAM to the 3070.

ONE DAY after launch we have seen a 50% next gen title and it's already surpassing the VRAM limit. ONE DAY. Give it a year? or two? I doubt that card will even cut it at 1440p with high settings. Now look, I am not disagreeing with your findings on Doom. I've never said I don't agree. I run it at Nightmare, on 1440p and I would be hard pushed to spot the difference. However, today that all changed because in Watchdogs *it does* make a huge visual difference. And that huge visual difference? that is next gen gaming compared to current gen. So all of your arguments are futile, because you too are simply throwing around guesses like every one else as to what may happen. However, the difference between these two sides? is nothing but caution. Caution and quite possibly experience, if you have ever been burned by VRAM.

I have. I've also seen the behaviour of Nvidia for years and years, so I usually have a bloody good idea of what they are up to. You wanted your "Cheap card, faster than a 2080Ti" well? you got it. It's cheap for a reason, as per usual.
 
Soldato
Joined
28 Oct 2011
Posts
5,861
Well the extra vRAM offers no benefits but costs the consumer, the price of those additional memory modules is passed onto the consumer by making the card more expensive than it otherwise need be.


It already has three benefits, 1. Heavily modded games. 2. Future advantage in next gen titles. 3. Value for money.

Games are already here and in the pipeline that will benefit from having more VRAM. I'm not sure why you have to keep digging? Just stroke your 3080 order and be happy with your 10GB of VRAM, after all it is more than enough for the next few years according to you...

:D
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
If they put 10gb on there and increased the bandwidth and price I am sure no one would have been complaining. At all. This is clearly why the 6800 costs more. However, many would argue that the extra £50 to make it a fully fledged 4k card *and* next gen somewhat proof would be better than paying out over a monkey for a card that could be crying in the corner in less than a year.

This is why I did not respond to your posts. I had no reason to. It's not about who wants to be right it's about reading the market. If you seriously, honestly think it would be enough going into the next gen? I can't help you. So yes, a lot of it is guesswork and estimation. However, like I said on the previous page if you have been in the game for a while? you would be more concerned.

I have been burned on VRAM three times because of people who told me it was more than enough etc. They were wrong.

Right I do understand where you're coming from. Our disagreement is that I contend it wont be crying in a corner in less than a year, and that's where we fundamentally disagree.

I've given reasons for why I think that, such as looking at games now which are "future proof" in the sense that they offer features today which are designed to be "next gen" and to give the title longevity. Some examples of this are Crysis Remastered, FS2020, the point is that if you crank up the settings in these games you start filling up more vRAM but you also put more strain on the GPU. And the GPU is what gives out first. Benchmarks clearly show this is the case that vRAM wont exceed 10Gb but the GPU will be burdened so much that it's fundamentally unplayble.

If I'm wrong with that prediction then of course you'd have a point, the extra vRAM would be helpful at some point in future just not right in the moment, and I completely understand that argument. I just obviously disagree on the prediction itself. I understand that some cards can have this problem if the vRAM is not enough for that particular card, I'm not making a blanket statement about all cards and all vRAM configs, I'm saying this specifically about the 8Gb and 10Gb 3070 and 3080.

It already has three benefits, 1. Heavily modded games. 2. Future advantage in next gen titles. 3. Value for money.

Games are already here and in the pipeline that will benefit from having more VRAM. I'm not sure why you have to keep digging? Just stroke your 3080 order and be happy with your 10GB of VRAM, after all it is more than enough for the next few years according to you...

:D

1) Heavily modded games is about the closest I can come to agreement. There is some evidence that modded games can exceed vRAM budgets but that has been done badly in most instances I've been presented with. All metrics I've seen so far do not measure vRAM usage, they measure vRAM allocated, which I've gone into at length what that is inaccurate and misleading. I have however seen someone explicitly stating they have used this metric, it's a post on the resetera forums about memory measurement, and they admitted they were running over 1000 mods. Modding is a reasonably niche thing to do even in PC gaming, and people who use that many mods as to exceed vRAM budgets are extremely rare. And the simple fact is these people are outliers and Nvidia aren't going to increase the cost of something like a 3080 by putting on 6Gb more vRAM to appease the few people running 1000+ mods.

2) Future advantage is speculative and based on what evidence we have right now next gen titles will be GPU bound before they become vRAM bound, and I've given evidence as to why i think that.

3) This is just wrong and based on your conspiracy like theories about Nvidia holding back on memory while pocketing the cash for themselves. In the actual real world if Nvidia added more GDDR6x chips to the 3070 and 3080 it would cost them more to build and hence they'd have to sell the card at a greater price. You'd pay more for your video card for memory that you cannot use.

No one has given an example of a next gen game that will benefit from having 16Gb over 10Gb, or if they have maybe you can point me to a source?
 
Soldato
Joined
6 Feb 2019
Posts
10,267
The main reason some people will think vram doesn't matter is because most games use low quality textures - every now and again a developer decides to give us native 4k texture assets and that kills low vram cards. Well next gen is here, so you can prepare for that to happen a lot more often cause more and more games will ship 4k textures
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK

The 8gb is enough crowd did not last very long

watch dogs legion decimates low vram cards - if you have less than 8.5gb vram and you enable the Ultra HD texture pack your performance will tank and stutter - oh and this happens at 1440p LMAO

I'll try and contact them and ask how they are gathering vRAM usage. I'd be willing to bet they're measuring malloc rather than memory in use. I'll post back here if I get a reply. If/when I get a copy of the game I'll also test it myself.
 
Soldato
Joined
28 Oct 2011
Posts
5,861
1) Heavily modded games is about the closest I can come to agreement. There is some evidence that modded games can exceed vRAM budgets but that has been done badly in most instances I've been presented with. All metrics I've seen so far do not measure vRAM usage, they measure vRAM allocated, which I've gone into at length what that is inaccurate and misleading. I have however seen someone explicitly stating they have used this metric, it's a post on the resetera forums about memory measurement, and they admitted they were running over 1000 mods. Modding is a reasonably niche thing to do even in PC gaming, and people who use that many mods as to exceed vRAM budgets are extremely rare. And the simple fact is these people are outliers and Nvidia aren't going to increase the cost of something like a 3080 by putting on 6Gb more vRAM to appease the few people running 1000+ mods.

2) Future advantage is speculative and based on what evidence we have right now next gen titles will be GPU bound before they become vRAM bound, and I've given evidence as to why i think that.

3) This is just wrong and based on your conspiracy like theories about Nvidia holding back on memory while pocketing the cash for themselves. In the actual real world if Nvidia added more GDDR6x chips to the 3070 and 3080 it would cost them more to build and hence they'd have to sell the card at a greater price. You'd pay more for your video card for memory that you cannot use.

No one has given an example of a next gen game that will benefit from having 16Gb over 10Gb, or if they have maybe you can point me to a source?[/QUOTE]


1. Plenty of people of exceeded VRAM with modding and that tanks performance, that's a fact. No they've give consumers the same VRAM as people paid $200 for in 2016, knowing that the majority of buyers will be mugs who can't see through the fact that it's a con, people like you.

2. It's not speculative it's already here and on the way.

3. Who else is pocketing the cash?

You can cling to 2016's VRAM all you like, that's up to you, if you're happy with it, what irks you so much that others don't see it that way? After all you're set for years to come right?

For someone who works in tech, you're completely clueless (or pretend to be) about how one of the largest tech corps operate as a business, NV have cheaped out yet again on VRAM, and cheaped on the crappy Samsung 8nm node, preferring to rob people with gimped memory in the hope AMD couldn't compete. Its is blatantly obvious that they intended to sell the gimped versions, hold back the full fat 70 and 80 cards, knowing that many afflicted with FOMO and buyer's remorse would buy the real card six months later.

But keep on shillin', at least until the future makes you look stupid, it's quite remarkable that you work in in tech but see it is a stagnant, and that these joke VRAM allocations are going to be sufficient for years and years to come, even weirder that you love the fact that NV has robbed you blind on VRAM.
 
Top Bottom