• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Soldato
Joined
28 Oct 2011
Posts
8,405
Because more is better. Doesn't mean it will all be utilised if is available.

Why buy 32GB Dram, when 16GB is more than enough for gaming, because people think more is better. (Unless you are doing certain workloads to utilise all the memory, then it is only bragging rights...)

I'm happy to run any test to prove a point with a 6GB card at 4K.
But don't take my word for it, there are plenty of videos on YouTube with side by side comparison of a 3070 running a variety of games at 4k with no shutters, slow downs etc.

Yep, greed, gimp the intial cards then sell them with the VRAM that they should have had in the first place for more money.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Yea I was one of those guys saying that allocation and usage are different. But after seeing the actual results now I don't think 8gb will cut it for 4k for long. Sure, it will play any game out right now, and msfs 2020 and doom eternal at high texture settings, but next gen games? Definitely not at maxed, but at high/medium settings? Sure.

10gb is also cutting it close but it looks like it will be enough for 2 years or so at 4k...

I am planning to buy for 1440p144hz.. What are you thoughts about 3080 10gb at 1440p144hz? Do you think 10gb is enough for 1440p? I have heard people saying its kind of on the low end going forward at 4k, but Kinda curious to see what y'all think about 1440p...

I am going to wait until February 2021, If they release a 3080ti with 12gb of g6x(for competetive prices) I might grab that one, this is the latest gpu being leaked right now along with a 3070ti 10gb ddr6x.

A dev cycle for Nvidia or AMD is usually two years. That means that when they sell you a product? they design it to last for two years.

If they didn't? then they know for a fact people would stick to what they had, like many did with Turing. Most just held onto their Pascal cards. It obviously took longer than that with Ampere, but that was only because AMD released some mid range cards. So they could shift all of their stock before releasing Ampere.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Right so why the need for the proposed 16GB and 20GB variants then?

https://www.overclock3d.net/news/gp...d_rx_6800_16gb_gpus_spotted_in_eec_listings/1

Even the entry level cards have 12gb.

Just remember, GDDR6 costs AMD as much as it costs Nvidia. So why would AMD deliberately put 4gb "too much" on their cheaper cards?

Could it possibly be because they designed and manufactured the new consoles and have a ruddy good idea of how much VRAM a PC will need it in the wake of it? or are they just wasting their time and money?

Even their last gen cards had 8gb and that was what? two years ago.
 
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
https://www.overclock3d.net/news/gp...d_rx_6800_16gb_gpus_spotted_in_eec_listings/1

Even the entry level cards have 12gb.

Just remember, GDDR6 costs AMD as much as it costs Nvidia. So why would AMD deliberately put 4gb "too much" on their cheaper cards?

Could it possibly be because they designed and manufactured the new consoles and have a ruddy good idea of how much VRAM a PC will need it in the wake of it? or are they just wasting their time and money?

Even their last gen cards had 8gb and that was what? two years ago.
More like they put more Vram to have something more than price to fight NV on performance level.
If ya got 2 cards same price and performance but one got 4gb more. What will You buy ??
 
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
8gb is not enough.

If NVidia figured 2 years ago that the 2080 Ti needed 11gb it is only logical to think that the 3070 is going to need similar as it has similar performance.

The big difference between the 2080 Ti and 3070 is you can not SLI the latter which avoids some very heavy memory usage situations @2160p.

Another thing to remember is the 3070 is a cheaper mid range card so to be fair NVidia are entitled to cut a few corners with things like memory size.

The 3070 is not the Ampere flagship or anywhere close to it but it does offer very good performance/value.
I'm with You on this as i see it 3070 is 1440p card where 8gb is enough. For 4k 10gb is enough for 2 years.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Surely if Nvidia's sells a lot of 8Gb cards then game developers will make sure their games work on 8Gb? Surely developers want sales? Why would they aim for high end users only?

The first thing game devs do is make their games run on a console. That is always the first port of call and has been for many years. Ever since Crysis.

After that they need to make it run on a PC. Meaning more assets, more code, more textures. More of everything basically. At which point they can decide what to add for the PC, if anything. As soon as they do that requirements go up.

Do you think they use 3070s when coding a game to test it? or do you think they use the best? and that has always been the crux. They will use whatever they can to make their game run. It's all about money, and not spending anything you don't have to. Now there are exceptions to this rule. Rockstar have been really good since GTA4. They delay the game and put in the time and money and get it as good as they can. However, system spec on a PC will always need to be higher than a console because of the variances in hardware.

There will come a cut off point where to get a game running well on less VRAM or an older GPU will simply be uneconomical and thus they will just release it.

For the past few years? they had to make games run on the consoles. Meaning minimum specs on a PC have been really mild. However, you crank that up to a console with 2080, maybe a little more power? the minimum specs will increase a lot.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
I'm with You on this as i see it 3070 is 1440p card where 8gb is enough. For 4k 10gb is enough for 2 years.

I had about four posts I was going to reply to with this but yours was the latest :D

What irks me the most about the 3070 is that it is fast. Really fast. For £500? it is a pretty epic bargain. However, it should be able to do 4k. Times have changed and we have moved on. At one point the Fury X was a 4k card. At one point the 980Ti and 1080Ti were 4k cards. They didn't cost much more than the 3070, and it is now what? four? five? years later.

It seems to be that the only thing stopping it being a true 4k card is the memory. Note I mean that broadly. IE, yeah it could well be a combination of not enough and not enough bandwidth. However, that's a shame, because it means at 4k you are better off buying a used 2080Ti. I would like to have seen a scenario where that wasn't a good idea.

That is why I am most annoyed and disappointed. Everything on the card itself is more than capable of 4k, yet they have derped it by not giving it enough VRAM for now, let alone the future. Had they upped it? then the bandwidth would have upped also and it would have lasted you twice to three times as long. Of course I don't expect that to come for free, and would have been happy to see it cost £100 more. However, if it is to be taken at face value (IE a 1440p card) then like I have said it's quite expensive. The 2080 Super will do just as well in 1440p for gaming (obviously the FPS will be lower, but the experience would be more than good enough).

As I said though, the most annoying thing is that this would be a perfectly capable 4k card had they not derped it. Which is a shame.
 
Soldato
Joined
28 Oct 2011
Posts
8,405
https://www.overclock3d.net/news/gp...d_rx_6800_16gb_gpus_spotted_in_eec_listings/1

Even the entry level cards have 12gb.

Just remember, GDDR6 costs AMD as much as it costs Nvidia. So why would AMD deliberately put 4gb "too much" on their cheaper cards?

Could it possibly be because they designed and manufactured the new consoles and have a ruddy good idea of how much VRAM a PC will need it in the wake of it? or are they just wasting their time and money?

Even their last gen cards had 8gb and that was what? two years ago.


If that's the case at 5PM then that's one in the eye for NV, and those they keep defending them continually cheaping out on VRAM. AMD were giving people 8GB lower ends cards in 2016, the 470's 480's 580's 590's all had 8GB variants while NV was flogging more expensive cards with piffling 3GB and 6GB. Over the last 4-5 years AMD have consistently either matched or bettered the NV equivalent in this area. I think it's a probably a bit of both, they're continuing to offer better VFM, and they have a good handle of what will be required going forward.

Need to see the whole picture yet though.
 
Associate
Joined
2 Feb 2018
Posts
237
Location
Exeter
I'll leave this here for you to decide.

2060 6GB vs 1070 Ti 8GB at 4K. Watch the VRAM closely on both cards across each game.

Right now, you can game with maxed out settings with only 6GB VRAM @4K. But it is not future proof. An 8GB card will last a couple of years before we will need more.
So buying a 3070 with 8GB is fine if you plan to upgrade in that time frame. 3080 10GB will be fine to skip a GPU generation, and upgrade on the next.
A need and a want is two different things, so don't be fooled by more is better, unless you can utilise it all in a certain workload...

 
Associate
Joined
17 Jul 2013
Posts
74
The first thing game devs do is make their games run on a console. That is always the first port of call and has been for many years. Ever since Crysis.

After that they need to make it run on a PC. Meaning more assets, more code, more textures. More of everything basically. At which point they can decide what to add for the PC, if anything. As soon as they do that requirements go up.

Do you think they use 3070s when coding a game to test it? or do you think they use the best? and that has always been the crux. They will use whatever they can to make their game run. It's all about money, and not spending anything you don't have to. Now there are exceptions to this rule. Rockstar have been really good since GTA4. They delay the game and put in the time and money and get it as good as they can. However, system spec on a PC will always need to be higher than a console because of the variances in hardware.

There will come a cut off point where to get a game running well on less VRAM or an older GPU will simply be uneconomical and thus they will just release it.

For the past few years? they had to make games run on the consoles. Meaning minimum specs on a PC have been really mild. However, you crank that up to a console with 2080, maybe a little more power? the minimum specs will increase a lot.

Can you name any specific examples where a crappy console port was a crappy console port specifically because even the smallest available textures were too big for most cards on the market?

Sure, there comes a point where they say "yeah, we're not going to support less than card x" but then usually thats because card x is a good few years and 2-3 generations old by which point you say 'fair enough' and most people have moved onto something else by then anyway. You just have to look at things like Steam surveys and the x080/x090 series cards really aren't much of the market in volume terms.
 
Soldato
Joined
28 May 2007
Posts
10,067
If that's the case at 5PM then that's one in the eye for NV, and those they keep defending them continually cheaping out on VRAM. AMD were giving people 8GB lower ends cards in 2016, the 470's 480's 580's 590's all had 8GB variants while NV was flogging more expensive cards with piffling 3GB and 6GB. Over the last 4-5 years AMD have consistently either matched or bettered the NV equivalent in this area. I think it's a probably a bit of both, they're continuing to offer better VFM, and they have a good handle of what will be required going forward.

Need to see the whole picture yet though.

Yea even before them cards we had 390 8gb at the mid range. People can argue all they want if it's enough to game with but for the money being paid out it's definitely not enough. For people like me i would see the gpu power as being there but as someone who keeps cards usually 3 years and only upgrade when my hardware struggles i would not see the 3070 as a good investment.
 
Last edited:
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
More like they put more Vram to have something more than price to fight NV on performance level.
If ya got 2 cards same price and performance but one got 4gb more. What will You buy ??

As this thread has shown many don't seem to give a toss about how long the card will last, only how it performs now. If they did? they would understand why 8gb is not enough.

AMD have historically been quite generous with VRAM, until they learned a harsh lesson with the Fury X. They have since seen the error of their ways thank god.

Remember, it's always better to have more than enough of anything. "Just enough" is never going to remain that way.
 
Soldato
Joined
28 Oct 2011
Posts
8,405
Yea even before them cards we had 390 8gb at the mid range. People can argue all they want if it's enough to game with but for the money being paid out it's definitely not enough. For people like me i would see the gpu power as being there but as someone who keeps cards usually 3 years and only upgrade when my hardware struggles i would not see the 3070 as a good investment.

Yep that's been my point all along and Historically, the 70 with 12GB is a much better card. £500 for 8GB of VRAM does not compute with me.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Can you name any specific examples where a crappy console port was a crappy console port specifically because even the smallest available textures were too big for most cards on the market?

Sure, there comes a point where they say "yeah, we're not going to support less than card x" but then usually thats because card x is a good few years and 2-3 generations old by which point you say 'fair enough' and most people have moved onto something else by then anyway. You just have to look at things like Steam surveys and the x080/x090 series cards really aren't much of the market in volume terms.

There is no such thing as a console port for starters. I'm not going to explain how games are made, but if you understood you wouldn't call them that. "Cross coded slop"? maybe, but they are not ports.

I've seen lots of games come out that can't even be ran on the highest end hardware. GTA4? if you touched the slider you would immediately run out of VRAM, even on Quad SLi with two 295s. When questioned about it Rockstar said "Higher visual settings are for future systems". Now sure, they learned their lesson. However that wasn't the only example. Crysis was the same. It was many years before a system that could run it well even existed. Another one? Sleeping Dogs. Again if you cranked on the settings you would soon find that no system on the market could run that sucker at full settings.

Which to many? isn't a problem. They will lower settings. However, when you consider that back then two 295s cost you about £1200? it's inexcusable. IE, that is cross coded slop.

Optimisation costs time and money. As thus, since the consoles went X86 it has been waaaay better than it used to be. If anything minimum specs compared to how they used to be have dropped. To the point where you can pick up something like a RX 580 and play at 1080p quite happily.

However, there is a reason for that, and it's mostly because the XB1X has a very similar GPU, and devs are coding for that GPU to run games at 4k. When given double that much VRAM and a much faster GPU? what do you think will happen?
 
Associate
Joined
2 Feb 2018
Posts
237
Location
Exeter
Yep that's been my point all along and Historically, the 70 with 12GB is a much better card. £500 for 8GB of VRAM does not compute with me.

I agree,

Nvidia have clever marketing strategies.
Nvidia set everyone's focus on the prices being amazingly affordable over the previous gen, which had a huge price hike, to then make Ampere look good value... Which I'll repeat, £500 is not cheap for a midrange card, regardless whether it performed as good as the previous gen top card, which was overpriced.

Now if the 2080 Ti was £700, like the previous top end 1080 Ti, then the 3070 wouldn't look that appealing at £500, especially when you see the GTX1070 was £350, the RTX 2070 was £550.
The top card the, 3090, which is not a Titan, it is a GeForce card using GeForce drivers. Which should be branded the 3080 Ti to keep it in line with previous naming, is more than double the price of what a 1080 Ti was priced at, and £300 more than the 2080 Ti. Yet the performance between the 3080 and 3090 is ~10%, not worth the £800 premium they want.
3090 or call it 3080 Ti should be back in line at £700, and all other 3xxx should follow suit off the 1xxx series pricing if they really want to make Ampere prices reasonable.
 
Associate
Joined
17 Jul 2013
Posts
74
There is no such thing as a console port for starters. I'm not going to explain how games are made, but if you understood you wouldn't call them that. "Cross coded slop"? maybe, but they are not ports.

I've seen lots of games come out that can't even be ran on the highest end hardware. GTA4? if you touched the slider you would immediately run out of VRAM, even on Quad SLi with two 295s. When questioned about it Rockstar said "Higher visual settings are for future systems". Now sure, they learned their lesson. However that wasn't the only example. Crysis was the same. It was many years before a system that could run it well even existed. Another one? Sleeping Dogs. Again if you cranked on the settings you would soon find that no system on the market could run that sucker at full settings.

Which to many? isn't a problem. They will lower settings. However, when you consider that back then two 295s cost you about £1200? it's inexcusable. IE, that is cross coded slop.

Optimisation costs time and money. As thus, since the consoles went X86 it has been waaaay better than it used to be. If anything minimum specs compared to how they used to be have dropped. To the point where you can pick up something like a RX 580 and play at 1080p quite happily.

However, there is a reason for that, and it's mostly because the XB1X has a very similar GPU, and devs are coding for that GPU to run games at 4k. When given double that much VRAM and a much faster GPU? what do you think will happen?

I agree there's lots of games where the hardware to run them smooth as silk at 100fps on ultra settings doesn't exist at time of launch (flight sim titles have been notoriously like that) but thats often for reasons other than just purely running out of VRAM. A video card that ground to a halt with Crysis wasn't suddenly going to fly if it were somehow possible to simply double the RAM in it.
 
Soldato
Joined
19 May 2012
Posts
3,633
So every console is packing more than 8-10GB of VRAM, every AMD card has more than 8-10GB of VRAM...

But NVIDIA are telling us 8-10 is enough? ok...
 
Soldato
Joined
19 May 2012
Posts
3,633
Surely if Nvidia's sells a lot of 8Gb cards then game developers will make sure their games work on 8Gb? Surely developers want sales? Why would they aim for high end users only?


No.

Games have texture options. So 8GB owners will just have to use high or medium textures whilst people with higher VRAM cards can use the ultra textures. And textures make a MASSIVE difference to visual fidelity IMO.
Most of the games for next gen are boasting photorealistic textures and they aren't cheap.
 
Back
Top Bottom