• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

I showed this several pages back, Doom eternal real memory usage at 4k Ultra Nightmare is 7Gb roughly, for me slightly under and of that usage most of it was wasted texture pool size which wasn't being used for anything useful. It literally just reserves 4.5Gb of vRAM up front for texture streaming but provides no benefit to doing so. When low or high at about 1Gb is fine.

Flight simulator when measured correctly (real usage) tops out at about 9.5Gb but nothing can run it at those settings even a 3090 chokes on frame rate. In order to get FS2020 to be playable you have to drop the settings down, not because of vRAM limits, even the 24Gb 3090 chokes, but because the demand on the GPU is so high.
In the latest hardware unboxed review of the 3070, 3070 lost out 20 frames because ultra nightmare textures used 9gb of memory at 4k, he had to reduce them to ultra to bring the usage down to 7gb...

I know there is no difference in ultra and ultra nightmare textures, just saying 8gb is definitely not enough going forward into next generation of games at 4k. I mean it will probably give issues in 3 out of 10 games but that is still quite a lot if those are games you plan on playing at maxed settings.
 
Last edited:
In the latest hardware unboxed review of the 3070, 3070 lost out 20 frames because ultra nightmare textures used 9gb of memory at 4k, he had to reduce them to ultra to bring the usage down to 7gb...

You can't tell him, he doesn't care.

He will just bang on and on and on about how it's perfectly fine.

This is exactly what I predicted yesterday. Nvidia wanted to focus on 1440p, because they knew it matches a 2080Ti which is what they had said in their marketing. But this guy knows more than Nvidia. Guys like that? don't waste your time on.
 
In 3 out of 16 reviewed games 8 GB of VRAM is NOT ok for UHD 2160p gaming or higher.

https://www.techpowerup.com/review/?category=PC Port Testing

It won't get better moving forward either. DLSS will in cases lower the needed VRAM but that is simply because it is using lower rent textures and sharpening them. Try and remember, a few of the games in the review were using DLSS. They did not paint a picture of what happens when you natively run the games at 4k.

Now if I played every single game out there and smiled along happily? it wouldn't matter so much. However, given I am much more of a "one game" type of person all three of those would be on my list above all the others. All I play ATM is COD MW and Doom Eternal.

As I said, this is a shame. 10gb? this card would have been an absolute belter.
 
You can't tell him, he doesn't care.

He will just bang on and on and on about how it's perfectly fine.

This is exactly what I predicted yesterday. Nvidia wanted to focus on 1440p, because they knew it matches a 2080Ti which is what they had said in their marketing. But this guy knows more than Nvidia. Guys like that? don't waste your time on.

you must understand what the texture pool size setting is doing before you dismiss what PrincessFrosty is saying.
 
you must understand what the texture pool size setting is doing before you dismiss what PrincessFrosty is saying.

I don't care what it's doing. All I care about is not buying a £500+ GPU for my £1000 PC (rough figure for demonstration purposes) and it not tanking and running games slowly. I don't care about all of the pedantic boring stuff, I just don't want certain games tanking or becoming unplayable.

As such? he or she could have a degree in astro physics I really couldn't give a toss. Like I said, if I know I have to start cutting settings I will just buy a console. They are much cheaper.

TBH? I don't even really care about the cut in quality on consoles. I wouldn't, I own three (two XB1X, one PS Pro) and I really enjoy using them. However, that said I did not spend £3000+ on them. Had I done that on a rig and then found myself compromising? yeah no thanks.

At least AMD seem have this covered.
 
You can't tell him, he doesn't care.

He will just bang on and on and on about how it's perfectly fine.

This is exactly what I predicted yesterday. Nvidia wanted to focus on 1440p, because they knew it matches a 2080Ti which is what they had said in their marketing. But this guy knows more than Nvidia. Guys like that? don't waste your time on.

Not everyone wants to play at 4k. Look at the steam surveys. The market share is huge for 1080p still. Some people want highest res possible with the highest settings. Some people want as many frames as possible. Others want that middle ground of 1440p 144hz as a baseline.
 
you must understand what the texture pool size setting is doing before you dismiss what PrincessFrosty is saying.
He is not interested in understanding.


I don't care what it's doing. All I care about is not buying a £500+ GPU for my £1000 PC (rough figure for demonstration purposes) and it not tanking and running games slowly. I don't care about all of the pedantic boring stuff, I just don't want certain games tanking or becoming unplayable.

As such? he or she could have a degree in astro physics I really couldn't give a toss. Like I said, if I know I have to start cutting settings I will just buy a console. They are much cheaper.

TBH? I don't even really care about the cut in quality on consoles. I wouldn't, I own three (two XB1X, one PS Pro) and I really enjoy using them. However, that said I did not spend £3000+ on them. Had I done that on a rig and then found myself compromising? yeah no thanks.

At least AMD seem have this covered.
So then spend £180 more and buy a 3080. Runs every game ever made up until today without running out of VRAM. All for a third the price of a 2080Ti Kingpin to boot! :p:D
 
You know what, it's getting boring now. So much for doom eternal strangling a 8Gb card:
NVIDIA GeForce RTX 3070 Founders Edition Review - Disruptive Price-Performance | TechPowerUp

Doom eternal. Ultra nightmare settings. 4k:

oLeXorm.png

Anybody see the problem there? That's right, there isnt one. 1fps between the 2080ti 11gb and the 3070 8gb.


I don't care what it's doing.

Then why are you posting at all? Because all you're doing is perpetuating the myth.

when you set a texture pool size to it's maximum, you are literally asking the game to reserve more vram than it needs to, more than your GPU has, and saying **** it. And then you get people making purchasing decisions off the back of that and people like you dismissing anybody who tries to explain it because you cant be arsed. Well great, stay out let the rest of us get on with it then.
 
Not everyone wants to play at 4k. Look at the steam surveys. The market share is huge for 1080p still. Some people want highest res possible with the highest settings. Some people want as many frames as possible. Others want that middle ground of 1440p 144hz as a baseline.

So expensive card is supposed to be a bit future-proof. This is like 3.5 GB GTX 970 but officially declared specs which the blind sheep will simply accept.
8 GB of VRAM has been mid-range since the RX 480 launched in 2016.
 
8gb has been around for years and years now.

It's blatantly obvious Nvidia were worried about upping it, as then the 3070 would have been the best card they cracked out in a decade for value. Instead they have given a questionable amount and thus know you'll be back for more. Which is fine, so long as people understand that.

And of course it gives a lie to that utter twaddle about the tech being limited which someone keeps banging on about. NV have simply segmented the VRAM on the 70 and 80 cards, selling you gimped cards first, and the real 70 and 80 cards whch you should have got for the gimped price.

So obvious, yet people choose to defend it...
 
So expensive card is supposed to be a bit future-proof. This is like 3.5 GB GTX 970 but officially declared specs which the blind sheep will simply accept.
8 GB of VRAM has been mid-range since the RX 480 launched in 2016.

NV have made the gamble that this won't be a huge bottleneck on the card. If a year from now that turns out to be a bad gamble and AMD give us something future proof at a really competitive price then people will make the switch.
 
I've now watched several reviews and the 3070 has some memory issues

thisnis easily seen when compared to the 2080ti. In several games the 3070 beats the 2080ti at 1440p but loses at 4k and in almost all games the gap between the 3070 and 2080ti gets smaller at 4k.

This doesn't necessarily mean the 3070 is running about of vram though, in some cases that might be happening but I think the more likely culprit is the memory bandwidth - the 3070 is well down on bandwidth and this is a topic I brought up in the console discussions a few days ago when I said next gen games will be held back by the poor memory bandwidth on the consoles not the total amount of vram
 
So expensive card is supposed to be a bit future-proof. This is like 3.5 GB GTX 970 but officially declared specs which the blind sheep will simply accept.
8 GB of VRAM has been mid-range since the RX 480 launched in 2016.

Of course it has, this has been my argument all along, and you can buy a £150 card with 8GB - today. Yet, people would rather defend NV and have yesterday's RAM for £500 plus.
 
I followed most of the 3000 series thread on here and especially after the announcement/reveal. The consensus was if you care about 4k then get a 3080. Otherwise a 3070 will probably be enough for 1440p gaming. That was well before we saw todays reviews. Nothing has changed in that regard. I don't remember anyone pushing the 3070 as a go to future proof 4k card.
 
I've now watched several reviews and the 3070 has some memory issues

thisnis easily seen when compared to the 2080ti. In several games the 3070 beats the 2080ti at 1440p but loses at 4k and in almost all games the gap between the 3070 and 2080ti gets smaller at 4k.

This doesn't necessarily mean the 3070 is running about of vram though, in some cases that might be happening but I think the more likely culprit is the memory bandwidth - the 3070 is well down on bandwidth and this is a topic I brought up in the console discussions a few days ago when I said next gen games will be held back by the poor memory bandwidth on the consoles not the total amount of vram

Yep, it's bandwidth limited alright. the 3070 is definitely positioned as a 1440p card.
 
Of course it has, this has been my argument all along, and you can buy a £150 card with 8GB - today. Yet, people would rather defend NV and have yesterday's RAM for £500 plus.
You know yourself you cannot only compare VRAM and ignore the performance across the board in actual games. A baseless argument unless the £150 card can reach similar FPS across multiple titles.
 
@Grim5 The XBox Series X has between 336GB/S ~ 560GB/S of memory bandwdith dependent on what memory pool is being used. So for the 10GB of faster RAM,the XBox Series X actually has more bandwidth than an RTX3070. The PS5 has 448GB/S with a slower GPU than thge XBox Series X. So I doubt memory bandwidth is a problem for both of them!
 
So expensive card is supposed to be a bit future-proof. This is like 3.5 GB GTX 970 but officially declared specs which the blind sheep will simply accept.
8 GB of VRAM has been mid-range since the RX 480 launched in 2016.

I had a 970. I was pretty miffed to find out the truth behind it. However, the truth didn't come out until later on. The only blind sheep are the people who bought them after the situation became clear, without reading reviews.

what has the amount of vram on ampere got to do with the 970?
 
Back
Top Bottom