• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

I have an RTX 3070 and I am gaming at 1440P Ultra-wide (Resolution 3440x1440). It's so sad the card is practically performing at 2080/2080ti levels but is being held back by the VRAM 100% on this title least. It's depressing to look at. Future proof my ass, I bought it specifically for the Cyberpunk launch as it was the only RTX I could find for a not-over-the-top price in my area. Being new to RTX, as I upgraded from GTX 1080, of course I started having weird lag spikes and began to monitor everything. So, the major lag spikes started happening when the game reached 7,9GB of VRAM. The GPU usage then spikes down quickly, resulting in a stutter, and then goes back up, which I assume is because of the VRAM.

So, no, even if Cyberpunk is the new "Crysis" of our time, The 3070 isn't a 1440p card and I'm surprised it is being marketed as such... It's a 1080p card if you want to run the higher settings. I'd say for 1080p the 8GB will be more than enough. I mean, honestly, I am not sure what I expected. I guess I should have heeded all the "doomsayers" when the cards were announced and people were yelling about the VRAM for the 3070/80 cards. This really is the first time I am dissapointed in Nvidia, it is a major dissapointment being able to use DLSS and RT options only to have the game tell me **** you due to the vram. FYI if I turn on the RTX options altogether the 8gb is enough.

Whoever is saying "MUH silly 8gb elitism" just doesn't know what they're talking about. No card should have to struggle with its own VRAM (Not one that is released in 2020 anyways). The VRAM should become an issue only if you force some unreasonable 4k RTX **** on it. I saw the 3070 being marketed as "good for 1440p", which is the only reason I am mad. That and I should have not been in the "muh silly 8gb elitism" category beforehand.
The game is essentially a hack, the only reason 8gb is probably enough without ray tracing is because there are about 5 stages of detail for items in the game world. The engine is constantly changing them as you move about.

Walk right up to a black bollard(with yellow ring) and slowly tap to walk back you will see it transition multiple times until the top eventually became flat.

These transitions are extremely noticeable for me I suspect because of memory bandwidth on either my Vega/system/ or ssd.
 
The game is essentially a hack, the only reason 8gb is probably enough without ray tracing is because there are about 5 stages of detail for items in the game world. The engine is constantly changing them as you move about.

Walk right up to a black bollard(with yellow ring) and slowly tap to walk back you will see it transition multiple times until the top eventually became flat.

These transitions are extremely noticeable for me I suspect because of memory bandwidth on either my Vega/system/ or ssd.

Well, to be fair I am forcing the game A LOT, since I am moving fast through the city. I am just stress testing the same way I did with GTA V when it came out. At first I actually thought my GPU was somehow faulty, as I started noticing random spikes in the GPU usage when these stutters happened. I was quite baffled and did some stress tests and my GPU ran solid 98-99% usage the entire time. So I decided to look into the resources more and put the game through some rounds. This is what I noticed.

I am not completely dismissing the 3070. For the value, it is a beast since you have a similar performance as the 2080 ti (albeit a tad lower in some cases). But the difference is the 2080 ti has more VRAM. Either way, my resolution isn't helping me here, as I have 30% more "widescreen" to cover compared to traditional 1440p. I am just wondering what compromises I should make in order to lower the VRAM usage. Anyone running the game at 3440x1440 could chime in?
 
There was msi 3070 trio going for ages on compediters site. It was £640 of course it eventually sold. I refuse to buy an 8gig card now. I want an 6800XT or even a 3080 despite lack of vram. Might hold out for next year's 3080 TI though.
 
There was msi 3070 trio going for ages on compediters site. It was £640 of course it eventually sold. I refuse to buy an 8gig card now. I want an 6800XT or even a 3080 despite lack of vram. Might hold out for next year's 3080 TI though.

I honestly did not expect VRAM to be an issue. I just blindly trusted it being a "new gen card". For all those years I always saw that VRAM was really overcompensated on new cards, most games even have the "vram indicator" in the settings to let you know if you are going overboard. I only remember VRAM being an issue in ~ GTA IV probably, that was way before the modern nvidia/amd architectures. Game was new and my card was old, so it was normal... This though... surprising! I got my 3070 for around that price too... ehh...
 
pretty crazy. Not surprising though, the 2080ti seems to tank in game as well without dlss.

15:26s

Sub 40fps average at 1080p with a 2080ti. yeehaaaw. I''m not even sure if that's with the texture pack or not :o

I'm more shocked that a next gen game looks like it could have been released 15 years ago. The graphics in that are proper turd.
 
GTA IV, another badly optimised game like Cyberpunk 2077...

I've been gaming at 4K since 2013, where VRAM "was" an issue with 2GB GPU's.
When the 1080 Ti came out, it was the first time I could play games at 4k on a single GPU, rather than relying on SLI or CFX.

Back at that time, anything over 24fps was accepted as playable (You know, as films run at 24fps). Consoles mainly ran at 30fps. 60fps was the target for silky-smooth gameplay.
Today, people want 144fps. Quite a jump hey?
Well, at 4k that is just not going to happen in modern titles.

Now I currently have a 49" 4k monitor and the LG 38GL950G (3840x1600 ultrawide) powered by a 3070 currently.
Pretty much every title I throw at it, can run maxed out at 4k with no issues with VRAM limits.
(I was even running a 5600xt with 6GB for a while, and did not run into VRAM limits).
Cyberpunk at the mo, is very unoptimised, that even a 3090 struggles to run it. The game engine does not have Ray Tracing baked into it (The game engine was not built from the ground up around Ray Tracing) it is an add on as Nvidia is pushing RT hard on us to be the future tech along with DLSS.

For now, just turn down some settings in Cyberpunk if you want to have a playable experience. I'm sure patches will be out to help optimise it, but on games as big as this, don't expect a god like patch to fix everything.

Typically, as a coder, you fix one thing, and it breaks something else. Nothing seems to be bug free these days.
 
GTA IV, another badly optimised game like Cyberpunk 2077...

I've been gaming at 4K since 2013, where VRAM "was" an issue with 2GB GPU's.
When the 1080 Ti came out, it was the first time I could play games at 4k on a single GPU, rather than relying on SLI or CFX.

Back at that time, anything over 24fps was accepted as playable (You know, as films run at 24fps). Consoles mainly ran at 30fps. 60fps was the target for silky-smooth gameplay.
Today, people want 144fps. Quite a jump hey?
Well, at 4k that is just not going to happen in modern titles.

Now I currently have a 49" 4k monitor and the LG 38GL950G (3840x1600 ultrawide) powered by a 3070 currently.
Pretty much every title I throw at it, can run maxed out at 4k with no issues with VRAM limits.
(I was even running a 5600xt with 6GB for a while, and did not run into VRAM limits).
Cyberpunk at the mo, is very unoptimised, that even a 3090 struggles to run it. The game engine does not have Ray Tracing baked into it (The game engine was not built from the ground up around Ray Tracing) it is an add on as Nvidia is pushing RT hard on us to be the future tech along with DLSS.

For now, just turn down some settings in Cyberpunk if you want to have a playable experience. I'm sure patches will be out to help optimise it, but on games as big as this, don't expect a god like patch to fix everything.

Typically, as a coder, you fix one thing, and it breaks something else. Nothing seems to be bug free these days.

True, true, it is not 100% as optimized as it can be. After all, the game is huge and as someone mentioned, being rendered at like 5 phases at a time. It is a next gen title, albeit really buggy and glitchy.

Still, the more future proof card is always going to be the extra VRAM one. So sad I will need to sell this card to get the TI one probably just to be able to game at 1440p in the near future (well, today even LOL). We'll see what AMD has in store in the future in terms of RT capabilities. I have supported nvidia as they always delivered, but this time I really can't say good things for their decision to skimp the Vram on this card. If you do not believe me, check the benchmarks for the 2080 ti compared to the 3070. You will really notice the vram different even though it's not that much (3gb...?)
 
Remember Cyberpunk has been in the making for 8yrs, with most of the budget spent on PR marketing. Add in multiple delays, yet it still launched with a lot of bugs, and is very unoptimised.
No god like patch will fix it...
By the time the next gen GPU's come out, Cyberpunk will be shelved, as we will be seeing a lot of games that have game engines with RT built in from the ground up to make use of the next gen consoles hardware, which will be optimised, look better and play better.

8GB will be enough VRAM for 4K till the next gen GPU's launch. (If I was hitting limits, then my opinion would change to relate what I know, but that is not the case right now).

Many reviewers run their tests with maxed out settings, maxed AA and AF, including "motion blur" (Who uses motion blur?... I don't, turn that off and gain a little GPU time.)
My point is, some settings can cause more performance issues than give you better eye candy.
Depending on the size of your 4K monitor, some people find you can run 4K with no AA or AF with no jaggies visable, so your mileage will vary in your monitors PPI and how close you sit to your monitor.

Something to also consider;
As I play around with all the latest tech from both sides, I notice little things like TAA enabled on both AMD and Nvidia.
Let's take SOTR, TAA on with an AMD GPU makes the image too blurry, that adding a lot of sharpness still doesn't fix it. Running with AA off gave a higher clarity image with an AMD GPU, but then you had to put up with jaggies, so a compromise.
TAA enabled with an Nvidia GPU gives higher clarity over AMD with it on, and just looks so much better.

"Shadows" is another thing, the difference between High and Ultra in many games is so minimal, that the cost of performance out weighs the tiny extra clarity you get.

"Volumetric"
Usually clouds and/or fog. This is very hard on GPU time. Running Medium in many games can give you a massive performance boost without ruining the experience.

Anyway, I could go on all day about this, but don't get tied up by "I must run it maxed out"
As some game engines have settings to break the engine, and ruin the experience.
 
Last edited:
Remember Cyberpunk has been in the making for 8yrs, with most of the budget spent on PR marketing. Add in multiple delays, yet it still launched with a lot of bugs, and is very unoptimised.
No god like patch will fix it...
By the time the next gen GPU's come out, Cyberpunk will be shelved, as we will be seeing a lot of games that have game engines with RT built in from the ground up to make use of the next gen consoles hardware, which will be optimised, look better and play better.

8GB will be enough VRAM for 4K till the next gen GPU's launch. (If I was hitting limits, then my opinion would change to relate what I know, but that is not the case right now).

Many reviewers run their tests with maxed out settings, maxed AA and AF, including "motion blur" (Who uses motion blur?... I don't, turn that off and gain a little GPU time.)
My point is, some settings can cause more performance issues than give you better eye candy.
Depending on the size of your 4K monitor, some people find you can run 4K with no AA or AF with no jaggies visable, so your mileage will vary in your monitors PPI and how close you sit to your monitor.

Something to also consider;
As I play around with all the latest tech from both sides, I notice little things like TAA enabled on both AMD and Nvidia.
Let's take SOTR, TAA on with an AMD GPU makes the image too blurry, that adding a lot of sharpness still doesn't fix it. Running with AA off gave a higher clarity image with an AMD GPU, but then you had to put up with jaggies, so a compromise.
TAA enabled with an Nvidia GPU gives higher clarity over AMD with it on, and just looks so much better.

"Shadows" is another thing, the difference between High and Ultra in many games is so minimal, that the cost of performance out weighs the tiny extra clarity you get.

"Volumetric"
Usually clouds and/or fog. This is very hard on GPU time. Running Medium in many games can give you a massive performance boost without ruining the experience.

Anyway, I could go on all day about this, but don't get tied up by "I must run it maxed out"
As some game engines have settings to break the engine, and ruin the experience.

I agree with your points... but I was simply pondering the idea of whether 8 GB is really "future proof" or enough for my particular resolution - 3440x1440. This is not a 4K resolution, but an "ultrawide" 1440p one. I am not going to be really butthurt about this, as it is what it is, but I may just try to hustle the card to get a 3080 or 3070TI instead when that is possible. I 100% think that 10gb (or 16gb for the TI 3070?) is safer to work with for future proofing than 8GB, no?

If we really take into account that RT is something that can be utilized from the ground up, as a "standard" (which I doubt, as it will be an extra "fidelity option" for probably some upcoming years as not all have RT capable cards, AMD is also struggling with this), instead of added as additional functionality (which, I have no idea about, but honestly, isn't it just an added option, like they did with quake 2 or minecraft? RTX on cyberpunk looks hella immersive and is the best I have yet seen in any title), then sure, I would say that Cyberpunk is definitely not something to be following as an example, as it is just poorly optimized for any kind of card including the 3090.

Do you mean the Sampler Feedback Streaming feature?


But the reality is that I am simply talking about a VRAM cap being hit on my card, even though I am running the settings according to what I found to be the most optimal settings for my setup in terms of FPS and visual fidelity/smoothness of the experience. I have seen a few more critical tech reviewers mentioning the VRAM issues on Cyberpunk for 1440p specifically for rtx 3070. I understand reviewers leave all the settings on, but I have settled the optimised settings for max visuals and least impact on performance and this problem still occurs for me...? Doesn't that kind of prove my point that this card is held back by VRAM? At least for this particular title?
 
Last edited:
If someone has 3440x1440 res and is gaming on a rtx 3070, would love for them to chime in haha.

Also, if anyone is doubting me that 8GB is laughable for this card, have a look at the benchmark done here:

 
Last edited:
Im running 3440x1440. Not run into VRAM issues yet. Cyberpunk is enough to bring the card to its knees with RTX on but thats not a VRAM issue.

Are you sure it's not a VRAM issue? Like have you monitored the resources and FPS? What settings are you running?

I have my monitoring up and running on a 2nd screen. The game runs like butter on performance DLSS with RT ON (medium) and digital foundry optimised settings for best performance (Tho with performance DLSS It's blurry at times). OR alternatively, it runs even better if you go with quality DLSS + RT settings off. 60-80fps but no annoying blur because of quality DLSS setting. This was on the ultra preset but with RT completely OFF. (but I imagine the performance settings are even better, probably climbing into the 100 fps at times, thanks to DLSS if you set the optimised digital foundry settings as well)

HOWEVER... With RT ON, the VRAM is always treading on 6-7GB+ in more intense scenarios where you roam the city (especially if there's vertical stuff going on that area)... A stutter/fps drop happens if the VRAM tries climbing above 7,8/7,9GB. FPS drop is just like in that benchmark (time stamp 3:40 you can see it dipped to almost half the FPS because of the VRAM limit).

I assume if you play with RT off then the game doesn't draw enough VRAM to go overboard...? From what I tested shortly, it seemed to be around 1-2gb lower VRAM usage.

I'm not trying to be pretentious. Just curious if I am the only one noticing this on this particular card, at least for Cyberpunk. Again, this isn't an optimized title, and a GPU should not be judged by one title. But the benchmark video and my own tests simply show the VRAM limiting scenarios of this card. I am sure the story would be different for 1080p as there would be more "safe" headroom for the VRAM. If I am flat out wrong, feel free to correct me, I mean you are also at the 3440x1440 res. The game is annoying at this res anyways as you have black bars everywhere... menus, inventory, hacking, etc.
 
Last edited:
Back at that time, anything over 24fps was accepted as playable (You know, as films run at 24fps). Consoles mainly ran at 30fps. 60fps was the target for silky-smooth gameplay.
Today, people want 144fps. Quite a jump hey?

Hah, I do not remember accepting such a thing back then :p Even bought a second GTX 580 so I could improve my frames in BF3. Even further back I was chasing frames, think you could jump at a bit higher at maybe 125fps(?) in quake 3 for example, I blame the timedemo function in quake 1...
 
I am looking at the options for replacing the 3070 with a 3080. But the price for 3070 = 700 eu while 3080 is 1k. Would you guys say that the 300eu difference is worth it? Or are both cards just obscenely overpriced? I could sell my 3070 for the same price almost and opt for the 3080 instead, as I can see it pop up from time to time as available.
 
I am looking at the options for replacing the 3070 with a 3080. But the price for 3070 = 700 eu while 3080 is 1k. Would you guys say that the 300eu difference is worth it? Or are both cards just obscenely overpriced? I could sell my 3070 for the same price almost and opt for the 3080 instead, as I can see it pop up from time to time as available.
Is EU=euro? If yes, that is a rip off. I would say hold off till prices stabilise. You could also take a gamble at the incoming ti cards.
 
Should always put a limit of £200 over the top of a reference card with that being the extreme end IMHO.

So 3080 FE is £649 so ideal AIB for me would be no more than £750 and there are the odd one still about going for that but mostly they are £900+

This is just my personal opinion based on what I would spend.
 
Back
Top Bottom