• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Can't say I bought a 3070 with any intention of playing 4K with the latest generation of games... I doubt many others did either. If I wanted 4K performance I wouldn't be expecting it from a mid-range card.

No one's talking about 4K. DLSS On is by definition below 4K. It's a weird talking point anyway, because I've been dabbling with 4K or reconstruct to 4K ever since I got an RX 480 (spent sooo many hours in AC: Origins with it playing like that, 4K 30, but still), then Vega 64 and now a 6800. So how come a 3070 is all of a sudden such a weak card comparatively? It has plenty of grunt to do so, especially if we add DLSS, but it's the memory that's really holding it back. That's the sad part, and why in some scenarios the 3060 actually does better. The extra 4 GB makes all the difference. It should not have been an 8 GB card at all, but alas, it was a mining season.

8GB in 2021 was an absolute scam.

Everyone knows it, there were £200 cards from 4 years ago with 8GB of VRAM.

:cry:
Pretty much. So many people still in denial tho, it's bizzare.

not to mention amds RDNA 2 generally performs even worse than the 3070 with RT turned on...
Who cares about RT on if it's unplayable. Now go show results without RT which is what you'd have to use regardless of AMD/Nvidia. There's no advantage in talking about RT if you can't use it, but guess what - can definitely use that vram (while also consuming less power)! :P
 
No one's talking about 4K. DLSS On is by definition below 4K. It's a weird talking point anyway, because I've been dabbling with 4K or reconstruct to 4K ever since I got an RX 480 (spent sooo many hours in AC: Origins with it playing like that, 4K 30, but still), then Vega 64 and now a 6800. So how come a 3070 is all of a sudden such a weak card comparatively? It has plenty of grunt to do so, especially if we add DLSS, but it's the memory that's really holding it back. That's the sad part, and why in some scenarios the 3060 actually does better. The extra 4 GB makes all the difference. It should not have been an 8 GB card at all, but alas, it was a mining season.


Works fine for me at 1440p - never really come close to saturating the VRAM unless I go to 4K (DLSS or not at 4K) - there are some games you need DLSS like CyberPunk 2077. Lot of one generation back games do play at 4K ultra settings without DLSS fine though.
 
As always. Non owner with anecdotal evidence calling owners in denial.

Don't own DL2 yet. But nobody else is as whiny as non owners about Vram.

It's very strange.

Gotta project your purchase option and tell us your Vram is bettererer!!!!

3090 on COD WZ consumes 24GB soo....
 
No one's talking about 4K. DLSS On is by definition below 4K. It's a weird talking point anyway, because I've been dabbling with 4K or reconstruct to 4K ever since I got an RX 480 (spent sooo many hours in AC: Origins with it playing like that, 4K 30, but still), then Vega 64 and now a 6800. So how come a 3070 is all of a sudden such a weak card comparatively? It has plenty of grunt to do so, especially if we add DLSS, but it's the memory that's really holding it back. That's the sad part, and why in some scenarios the 3060 actually does better. The extra 4 GB makes all the difference. It should not have been an 8 GB card at all, but alas, it was a mining season.


Pretty much. So many people still in denial tho, it's bizzare.


Who cares about RT on if it's unplayable. Now go show results without RT which is what you'd have to use regardless of AMD/Nvidia. There's no advantage in talking about RT if you can't use it, but guess what - can definitely use that vram (while also consuming less power)! :p

Why did you link a 4k video then? It doesn't matter if dlss/fsr is on or not as again, it's a moot point given that no card can achieve beyond 30 fps without fsr/dlss.... however, with dlss/fsr on and/or settings "slightly" reduced, one brand of current gen cards can achieve a locked 60 unlike another brands current gpus....

If we're not referring to 4k either all of a sudden now.... from what I have seen, 1440P performance looks good on the 3070 and again, better than what a 6900xt can achieve....

cLZhYR7.png

And in which case why did you make this comment? Could we also not say the same for RDNA 2 owners as technically they are getting a worse experience than 3070 owners in this game if using RT.....

They really did lower end Ampere buyers dirty with just 8 GB vram. Absolutely brutal performance, <20 fps even WITH DLSS on, especially when the camera turns. Hope people didn't buy these cards hoping they could turn on raytracing willy-nilly. :(

3070 and above owners can play with RT just fine depending on what res. and upscaling option they are using e.g. in my case with a 3080, everything maxed except fog at medium: 4k, dlss performance is for the most part locked to 60 and when playing at 3440x1440, dlss balanced is averaging around 80 fps, indoors areas, fps is considerably higher.

Now all of a sudden, it doesn't matter about ray tracing on either cards yet again, why link a video showing ray tracing settings turned on at 4k then?

Also, a 3070 performs about where you would expect it to even without rt:

CkYi48p.png
 
Last edited:
As always. Non owner with anecdotal evidence calling owners in denial.

Don't own DL2 yet. But nobody else is as whiny as non owners about Vram.

It's very strange.

Gotta project your purchase option and tell us your Vram is bettererer!!!!

3090 on COD WZ consumes 24GB soo....

Indeed, it's rather hilarious :D

Also, more the fact of "zomg, I'm hitting a vram bottleneck but lets just ignore the fact that I'm using settings and a res. which said gpu isn't even capable of playing at in the first place regardless of vram" :cry: Also, a case of "zomg, 1-2 people are having this issue, it "MUST" be this and nothing else..... *even though no one else has reported this and you have several people saying the complete opposite of said supposed bad issue*.....

Honestly, I think some people would be better of with consoles at times :)




Rather interesting too, notice something with that video, look at the guys gpu power consumption and how it is tied to the performance drops it seems, from a quick google, was brought back to his channel and another video of his:


Would be interested to see a new video too as with dx 12, there was some hitching when loading new areas, mostly when entering/leaving base areas:

https://www.dsogaming.com/patches/d...-cache-related-improvements-full-patch-notes/

Additionally, and thanks to the DX12 cache-related improvements, the game should be smoother. This should also reduce the game’s stutters. Not only that, but Dying Light 2 does no longer require AVX.
 
Last edited:
Saying that, I don't recall of @TNA mentioning any issues at 4k with his 3070 except in a cutscene?
That and the second part of the game when the map opens up would get frame drops in certain areas for short periods of time. But to be fair I was playing it using DLSS Performance mode which is essentially upscaling from 1080p.
 
No one's talking about 4K. DLSS On is by definition below 4K. It's a weird talking point anyway, because I've been dabbling with 4K or reconstruct to 4K ever since I got an RX 480 (spent sooo many hours in AC: Origins with it playing like that, 4K 30, but still), then Vega 64 and now a 6800. So how come a 3070 is all of a sudden such a weak card comparatively? It has plenty of grunt to do so, especially if we add DLSS, but it's the memory that's really holding it back. That's the sad part, and why in some scenarios the 3060 actually does better. The extra 4 GB makes all the difference. It should not have been an 8 GB card at all, but alas, it was a mining season.


Pretty much. So many people still in denial tho, it's bizzare.


Who cares about RT on if it's unplayable. Now go show results without RT which is what you'd have to use regardless of AMD/Nvidia. There's no advantage in talking about RT if you can't use it, but guess what - can definitely use that vram (while also consuming less power)! :p


I agree RT is no use if it tanks your perf, how anyone can defend 8GB on a 2021 card costing £800 is beyond me, it's objectively a rip off.
 
That and the second part of the game when the map opens up would get frame drops in certain areas for short periods of time. But to be fair I was playing it using DLSS Performance mode which is essentially upscaling from 1080p.

Thanks for the confirmation on that, "supposedly" every 3070 owner is seeing awful frame latency and fps drops throughout the game even with dlss performance because of that 1 video above even though the very same guy has posted another video showing a "supposed" fix for his issue.... :p

I agree RT is no use if it tanks your perf, how anyone can defend 8GB on a 2021 card costing £800 is beyond me, it's objectively a rip off.

Yup I agree RT shouldn't be enabled on RDNA 2 cards given how much RDNA 2 buckles with it turned on, best to leave that for the nextgen capable cards known as ampere.

If people choose to pay that for a card when you can buy it for £469, that's their own fault and tbh, anyone who massively overpaid for "any" card regardless of brand/vram or whatever was silly.... Shame AMD fans had no choice but to overpay in the UK though.
 
Indeed, it's rather hilarious :D

Also, more the fact of "zomg, I'm hitting a vram bottleneck but lets just ignore the fact that I'm using settings and a res. which said gpu isn't even capable of playing at in the first place regardless of vram" :cry: Also, a case of "zomg, 1-2 people are having this issue, it "MUST" be this and nothing else..... *even though no one else has reported this and you have several people saying the complete opposite of said supposed bad issue*.....

Honestly, I think some people would be better of with consoles at times :)




Rather interesting too, notice something with that video, look at the guys gpu power consumption and how it is tied to the performance drops it seems, from a quick google, was brought back to his channel and another video of his:


Would be interested to see a new video too as with dx 12, there was some hitching when loading new areas, mostly when entering/leaving base areas:

https://www.dsogaming.com/patches/d...-cache-related-improvements-full-patch-notes/

Critical thinking skills are not taught in normal education here in England, either you have this curiosity and a mind that questions everything and looks deeper or you don't and come up with clumsy diagnosis's like these fellas.

Critical thinkers usually don't do all that well in school mind you unless the subject fits their passion and direction in life, school is monotony. Imagine that kid undermining everything a teacher says. Depends on the teacher though, some are aware of limitations in school education so there is that.

"It has no value to the status quo, the production society that must be upheld for fear of chaos. It's not a priority; society wants you to be a happy worker, creating and consuming products.
In my school (private, Catholic, all boys, in England a.k.a: A weird ******* school) we had sets for Religious Education. RE is a two year course, the top sets completed the course in one year and studied critical thinking for one year. The aim was to produce more rounded, free thinking graduates.
My teacher told us outrageous lies in the form of longwinded stories all year. We learnt mental models like Ockham's razor, alongside the philosophies of the big names (Aristotle, Plato, Descartes), with a lot of weight on brainwashing catholic philosophies from Thomas Aquinas and other likened minds.
This is the honest truth: only a few select (and precocious) minds in the class realized the teacher's stories were pure lies. The majority of us didn't realize until long after we'd graduated how absolutely ridiculous the stories were.
I still use many mental models on a daily basis. Maybe because I got hoodwinked, I will always value the power of deriving your own information and critiquing all information, even the things I "think" I already "know".
Being a strong critical thinker helps me in my job a lot and helps me to find new comedic angles when socializing. It also makes mainstream News insufferable unless its financial.
NB: its easy but takes practise. All children are strong critical thinkers, full of questions. Parents and schools don't nurture it. There is nothing wrong with being a happy worker... happiness is truly invaluable"


Went on a bit of a tangent there..
 
Last edited:
I think critical thinking skills go out the window as soon as someone has spent significant money on something and also has an emotional attachment to the company that made it.
 
I think critical thinking skills go out the window as soon as someone has spent significant money on something and also has an emotional attachment to the company that made it.
I agree with you but do you suppose everyone who does this has an emotional connection to a company?

Let us consider that there are only 2 GPU manufacturers delivering GPU's of this caliber and the situation upon the person individually.

A hobby I would assume if gaming or a need in terms of work, depends on the persons needs.

Emotions running free with zero direction are for kids and disgruntled teens.

We can definitely see a personal preference effect when say a GPU does not deliver stable performance at stock and it is widely known as a problem with the brands drivers, vs someone just favoring because they bought it.

The person may switch sides for a stable experience which = subjectively better but also objectively.
 
Can't say I bought a 3070 with any intention of playing 4K with the latest generation of games... I doubt many others did either. If I wanted 4K performance I wouldn't be expecting it from a mid-range card.

but Nvidia hyped it up as a 2080ti killer which is a 4k card

So who's to blame, the buyer or Nvidia?
 
but Nvidia hyped it up as a 2080ti killer which is a 4k card

So who's to blame, the buyer or Nvidia?

Technically it is/was a 2080ti killer.... better ray tracing performance and matched it in rasterization all for half the price hence why no one on the MM etc. would offer more than £500 for a 2080ti after the 3070 was announced :cry:

If the 3070 isn't a 4k card, then neither is a 2080ti in this day and age.
 
Last edited:
Technically it is/was a 2080ti killer.... better ray tracing performance and matched it in rasterization all for half the price hence why no one on the MM etc. would offer more than £500 for a 2080ti after the 3070 was announced :cry:

If the 3070 isn't a 4k card, then neither is a 2080ti in this day and age.
Turns out GPU grunt is the answer yet again... yet they keep rehashing the argument LOL.
 
Back
Top Bottom