Soldato
Having owned one since launch day, the 3070 is a max settings@1080p kind of card, going above that and it can(game dependant) start getting pretty ugly frame times, FC6 with max settings 1080p to (8m 20s)4K:
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Can't say I bought a 3070 with any intention of playing 4K with the latest generation of games... I doubt many others did either. If I wanted 4K performance I wouldn't be expecting it from a mid-range card.
Pretty much. So many people still in denial tho, it's bizzare.8GB in 2021 was an absolute scam.
Everyone knows it, there were £200 cards from 4 years ago with 8GB of VRAM.
Who cares about RT on if it's unplayable. Now go show results without RT which is what you'd have to use regardless of AMD/Nvidia. There's no advantage in talking about RT if you can't use it, but guess what - can definitely use that vram (while also consuming less power)!not to mention amds RDNA 2 generally performs even worse than the 3070 with RT turned on...
No one's talking about 4K. DLSS On is by definition below 4K. It's a weird talking point anyway, because I've been dabbling with 4K or reconstruct to 4K ever since I got an RX 480 (spent sooo many hours in AC: Origins with it playing like that, 4K 30, but still), then Vega 64 and now a 6800. So how come a 3070 is all of a sudden such a weak card comparatively? It has plenty of grunt to do so, especially if we add DLSS, but it's the memory that's really holding it back. That's the sad part, and why in some scenarios the 3060 actually does better. The extra 4 GB makes all the difference. It should not have been an 8 GB card at all, but alas, it was a mining season.
No one's talking about 4K. DLSS On is by definition below 4K. It's a weird talking point anyway, because I've been dabbling with 4K or reconstruct to 4K ever since I got an RX 480 (spent sooo many hours in AC: Origins with it playing like that, 4K 30, but still), then Vega 64 and now a 6800. So how come a 3070 is all of a sudden such a weak card comparatively? It has plenty of grunt to do so, especially if we add DLSS, but it's the memory that's really holding it back. That's the sad part, and why in some scenarios the 3060 actually does better. The extra 4 GB makes all the difference. It should not have been an 8 GB card at all, but alas, it was a mining season.
Pretty much. So many people still in denial tho, it's bizzare.
Who cares about RT on if it's unplayable. Now go show results without RT which is what you'd have to use regardless of AMD/Nvidia. There's no advantage in talking about RT if you can't use it, but guess what - can definitely use that vram (while also consuming less power)!
They really did lower end Ampere buyers dirty with just 8 GB vram. Absolutely brutal performance, <20 fps even WITH DLSS on, especially when the camera turns. Hope people didn't buy these cards hoping they could turn on raytracing willy-nilly.
Having owned one since launch day, the 3070 is a max settings@1080p kind of card, going above that and it can(game dependant) start getting pretty ugly frame times, FC6 with max settings 1080p to (8m 20s)4K:
As always. Non owner with anecdotal evidence calling owners in denial.
Don't own DL2 yet. But nobody else is as whiny as non owners about Vram.
It's very strange.
Gotta project your purchase option and tell us your Vram is bettererer!!!!
3090 on COD WZ consumes 24GB soo....
Additionally, and thanks to the DX12 cache-related improvements, the game should be smoother. This should also reduce the game’s stutters. Not only that, but Dying Light 2 does no longer require AVX.
That and the second part of the game when the map opens up would get frame drops in certain areas for short periods of time. But to be fair I was playing it using DLSS Performance mode which is essentially upscaling from 1080p.Saying that, I don't recall of @TNA mentioning any issues at 4k with his 3070 except in a cutscene?
No one's talking about 4K. DLSS On is by definition below 4K. It's a weird talking point anyway, because I've been dabbling with 4K or reconstruct to 4K ever since I got an RX 480 (spent sooo many hours in AC: Origins with it playing like that, 4K 30, but still), then Vega 64 and now a 6800. So how come a 3070 is all of a sudden such a weak card comparatively? It has plenty of grunt to do so, especially if we add DLSS, but it's the memory that's really holding it back. That's the sad part, and why in some scenarios the 3060 actually does better. The extra 4 GB makes all the difference. It should not have been an 8 GB card at all, but alas, it was a mining season.
Pretty much. So many people still in denial tho, it's bizzare.
Who cares about RT on if it's unplayable. Now go show results without RT which is what you'd have to use regardless of AMD/Nvidia. There's no advantage in talking about RT if you can't use it, but guess what - can definitely use that vram (while also consuming less power)!
That and the second part of the game when the map opens up would get frame drops in certain areas for short periods of time. But to be fair I was playing it using DLSS Performance mode which is essentially upscaling from 1080p.
I agree RT is no use if it tanks your perf, how anyone can defend 8GB on a 2021 card costing £800 is beyond me, it's objectively a rip off.
Indeed, it's rather hilarious
Also, more the fact of "zomg, I'm hitting a vram bottleneck but lets just ignore the fact that I'm using settings and a res. which said gpu isn't even capable of playing at in the first place regardless of vram" Also, a case of "zomg, 1-2 people are having this issue, it "MUST" be this and nothing else..... *even though no one else has reported this and you have several people saying the complete opposite of said supposed bad issue*.....
Honestly, I think some people would be better of with consoles at times
Rather interesting too, notice something with that video, look at the guys gpu power consumption and how it is tied to the performance drops it seems, from a quick google, was brought back to his channel and another video of his:
Would be interested to see a new video too as with dx 12, there was some hitching when loading new areas, mostly when entering/leaving base areas:
https://www.dsogaming.com/patches/d...-cache-related-improvements-full-patch-notes/
I agree with you but do you suppose everyone who does this has an emotional connection to a company?I think critical thinking skills go out the window as soon as someone has spent significant money on something and also has an emotional attachment to the company that made it.
I think critical thinking skills go out the window as soon as someone has spent significant money on something and also has an emotional attachment to the company that made it.
You do appear to have an emotional attachment to Nvidia, and certainly not a healthy one.
You do appear to have an emotional attachment to Nvidia, and certainly not a healthy one.
Can't say I bought a 3070 with any intention of playing 4K with the latest generation of games... I doubt many others did either. If I wanted 4K performance I wouldn't be expecting it from a mid-range card.
but Nvidia hyped it up as a 2080ti killer which is a 4k card
So who's to blame, the buyer or Nvidia?
Turns out GPU grunt is the answer yet again... yet they keep rehashing the argument LOL.Technically it is/was a 2080ti killer.... better ray tracing performance and matched it in rasterization all for half the price hence why no one on the MM etc. would offer more than £500 for a 2080ti after the 3070 was announced
If the 3070 isn't a 4k card, then neither is a 2080ti in this day and age.
Surely.Has to be all these setups surely.. like lack of system ram, not enough cores, poor single thread etc.