• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
The truth is both Nvidia and AMD are getting bored with dGPU's.

Nvidia have reached equilibrium, they can't grow from this space anymore, its reached a constant.

AMD feel the same way, yes, they also feel they have reached their equilibrium, they have hit rock bottom, they don't feel it could get any worse and if it does its probably a good thing, because then they can wind it all down.

Nvidia are looking to AI to grow their company, AMD are looking to SoC's and licensing their IP.

Just imagine what would be if things were more equal between them? I have been saying for years to a certain sect in this community, be careful what you wish for.
 
I think this makes it hard to defend low VRAM cards, the 12 gig 4070 is of course kind of medium, wedged in halfway between 8 gigs and 16 gigs, but wow what a great showcase of the problem from this HUB video.

Notice as well 6.3 gigs usage, not that close to the 8 gig capacity, showing that using that measurement is not a good way at all to decide if there is VRAM constraints.

The video itself also has lots of hitching/stalling when assets are been swapped in the game. Classic low VRAM behaviour.

--

Some other games were tested and he confirmed of course the game handles it automatically by not loading the proper textures, or if it doesnt games crashed. I think this video finally is showing the actual problems, and reviewers should update how they review cards in future when assessing VRAM. Dont leave benchmarks unattended e.g. as you have no idea whats going on.

KxdVgC6.png


That's just "Game Optimisation"

Look i've explained this before already.....


:D
 
I'm semi serious, the game used to crash on Nvidia GPU's, so they "Optimised" it, now it doesn't crash.

Steve Walton first noticed this when after an update it stopped crashing but now has optimised textures for Nvidia GPU's.

Welcome to your future if you have one of these Nvidia, or AMD cards that doesn't provide what you should really know if you were honest to yourselves is a sufficient amount of VRam for what the card is.

Is this a 4K capable card? Is it really?
 
Last edited:
I wouldnt call that optimisation as such.

Essentially unreal engine streams in textures of variable quality based on many factors, usually you get lower quality in the distance to save resources then they improve as get closer, but if the VRAM isnt there, then you stuck with the textures that arent meant to be viewed at close range.

FF7 remake on the PS4 became infamous for a door in the game that was stuck in its blurry low LOD state when you went up close to it.

Your video doing it as AMD vs Nvidia perhaps isnt done the right way either as you might be seen as a AMD fanboy, Nvidia GPUs with high VRAM wont have the issue.

Either way it doesnt really matter what it is, what matters is that all these silly claims that because a game runs and it shows the VRAM not maxed out it means the game isnt constrained by VRAM. Whilst this HUB video finally a big techtuber shows whats going on.
 
I'm semi serious, the game used to crash on Nvidia GPU's, so they "Optimised" it, now it doesn't crash.

Steve Walton first noticed this when after an update it stopped crashing but now has optimised textures for Nvidia GPU's.

Welcome to your future if you have one of these Nvidia, or AMD cards that doesn't provide what you should really know if you were honest to yourselves a sufficient amount of VRam for what the card is.

Is this a 4K capable card? Is it really?
Yeah I was going to put I consider it a game stability feature, as muddy textures is better than a game crashing.

Since I am not prepared to spend 1.5k plus on a GPU and I dont have much love for RT, I think I might be swapping my 3080 for a AMD GPU, this issue has finally driven me over to the other side.

Losing SGSSAA would hurt though. :(

 
Last edited:
I wouldnt call that optimisation as such.

Essentially unreal engine streams in textures of variable quality based on many factors, usually you get lower quality in the distance to save resources then they improve as get closer, but if the VRAM isnt there, then you stuck with the textures that arent meant to be viewed at close range.

FF7 remake on the PS4 became infamous for a door in the game that was stuck in its blurry low LOD state when you went up close to it.

Your video doing it as AMD vs Nvidia perhaps isnt done the right way either as you might be seen as a AMD fanboy, Nvidia GPUs with high VRAM wont have the issue.

Either way it doesnt really matter what it is, what matters is that all these silly claims that because a game runs and it shows the VRAM not maxed out it means the game isnt constrained by VRAM. Whilst this HUB video finally a big techtuber shows whats going on.

It was done tongue-in-cheek to the brand wars.

I could redo it in a much more informative way if i thought people would actually listen, but i'm not an instructor, i'm pretty bad at that sort of stuff.
 
Last edited:
It was done tongue-in-cheek to the brand wars.

I could redo it in a much more informative way if i thought people would actually listen, but i'm not an instructor, i'm pretty bad at that sort of stuff.
If you did it explaining the effect of VRAM starvation whilst cycling between the textures, I wouldnt mind sharing the video around, as its a good idea. The hardware unboxed video is great but still suffers from been a long video and people have short attention spans, yours just gets straight to the point.
 
If you did it explaining the effect of VRAM starvation whilst cycling between the textures, I wouldnt mind sharing the video around, as its a good idea. The hardware unboxed video is great but still suffers from been a long video and people have short attention spans, yours just gets straight to the point.

All-right, i'll try to put something together when i have the time... :)
 
Just curious, has anyone purchased one of these RTX 4070 now? If so what's the experience like - particularly 4K? Any maxing out of the graphics memory? Currently I've a 1080 with 8GB and whilst I can play most games that I have in my arsenal fine I have started to see maxing out of that 8GB on one or two games. I'm typically playing games that are 3+ years old admittedly!

Raven did :cry:
 
I think this makes it hard to defend low VRAM cards, the 12 gig 4070 is of course kind of medium, wedged in halfway between 8 gigs and 16 gigs, but wow what a great showcase of the problem from this HUB video.

Notice as well 6.3 gigs usage, not that close to the 8 gig capacity, showing that using that measurement is not a good way at all to decide if there is VRAM constraints.

The video itself also has lots of hitching/stalling when assets are been swapped in the game. Classic low VRAM behaviour.

--

Some other games were tested and he confirmed of course the game handles it automatically by not loading the proper textures, or if it doesnt games crashed. I think this video finally is showing the actual problems, and reviewers should update how they review cards in future when assessing VRAM. Dont leave benchmarks unattended e.g. as you have no idea whats going on.

KxdVgC6.png
Serious question, have you played the game? Even with the latest patch fixes. Stutters, framedrops, inconsistent frametimes is a very common occurrence on a 4090 and a 5.6 ghz overclocked 12900k with 7200c32 manually tuned ram. My 0.1% lows drop to almost single digits in the village area. So...?
 
Consider Nvidia are in dialogue with major game studios, and they will be testing their products before release, I can only conclude its a deliberate strategy for early obsolescence of their products. The A4000 if that was the 3070 it would be a much better card.
 
Consider Nvidia are in dialogue with major game studios, and they will be testing their products before release, I can only conclude its a deliberate strategy for early obsolescence of their products. The A4000 if that was the 3070 it would be a much better card.

A 16GB GDDR6 21GB/s kit costs $32 retail for 1250 piece bulk.

21GB/s is high end and its retail, Nvidia buying millions would probably pay, i don't know.... $20? There is no reason why a £300 GPU can't have 16GB, in fact it really should, the 3060 had 12GB because the alternative was 6GB, that didn't make any meaningful difference to the margins on that GPU.

So yes, its deliberate. just as 12GB on a 4070Ti is deliberate. I would argue that's already NOT a 4K card, it most certainly will not be in about 2 years, i might just be as much a 1440P card then as it is a 4K card now.

The starting point now should be 12GB, there is no reason why these £1000+ cards can't be 48GB.

And there is a knock-on effect of this, it is that game developers don't feel like they can use as complex and high resolution textures as they might like, so games don't look as good as they might, when you run your game in 4K, is it running 4K textures? I would suggest 9:10 its not, because it can't, so its not 4K.
 
Last edited:
The bottom line for me is image quality. After all, why do we need VRAM if not to push textures so we get better image quality, right?

Well here are some facts. The 3060ti I have on my 2nd PC plays hogwarts at 3440x1440p DLSS Quality with everything Ultra except textures, those are at high. Framerate was constantly above 60 except cutscenes, it dropped to the 55 range in those. With those settings, and according to Hardware Unboxed's video where basically he says that DLSS Performance (!!!) looks better than native and better than FSR Quality, there is not a single non Nvidia card that delivers a better image quality than my poor man's 3060ti in Hogwarts. You can have 4x7900xtx in Crossfire overclocked to hell, and you won't be able to beat a 3060ti in terms of image quality.

So I'd rather have nvidia deliberately providing us with the best image quality while also deliberately skimping on vram than the other way around.
 
The bottom line for me is image quality. After all, why do we need VRAM if not to push textures so we get better image quality, right?

Well here are some facts. The 3060ti I have on my 2nd PC plays hogwarts at 3440x1440p DLSS Quality with everything Ultra except textures, those are at high. Framerate was constantly above 60 except cutscenes, it dropped to the 55 range in those. With those settings, and according to Hardware Unboxed's video where basically he says that DLSS Performance (!!!) looks better than native and better than FSR Quality, there is not a single non Nvidia card that delivers a better image quality than my poor man's 3060ti in Hogwarts. You can have 4x7900xtx in Crossfire overclocked to hell, and you won't be able to beat a 3060ti in terms of image quality.

So I'd rather have nvidia deliberately providing us with the best image quality while also deliberately skimping on vram than the other way around.
can't take you seriously when you play hogwarts TBF :cry:
 
The bottom line for me is image quality. After all, why do we need VRAM if not to push textures so we get better image quality, right?

Well here are some facts. The 3060ti I have on my 2nd PC plays hogwarts at 3440x1440p DLSS Quality with everything Ultra except textures, those are at high. Framerate was constantly above 60 except cutscenes, it dropped to the 55 range in those. With those settings, and according to Hardware Unboxed's video where basically he says that DLSS Performance (!!!) looks better than native and better than FSR Quality, there is not a single non Nvidia card that delivers a better image quality than my poor man's 3060ti in Hogwarts. You can have 4x7900xtx in Crossfire overclocked to hell, and you won't be able to beat a 3060ti in terms of image quality.

So I'd rather have nvidia deliberately providing us with the best image quality while also deliberately skimping on vram than the other way around.

For anyone who understands this at least at the most basic level this paragraph is full of contradictions.

You cannot make a 1K texture look as good as a 2K texture, there are things that you can do to make the detail of the lower resolution texture stand out more, make it look sharper, less muddy, but that detail is still not as complete as the 2K texture. Its missing.
 
Last edited:
The bottom line for me is image quality. After all, why do we need VRAM if not to push textures so we get better image quality, right?

Well here are some facts. The 3060ti I have on my 2nd PC plays hogwarts at 3440x1440p DLSS Quality with everything Ultra except textures, those are at high. Framerate was constantly above 60 except cutscenes, it dropped to the 55 range in those. With those settings, and according to Hardware Unboxed's video where basically he says that DLSS Performance (!!!) looks better than native and better than FSR Quality, there is not a single non Nvidia card that delivers a better image quality than my poor man's 3060ti in Hogwarts. You can have 4x7900xtx in Crossfire overclocked to hell, and you won't be able to beat a 3060ti in terms of image quality.

So I'd rather have nvidia deliberately providing us with the best image quality while also deliberately skimping on vram than the other way around.
Question to you, do you avoid games that don't have dlss option?

But to go with your statement, if you had that kind of horsepower with 4x GPUs then I would super sample the res and it would destroy your interpretation of image quality
 
Last edited:
Question to you, do you avoid games that don't have dlss option?
I don't check beforehand what games have it or not. I just buy the game and play it, with the exception of warzone 2 and RDR2 where DLSS looks mediocre to me so I used native instead, in every other game that has the option I run with DLSS Quality on, even on my 4090.
 
I don't check beforehand what games have it or not. I just buy the game and play it, with the exception of warzone 2 and RDR2 where DLSS looks mediocre to me so I used native instead, in every other game that has the option I run with DLSS Quality on, even on my 4090.
To me you seem picky considering how you vouch for the tech, some members here seem to think cyberpunk is the only game
 
Status
Not open for further replies.
Back
Top Bottom