• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
But to go with your statement, if you had that kind of horsepower with 4x GPUs then I would super sample the res and it would destroy your interpretation of image quality
You would run out of vram. Hogwarts uses 21-22gb on my 4090 at 4k, if you tried 8k youd probably run out of vram. But anyways, sure, it was an exaggeration but you get the point
 
For anyone who understands this at least at the most basic level this paragraph is full of contradictions.

You cannot make a 1K texture look as good as a 2K texture, there are things that you can do to make the detail of the lower resolution texture stand out more, make it look sharper, less muddy, but that detail is still not as complete as the 2K texture. Its missing.
There is no contradiction. Simple example, try 720p with FSR ultra performance on top,, do you think the game will look better than 4k native just because textures are on ultra instead of high? No, of course not. So the question is whether DLSS Q can improve image quality more than it loses by dropping textures to high instead of ultra, and at least for hogwarts that's a resounding yes. Doesn't mean it applies to every game - but at least in this one - 8gb nvidia cards are top dogs in terms of image quality.
 
A 16GB GDDR6 21GB/s kit costs $32 retail for 1250 piece bulk.

21GB/s is high end and its retail, Nvidia buying millions would probably pay, i don't know.... $20? There is no reason why a £300 GPU can't have 16GB, in fact it really should, the 3060 had 12GB because the alternative was 6GB, that didn't make any meaningful difference to the margins on that GPU.

Exactly!
 
There is no contradiction. Simple example, try 720p with FSR ultra performance on top,, do you think the game will look better than 4k native just because textures are on ultra instead of high? No, of course not. So the question is whether DLSS Q can improve image quality more than it loses by dropping textures to high instead of ultra, and at least for hogwarts that's a resounding yes. Doesn't mean it applies to every game - but at least in this one - 8gb nvidia cards are top dogs in terms of image quality.

You do realise that you can lower the render resolution without lowering the texture resolution? you apply a sharpening filter on top of that and it can appear to look better.

These two images are the same in game setting at the same resolution, one has a third party in game post processing filter passed over it.

5wrDFot.jpg
NqI3x07.jpg
 
Last edited:
The bottom line for me is image quality. After all, why do we need VRAM if not to push textures so we get better image quality, right?

Well here are some facts. The 3060ti I have on my 2nd PC plays hogwarts at 3440x1440p DLSS Quality with everything Ultra except textures, those are at high. Framerate was constantly above 60 except cutscenes, it dropped to the 55 range in those. With those settings, and according to Hardware Unboxed's video where basically he says that DLSS Performance (!!!) looks better than native and better than FSR Quality, there is not a single non Nvidia card that delivers a better image quality than my poor man's 3060ti in Hogwarts. You can have 4x7900xtx in Crossfire overclocked to hell, and you won't be able to beat a 3060ti in terms of image quality.

So I'd rather have nvidia deliberately providing us with the best image quality while also deliberately skimping on vram than the other way around.
DLSS (as well as upping the native rendering resolution) wont improve low quality textures.
 
WUT? That's pretty trivial for AI to solve. Have you seen AI reconstructions of partially destroyed images?

AI can only make an educated guess, when the AI is filling in the missing detail based on a calculated guess it will come up with a more complete image but one that is ultimately fake.

A bit like sending a blurry fax of a Mini to China, what you end up with is this.....

5SY82W9.jpg

A fake Mini.
 
Last edited:
Because you are mentioning image quality.

Higher texture resolution allows more detail on the textures which plays a big part of the quality of the presentation.
Nobody said textures don't' play a big part of the quality of the presentation.. What im saying is DLSS Q + High textures look better than native + Ultra textures and much better than FSR in hogwarts.
 
AI can only make an educated guess, when the AI is filling in the missing detail based on a calculated guess it will come up with a more complete image but one that is ultimately fake.
What do you mean "ultimately fake". Isn't the game itself ultimately fake? I think what you mean is - that the resulting image is not a 1:1 representation of what the creators made. Sure, but as long as the actual result looks better, then it doesn't matter.
 
Even image reconstruction benefits from higher quality inputs. There is a reason why it works better at higher output resolutions because it relies on higher quality inputs.If anyone understood how machine learning worked they would realise this. It is not magic.
Nobody said otherwise. But the question is, since DLSS Performance (which is basically 1080p) looks equal to 4k native (according to hwunboxed), isn't it safe to assume that DLSS Quallity with textures High would look better than native + textures ultra, let alone FSR?

Or in other words, would you rather play at 1080p with ultra textures or 4k with high textures? I think it's a pretty straightforward answer
 
Last edited:
Nobody said otherwise. But the question is, since DLSS Performance (which is basically 1080p) looks equal to 4k native (according to hwunboxed), isn't it safe to assume that DLSS Quallity with textures High would look better than native + textures ultra, let alone FSR?

Or in other words, would you rather play at 1080p with ultra textures or 4k with high textures? I think it's a pretty straightforward answer
1080p with 4k textures would look better than 4k with 512 textures, absolutely no question. Even more so with decent AA.

I can give you some examples, Lightning Returns can be played at 4k rendering on the PC, but it still has the original PS3 quality textures, and when you get those textures in front close to the screen like grass etc. they stand out a lot looking really bad alongside the high rendering resolution.

I also mod a game, where we have someone contributing AI upscaled textures, and also some people making brand new textures 4096x4096 with proper detail, and the difference is extreme.
 
Last edited:
1080p with 4k textures would look better than 4k with 512 textures, absolutely no question. Even more so with decent AA.

I can give you some examples, Lightning Returns can be played at 4k rendering on the PC, but it still has the original PS3 quality textures, and when you get those textures in front close to the screen like grass etc. they stand out a lot looking really bad alongside the high rendering resolution.

I also mod a game, where we have someone contributing AI upscaled textures, and also some people making brand new textures 4096x4096 with proper detail, and the difference is extreme.

I mod games too and agree entirely. How not PCMR,for people to not able to mod games,just because we have some weird 8GB limitation on expensive dGPUs to pad company profits. 12GB is utterly stingy considering the kits only cost a few dollars apparently. Nvidia and AMD to a lesser degree need to be called out on this. Seems more like Apple and it's cutting back on RAM so it can push people towards upgrading quicker!

Edit!!

Junk like the RTX4060TI/RTX4060/RX7600XT being only 8GB cards and probably all being £300 and above should not be acceptable. Its bad enough the tiers being pushed up too.

8GB cards should be under £300.

12GB should be the minimum level,maybe even 16GB over £300.

We had 8GB VRAM cards for well under £300 since the days of the R9 390 and RX480 since 2015/2016. The RTX4060TI/RTX4060/RX7600XT also use el-cheapo 128 bit memory buses too so even need less VRAM chips than older generation dGPUs.

Its a pure cash grab exploiting people who don't know anything about how games are going.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom