• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Still Founders Edition 4070's on sale, normally a FE is sold out for a new launch, in a day, or even if a few months have passed after release gone after 2-3 days.

4070 not received well, big understatement, and mainly due to price, we can argue about the hardware but if the 4070 had been £500 that would not have been such a big problem.

Nvidia and AMD have to show value for your money in today's market, want £600 for a mid tier GPU, then have 16GB VRAM, and out do a 3080 Ti/6950XT.
 
Last edited:
And there is a knock-on effect of this, it is that game developers don't feel like they can use as complex and high resolution textures as they might like, so games don't look as good as they might, when you run your game in 4K, is it running 4K textures? I would suggest 9:10 its not, because it can't, so its not 4K.

Game devs usually don't like to do more than the minimum required for a long time now - or are not allowed by higher ups. There were some many options during last gen to have immersive, interactive, beautiful looking game worlds and yet that didn't happen. Most of the time, higher vRAM or higher horsepower would mean just less optimization going on.

For anyone who understands this at least at the most basic level this paragraph is full of contradictions.

You cannot make a 1K texture look as good as a 2K texture, there are things that you can do to make the detail of the lower resolution texture stand out more, make it look sharper, less muddy, but that detail is still not as complete as the 2K texture. Its missing.

There a couple of things worth mentioning:

1) the game is not the best is its native presentation and, overall, DLSS could help solve some issue
2) ultra (something), not just textures, doesn't really add much, so choosing one lower settings doesn't affect the image too much, while (as per point 1.), DLSS type of thing could improve on things.

Of course, I don't agree that having some insane powerful graphics card so you could render at much higher resolution internally and then downsacale it won't be better than DLSS, but that isn't always possible with new games (if any with lesser GPUs)
 
Still Founders Edition 4070's on sale, normally a FE is sold out for a new launch, in a day, or even if a few months have passed after release gone after 2-3 days.

4070 not received well, big understatement, and mainly due to price, we can argue about the hardware but if the 4070 had bee £500 that would not have been such a big problem.

Nvidia and AMD have to show value for your money in today's market, want £600 for a mid tier GPU, then have 16GB VRAM, and out do a 3080 Ti/6950XT.
Funny thing is how little people would need to find a product more desirable in terms of price (since it won't actually worth even 500 pounds being more of a up sold 4060). 600 vs. 500 is basically 20% (if you look at it against the 500). The difference in price between less expensive 4070 and most expensive is about the same...
 
Last edited:
Yeah because VRAM isnt modular, it means to upgrade you have to replace the entire GPU.
They could do it; just like Creative used to enable you to upgrade the ram on the Creative Soundblaster AWE32 soundcards back in the day. NVIDIA / AMD wont of course do this as it's not as profitable enabling the user to upgrade by simply adding ram. They'd be missing out on a ton of upgrade sales. NVIDIA instead play Apple's game. It works...and we just put up with it...
 
And there is a knock-on effect of this, it is that game developers don't feel like they can use as complex and high resolution textures as they might like, so games don't look as good as they might, when you run your game in 4K, is it running 4K textures? I would suggest 9:10 its not, because it can't, so its not 4K.
4k textures in modern games would probably be unplayable in current cards. Especially with the direction they are going with more and more unique textures, if TLOU was running 4k textures the game would probably require more than 40gb of vram with insane amounts of bandwidth.. The majority of the textures in TLOU are 512x512, with some exceptions between 2k.
 
Game devs usually don't like to do more than the minimum required for a long time now - or are not allowed by higher ups. There were some many options during last gen to have immersive, interactive, beautiful looking game worlds and yet that didn't happen. Most of the time, higher vRAM or higher horsepower would mean just less optimization going on.



There a couple of things worth mentioning:

1) the game is not the best is its native presentation and, overall, DLSS could help solve some issue
2) ultra (something), not just textures, doesn't really add much, so choosing one lower settings doesn't affect the image too much, while (as per point 1.), DLSS type of thing could improve on things.

Of course, I don't agree that having some insane powerful graphics card so you could render at much higher resolution internally and then downsacale it won't be better than DLSS, but that isn't always possible with new games (if any with lesser GPUs)
Super sampling is way better as it handles the entire scene and every given second.

All games can be super sampled by adding in the driver's custom res.

I swear the days of dlss has dumbed down the pcmr
 
Funny thing is how little people would need to find a product more desirable in terms of price (since it won't actually worth even 500 pounds being more of a up sold 4060). 600 vs. 500 is basically 20% (if you look at it against the 500). The difference in price between less expensive 4070 and most expensive is about the same...
Look at all the 6800XT's and 6950XT's that have been selling.

People waited for the 4070 and were just disappointed again and so looked at the best value card they could get and ironically for Nvidia that was AMD, and drove there sales up. More ironically, for Nvidia and AMD, sales did not go up for the latest and greatest GPU, but for the previous generation.
 
Super sampling is way better as it handles the entire scene and every given second.

All games can be super sampled by adding in the driver's custom res.

I swear the days of dlss has dumbed down the pcmr
That is as false as it gets.

The goal of DLSS isn't to improve image quality, it's to improve your framerate. The fact that in some cases it manages to do both is the impressive part. Supersampling on the other hand lowers your framerate, dramatically.. If you want to do a proper image quality comparison, what you should do is compare native 1080p with 4k DLSS Performance (which renders in 1080p). DLSS P would do so much better it would look like magic, but of course it wouldn't give you any fps compared to native 1080p.

Supersampling is much worse than DLDSR, which is basically using DLSS to downscale.

Of course, cause I like being fair contrary to what people believe, FSR is also BETTER than native. And what I mean by that, is that FSR Quality at 4k IS better than native 1440p.
 
That is as false as it gets.

The goal of DLSS isn't to improve image quality, it's to improve your framerate. The fact that in some cases it manages to do both is the impressive part. Supersampling on the other hand lowers your framerate, dramatically.. If you want to do a proper image quality comparison, what you should do is compare native 1080p with 4k DLSS Performance (which renders in 1080p). DLSS P would do so much better it would look like magic, but of course it wouldn't give you any fps compared to native 1080p.

Supersampling is much worse than DLDSR, which is basically using DLSS to downscale.

Of course, cause I like being fair contrary to what people believe, FSR is also BETTER than native. And what I mean by that, is that FSR Quality at 4k IS better than native 1440p.
But surely that's the point right? Your framerate is boosted fairly considerably allowing you to run at higher resolution, higher texture levels, anti-aliasing etc than you would otherwise be able to activate without the frame rate making it painful to play. So in effect the enhancement is a by-product of the ability to run at that higher frame rate afforded by DLSS?
 
That is as false as it gets.

The goal of DLSS isn't to improve image quality, it's to improve your framerate. The fact that in some cases it manages to do both is the impressive part. Supersampling on the other hand lowers your framerate, dramatically.. If you want to do a proper image quality comparison, what you should do is compare native 1080p with 4k DLSS Performance (which renders in 1080p). DLSS P would do so much better it would look like magic, but of course it wouldn't give you any fps compared to native 1080p.

Supersampling is much worse than DLDSR, which is basically using DLSS to downscale.

Of course, cause I like being fair contrary to what people believe, FSR is also BETTER than native. And what I mean by that, is that FSR Quality at 4k IS better than native 1440p.
Guess we can agree we just handle things differently, but to say more pixels and every scene, every detail is not the more superior method is silly.

It's plain numbers, the pixels have to go somewhere and they do.

The point of high end upgrades is to achieve both within the confines of your target frame rate, if you can achieve 60fps by injecting more pixels because you upgraded then that's the point.

Dlss and the like is a sad crutch for the frames at the cost of image quality for the once great pcmr.
 
Dlss and the like is a sad crutch for the frames at the cost of image quality for the once great pcmr.
But that is fundamentally wrong. Firstly because even when used as an upscaler, DLSS improves image quality (going by hwunboxds video), and second of all, because you don't need to use it as an upscaler.
but to say more pixels and every scene, every detail is not the more superior method is silly.
But native does not offer you more pixels. Native 1080p offers you the exact same internal render resolution as 4k DLSS Performance, and the latter is painfully better in image quality. And so is FSR btw. Native is dead, either because people use DLSS / FSR as upscalers to gain fps boosts (while sometimes improving imaging quality) or because people with spare gpu horsepower use DLDSR (which is basically a reverse DLSS) to downscale from higher resolutions. With DLDSR and DLSS existing there is no reason to play on native, ever.

Or to put it better, let''s say you have a 1440p monitor. Your option is to either play in native 1440p, or to render at 4k and then use DLSS Q. The latter WILL look better while offering the same performance.
 
Last edited:
Super sampling is way better as it handles the entire scene and every given second.

All games can be super sampled by adding in the driver's custom res.

I swear the days of dlss has dumbed down the pcmr
Of course super sampling is better, but I can't super sample when my native is under 60fps and I want 60fps. That's the job of DLSS.

Personally I use DLSS, because it offers a better image than reducing details, resolution or both. Being a purist and sticking to native would mean having a worse experience which is... insane.
 
  • Like
Reactions: TNA
Nobody said otherwise. But the question is, since DLSS Performance (which is basically 1080p) looks equal to 4k native (according to hwunboxed), isn't it safe to assume that DLSS Quallity with textures High would look better than native + textures ultra, let alone FSR?

Or in other words, would you rather play at 1080p with ultra textures or 4k with high textures? I think it's a pretty straightforward answer

This is an albedo or colour map texture, i've down scaled it from 4K to 1K.

Go here and download ( Topaz Gigapixel AI ) the trail version is free, with a water mark, upscale it to 4K, so in the settings that's 4X.

Upload your result here, i have the original, lets see how they compare :)

PbEb8cL.jpg

 
Last edited:
This is an albedo or colour map texture, i've down scaled it from 4K to 1K.

Go here and download ( Topaz Gigapixel AI ) the trail version is free, with a water mark, upscale it to 4K, so in the settings that's 4X.

Upload your result here, i have the original, lets see how they compare :)

PbEb8cL.jpg

But in hogwarts the ultra textures are not 4 times the resolution of the high textures, are they? I mean you already know this, don't you? I dont understand why you refuse to accept that the 3060ti beats every non nvidia card in image quality but okay, you do you man
 
But in hogwarts the ultra textures are not 4 times the resolution of the high textures, are they? I mean you already know this, don't you? I dont understand why you refuse to accept that the 3060ti beats every non nvidia card in image quality but okay, you do you man

I don't know what the resolution differences are, no more than you do. This is nothing to do with that.

I'm trying to show you how AI cannot replace missing detail, because in binary you cannot get 1 from 0, no data, no output, "Augment and Enhance" is science fiction, it always will be for this reason, people like Tim from HUB declaring "this is better than native" is ######, he says that because he thinks science fiction is now fact, because he's on the screen saying that with such confidence and authority people believe it, but those people are as clueless as he is, i'm not casting any dispersions here, aside my frustrations with the quality of this journalism by clueless people who think they know everything or trust implicitly in anything Nvidia tell them.

If it looks better than native then the texture is the same resolution with post processing filters passed over it, it is not augmented and enhanced, and if its the same resolution then it carries the same weight in your VRam.
 
I don't know what the resolution differences are, no more than you do. This is nothing to do with that.

I'm trying to show you how AI cannot replace missing detail, because in binary you cannot get 1 from 0, no data, no output, "Augment and Enhance" is science fiction, it always will be for this reason, people like Tim from HUB declaring "this is better than native" is ######, he says that because he thinks science fiction is now fact, because he's on the screen saying that with such confidence and authority people believe it, but those people are as clueless as he is, i'm not casting any dispersions here, aside my frustrations with the quality of this journalism by clueless people who think they know everything or trust implicitly in anything Nvidia tell them.

If it looks better than native then the texture is the same resolution with post processing filters passed over it, it is not augmented and enhanced, and if its the same resolution then it carries the same weight in your VRam.


:cry:
 
I don't know what the resolution differences are, no more than you do. This is nothing to do with that.

I'm trying to show you how AI cannot replace missing detail, because in binary you cannot get 1 from 0, no data, no output, "Augment and Enhance" is science fiction, it always will be for this reason, people like Tim from HUB declaring "this is better than native" is ######, he says that because he thinks science fiction is now fact, because he's on the screen saying that with such confidence and authority people believe it, but those people are as clueless as he is, i'm not casting any dispersions here, aside my frustrations with the quality of this journalism by clueless people who think they know everything or trust implicitly in anything Nvidia tell them.

If it looks better than native then the texture is the same resolution with post processing filters passed over it, it is not augmented and enhanced, and if its the same resolution then it carries the same weight in your VRam.
Dude, 1 picture is a thousand words I guess. You can't seriously tell me it's just nvidia marketing that makes me think the right AI image is higher quality than the left native image

DLSS-power.png
 
Status
Not open for further replies.
Back
Top Bottom