• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
If I bought a very high end card (say £1,000+) I'd like to be able to play with minimum framerates of 60 or above at 4K. That's pretty much all I would expect.

Its way worse than that mate :)

No one looking at this thinks RT is worth worrying about, which is why no one does, you need a £2000 GPU for 40 FPS, lol. what a load of old ____

7XPlwyQ.png
 
Its nearly £400 (45%) more expensive for a difference of 15% performance, yes of course i would....
I'm not in violent disagreement with that statement, but the difference isn't just the performance.

That's what I was saying all along for the 4070ti as well, since it had the better raster performance per dollar for the last 4 months or something, but nobody seemed to agree back then. Now that amd has the better performance per dollar, it's back on the table as an argument. Okay I guess.
 
I think that frame generation would have to deliver 2x the performance of native, when RT is enabled, in order for ray tracing to be viable.

It has to be good enough to negate the performance hit of enabling RT entirely.

Even then, lots of people seem resistant to the idea of generating frames (even though it only applies to half of the frames).
 
Last edited:
Oh right yeah, you need to down scale it to 720P to get playable performance out of it on a £2000 GPU :cry:

No one thinks its worth a damn and that's why....
4k dlss q runs at an internal resolution of 1440p and it looks way way better than native 1440p. I would argue in lots of cases It looks better than native 4k. It's stuff like that make me think you are too far gone into your nvidia hating to look at things objectively. DLSS is the best thing that happened in the gpu space probably ever.
 
I think that frame generation would have to deliver 2x the performance of native, when RT is enabled, in order for ray tracing to be viable.

Even then, lots of people seem resistant to the idea of generating frames (even though it only applies to half of the frames).
FG is useless on its own. You need to have 60-70+ FPS before activating it, else the latency impact is very noticeable. But dlss q already runs cyberpunk at 70-80+ FPS, it runs great and it's jaw dropping
 
It looks better than native 4k.
In my experience, 4K resolution with good temporal AA provides the best possible image quality. You can always add a sharping filter too, and tweak this to the desired amount depending on the user's preference.
 
Last edited:
Assuming I won a competition and could pick between the two, I would take a RTX 4080 over a 7900 XT without batting an eye lid.

Same.

- dlss superior in uptake and quality still
- frame generation is useful whether people like it or not, even HUB are starting to like it now and think it is worthwhile feature
- RT performance

RT in Cyberpunk is just bad on any GPU, its the game that makes people think "RT is no where near mainstream yet so why should i care?"

Oh right yeah, you need to down scale it to 720P to get playable performance out of it on a £2000 GPU :cry:

No one thinks its worth a damn and that's why....

Another thing you're wrong on, even your dear HUB agreed DLSS is great in CP and frame generation, guess they aren't reliable now though?
 
  • Like
Reactions: TNA
In my experience, 4K with good temporal AA provides the best possible image quality.
Yes, but not all games have a good temporal aa solution. Lots of games I activate dlss not because I need the fps but because it just looks better than native. Cod cold war was a prime example, the last of us is another one. But even against a good aa solution, dlss q isn't far behind. It definitely gets 90+% of the way there.

Whoever claims anything like humbug did about 720 either isn't honest or hasnt tried it.
 
Last edited:
FG is useless on its own. You need to have 60-70+ FPS before activating it, else the latency impact is very noticeable. But dlss q already runs cyberpunk at 70-80+ FPS, it runs great and it's jaw dropping
Well, that is going to be the only way to consistently manage 60 FPS at 4K with RT enabled on today's hardware.

Presumably it's what they will (eventually) attempt on consoles also.
 
Last edited:
4k dlss q runs at an internal resolution of 1440p and it looks way way better than native 1440p. I would argue in lots of cases It looks better than native 4k. It's stuff like that make me think you are too far gone into your nvidia hating to look at things objectively. DLSS is the best thing that happened in the gpu space probably ever.
Let me tell you something.

Sometimes its good to listen to your normie mates who don't give a rats arse about the technicalities of it, i have a lot of those and they all think its a joke that you have to press a button that run your game at 720P or it all becomes unplayable on a £2000 GPU.

Half the time that puts them off Nvidia because all they do know is its Nvidia doing all of this, they not only think its a funny they think its too complicated, they are just looking for a GPU and because AMD don't have RTX and DLSS its just a GPU so i'll buy that thanks....

Nvidia.... there just pushing it too far...
 
Last edited:
Let me tell you something.

Sometimes its good to listen to your normie mates who don't give a rats arse about the technicalities of it, i have a lot of those and they all think its a joke that you have to press a button that run your game at 720P or it all becomes unplayable on a £2000 GPU.

Half the time that puts them off Nvidia because all they do know is its Nvidia doing all of this, they not only think its a funny they think its too complicated, they are just looking for a GPU and because AMD don't have RTX and DLSS its just a GPU so i'll buy that thanks....

Nvidia.... there just pushing it too far...
Again, it's not 720p. That's just disgenius. Dlss q looks as good as native in most situations.

Also, amd has both rt and fsr, so it's not a gpu either I guess.

The video below was taken from my pc, super obvious that dlss > native. Look at all the shimmering in native. "720p", sure man, whatever.

 
It's true that you pay a big premium for the RT and DLSS features on recent Nvidia GPUs.

They are ultimately just optional, far from essential.
 
Last edited:
Well, that is going to be the only way to consistently manage 60 FPS at 4K with RT enabled on today's hardware.

Presumably it's what they will (eventually) attempt on consoles also.
The 4090 managed it without FG, at least in current games
 
Yes, but not all games have a good temporal aa solution. Lots of games I activate dlss not because I need the fps but because it just looks better than native. Cod cold war was a prime example, the last of us is another one. But even against a good aa solution, dlss q isn't far behind. It definitely gets 90+% of the way there.

It doesn't matter what you say or post, you'll still get the upsampling/dlss haters, literally every reviewer possible now has pointed out how dlss (and sometimes although very rarely, even FSR 2+) is better than native res. "overall" i.e. HUB, DF, TPU, computerbase, pcgameshardware, oc3d, gamer nexus (apparently some of them are the most reputable and only reputable sites when it comes to vram issues though, funny eh....) and of course all backed up with evidence, meanwhile, you still get the couple of posters on here who are adamant it's worse yet they still don't post anything to back this up and if they do, it's one specific scene cherry picked :cry:
 
Again, it's not 720p. That's just disgenius. Dlss q looks as good as native in most situations.

Also, amd has both rt and fsr, so it's not a gpu either I guess.

The video below was taken from my pc, super obvious that dlss > native. Look at all the shimmering in native. "720p", sure man, whatever.


You're either not reading what i said or you just don't get it, no one cares.....
 
The 4090 managed it without FG, at least in current games
The 4090 is quite good to be fair, especially in traditional graphics performance (not gonna comment on the price and power supply requirements :cry: ).

But what that shows you is that it's far beyond the reach of about 99% of customers. I'd still argue the RT performance isn't good enough, hasn't been improved enough compared to the last generation (not including any other bells and whistles).

And the 4090 TI is still to come, no doubt.
 
Last edited:
You're either not reading what i said or you just don't get it, no one cares.....

Exactly no one cares when you don't post anything to back up rubbish such as this:

Let me tell you something.

Sometimes its good to listen to your normie mates who don't give a rats arse about the technicalities of it, i have a lot of those and they all think its a joke that you have to press a button that run your game at 720P or it all becomes unplayable on a £2000 GPU.

Half the time that puts them off Nvidia because all they do know is its Nvidia doing all of this, they not only think its a funny they think its too complicated, they are just looking for a GPU and because AMD don't have RTX and DLSS its just a GPU so i'll buy that thanks....

Nvidia.... there just pushing it too far...

Oh right yeah, you need to down scale it to 720P to get playable performance out of it on a £2000 GPU :cry:

No one thinks its worth a damn and that's why....

Honestly, got to be trolling at this stage now surely....
 
Status
Not open for further replies.
Back
Top Bottom