• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Thats just it. RT is the future but unfortunately brought to tech too soon that can't handle it properly.

But they are now doing path tracing... at 16 FPS native on a 4090.....but of course if you turn on DLSS3 fake frames (that was meant to double your frame rate) magic happens..it goes to 127FPS... :rolleyes:


Anyways I'm bored of this FG topic now as it's just unreal to me that performance is now measured in fake frames being inserted. If a GPU can't do 30-60fps native resolution with any new graphics enhancing feature then it needs better hardware before being used. Fake frames and upscaling from lower resolutions is not how we measured GPU performance historically. Apples to apples not apples to coconuts..
 
But they are now doing path tracing... at 16 FPS native on a 4090.....but of course if you turn on DLSS3 fake frames (that was meant to double your frame rate) magic happens..it goes to 127FPS... :rolleyes:


Anyways I'm bored of this FG topic now as it's just unreal to me that performance is now measured in fake frames being inserted. If a GPU can't do 30-60fps native resolution with any new graphics enhancing feature then it needs better hardware before being used. Fake frames and upscaling from lower resolutions is not how we measured GPU performance historically. Apples to apples not apples to coconuts..
yes but Nvidia claims the shadows will be "physically correct" so it must be worth it :rolleyes:
 
But they are now doing path tracing... at 16 FPS native on a 4090.....but of course if you turn on DLSS3 fake frames (that was meant to double your frame rate) magic happens..it goes to 127FPS... :rolleyes:


Anyways I'm bored of this FG topic now as it's just unreal to me that performance is now measured in fake frames being inserted. If a GPU can't do 30-60fps native resolution with any new graphics enhancing feature then it needs better hardware before being used. Fake frames and upscaling from lower resolutions is not how we measured GPU performance historically. Apples to apples not apples to coconuts..
They'll probably come out with DLSS4 in another 2 years which will just make each frame repeat to double your fps while then also adding a fake frame between.
 
Keep hearing 3090 is only 10% faster than a 3080, 3090's got higher lows than the 3080ti has avg fps?

Only 10% remember how many said that.

big-brain-point-finger-at-forehead.gif
 
But they are now doing path tracing... at 16 FPS native on a 4090.....but of course if you turn on DLSS3 fake frames (that was meant to double your frame rate) magic happens..it goes to 127FPS... :rolleyes:


Anyways I'm bored of this FG topic now as it's just unreal to me that performance is now measured in fake frames being inserted. If a GPU can't do 30-60fps native resolution with any new graphics enhancing feature then it needs better hardware before being used. Fake frames and upscaling from lower resolutions is not how we measured GPU performance historically. Apples to apples not apples to coconuts..
I really hate to be right seeing this coming all the way back when Nvidia first introduced DLSS...

...same also for Nvidia introduction of first Titan; I saw it coming that it was their first step toward lifting up the pricing cards at all tiers below. Took them 10 years, but they have finally done it with the 4000 series.
 
I really hate to be right seeing this coming all the way back when Nvidia first introduced DLSS...

...same also for Nvidia introduction of first Titan; I saw it coming that it was their first step toward lifting up the pricing cards at all tiers below. Took them 10 years, but they have finally done it with the 4000 series.

A few of us warned people about the Titan,and how Nvidia shifted tiers back then. We also pointed out how the mainstream dGPU performance gains were starting to stagnate since 2015/2016. Then we also warned about what DLSS would also be used for eventually. Some of us also tried to tell people not to pay above the odds for dGPUs during the pandemic. But OFC,people just carried on as normal and now we have what PCMR wanted.
 
Last edited:
I really hate to be right seeing this coming all the way back when Nvidia first introduced DLSS...

...same also for Nvidia introduction of first Titan; I saw it coming that it was their first step toward lifting up the pricing cards at all tiers below. Took them 10 years, but they have finally done it with the 4000 series.
Prices have increase by a tier while cards physical specs have reduce by a tier.

The 4080 is really a 4070ti but priced like a 4080ti

The 4070ti is really a 4060ti but priced above an 80 class

4070 will be priced like a 4070ti while the spec around a 4060
 
Prices have increase by a tier while cards physical specs have reduce by a tier.

The 4080 is really a 4070ti but priced like a 4080ti

The 4070ti is really a 4060ti but priced above an 80 class

4070 will be priced like a 4070ti while the spec around a 4060
I am well aware of this.

Nvidia with their smokes and mirrors distractions playing with the naming game.

I am not blind enough to not notice how 1080ti, 980ti where the 80ti was the top card, and now they just called them xx90 to further elevate the pricing.
 
Last edited:
Prices have increase by a tier while cards physical specs have reduce by a tier.

The 4080 is really a 4070ti but priced like a 4080ti

The 4070ti is really a 4060ti but priced above an 80 class

4070 will be priced like a 4070ti while the spec around a 4060

You could argue the RTX4060TI is more like an RTX3060 replacement,so should be name an RTX4060. The RTX4060TI should be use a cut-down AD104 dGPU from the RTX4080.
I am well aware of this.

Nvidia with their smokes and mirrors distractions playing with the naming game.

I am not blind enough to not notice how 1080ti, 980ti where the 80ti was the top card, and now they just called them xx90 to further elevate the pricing.

But with Kepler,they essentially rebranded the GTX560TI replacement as a GTX680,and so made the Titan the new more expensive high end. People defended this move back then,which was similar in some ways to what Nvidia is doing now.

Only because AMD Hawaii existed,were they forced to make the Titan into the GTX780/GTX780TI,with the rejigged GTX700 series. The moment AMD couldn't compete at the high end after that,they yet again made the 80 series a class based on what the 60/60TI used to be,and made the 80TI range permanent.

This is when the decline of the mainstream dGPUs started and so was the stagnation. The GTX460/GTX560TI were based on the second tier dGPU. The RTX4060 will be based on a fourth tier dGPU.

Turing was the second "re-alignment" and only because the RX6000 series was decent were they forced to make Ampere better. Now they tried a "third re-alignment". Foolish PCMR never learns.
 
Last edited:
Regarding a 'minimum playable framerate' I aim for either 60 or 70 FPS. Any thing over 60 I can't tell much difference, and the max refresh of my monitor is 70 hertz anyway.

Anything over 59 FPS feels very smooth to me, the most important thing is performance consistency.
 
Last edited:
Only 10% remember how many said that.

big-brain-point-finger-at-forehead.gif
The RTX 3090 is not really relevant to new buyers these days. Wasn't it priced over £1,000 on launch?

Edit - The FE was priced at £1,399. So, it's in the >£1000 silly money category.

In comparison, the 3080 TI seemed somewhat impressive when it came out, considering the similar spec.

tbf, the 3080 TI FE was priced just over £1,000, so also silly price.

The 3090 was priced that way because Nvidia knows there are always people who will pay for this amount of VRAM, especially if it launches first.
 
Last edited:
  • Like
Reactions: TNA
They should sell some cards without RT to get the price down.
That would be nice wouldn't it.

Throw in some cheaper GDDR6 too.

It won't happen probably because it's not worth having different chip designs, which is a shame, considering that many games don't have RT, or only implement it partially.
 
Last edited:
Clearly gamers can't just rely on getting a graphics card with lots of VRAM (by it self), because otherwise GPUS like the RX 6800 would be able to keep up in demanding titles at 1440p and above (although it is still a bit ahead of the RTX 3080 10GB in games like The Last of Us).
 
Last edited:
The RTX 3090 is not really relevant to new buyers these days. Wasn't it priced over £1,000 on launch?

Edit - The FE was priced at £1,399. So, it's in the >£1000 silly money category.

In comparison, the 3080 TI seemed somewhat impressive when it came out, considering the similar spec.

tbf, the 3080 TI FE was priced just over £1,000, so also silly price.

The 3090 was priced that way because Nvidia knows there are always people who will pay for this amount of VRAM, especially if it launches first.

The 3080Ti was never good value especially as it came some time after the 70/80/90 wave. The highlight was on the % amount posts as its a figure often rushed out in arguments when in reality you can see there was a difference between the cards especially at 4k or with RT involved. It is even showing now at 1440p which is why the weaker Ada cards now launching are not being well received.
 
Also all this talk of "back then" makes me wanna fire up Quake, Heretic, Duke Nukem 3d and Redneck Rampage when I get home :D

I find a lot of games from back then are better left in nostalgia land. The one that I keep going back to every few years is Deus Ex though. That never gets old for me. Really hope they release a path traced versión of it like Portal.
 
Regarding a 'minimum playable framerate' I aim for either 60 or 70 FPS. Any thing over 60 I can't tell much difference, and the max refresh of my monitor is 70 hertz anyway.

Anything over 59 FPS feels very smooth to me, the most important thing is performance consistency.
get a better monitor, i said the same thing at 60 fps to anyone on higher refresh, cant tell, dont need it blah blah then you see the smoother animations and you realise how wrong you were
 
Status
Not open for further replies.
Back
Top Bottom