• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
The averages don't lie, the difference is 2% in raster (there is nothing further to argue about that).
relative-performance-3840-2160.png


And in RT the 4080 is 20-45% faster in games that make heavy use of RT.
+
less power consumption and less heat output
DLSS which is superior to FSR
and FG

The only advantages of the 7900XTX are:
2% faster in raster
better control panel
less driver overhead in DX12 but it is the opposite in DX10/11 titles

At the end of the day both are bad value, but the 7900XTX has even worse value, I mean for only 10% more money the 4080 offers so much more compared to 7900XTX.
I said it since the very beginning that people (if they are willing) are essentially paying the extra for the 4080 for Nvidia features (like DLSS and better RT performance etc), and 7900XTX is generally faster than it for 4K Rasterization.

As mentioned the so-called 2% on the relative performance is only good for quick reference but not and accurate representation in the real-world environment due to the way it add everything up and the dividing them into average (and some of the games in the test that the 4080 is faster are older games that are know to favour Nvidia heavily i.e. Civ VI, Witcher 3, Borderland 3, Metro Exodus). Most people are more interested in how much potentially faster in the games that they specifically play, and if it is more likely to be faster in upcoming games in general.

Simply put if anyone care enough for things like RT or DLSS and don't mind paying the extra, then they should go for the 4080; for those that only care about raw performance and don't care about RT or upscaling whatnot, and not interested in spending £100-£200 extra they could go for the 7900XTX. But said already mentioned those that don't mind spending the extra for the 4080, they would probably go all the way and stretch to the 4090.

With that said though, the 7900XT 20GB is looking more and more like a like a tempting proposition with it's price dipping toward the £750, with it not having better price/perf than the 7900XTX, 4070Ti and 4080.
 
So I said the same thing and he showed me a screengrab from his emails from when he bought it - direct from Nvidia. Did they change the price at all?

LE3Zk71.jpeg

No, they did bring out a 2070 Super and bumped the price for that to $499, i remember paying £480 for this one....

 
I said it since the very beginning that people (if they are willing) are essentially paying the extra for the 4080 for Nvidia features (like DLSS and better RT performance etc), and 7900XTX is generally faster than it for 4K Rasterization.

As mentioned the so-called 2% on the relative performance is only good for quick reference but not and accurate representation in the real-world environment due to the way it add everything up and the dividing them into average (and some of the games in the test that the 4080 is faster are older games that are know to favour Nvidia heavily i.e. Civ VI, Witcher 3, Borderland 3, Metro Exodus). Most people are more interested in how much potentially faster in the games that they specifically play, and if it is more likely to be faster in upcoming games in general.

Simply put if anyone care enough for things like RT or DLSS and don't mind paying the extra, then they should go for the 4080; for those that only care about raw performance and don't care about RT or upscaling whatnot, and not interested in spending £100-£200 extra they could go for the 7900XTX. But said already mentioned those that don't mind spending the extra for the 4080, they would probably go all the way and stretch to the 4090.

With that said though, the 7900XT 20GB is looking more and more like a like a tempting proposition with it's price dipping toward the £750, with it not having better price/perf than the 7900XTX, 4070Ti and 4080.

If you're going to put up the slide for raster then also put up the slide for RT if you're going to make claims for that, its 16% and a chunk of that is down to Cyberpunk which has unplayable frame rates even on the 4090. Cyberpunk is just an RTX demo...


PSXft79.png
 
I said it since the very beginning that people (if they are willing) are essentially paying the extra for the 4080 for Nvidia features (like DLSS and better RT performance etc), and 7900XTX is generally faster than it for 4K Rasterization.

As mentioned the so-called 2% on the relative performance is only good for quick reference but not and accurate representation in the real-world environment due to the way it add everything up and the dividing them into average (and some of the games in the test that the 4080 is faster are older games that are know to favour Nvidia heavily i.e. Civ VI, Witcher 3, Borderland 3, Metro Exodus). Most people are more interested in how much potentially faster in the games that they specifically play, and if it is more likely to be faster in upcoming games in general.

Simply put if anyone care enough for things like RT or DLSS and don't mind paying the extra, then they should go for the 4080; for those that only care about raw performance and don't care about RT or upscaling whatnot, and not interested in spending £100-£200 extra they could go for the 7900XTX. But said already mentioned those that don't mind spending the extra for the 4080, they would probably go all the way and stretch to the 4090.

With that said though, the 7900XT 20GB is looking more and more like a like a tempting proposition with it's price dipping toward the £750, with it not having better price/perf than the 7900XTX, 4070Ti and 4080.
New games tend to have some form of RT which is obviously favor nvidia, for all the old and non RT stuff both have practically identical performance. We cannot ignore RT this gen anymore because gpu's have become powerful enough to allow good performance with the help of FG+DLSS. Anyone who is going to get expensive high-end gpu also cares for RT.

Basically in the real world the 4080 delivers much better performance than the 7900XTX.
 
New games tend to have some form of RT which is obviously favor nvidia, for all the old and non RT stuff both have practically identical performance. We cannot ignore RT this gen anymore because gpu's have become powerful enough to allow good performance with the help of FG+DLSS. Anyone who is going to get expensive high-end gpu also cares for RT.

Basically in the real world the 4080 delivers much better performance than the 7900XTX.
Control and Cyberpunk are pre RDNA and were never made with that GPU in mind. They are RTX demo's.

Later RT games tend to do much better on AMD GPU's, for example Resident Evil Village is the newest game TPU tested and the difference between the 4080 and the 7900XTX is 2%.

Lb3y7hJ.png
 
Last edited:
Anyone who is going to get expensive high-end gpu also cares for RT.

Incorrect. I do not give a rats backside about RT. I also do not care about Frame generation or upscaling of any type.

RT is just going to be used to drive sales of new cards by making the old cards just that little bit too slow when a new game comes out. We have seen this countless times over the past 20 years with shadows, pixel shaders and tesselation. A games company will take Nvs cash to build the game so it will only really run on the latest card and everyone will be just so tempted to buy the new shiny. With most of even the most demanding games you can either buy a 4090 or turn the RT down a notch and not notice a difference unless you like pretty puddles or to pause and pixel peep.

No matter how much Nv is trying to brainwash you into thinking it , RT is not the only feature that actually matters.
 
Incorrect. I do not give a rats backside about RT. I also do not care about Frame generation or upscaling of any type.

RT is just going to be used to drive sales of new cards by making the old cards just that little bit too slow when a new game comes out. We have seen this countless times over the past 20 years with shadows, pixel shaders and tesselation. A games company will take Nvs cash to build the game so it will only really run on the latest card and everyone will be just so tempted to buy the new shiny. With most of even the most demanding games you can either buy a 4090 or turn the RT down a notch and not notice a difference unless you like pretty puddles or to pause and pixel peep.

No matter how much Nv is trying to brainwash you into thinking it , RT is not the only feature that actually matters.

You're right of course, my 2070 Super is still a perfectly good 1440P GPU, unless i use RT, so why should i care about RT at all? i'm not going to buy the latest GPU every year so its just not relevant to me.
 
Last edited:
True. It is just one feature that matters. Especially if you like your graphics :)

Most of the gamers I know,who have bought RTX3000 cards(let alone the AMD ones),bought them for the rasterised performance. But I know nobody who has anything more than an RTX3080 10GB in the realworld and that was bought as part of a system to run MS Flight Simulator!

The only games I knew anyone bother to try RT on,were Cyberpunk 2077 and SOTTR,and at least from my own experiences and talking to mates,we literally had to turn down settings anyway with the former. It's something nice to have,I don't deny it but still most of the games out there need decent rasterised processing power! Even Atomic Heart,that I thought would be an RT showcase,especially with all the mentions from Nvidia,shipped with no RT. Yet it still looks really nice.

However,we seriously need something like at least doubling of current performance on the Nvidia side(under £500) and even more for the AMD side for it to really be something that can be used liberally. So for me I think the RTX5000/RX8000 series will be where things will hopefully get interesting,especially if the consoles get refreshes with more powerful GPUs.
 
Last edited:
Most of the gamers I know,who have bought RTX3000 cards(let alone the AMD ones),bought them for the rasterised performance. But I know nobody who has anything more than an RTX3080 10GB in the realworld and that was bought as part of a system to run MS Flight Simulator!

The only games I knew anyone bother to try RT on,were Cyberpunk 2077 and SOTTR,and at least from my own experiences and talking to mates,we literally had to turn down settings anyway with the former. It's something nice to have,I don't deny it but still most of the games out there need decent rasterised processing power! Even Atomic Heart,that I thought would be an RT showcase,especially with all the mentions from Nvidia,shipped with no RT. Yet it still looks really nice.

However,we seriously need something like at least doubling of current performance on the Nvidia side(under £500) and even more for the AMD side for it to really be something that can be used liberally. So for me I think the RTX5000/RX8000 series will be where things will hopefully get interesting,especially if the consoles get refreshes with more powerful GPUs.

I have had it more games, but it can be hit and miss. In some titles it makes little difference. But when you use it together with DLSS you get the performance you lost from RT back for the most part anyway.
 
I have had it more games, but it can be hit and miss. In some titles it makes little difference. But when you use it together with DLSS you get the performance you lost from RT back for the most part anyway.

In Cyberpunk 2077,there are parts even with DLSS performance can get dicey,so it meant turning down a whole lot of effects. In games which run better,the effects are less well prounounced. It's why Nvidia is trying to copy TVs and add inserted frames,because I have the feeling that dGPUs will be lagging the software for the immediate future,especially in the sub £500 area. But what is quite hilarious,is that Nvidia and their overpricing means an RTX3050 is fighting an RX6600/RX6600XT. Not even RT can save the RTX3050!
 
Last edited:
The PS6 will launch in 2027 at the earliest.

Tech moves fast tho. Asus are about to launch a Steam Deck competitor, he's not allowed to say what hardware is in it but later revealed it has Zen 4 CPU cores, its probably a Phoenix APU, (it has a suspitiouly AMD looking control OSD panel) at the same 15 watts its more than 50% faster than the APU in the Steam Deck, which can run Spiderman at 60Hz 720P.
The Asus unit can go to 35 watts and at that its 2X as fast. at 35 watts its probably as fast as the APU in the XBox Series S, which is 85 Watts.

These looooong console cycles are going to be the death of them given that now everyone and their cat is jumping on the handheald gaming PC bandwagon.

Its a really nice bit of kit.

 
Last edited:
Anyone remember Dragon's Lair?
Lots of pre-determined footage and the player is mostly a spectator, surely the perfect showcase for Fake Frames, sorry Frame Generation!

Maybe it is time for LaserDisc games to make a return :D

Mind you, isn't watching youtubers playing games the modern equivalent of spectating at Dragon's Lair in an arcade?
 
PS6 in 2027, is that long enough for Intel ARC™ Celestial to be the running? Rumours are Battlemage 2024, Celestial 2026. Both TSMC so don't expect the saviour of mainstream PC gaming to come from Intel then...

The hardware for a Steam Deck is surely fairly easy. Getting tons of games certified is way harder. Plus, is the Asus one going to run Windows so MS want their cut, or are they gong to use SteamOS?
 
Last edited:
PS6 in 2027, is that long enough for Intel ARC™ Celestial to be the running? Rumours are Battlemage 2024, Celestial 2026. Both TSMC so don't expect the saviour of mainstream PC gaming to come from Intel then...

The hardware for a Steam Deck is surely fairly easy. Getting tons of games certified is way harder. Plus, is the Asus one going to run Windows so MS want their cut, or are they gong to use SteamOS?

Also means that a PS5 PRO should be happening too.
 
Most of the gamers I know,who have bought RTX3000 cards(let alone the AMD ones),bought them for the rasterised performance. But I know nobody who has anything more than an RTX3080 10GB in the realworld and that was bought as part of a system to run MS Flight Simulator!

The only games I knew anyone bother to try RT on,were Cyberpunk 2077 and SOTTR,and at least from my own experiences and talking to mates,we literally had to turn down settings anyway with the former. It's something nice to have,I don't deny it but still most of the games out there need decent rasterised processing power! Even Atomic Heart,that I thought would be an RT showcase,especially with all the mentions from Nvidia,shipped with no RT. Yet it still looks really nice.

However,we seriously need something like at least doubling of current performance on the Nvidia side(under £500) and even more for the AMD side for it to really be something that can be used liberally. So for me I think the RTX5000/RX8000 series will be where things will hopefully get interesting,especially if the consoles get refreshes with more powerful GPUs.

As discussed more than a year ago then, where we had a handful of capable RT games (that wont be playable to the mass tiers) and not enough games out to reiterate the stingy vram models. Gottcha! :cry:
 
PS6 in 2027, is that long enough for Intel ARC™ Celestial to be the running? Rumours are Battlemage 2024, Celestial 2026. Both TSMC so don't expect the saviour of mainstream PC gaming to come from Intel then...

The hardware for a Steam Deck is surely fairly easy. Getting tons of games certified is way harder. Plus, is the Asus one going to run Windows so MS want their cut, or are they gong to use SteamOS?

Leaks suggest its AMD again.

Also means that a PS5 PRO should be happening too.

Late 2024.
 
As discussed more than a year ago then, where we had a handful of capable RT games (that wont be playable to the mass tiers) and not enough games out to reiterate the stingy vram models. Gottcha! :cry:
It was the same with Rollo,etc going on about tessellation during the Fermi era. The issue is the mainstream and entry level Nvidia dGPUs were not great with it. A lot of people moaned that ATI introduced the ability to manually adjust it as "cheating". But I remember,it meant in a number of cases,the ATI cards could tweak it and run at better FPS than the lower end and mainstream Nvidia cards forced to run it full brunt,but lacked the processing power. The same happened with Kepler with The Witcher 3(I think the GTX780 series was OK though and it was the lower end models which had the issues).

Leaks suggest its AMD again.



Late 2024.
Rumours say this year.
 
Last edited:
I just posted a video in the last of us showing that dlss q is better than native....

And DLSS / FSR with 100% render scale input will be better than DLSS Q meaning a native input image pushed through the DLSS / FSR TAA can often do a better job than a games native TAA solution. DLAA and the FSR equivalent needs to be in every game IMO.
 
Status
Not open for further replies.
Back
Top Bottom