• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Well that's my point, if DLSS lowers image quality like you are suggesting then native 1440p should look better than 4k DLSS Quality, since they both render at 1440p. But it doesn't. It actually looks much worse. Which is the point.

Well yeah thats pretty obvious as lowering the resolution on your monitor is going to look bad. In real world use people don't do that. If I have a 4k monitor I'm not doing to run 1080p or 1440p because it will look bad. I'm going to choose either 4k native resolution or 4k with DLSS if the game supports.
4k native on a 4k resolution looks better than 4k with DLSS especially in motion though some games DLSS does look near indistinguishable but there are alway scenes that you can see artifacts if you go out of your way to look for them.

Anyway this thread is getting a bit silly now. Maybe drop the whole upscaling discussion.
 
Well moving the thread back on topic from the discussion around pixels, I bought a FE model for science! (I also own two ARC A7xx GPU's purely for science... :cry: - although given current pricing and performance uplifts they are actually not a bad choice at all).

Mainly want to test out DLSS3 and see what can be achieved by messing around with power limits (something I enjoy on my laptops given the fixed max TGP enforced). The 4070 is not great value vs previous generations, but its the cheapest card that meets my testing requirements (short of waiting for the VRAM gimped 4060Ti)*. I will likely pick up a 7900XTX/4080Ti (latter if/when launched) when the prices come down but for now this will do for my purposes. For context My TUF 7900XT was returned when it started to howl like a banshee in Destiny 2. The actual GPU itself was a brute for its performance.

*Plus I really like the cooler design so having a mini me that I can put on my shelf alongside the other GPU's I have collected over the years is nice.

My little R5 7600 should be enough CPU for it at 3400x1440 and it will fit in the ITX chassis I have on hand if I decide to back that way in the future.

So TL:DR. Bought a FE model to test. Don't recommend it at current pricing for a purely gaming card but am interested in seeing what it can do for myself.
 
Actually if your card has the horsepower for 4k native in a specific game, it's better to use DLDSR.. Say you are getting 80 fps at 4k and you want better image quality, you use DLDSR to render at 5461*2880 and then use DLSS on top of that. Performance should be a little bit lower than 4k native but image quality will be oh mama
You mean this?



There is no difference in frame rate between running DSR normal and DSR DL although supposedly is 2x faster (or whatever tha 2x more efficient means).
 
Last edited:
Which titles? :p

Well the latest Star Wars game, even The Last of Us.

And all the new titles we didn't know about until recently like Immortals of Aveum.

I dont think its a coincidence AMD put a healthy amount of vram on their gpus.

That to me is an indicator.

---

Primarily my comments are made in light of the price of the 4080. The cheapest one being £1100 and how long is 16Gb really going to last.

I think some titles and more in the near future will use that all up at 1440p.

 
Last edited:
Well the latest Star Wars game, even The Last of Us.

And all the new titles we didn't know about until recently like Immortals of Aveum.

I dont think its a coincidence AMD put a healthy amount of vram on their gpus.

That to me is an indicator.

---

Primarily my comments are made in light of the price of the 4080. The cheapest one being £1100 and how long is 16Gb really going to last.

I think some titles and more in the near future will use that all up at 1440p.


Star Wars will be fixed just like The Last of Us was. You should ignore what people say on launch. Was it not HUb who made a big deal about The Last of Us to get clicks. Now they are left looking somewhat silly. Not that they care, they got their clicks :cry:

Oh and £1100 or whatever the 4080 is way too much for that card. Things will be better next gen imo. How much better, who knows.
 
Last edited:
Something like this?

Close, but not quite. This chart doesn't show how much more power modern cards are pulling. It shows their efficiency, which isn't quite the same thing?
 
Star Wars will be fixed just like The Last of Us was. You should ignore what people say on launch. Was it not HUb who made a big deal about The Last of Us to get clicks. Now they are left looking somewhat silly. Not that they care, they got their clicks :cry:

Oh and £1100 or whatever the 4080 is way too much for that card. Things will be better next gen imo. How much better, who knows.

Only thing is TLOU is emblematic of what can be expected, plus it's not a unique event, just the latest to hit the headlines in any big way because it hit a common hardware limitation.
If that is what happens with a game from 2013 which was originally designed for console hardware of the time, what's going to happen with games designed for current gen or the next when it arrives to PC?
Not to mention that games released in beta state, followed by rushed patches, has also been pretty standard for a while.
 
Last edited:
So - different question. Has anyone seen a generational analysis based on power usage? The price of cards has varied hugely in the last few years, because of mining, scalping etc. But the power used to push the pixels should be a fairly independent measure.

I remember being surprised at the power level of the 3000-series, and the 4090 is nuts. But how this changes over generations would be interesting - if anyone has seen anything like that?
The 4000 series are about 50% faster at the same power as the 3000 series while those where about 20% than the 2000 series at the same power.

The issue is for that 50% performance gain you are paying 50% or more at the minimum on all sku exept the 4090.
 
Last edited:
Well the latest Star Wars game, even The Last of Us.

And all the new titles we didn't know about until recently like Immortals of Aveum.

I dont think its a coincidence AMD put a healthy amount of vram on their gpus.

That to me is an indicator.

---

Primarily my comments are made in light of the price of the 4080. The cheapest one being £1100 and how long is 16Gb really going to last.

I think some titles and more in the near future will use that all up at 1440p.

Fwiw, patches didn't decrease vram usage in TLOU, bit crap reducing texture settings to fit 10Gb but it is what it is.

No chance I'll be paying ~£1200 for 16Gb, I'll either keep what I have or go 7900XT.
 
Only thing is TLOU is emblematic of what can be expected, plus it's not a unique event, just the latest to hit the headlines in any big way because it hit a common hardware limitation.
If that is what happens with a game from 2013 which was originally designed for console hardware of the time, what's going to happen with games designed for current gen or the next when it arrives to PC?
Not to mention that games released in beta state, followed by rushed patches, has also been pretty standard for a while.

We will soon see I guess. I personally think 12gb will be fine until next gen is out. Just like I thought 10gb was fine last gen until this gen was out.

Next gen 16GB will be the minimum needed however. My opinion anyway :)
 
Fwiw, patches didn't decrease vram usage in TLOU, bit crap reducing texture settings to fit 10Gb but it is what it is.

No chance I'll be paying ~£1200 for 16Gb, I'll either keep what I have or go 7900XT.

10gb? The 4070 has 12gb last I checked and @mrk said it worked fine with 12gb after the patch. Don’t think he reduced settings?
 
Close, but not quite. This chart doesn't show how much more power modern cards are pulling. It shows their efficiency, which isn't quite the same thing?

You can see power consumption on every self respecting review(er).

IE

power-gaming.png


We will soon see I guess. I personally think 12gb will be fine until next gen is out. Just like I thought 10gb was fine last gen until this gen was out.

Next gen 16GB will be the minimum needed however. My opinion anyway :)

If they'll push heavily towards direct storage kind of thing, apparently you'll need about 32gb vram (coming directly from nvidia). Video was posted by @humbug, i think, a while back and after seeing it (finally)... don't know. Based on what is happening now and the attitude of the guy speaking with MLID, there could be little incentive to optimize games in the future when you can brute force it through console's stogae hw and then expect PCs to double do it through even more powerful hardware. On a quick read about xbox and ps, xbox storage would require about 4 zen2 cores and ps 9 zen2 cores to decompress the data while playing, so that will mean big CPU requirements could also be here to stay until Direct Storage comes to PC (at least to those on win11) and then you'll get double digit performance regression in GPU performance as the card will be busy with that. Maybe dx13 would require for the cards to have their own dedicated hardware for that
Considering the price they ask for these card is a total mess.
:p


What I don't understand now is the purpose of (what seems to be) the over engineered storage systems for these 2 consoles. What that does that couldn't have been done with the already existing streaming/decompressing algo used now together with a fast nvme drive.
 
  • Like
Reactions: TNA
Status
Not open for further replies.
Back
Top Bottom