• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
I can forgo a few frames in raster, for the most part I doubt I'll notice the difference in the games I play without an FPS counter on screen. I'm also running the 3090 with a decreased power target so it may be about level with a 4070 running at full power.

It all depends on the price of the FE really. Hopefully it's closer to £500 than £600.

Just undervolt and downclock the RTX3090. I set a custom voltage/frequency curve with my RTX3060TI and saved around 30W~40W IIRC.

The RRP is $599 so it would be around £580 after VAT.
 
Last edited:
Is this a joke?

The average DRR5 memory has about 80GB/s bandwidth and 50ns of latency.
Average GDDR6 has 600GB/s bandwidth and 10ns of latency.

You're effectively turning your RTX 3070 in to a fat iGPU.

There is a reason we don't put DDR on dGPU's anymore.....
whats average here? isnt anyone using ddr5 running at least 6k?
 
Yeah, he also said "we don't have time for optimization so games wonn't run on 8gb". Which makes sense considering the best looking game this gen plays great at 4k ultra with 5gb of VRAM while the atrociously bad looking forspoken and godfall needs twice - three times as much. At this point I feel like you are supposed to pay for more vram not to have better visuals - since youll have worse actually - but to save devs time optimizing.

That is not what he said though is it. He said the amount of work and time it takes to redo all the assets to fit into an 8GB VRAM buffer is not worth the cost.

This was always going to happen though. Consoles that have 16GB of RAM were always going to push GPUs beyond 8GB of VRAM to match their texture settings just like PS4 pushed PCs beyond 4GB of VRAM to match those texture settings. The same will happen in about 7 years when PS6 is released and we have gone through the cross gen phase of that generation. Then you will need more VRAM again to match the PS6 texture quality and on and on it goes.

When the 1080 came out 3 years into the console gen with 8GB of VRAM you could see the 4GB and under cards starting to struggle with the latest games. This is the same thing.

Infact you could liken Ampere to the 700 series where they launched around the same time as the consoles and they were fine until the consoles dropped out of the cross gen phase. The 780Ti with 3GB of ram was just as bad then as a 3080 with 10GB of VRAM was at its launch. The 900 series was better just like the 4000 series is but it was still a bit short outside of the 980Ti and Titan (4080/4090) and then with the 1000 series which launched 3 years into the PS4 generation they actually released x70 and x80 cards with enough vram to last until the next console gen came out of the cross gen phase which is now.

Pretty sure NV will course correct with 5000 series like they did with the 1000 series and they will be very strong parts which will last until the PS6 generation starts to come out of the cross gen phase where VRAM usage will go up again.
 
I've already done that, it'll still draw a fair bit more power and output more heat than a 4070 with an undervolt at stock clocks.

The 8NM process that Ampere was made on was pushed past it's sweet spot:

With the RTX4090,when that same channel tried to undervolt it had some weirdness as the card itself would override the settings.I suspect the RTX4070 will gain less from undervolting because it's basically an RTX3060 replacement,so will be closer to the ideal voltage/frequency curve. It is already quite cut down from the RTX4070TI and running at lower peak clockspeeds.

Personally it is up to you,but even as a person who wants good performance/watt(SFF system user) I wouldn't take a downgrade in performance. The issue is that even competitive shooters are getting more graphically demanding(even with VRAM) and the way dGPU sales are collapsing(Nvidia has over $5 billion in unsold inventory) I would be holding out for a Super refresh. Also coming into summer,I suspect most of us will be gaming less anyway.

Also as a side note,just make sure your motherboard is not overvolting stuff too much,and the PSU you have actually is OK at lower loads. The 80+ certification tends to not cover under 20% loads for many of the tiers. I find you can get useful power savings just examing that too. The same goes with monitors - something that reviewers don't appear to look at either.

thats not relevant to what i quoted as he mentioned ddr5 latency

Yes,but even if you had a lower latency,decent speed DDR5 kit,it's still going to do better in the circumstance he talks about than DDR4. The issue is most people affected by VRAM paging into system RAM,are entry level/mainstream dGPU users with 8GB of VRAM and older/cheaper systems which use DDR4.
 
Last edited:
The 8NM process that Ampere was made on was pushed past it's sweet spot:

With the RTX4090,when that same channel tried to undervolt it had some weirdness as the card itself would override the settings.I suspect the RTX4070 will gain less from undervolting because it's basically an RTX3060 replacement,so will be closer to the ideal voltage/frequency curve. It is already quite cut down from the RTX4070TI and running at lower peak clockspeeds.

Personally it is up to you,but even as a person who wants good performance/watt(SFF system user) I wouldn't take a downgrade in performance. The issue is that even competitive shooters are getting more graphically demanding(even with VRAM) and the way dGPU sales are collapsing(Nvidia has over $5 billion in unsold inventory) I would be holding out for a Super refresh. Also coming into summer,I suspect most of us will be gaming less anway.

Seems to be a long winded way of saying the 4070 will actually be much more efficient at around the same level of performance considering how much I'd have to downclock and undervolt the 3090 to get anywhere close. The 4070 may not undervolt but I can still reduce the power target to make it even more efficient in games that will still run at 240fps.

I've not made this out to be some kind of genius move; it's a sidestep and will only be done if I can do so without losing money. As already said, it's a stopgap until I can buy a 5080 or 5090.

No competitive shooter over the next year or so is going to require more than 12gb VRAM at 1080p, that would lock out most of the playerbase.
 
Last edited:
Seems to be a long winded way of saying the 4070 will actually be much more efficient at around the same level of performance considering how much I'd have to downclock and undervolt the 3090 to get anywhere close.

I've not made this out to be some kind of genius move; it's a sidestep and will only be done if I can do so without losing money. As already said, it's a stopgap until I can buy a 5080 or 5090.

Because I think you are overthinking how much you might save,especially as we are coming into summer. I think you are getting a dose of upgradetitus and selling an upgrade to yourself. I could technically sell this RTX3060TI now and not lose much over what I paid for it,and get an RTX4060 when it is released which is probably as fast and save a few watts. I tend to play older games ATM,which barely tax the dGPU. But even as a person who has 12.7 litre SFF case,I wouldn't bother because it's a pointless upgrade.

If dGPU sales still keep cratering like they are now,if a refresh comes along for similar or less money at the end of the year,you might end up regretting it. But it's your money in the end,so do what you feel is right.
 
Last edited:
Seems to be a long winded way of saying the 4070 will actually be much more efficient at around the same level of performance considering how much I'd have to downclock and undervolt the 3090 to get anywhere close. The 4070 may not undervolt but I can still reduce the power target to make it even more efficient in games that will still run at 240fps.

I've not made this out to be some kind of genius move; it's a sidestep and will only be done if I can do so without losing money. As already said, it's a stopgap until I can buy a 5080 or 5090.

No competitive shooter over the next year or so is going to require more than 12gb VRAM at 1080p, that would lock out most of the playerbase.

You will lose money in the long term.

In 24 months when 5000 releases a 24GB 3090 is going to have better 2nd hand value than a 12GB 4070. Even if you can straight swap now.
 
You will lose money in the long term.

In 24 months when 5000 releases a 24GB 3090 is going to have better 2nd hand value than a 12GB 4070. Even if you can straight swap now.

In 2 years time I couldn't care less what the cards are selling for second hand, there won't be much in it and you've also got to factor remaining warranty into that.

I'd never have considered this at the price mentioned in the thread title but if a 4070 can be had for around £550 it's going straight in my basket.
 
Last edited:
In 2 years time I couldn't care less what the cards are selling for second hand, there won't be much in it and you've also got to factor remaining warranty into that.

I'd never have considered this at the price mentioned in the thread title but if a 4070 can be had for around £550 it's going straight in my basket.

£550 no chance, FE will be £590 min.
 

You are probably correct, but at the time of the 3080/3070 release, the exchange rate wasn't much different and that was £469 for a $499 card. Likewise the $699 3080 was £649.

So, technically it should be £559 -£569 for the 4070 Founders....

It likely won't be and the UK will get fleeced again though as they just seem to be doing 1:1 exchange rates now (but that was when the exchange rate was in the toilet - it is close to the level it was for the 3000 series release now..)
 
Last edited:
That is not what he said though is it. He said the amount of work and time it takes to redo all the assets to fit into an 8GB VRAM buffer is not worth the cost.

This was always going to happen though. Consoles that have 16GB of RAM were always going to push GPUs beyond 8GB of VRAM to match their texture settings just like PS4 pushed PCs beyond 4GB of VRAM to match those texture settings. The same will happen in about 7 years when PS6 is released and we have gone through the cross gen phase of that generation. Then you will need more VRAM again to match the PS6 texture quality and on and on it goes.

When the 1080 came out 3 years into the console gen with 8GB of VRAM you could see the 4GB and under cards starting to struggle with the latest games. This is the same thing.

Infact you could liken Ampere to the 700 series where they launched around the same time as the consoles and they were fine until the consoles dropped out of the cross gen phase. The 780Ti with 3GB of ram was just as bad then as a 3080 with 10GB of VRAM was at its launch. The 900 series was better just like the 4000 series is but it was still a bit short outside of the 980Ti and Titan (4080/4090) and then with the 1000 series which launched 3 years into the PS4 generation they actually released x70 and x80 cards with enough vram to last until the next console gen came out of the cross gen phase which is now.

Pretty sure NV will course correct with 5000 series like they did with the 1000 series and they will be very strong parts which will last until the PS6 generation starts to come out of the cross gen phase where VRAM usage will go up again.
I'm sure I read on here ages ago that you should match your graphics card VRAM to that of the current console generations total RAM if you're interested in the multiplatform games developed for that generation. No citation, but glad to see it isn't just me who thinks this way.
 
You are probably correct, but at the time of the 3080/3070 release, the exchange rate wasn't much different and that was £469 for a $499 card. Likewise the $699 3080 was £649.

So, technically it should be £559 -£569 for the 4070 Founders....

It likely won't be and the UK will get fleeced again though as they just seem to be doing 1:1 exchange rates now (but that was when the exchange rate was in the toilet - it is close to the level it was for the 3000 series release now..)
I calculated with 20% VAT,it will be £580 at the current exchange rates.

So 2.5 years on and a 3080 has got a £70 quid price cut.
Basically yes. But it's worse when you realise the RTX4070 is really an RTX3060 class dGPU. So you have a salvaged 295MM2 AD104 which replaces a salvaged 628MM2 GA102. But more importantly,the RTX3080 had a 320 bit memory bus,at a time GDDR6 was expensive and needed 10 VRAM chips on the PCB. The RTX4070 has a 192 bit memory bus in a time where GDDR6/GDDR6X costs less,plus it needs a much cheaper PCB and cooler setup. It only needs 6 VRAM chips on the PCB.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom