• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Each card has a minimum power draw so you can only adjust power downwards to a certain amount, eg a 6500xt has 2 watts at idle and a 6800 28 watts, while doing video playback the 6500xt uses 12 watts v the 6800 48 watts.

This is why I have 2 AMD cards. I use the 6500xt most of the year as it can handle the older games that I mostly play. Then in the winter I install the 6800 and play the new triple AAA games that I fancy on it alongside the old games I play year round. The 6800 also acts as extra heating in the winter, I never had to put the heating on in the room it was in. The 6500xt is paying for itself.


I have had SFF systems for over 15 years and the bigger of my two current system has a 12 Litre case.There are people literally cramming RTX4090 and RTX3090 dGPUs into this case,so as much power consumption is a consideration for SFF systems,its quite clear for many they don't seem to care the RTX3090 drew 150W more than an RX6800. Its why I find all these power arguments now in favour of Nvidia weird,when it seemed to not be a concern even for SFF PC users,where you would think it would matter. But even I didn't get an RX6600/RX6600XT over an RTX3060TI because they cost almost the same and were slower.

If you start with a faster card,you can underclock,undervolt,etc and push the card into lower power profiles. FPS capping in games also helps.

This is why I learnt to do proper custom curves,FPS capping,etc and I did it for most dGPUs I have owned whether it's AMD or Nvidia. I literally saved 40W doing that with my current card. But FPS capping in older games,etc I can drop 100W maybe more over something like Cyberpunk 2077. But then in my other PC I did the same. I have a Core i5 10400,which technically is a bit less efficient than a Ryzen 5 3600. But I wasn't going to pay £80 more for a Ryzen 5 3600 and even more for the equivalent AMD mini-ITX motherboard(the Intel CPU and motherboard cost as much as a Ryzen 5 3600 at the time). Did some tweaks and it seems to work fine with an Alpenfohn Black Ridge.

The thing is I am not saying power consumption isn't a consideration,but it shouldn't be the main consideration over value. In the case of the so called fake RTX4070,44% more performance over an RTX3060TI for £220 more is ridiculous. The RTX3060TI was 40% faster than the RTX2060 Super for the same price,and so was the RTX2060 Super over the GTX1070.

The GTX1070 was 61% faster than the GTX970 at qHD,had twice the VRAM and drew less power. Price went up from $330 to between $379~$449. But since it beat the $500+ GTX980TI(unless it was overclocked) and had more VRAM people didn't mind so much.

Nvidia had something potentially the same with Ada Lovelace,but it shows you how much they have changed as a company in the last 7 years.
 
Last edited:
But I don't want to play AAA games in the summer. I prefer to spend time outside with the other half, she loves being in the pool

What about in the night ? You get days when it rains what do you do then ? Lucky we get few weeks of sun

Also you paying for the 6500xt how much are you actually saving can you not just undervolt the 6800 for the less demanding games ?

Within reason I rather have performance first
 
Last edited:
The problem with the whole RTX4000 series pricing stack,is even significant price reductions don't make it still OK in value.

For example if the RTX4070 12GB was reduced to £470,after 2.5 years it is barely 25% faster than an RTX3070 with 4GB more of VRAM:

If Nvidia were jebaiting AMD,and the RTX4060TI was 12GB,the leaks indicate RTX3070TI level performance. The only issue that makes it merely 20% more performance than an RTX3060TI with 4GB more of VRAM.

Edit!!

The sad reality is people just buy these things without bothering to check unlike in the past. So all we are having is performance stagnation at current pricing. People trying to spin that an RTX4070 or RX7900XT selling well don't seem to appreciate,a lot of the market isn't moving forward in performance.

As a result people are resorting to blaming consoles and devs,instead of the obvious culprits when the PC doesn't seem to move forward as much.

Remember,two years after the $400 XBox 360(with 512MB of RAM) we got the 8800GT 512MB for $250 just in time for the launch of Crysis with nearly the performance of the 8800GTX.Nearly three years after the £400 PS5 where is the RTX4060 for less than £400 with almost the performance of an RTX3090 with 16GB of VRAM? If people want RT and Pathtracing to go forward we need massive jumps in performance at all tiers.

Too many people are patting themselves in defence of poor hardware having "decent" sales. It seems ignorance is the new badge of honour! :(
 
Last edited:
If people want RT and Pathtracing to go forward we need massive jumps in performance at all tiers.

Too many people are patting themselves in defence of poor hardware having "decent" sales. It seems ignorance is the new badge of honour! :(

This is the point to impress upon. Well put, I see countless posts from folk fapping over the hardware once they buy it however the increasing admission of having to use upscaling to even play some games has nearly won the day.

Now bad ports and excuses aside, just like I pointed out when Turing hyperbole for the weaker cards such as the 2060 at the time - its not really worth the checkbox it comes with as ray tracing on such a weak unit is rather wasteful. As your breakdown above explains time and time again, 20% isn't that great more than two years after. Particularly when your now being asked for the price that the model above was.
 
Within reasonable ranges, power consumption is a non issue, or should be, for gamers.

Performance first, power second.

It isn't a non issue for everyone, https://www.pcworld.com/article/394956/why-california-isnt-banning-gaming-pcs-yet.html

Even the the memory bandwidth of the GPU is taken into account on deciding what computer can be sold in California because of regulations on energy usage, so why wouldn't Nvidia cut down memory bandwidth, introduce more cache and reduce energy use if it meant it was easier to sell their product into various USA states?

The problem with the whole RTX4000 series pricing stack,is even significant price reductions don't make it still OK in value

For example if the RTX4070 12GB was reduced to £470,after 2.5 years it is barely 25% faster than an RTX3070 with 4GB more of VRAM:

There is no need to buy at launch, wait and see if they go down in price eg EOL, remember Jay2cent getting excited about price reductions on gpus in the Nvidia shop lol, https://youtu.be/p79H_XOwpZo?t=61
The extra 4GB isn't free, neither is the node shrink so of course the 4070 costs more than the 3070, even before inflation.
As I have said previously when going to a newer node Nvidia has to decide how much of the benefit goes on performance increase and how much on reduced power consumption. Many people on here complain that they wanted more performance than power savings, however, that ignores growing pressure on Nvidia to reduce power usage, eg as above, https://www.pcworld.com/article/394956/why-california-isnt-banning-gaming-pcs-yet.html
The sad reality is people just buy these things without bothering check unlike in the past. So all we are having is performance stagnation at current pricing. People trying to spin that an RTX4070 or RX7900XT selling well don't seem to appreciate,a lot of the market isn't moving forward in performance.
Is the fact that the 4070 isn't selling well because people have checked out reviews on its performance and decided that it doesn't give enough of an upgrade on the Nvidia 3000/AMD 6000 series that they bought?
I agree with you that the 4070 isn't a good performance increase over 3000/6000 for the price, however, the energy reduction compared to the 3080 is significant and lowers cost to own over life, the more you play the more you save should be the new Nvidia phrase. If AMD get FSR3 working on my 6800 then the 4070 could be seen as a down grade because of the reduced VRAM.
What about in the night ? You get days when it rains what do you do then ? Lucky we get few weeks of sun

Also you paying for the 6500xt how much are you actually saving can you not just undervolt the 6800 for the less demanding games ?

Within reason I rather have performance first
I use Radeon chill to limit power usage on the 6800, plus the power slider. There is a limit to how much you can get the card down to though, eg just compare the vram, 16GB on the 6800 takes more to power than just 4Gb on the 6500xt.
Taken from the review of the 2060 12GB by Techpowerup, "Compared to the RTX 2060, non-gaming power consumption is increased because the extra memory draws additional power." https://www.techpowerup.com/review/nvidia-geforce-rtx-2060-12-gb/35.html

Even the 6500xt I frame cap in less demanding games. I get that many, perhaps most, people wish that the 4070 had used the same amount of power as the 3080 and significantly improved performance, however, as I have shown above there is growing pressure on Nvidia and other computer hardware companies to limit the power usage of computers, eg by the state of California.
Not only does a 300 watt gpu cost electricity to run but it costs money for the air conditioning to remove that heat in some parts of the world, such as my home in Missouri, USA



If people don't think that the 4070 is acceptable over the 3070 I can't wait to read what they think about the 4060ti/4060.
 
Last edited:
Well I live in London England and the extra heat is almost always welcome and the limited amount of days it ain't I am out enjoying the sun :)

An extra 100w is not a huge deal for me personally. That's like around 2p extra per hour for me. Say I managed to game 1000 hours over 2 years before upgrading, that is £20. That is almost worth it just for the extra heat alone :D

Not defending it by the way, I prefer more efficient GPU's myself and 200w is sweet. But it only goes so far when you take the data above into account.
 
The extra 4GB isn't free, neither is the node shrink so of course the 4070 costs more than the 3070, even before inflation.
I think the 4070 is considerably cheaper to make than than a a 3070.

GDDR6X is cheaper than standard G6 since only nvidia use it, the AD104 die is 25% smaller and the 4070 cuts that down by another 25% only using 5888 CUDA out of the possible 7680 so yields would be excellent.

Then there is the empty PCB, a lower tdp and lower bus require less components so another cost saving.

If you want to calculate cost of this GPU then base it of a $330 3060 12gb as the spec and production cost is almost identical bar the die which costs about $30-40 more on TSMC.

Is the fact that the 4070 isn't selling well because people have checked out reviews on its performance and decided that it doesn't give enough of an upgrade on the Nvidia 3000/AMD 6000 series that they bought?
I agree with you that the 4070 isn't a good performance increase over 3000/6000 for the price, however, the energy reduction compared to the 3080 is significant and lowers cost to own over life, the more you play the more you save should be the new Nvidia phrase. If AMD get FSR3 working on my 6800 then the 4070 could be seen as a down grade because of the reduced VRAM.
Its not selling well because it offers 60 class performance for £600, if people bought based off energy consumption the lower end cards would cost more than higher end cards.

A 3060ti beat a 2080 by over 10% in performance, used less energy and was £350 cheaper so for a 4070 to match/lose to a 3080 use less power and only be £50 cheaper is nowhere near enough.
 
Last edited:
perhaps most, people wish that the 4070 had used the same amount of power as the 3080 and significantly improved performance, however, as I have shown above there is growing pressure on Nvidia and other computer hardware companies to limit the power usage of computers, eg by the state of California.
Not only does a 300 watt gpu cost electricity to run but it costs money for the air conditioning to remove that heat in some parts of the world, such as my home in Missouri, USA

It's like buying a car that is really good on fuel but hasn't got the torque to climb the first hill it gets to.

I get your points about power use but it's a secondary factor, after ensuring the performance is there.

Pc gamers want to push the performance as far as possible, otherwise you'd just buy a console.
 
Some IT companies are selling these cards for <£500, but they aren't available to the public at that price.

This probably gives an indicator of what they're worth - But value is relative and depends what people are willing to pay.
 
Last edited:
Some IT companies are selling these cards for <£500, but they aren't available to the public at that price.

This probably gives an indicator of what they're worth - But value is relative and depends what people are willing to pay.

You sure you are not just looking at prices without vat?
 
Each card has a minimum power draw so you can only adjust power downwards to a certain amount, eg a 6500xt has 2 watts at idle and a 6800 28 watts, while doing video playback the 6500xt uses 12 watts v the 6800 48 watts.

This is why I have 2 AMD cards. I use the 6500xt most of the year as it can handle the older games that I mostly play. Then in the winter I install the 6800 and play the new triple AAA games that I fancy on it alongside the old games I play year round. The 6800 also acts as extra heating in the winter, I never had to put the heating on in the room it was in. The 6500xt is paying for itself.


How much are you paying for KWh of electricity.?
 
Well I live in London England and the extra heat is almost always welcome and the limited amount of days it ain't I am out enjoying the sun :)

An extra 100w is not a huge deal for me personally. That's like around 2p extra per hour for me. Say I managed to game 1000 hours over 2 years before upgrading, that is £20. That is almost worth it just for the extra heat alone :D

Not defending it by the way, I prefer more efficient GPU's myself and 200w is sweet. But it only goes so far when you take the data above into account.
You are paying 20p per kWh of electricity in London.?
 
So the 4060Ti will be using the 3060Ti board?

Consumers going to see the saving passed on? (Not that I'm after an 8Gb GPU mind)

Edit: sorry forgot to add the /sarcasm if it's unclear at all :D
 
Last edited:
How much are you paying for KWh of electricity.?
Capped rate for my home in the UK. For the USA since 1st April, "The new rate for residential accounts will be 11.71 cents per kilowatt-hour."

I think the 4070 is considerably cheaper to make than than a a 3070.

GDDR6X is cheaper than standard G6 since only nvidia use it, the AD104 die is 25% smaller and the 4070 cuts that down by another 25% only using 5888 CUDA out of the possible 7680 so yields would be excellent.

Then there is the empty PCB, a lower tdp and lower bus require less components so another cost saving.

If you want to calculate cost of this GPU then base it of a $330 3060 12gb as the spec and production cost is almost identical bar the die which costs about $30-40 more on TSMC.

Don't forget since the MRSP of the 3070 was announced not only is the smaller node more expensive but TSMC announced 10 to 20% price increase. Substrate prices went up, other component prices went up, even the price of the cardboard box went up. When the selling price goes up even extra vat goes up, so if the selling price goes up by £50 you have to add an extra £10 for VAT.
We heard that companies such as nvidia were booking long term supply contracts to ensure supply of components to make gpus, so are they locked into higher prices? Also it has been claimed that they booked more capacity at TSMC than they likely need so is the extra cost of any unused capacity figured into the price. While gamers shouldn't have to pay extra for gpus because Nvidia signed expensive contracts that they no longer need for components that are now readily available, is this factored into the BOM cost for the 4000 series. My experience in business is that it would be, if it has occured
 
Last edited:
Capped rate for my home in the UK. For the USA since 1st April, "The new rate for residential accounts will be 11.71 cents per kilowatt-hour."



Don't forget since the MRSP of the 3070 was announced not only is the smaller node more expensive but TSMC announced 10 to 20% price increase. Substrate prices went up, other component prices went up, even the price of the cardboard box went up. When the selling price goes up even extra vat goes up, so if the selling price goes up by £50 you have to add an extra £10 for VAT.
We heard that companies such as nvidia were booking long term supply contracts to ensure supply of components to make gpus, so are they locked into higher prices? Also it has been claimed that they booked more capacity at TSMC than they likely need so is the extra cost of any unused capacity figured into the price. While gamers shouldn't have to pay extra for gpus because Nvidia signed expensive contracts that they no longer need for components that are now readily available, is this factored into the BOM cost for the 4000 series. My experience in business is that it would be, if it has occured
Yet AMD managed to either cut or keep the same prices for its CPUs built on 5nm
 
Status
Not open for further replies.
Back
Top Bottom