• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Perhaps because they over charged MRSP for last gen, https://youtu.be/VFHOZN5AV6E?t=1377
Its pretty nuts watching him rant about that card at the time, $380. oof....

Its £210 now. a really good card for that.


RTX 3060, £300... still oof.

 
Last edited:
Its pretty nuts watching him rant about that card at the time, $380. oof....

Its £210 now. a really good card for that.


RTX 3060, £300... still oof.


12gb though :p
 
I mean, sure, it has that going for it, but its not a £300 card and its a shame Nvidia didn't do that for GPU's higher up with more grunt.

No I agree, i was just having a laugh.

It should be a sub £250 card by now.

However, i think the 12gb variant is likely going to remain fairly sought after considering how it has been shown to beat the 3070 in some cases due to the vram.

It might weirdly be a better choice over the 4060 as well if that indeed only has 8gb vram, and hold its value a bit longer.

I mean it is pretty hilarious that an £800 latest gen card has the same vram as a £300 past gen one, lol.
 
Last edited:
Capped rate for my home in the UK. For the USA since 1st April, "The new rate for residential accounts will be 11.71 cents per kilowatt-hour."



Don't forget since the MRSP of the 3070 was announced not only is the smaller node more expensive but TSMC announced 10 to 20% price increase. Substrate prices went up, other component prices went up, even the price of the cardboard box went up. When the selling price goes up even extra vat goes up, so if the selling price goes up by £50 you have to add an extra £10 for VAT.
We heard that companies such as nvidia were booking long term supply contracts to ensure supply of components to make gpus, so are they locked into higher prices? Also it has been claimed that they booked more capacity at TSMC than they likely need so is the extra cost of any unused capacity figured into the price. While gamers shouldn't have to pay extra for gpus because Nvidia signed expensive contracts that they no longer need for components that are now readily available, is this factored into the BOM cost for the 4000 series. My experience in business is that it would be, if it has occured
Sorry, how much are you paid?, I see you make more excuses for Nvidia than Jensen has leather jackets!! LOL

But seriously, this company is raking in 60% margins, their actions are for greed only, they are not satisfied with "profit" they need megga growth to prop up their bloated share price due to covid and crypto mining, now they expect the general pucblic, the typical low spec PC gamer to make sure they don't lose investors.....

We do not care about Nvidia profits, make products that delight your customers with solid performance & value for money or GTFO and just stop making GPUs, because you're just wasting my time, yes I am talking to you Nvidia. And AMD, stop watching what big daddy does before you make a move, either make a solid product with solid value for money or you can do one too!, its an utter joke your 7800xt might be slower than a 4070ti, it just goes to show what the 7900xt really is and should have been, urine takers....both of them.
 
Last edited:
There is no need to buy at launch, wait and see if they go down in price eg EOL, remember Jay2cent getting excited about price reductions on gpus in the Nvidia shop lol, https://youtu.be/p79H_XOwpZo?t=61
The extra 4GB isn't free, neither is the node shrink so of course the 4070 costs more than the 3070, even before inflation.
As I have said previously when going to a newer node Nvidia has to decide how much of the benefit goes on performance increase and how much on reduced power consumption. Many people on here complain that they wanted more performance than power savings, however, that ignores growing pressure on Nvidia to reduce power usage, eg as above, https://www.pcworld.com/article/394956/why-california-isnt-banning-gaming-pcs-yet.html

Is the fact that the 4070 isn't selling well because people have checked out reviews on its performance and decided that it doesn't give enough of an upgrade on the Nvidia 3000/AMD 6000 series that they bought?
I agree with you that the 4070 isn't a good performance increase over 3000/6000 for the price, however, the energy reduction compared to the 3080 is significant and lowers cost to own over life, the more you play the more you save should be the new Nvidia phrase. If AMD get FSR3 working on my 6800 then the 4070 could be seen as a down grade because of the reduced VRAM.

VRAM prices have crashed. Components have gone down in price since the pandemic - companies in other areas have even said so. The PCB is much simpler and so is the cooler. Stop reading Nvidia marketing. When Kepler came out a decade ago,there was a convenient leak of an Nvidia slide saying how much TSMC 28NM cost to use. After this people were pointing out the GK104 was another tiny die being pushed up in pricing. The Nvidia uberfans spun,that is why Nvidia doubled the price of the large die product,ie,the £450 GTX580(GF110) into the £900+ Geforce Titan(GK110),and doubled the price of the medium die GTX560TI replacement into the £400+ GTX680.

In the end it was a load of nonsense,because once AMD had a competitor to the GK110 in the wings,Nvidia pre-empted it with the GK110 based GTX780 at the GTX680 price. Once the R9 290X launched,suddenly we had a GTX780TI. So all this stuff about Nvidia "costs increasing" justifying increasing an RTX3060 replacement at £800(RTX4070TI) is a load of nonsense on the part of Nvidia.

PCMR needs to stop excuse making for them and acting in denial that a company is trying to rip off "loyal gamers". I don't give two hoots about Nvidia costs. This is almost coming across as Stockholm Syndrome.

If people care so much about cost,then maybe people shouldn't ever moan about energy prices or drug prices at all. Big Oil and Big Pharma have costs too.


I use Radeon chill to limit power usage on the 6800, plus the power slider. There is a limit to how much you can get the card down to though, eg just compare the vram, 16GB on the 6800 takes more to power than just 4Gb on the 6500xt.
Taken from the review of the 2060 12GB by Techpowerup, "Compared to the RTX 2060, non-gaming power consumption is increased because the extra memory draws additional power." https://www.techpowerup.com/review/nvidia-geforce-rtx-2060-12-gb/35.html

Even the 6500xt I frame cap in less demanding games. I get that many, perhaps most, people wish that the 4070 had used the same amount of power as the 3080 and significantly improved performance, however, as I have shown above there is growing pressure on Nvidia and other computer hardware companies to limit the power usage of computers, eg by the state of California.
Not only does a 300 watt gpu cost electricity to run but it costs money for the air conditioning to remove that heat in some parts of the world, such as my home in Missouri, USA



If people don't think that the 4070 is acceptable over the 3070 I can't wait to read what they think about the 4060ti/4060.

Which again is OK for people have no clue. For PC gamers who have a clue,they can drop power a huge amount. I had SFF PCs for over 15 years,so no I don't agree with the of the selling lower end trash for high prices make sense.

The RTX4070 is just over 40% faster than an RTX3060TI for 60% more money. The RTX3060TI was 40% faster than an RTX2060 Super for similar money.

This marketing nonsense about saving money with power existed years ago. People were spending £200 on GTX960 cards because the R9 290,which destroyed it in performance,would cost more in power and these people had full sized tower cases not even a SFF PC.Yet those same people just had to upgrade quicker so how is this saving money?

So all this spin of saving money is secondary to performance. If people care so much about power,an XBox Series S sips power - DF measured 84W at the wall under full load.

The XBox Series X consumes between 150W~170W. Laptops destroy most desktops on power effiency too. The best binned consumer parts go to laptops.

Owning a desktop and large PC monitors is not an indication of really wanting to save power.

Sorry, how much are you paid?, I see you make more excuses for Nvidia than Jensen has leather jackets!! LOL

But seriously, this compay is raking in 60% margins, their actions are for greed only, they are not satisfied with "profit" they need megga growth to prop up their bloated share price due to covid and crypto mining, now they expect the general pucblic, the typical low spec PC gamer to make sure they don't lose investors.....

We do not care about Nvidia profits, make products the delight your customers with solid performance & value for money or GTFO and just stop making GPUs, because you're just wasting my time, yes I am talking to you Nvidia. And AMD, stop watching what big daddy does before you make a move, either make a solid product with solid value for money or you can do one too!, its an utter joke your 7800xt might be slower than a 4070ti, it just goes to show what the 7900xt really is and should have been, urine takers....both of them.

An example of anti-consumerists. PCMR and gamers in general are some of the weakest willed,most easily manipulated consumers in the world. I thought Apple uberfans were bad,but Apple hardware sales have crashed showing they have gotten enough.

Apple has much lower GAAP gross margins than Nvidia. Soon people will be defending Big Pharma and Big Oil profits on here. It is Big Oil who are doing the same as Nvidia and jacking up prices because they can,and taking advantage of the war.

Yet PCMR who is complaining about energy prices,now makes excuses for tech companies who are doing the same. You can't even make this cognitive dissonance up.

This unlike decades ago where PCMR was proper PCMR. It increasingly seems to be filled by people celebrating being ripped off and living in ignorance. No wonder microtransactions and all that rubbish works so well with gamers. FOMO is so ingrained they can't even see when companies are mocking them.

Nvidia is mocking PCMR as a bunch of cretins,and increasingly so is AMD.
 
Last edited:
This is half the issue with fake environmentalism being indoctrinated into people. Slower products need upgrading quicker so it means more energy needs to be expended to make them,ie,more pollution. Plus if you have to upgrade quicker and spend more money,you are not saving money.

It's better for the environment for companies like Nvidia to sell an RTX4070TI for RTX3070 money,because the people buying them will have to upgrade less often.

It is better for people's financials as you can always dial down power if required,and a faster product will last longer.

Selling low end products for higher prices only helps tech companies:
1.)It means they get out of date quicker,so they get more repeat sales
2.)The products cost less to make so they make more money
3.)Pushes people to even higher priced products

But for the consumer and the environment:
1.)The consumer has less longevity so has to buy more often,so it costs the end user more
2.)Terrible for the environment as more products have to be made

Apple has lower GAAP gross margins than Nvidia,so Nvidia can "afford" to suck up the costs. You would think after Turing and even Kepler,PCMR would learn. But they have the memory of sieves.But Nvidia,AMD and Intel think PCMR,because of its weak will FOMO nature,who celebrates being ripped off,will prop them up as a charities. Charities which get billions of USD in US taxpayer funds:

The US taxpayer can prop their own companies up.

Sadly this will work. Its why Intel and even AMD are trying their best too,and PCMR deserves the market it gets.

I thought Nvidia,etc were trying to emulate Apple,but it seems PCMR has gone past peak Apple. Apple needs to be learning some new tricks from PCMR.
 
Last edited:
Sorry, how much are you paid?, I see you make more excuses for Nvidia than Jensen has leather jackets!! LOL

But seriously, this company is raking in 60% margins, their actions are for greed only, they are not satisfied with "profit" they need megga growth to prop up their bloated share price due to covid and crypto mining, now they expect the general pucblic, the typical low spec PC gamer to make sure they don't lose investors.....

We do not care about Nvidia profits, make products that delight your customers with solid performance & value for money or GTFO and just stop making GPUs, because you're just wasting my time, yes I am talking to you Nvidia. And AMD, stop watching what big daddy does before you make a move, either make a solid product with solid value for money or you can do one too!, its an utter joke your 7800xt might be slower than a 4070ti, it just goes to show what the 7900xt really is and should have been, urine takers....both of them.
Ever heard of don't attack the person, challenge the action? The 4070 offers around 3080 performance, indeed better than it if using dlss3 and the 3080 is limited by 10GB vram. While doing so using much less power.
We are not carbon neutral in the UK so any reduction in energy usage is kinder to the environment, kinder on budget gamers wallets and reduces the need of the UK to import energy. These designs often end up as forming the basis of cards for other work such as professional use and mining so hopefully they will use less power as well.

Its a fact that the 3080 class gpu used much more power than the 80 class fermi (480) which was described as a power hog. When I don't agree that the 3080 was a good card because it used to much power and I prefer the power usage of the 4070 I am making excuses for Jensen and being paid by him, not sure how that works?
VRAM prices have crashed. Components have gone down in price since the pandemic - companies in other areas have even said so. The PCB is much simpler and so is the cooler. Stop reading Nvidia marketing.

If people care so much about cost,then maybe people shouldn't ever moan about energy prices or drug prices at all. Big Oil and Big Pharma have costs too.

Some prices might have gone down since the highs of covid and the chip shortage, but how many have gone down to below the levels when the 3070 MRSP was set, eg. has the TSCM price for their smaller node crashed to below the cost of the samsung node, last I heard TSCM had actually increased prices since the pandemic and are going to increase costs further. So since the beginning of 2022 TSMC might have increased the price of an already more expensive node for the 4070 by 29%, https://www.tomshardware.com/news/tsmc-warns-clients-of-up-to-9-price-hike-in-2023
Commodity prices such as copper for heat pipes and pcb is still not back to pre covid levels.

Not sure about your 2nd line there, you seem to be the one moaning about the price/performance of the 4070. I prefer the price performance of the 4070 to the 3080 as its cheaper, has more vram, dlss 3 and a lower cost of ownership because of reduced energy costs. So anyone on a 2000 series who skipped the 3080 and buys a 4070 is getting IMO a better product at a cheaper price. Some people might have wanted more fps from the 4070 over the 3080, which dlss3 will give them in games like cyberpunk, and others might welcome the lower power usage.

Which again is OK for people have no clue. For PC gamers who have a clue,they can drop power a huge amount. I had SFF PCs for over 15 years,so no I don't agree with the of the selling lower end trash for high prices make sense.

The RTX4070 is just over 40% faster than an RTX3060TI for 60% more money. The RTX3060TI was 40% faster than an RTX2060 Super for similar money.

This marketing nonsense about saving money with power existed years ago. People were spending £200 on GTX960 cards because the R9 290,which destroyed it in performance,would cost more in power and these people had full sized tower cases not even a SFF PC.Yet those same people just had to upgrade quicker so how is this saving money?

So all this spin of saving money is secondary to performance. If people care so much about power,an XBox Series S sips power - DF measured 84W at the wall under full load.

The XBox Series X consumes between 150W~170W. Laptops destroy most desktops on power effiency too. The best binned consumer parts go to laptops.

Owning a desktop and large PC monitors is not an indication of really wanting to save power.
I thought that my 6500xt is basically a laptop gpu, it uses 2 watts at idle!
If nobody really cares about power why do reviewers include it, eg. https://youtu.be/DNX6fSeYYT8?t=1106
Has the focus on gpus shifted inappropriately to fps per dollar, at the expense of the environmental impact of gaming? Are many gamers being selfish when they recognize that a 200+ watt gpu is a power hog but want 300+ watt gpus

An example of anti-consumerists. PCMR and gamers in general are some of the weakest willed,most easily manipulated consumers in the world. I thought Apple uberfans were bad,but Apple hardware sales have crashed showing they have gotten enough.

Apple has much lower GAAP gross margins than Nvidia. Soon people will be defending Big Pharma and Big Oil profits on here. It is Big Oil who are doing the same as Nvidia and jacking up prices because they can,and taking advantage of the war.

Yet PCMR who is complaining about energy prices,now makes excuses for tech companies who are doing the same. You can't even make this cognitive dissonance up.
Why is USA states like California trying to reduce power usage of computer parts like gpus making excuses for Nvidia? Why is it not a good thing? https://www.makeuseof.com/why-the-california-ban-on-power-hogging-pcs-is-a-good-thing/

States like California want gpus to be more energy efficient, the 4070 is compared to the 3080
 
Last edited:
RTX3070 vs RTX4070 vs RTX3080 coolers

RTX3070

M3f8MHO.jpg

ZZ1kZTt.jpg

RTX4070

hz2M53U.jpg
uuHqdrN.jpg

RTX3080


qo5QT7P.jpg

bFv4D4o.jpg

The RTX3080 used a vapour chamber and has a massive heatsink with two fans. The RTX3070 has a bigger heatsink with two fans. The RTX4070 cooler looks very cheap - even cheaper than the cooler used in the RTX2060FE:

Vc2XRDH.jpg

NQDRuU4.jpg

RTX3070 vs RTX4070 vs RTX3080 PCB

The PCB of the RTX4070 looks much cheaper and simpler than an RTX3070,RTX3080 or even an RTX2060!

RTX3070 PCB

hIW7eol.jpg
jjtWqhk.jpg

RTX4070 PCB

phVHPFr.jpg
uWu0T4o.jpg

RTX3080 PCB

QjdQkBy.jpg

nIKSz8c.jpg

RTX2060 PCB

NKp7eEU.jpg

ftpZFuO.jpg

Looks very cheap to make,and probably is much cheaper than an RTX3070 or even an RTX2060 to make.
 
Last edited:
Last edited:
If you wanted to spend that little bit more on the 4070.


Why would people spend 4070Ti money on a water cooled 4070?

I, erm?... what? That makes no sense, you might pay a lot more for a water cooled 4090, because you're an enfusiast with money to burn and you can't go any higher than a 4090, a water cooled 4070? who is this for?
 
Magic 8 Ball says "don't count on it". :D

4080 used AD 103 which was also used in 4090 Laptop with 16GB
4070Ti/4070 used AD104 which was also used in 4080 Laptop with 12GB

So I imagine 4060 will use AD106 which is the 4070 Laptop part that currently has 8GB on a 128-Bit bus



Edit:
6750XT is the ~£400 12Gb option now

Bumping for posterity, now that news articles are cropping up with this :D

4060Ti also doesn't even look likely to get the full quota of AD106's shaders

 
So anyone on a 2000 series who skipped the 3080 and buys a 4070 is getting IMO a better product at a cheaper price. Some people might have wanted more fps from the 4070 over the 3080, which dlss3 will give them in games like cyberpunk, and others might welcome the lower power usage.

It's only a slightly better product, is the point. It's not great progress from gen to gen. You are right that it's a better product than the 3080, for slightly less money, but that is performance that could be had over 2 years ago.

People want progress at all levels in the hierarchy not the holding back of performance in order to push people into the more expensive models.
 
Bumping for posterity, now that news articles are cropping up with this :D

4060Ti also doesn't even look likely to get the full quota of AD106's shaders


Going to be hilarious when the 3060 is actually faster than these in some games.
 
It's only a slightly better product, is the point. It's not great progress from gen to gen. You are right that it's a better product than the 3080, for slightly less money, but that is performance that could be had over 2 years ago.

People want progress at all levels in the hierarchy not the holding back of performance in order to push people into the more expensive models.
It is good progress just more on wattage than fps, many people might not like that much of the progress is by way of power consumption over fps, but in regards to frames per watt it has made good performance progress, which is the way states like California are directing computer hardware companies as they are banning PCs that use to much energy according to certain criteria.

Your idea of progress eg fps, differs from mine fps per watt. There will be people who agree with your view and people who welcome the progress on performance per watt rather than fps. This review recognises the progress on power consumption, https://youtu.be/DNX6fSeYYT8?t=1106

Progress on watts per frame is still progress
 
Last edited:
It is good progress just more on wattage than fps, many people might not like that much of the progress is by way of power consumption over fps, but in regards to frames per watt it has made good performance progress, which is the way states like California are directing computer hardware companies as they are banning PCs that use to much energy according to certain criteria.

Your idea of progress eg fps, differs from mine fps per watt. There will be people who agree with your view and people who welcome the progress on performance per watt rather than fps. This review recognises the progress on power consumption, https://youtu.be/DNX6fSeYYT8?t=1106

Progress on watts per frame is still progress

?

Yes, but normally that goes hand in hand with progress in price/performance as well, which is almost completely lacking in this gen.

No one can be truly that enthusiastic that it uses 100w less for the same performance and almost same price. Big woop.

Also, power has to naturally come down anyone otherwise we'd have GPU's sucking 2000w by now (ie so power usage improvement is a given really).
 
Last edited:
The 4070 offers around 3080 performance, indeed better than it if using dlss3 and the 3080 is limited by 10GB vram. While doing so using much less power.
We are not carbon neutral in the UK so any reduction in energy usage is kinder to the environment, kinder on budget gamers wallets and reduces the need of the UK to import energy.
This is generally what happens when a new gen comes out but instead of a £600 70 class offering previous gen 80 performance its the £300-400 60 class which offers or beats the previous gen 80 while using a lot less power.
 
It is good progress just more on wattage than fps, many people might not like that much of the progress is by way of power consumption over fps, but in regards to frames per watt it has made good performance progress, which is the way states like California are directing computer hardware companies as they are banning PCs that use to much energy according to certain criteria.

Your idea of progress eg fps, differs from mine fps per watt. There will be people who agree with your view and people who welcome the progress on performance per watt rather than fps. This review recognises the progress on power consumption, https://youtu.be/DNX6fSeYYT8?t=1106

None the less the vast majority of the customer base for GPUs buys on progression of performance whether they directly know that or not. Most generation on generation changes there is a decrease in power use at the same performance level and that can be exaggerated if you rename parts to different tier levels to what they really are albeit Ada is a fairly decent increase in power efficiency at the same performance level even without those shenanigans.
 
Personally I think nVidia is only hurting themselves though there likely is no shortage of the average consumer who is buying, there will be plenty of people like myself who'd normally run out and buy something, who don't have specific budget constraints, but don't just buy something of poor value because we can afford it.
 
Status
Not open for further replies.
Back
Top Bottom