• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Because increasingly people seem aware that nvidia is pushing tiers up. This generation is probably worse than Turing in many ways, because it's on a new node too.

And less and less components on the PCB... Go look at the 3080 FE PCB which is what this card really competes/matches with and that was sold for £650 with a lot more components and larger cooler. Nvidia has shrunk the chips and the components on the boards so making a huge saving that way, these cards must have huge margins compared to even the 3080.

Also the funny thing is there maybe even a future 8GB version as there is 8 spaces for memory chips, or even a 16GB version SUPER DUPER.
 
Last edited:

And less and less components on the PCB... Go look at the 3080 FE PCB which is what this card really competes/matches with and that was sold for £650 with a lot more components and larger cooler. Nvidia has shrunk the chips and the components on the boards so making a huge saving that way, these cards must have huge margins compared to even the 3080.

Plus the VRAM chips,and components are much cheaper now than two years ago apparently. Then when you think the GA102 was over 600MM2,and the AD104 is under 300MM2 and the cooler is cheaper too,Nvidia will making huge margins on these.
 
To be fair to Nvidia though I expect they are having a bit of an interesting time in terms of finances. Their quarterly results were not great. They're massively debt driven (to the tune of like £12bn - what do they think they are? A nation state?) and obviously interest rates are going up around the planet which will make having that debt and taking out new debt far more expensive. So this possibly explains their aggressive cost cutting.
 
Nvidia pushed up tiers this generation on desktop:
1.)AD102=GA102(similar die size) and 384 bit memory controller. Memory in 12GB/24GB tranches.
2.)AD103=GA104(similar die size and the GA103 never made it to desktop) and 256 bit memory controller.Memory in 8GB/16GB tranches.
3.)AD104=GA106(similar die size) and 192 bit memory controller.Memory in 6GB/12GB tranches.
4.)AD106=GA107 and 128 bit memory controller.Memory in 4GB/8GB tranches.

So basically if we ignored price:
1.)RTX3090/RTX3090TI=RTX4090(AD102)
2.)RTX3080/RTX3080TI= no equivalent
3.)RTX3070/RTX3070TI=RTX4080(AD103)
4.)RTX3060TI/RTX3060=RTX4070TI/RTX4070(AD104)
5.)RTX3050=RTX4060TI/RTX4060(AD106)

The RTX4060/RTX4060TI only have 128 bit memory buses,so can only do 4GB/8GB/16GB unless Nvidia try memory segmentation(like the GTX660/GTX660/GTX970). But it's most likely they wanted to upsell people to the "RTX4070" 12GB if you want more VRAM. This is the same tactic Apple does if you want more RAM and SSD storage. Suddenly the spec gets more expensive.

I even argue the 4090 is really 3080ti class and not comparable to 3090 as that had 2% less cuda cores than a 3090ti. 4090 has a lot more cuda cores unused and the 4090ti if one appears has a bigger leap than 3090 to 3090ti.
 
To be fair to Nvidia though I expect they are having a bit of an interesting time in terms of finances. Their quarterly results were not great. They're massively debt driven (to the tune of like £12bn - what do they think they are? A nation state?) and obviously interest rates are going up around the planet which will make having that debt and taking out new debt far more expensive. So this possibly explains their aggressive cost cutting.

They might want to think about dropping the prices of their GPU's then to make some sales/money.
 
Irony is that last gen AMD cards, ignored in the beginning for the shinny Nvidia card and RT performance, are looking to be better value in the long term, with better RT performance, e.g. 6800XT v 3070, today, and equalling or bettering the 4070.
Exactly that's why I was telling people grab the 6950xt when it came down to £800 and the 6900xt was even less and both ended up going up in price back then again too because they were a no brainer.
 
If you cared about electricity you wouldn't build a rig, poor excuse.
I'm sorry, but I think you have not read what I wrote properly.

Having a bit higher power consumption with slight higher electricity cost vs running out of vram and plaguing RT games with unplayable issues (stuttering and texture errors) the significance difference between the two situations are not even remotely comparable.
 
Last edited:
I even argue the 4090 is really 3080ti class and not comparable to 3090 as that had 2% less cuda cores than a 3090ti. 4090 has a lot more cuda cores unused and the 4090ti if one appears has a bigger leap than 3090 to 3090ti.

True,but in terms of improvements its the best and the rest are crap.
To be fair to Nvidia though I expect they are having a bit of an interesting time in terms of finances. Their quarterly results were not great. They're massively debt driven (to the tune of like £12bn - what do they think they are? A nation state?) and obviously interest rates are going up around the planet which will make having that debt and taking out new debt far more expensive. So this possibly explains their aggressive cost cutting.
But then they need to be shifting their older generation stock,but instead inventory quantities are building up massively. That includes completed cards and materials. They seem to be hoping to shift all their old stock at pandemic pricing.

I'm sorry, but I think you have not read what I wrote properly.

Having a bit higher power consumption with slight higher electricity cost vs running out of vram and plaguing RT games with unplayable issues (stuttering and texture errors) the significance difference between the two situations are not even remotely comparable.

Don't worry they won't factor in the additional costs of having to upgrade more frequently,whilst many overclock/overvolt their CPUs,and use PSUs which are not in the peak efficiency zone. But ironically when AMD was ahead in power efficiency suddenly nobody cared!

RDNA2 cards were generally more efficient than the Ampere equivalents and high energy costs have been with us since late 2021.
 
Last edited:
But then they need to be shifting their older generation stock,but instead inventory quantities are building up massively. That includes completed cards and materials. They seem to be hoping to shift all their old stock at pandemic pricing.
Maybe having a lot of inventory of overpriced products is part of the aim. "Well we don't have a lot of money, but we do have a lot of graphics cards which we reasonably value at hundreds of dollars more than their ordinary price. Please let us count this on our balance sheet." :cry:
 
Maybe having a lot of inventory of overpriced products is part of the aim. "Well we don't have a lot of money, but we do have a lot of graphics cards which we reasonably value at hundreds of dollars more than their ordinary price. Please let us count this on our balance sheet." :cry:

I think they are palming them off to larger system builders IMHO. But for PCMR enthusiasts,they have cottoned on enough will buy at stupid pricing.
 
They might want to think about dropping the prices of their GPU's then to make some sales/money.

Would grab a 4080 FE if it dropped to £749 this year. But not interested if it is in 2024 as that will be too close to 5000 series release for me.

Not like I am in a rush as I have a 3080 Ti which is already better than the 4070.

Hopefully Nvidia learn a lesson from all this and we go back to 3000 series style pricing.

Give me a 5080 FE for £799 and charge whatever you want for the 5080 Ti, 5090 and 5090 Ti. And when I say 5080 I mean a 5080, not a 5060 with a 5080 badge on it!
 
I even argue the 4090 is really 3080ti class and not comparable to 3090 as that had 2% less cuda cores than a 3090ti. 4090 has a lot more cuda cores unused and the 4090ti if one appears has a bigger leap than 3090 to 3090ti.
At least the 4090 used the same die which is why it performs well but everything else so far has been on a tier down die which is why performance gains are so unimpressive then you factor in prices also going up a tier and you're just getting 2 tiers lower performance for the price.
 
Would grab a 4080 FE if it dropped to £749 this year. But not interested if it is in 2024 as that will be too close to 5000 series release for me.

Not like I am in a rush as I have a 3080 Ti which is already better than the 4070.

Hopefully Nvidia learn a lesson from all this and we go back to 3000 series style pricing.

Give me a 5080 FE for £799 and charge whatever you want for the 5080 Ti, 5090 and 5090 Ti. And when I say 5080 I mean a 5080, not a 5060 with a 5080 badge on it!

Yeh. I am hoping this 4070 launch makes them realise a few things...

Still loads in stock at RRP everywhere...
 
Don't worry they won't factor in the additional costs of having to upgrade more frequently,whilst many overclock/overvolt their CPUs,and use PSUs which are not in the peak efficiency zone. But ironically when AMD was ahead in power efficiency suddenly nobody cared!
Reminds me of when AMD have more vram than Nvidia it is "overkill", and when Nvidia has more vram it is "OMGZZZZ AMD don't have enough vram!!"; and then when Nvidia and AMD both having the same moderate amount of vram on their cards, it is somehow "more than enough" on the Nvidia card but "not enough" on the AMD card :cry::cry::cry:
 
Reminds me of when AMD have more vram than Nvidia it is "overkill", and when Nvidia has more vram it is "OMGZZZZ AMD don't have enough vram!!"; and then when Nvidia and AMD both having the same moderate amount of vram on their cards, it is somehow "more than enough" on the Nvidia card but "not enough" on the AMD card :cry::cry::cry:

I suppose a game running out of VRAM,would mean power consumption will drop....like a brick! :cry:
 
then when Nvidia and AMD both having the same moderate amount of vram on their cards, it is somehow "more than enough" on the Nvidia card but "not enough" on the AMD card :cry::cry::cry:

Because Nvidia has better memory compression !! :cry:;)

Always some excuse to excuse them... anyways... this is what happens when fanboy mentality rules.. They all end up paying the bill in the future and loosing... No consumer should be defending a company that is clearly overcharging them for their goods.. but they do... strange world we live in really.
 
Because Nvidia has better memory compression !! :cry:;)

Always some excuse to excuse them... anyways... this is what happens when fanboy mentality rules.. They all end up paying the bill in the future and loosing... No consumer should be defending a company that is clearly overcharging them for their goods.. but they do... strange world we live in really.
Nvidia is the Apple of PCMR?! :cry:
 
Status
Not open for further replies.
Back
Top Bottom