Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
This. As we touched on @CAT-THE-FIFTH suddenly the power efficiency card is a major feature, complete joke of a card to play when the package doesnt look so attractive I would say!
At £499 like the 3070 its a different story.
I'm looking at the overall package and just being more efficient doesn't cut it so I should over pay Nvidia because they are doing it to save the planet?
The performance for the cost is crap not only at best should be 4060ti but they also added £100 to the 3070 double shafted
I haven't said that they couldn't have reduced the cost, increased performance or otherwise make the purchase more appealing, eg a better games bundle on launch. I wonder how much Nvidia has made things worse (if at all) by locking themselves into expensive contracts for more wafers, substrate and other physical items than they need for this generation and so are trying to cover some of that cost by not giving gpu buyers such a good performance increase this generation.I'm looking at the overall package and just being more efficient doesn't cut it so I should over pay Nvidia because they are doing it to save the planet?
The performance for the cost is crap not only at best should be 4060ti but they also added £100 to the 3070 double shafted
Hmm I don't remember defending AMD? They are also at it That's because it's the 4070 is lowest price 40 series it gives some a excuse
I bought 3080 at msrp on release looking for it to tide me over till next gen and see what's on offer , I've always stayed around the £600-£700 mark
I haven't said that they couldn't have reduced the cost, increased performance or otherwise make the purchase more appealing, eg a better games bundle on launch. I wonder how much Nvidia has made things worse (if at all) by locking themselves into expensive contracts for more wafers, substrate and other physical items than they need for this generation and so are trying to cover some of that cost by not giving gpu buyers such a good performance increase this generation.
However, if like me your PC is on for multiple hours a day the power efficiency improvements of the 4070 over the 3080 are very welcome, for someone who games 4 or 5 hours a week and the Pc isn't on as much the power efficiency increase is nowhere near as important. The difference in over 100 watts usage per hour adds up to hundreds over the years for me, a new gpu infact, https://youtu.be/EYb5PhPSIpE?t=563
(The figures in that example are based on a difference of only 45watts gaming, the difference between the 4070 and 3080 is over double that)
Its the power consumption arguments which are apparently more important than value arguments.
Is it that bad to wait 2 generations?3080 was 2.5 years ago ? I remember because I bought 3080 at msrp on release I have no upgrade path this generation unless I well over pay which I won't be doing I'll wait next generation and see what's on offer
I can see people with 10/20 series buying 4070 because it's the cheapest 40 series and probably less educated how much they being shafted
4070 is only slightly cheaper to match it's performance after 2.5 years even purely on horsepower I don't think 4070 iis gonna age well looking 1 -2 years ahead product which you are well over paying for
Some reason the Power efficiency is worth it over what should be the 4060ti and paying £100 more from what the 3070 was
The chip actually uses the same power as its predecessor it’s just its predecessor wasn’t a 3070 but rather a 3060.So Nvidia seems to have gone much more for the same performance at lower energy usage (plus a slightly lower price) https://youtu.be/uZHDq-LEGzw?t=1242
The 4070 has roughly the same performance as the 3080 but uses much less power
This is supported in the 4000 series design being nearer than the 3000 series was to the AMD 6000 series, which had a smaller bus and infinity cache to help lower power usage https://www.pcgamer.com/amd-infinity-cache-rx-6800-xt-rx-6800-rdna-2/
Actually isn't power consumption part of the value argument? This lady shows that the Intel Arc isn't value because of its power usage, and shows that someone could pay for a new gpu simply by having a gpu which uses less power than the arc, https://youtu.be/EYb5PhPSIpE?t=563Its the power consumption arguments which are apparently more important than value arguments.
Is it that bad to wait 2 generations?
On my Nvidia PC I bought a 680 for around £235, skipped 700 series, bought 970 for £180, skipped 1000 series and bought a 2060 12Gb for £270. Notice how every time I purchased it was under MRSP and performed well eg, the 2060 12Gb performs better than the 1080, has 50% more vram, dlss and full direct12 support. So by skipping buying a new gen at launch I got a better gpu for £270 v £630. https://hexus.net/tech/reviews/graphics/93686-msi-geforce-gtx-1080-gaming-x/?page=13
The 4070 uses a slightly larger chip than the 3060 and the cost of the better TSCM node for the 4070 v the samsung node for the 3060 also makes the 4070 more expensive to make than the 3060, so to say that the 4070 = a 3060 isn't strictly accurate as the cost to make a 4070 over a 3060 is certainly higher. I remember some people saying that they were concerned about the increase in cost going from the samsung 3000 to the TSCM 4000, because of just the extra cost of the improved node.The chip actually uses the same power as its predecessor it’s just its predecessor wasn’t a 3070 but rather a 3060.
The fact that they got 3080 perfomance out of a 4060 class card is impressive but calling it a 4070 and asking £600 for it less so.
Power consumption matters when 2 products have the same performance. Comparing eg. a 6600xt to a 4090 is pointless.Its flip-flop. When the HD4870/HD4890 matched the GTX260/GTX260(216) for less money,people were saying it consumed less power and had CUDA. Then when Fermi came out,it was all about tessellation and overclocking(even though many ATI cards also overclocked well). Nobody cared about power consumption.
It was the same for the people who purchased a GTX750TI over an HD7850,despite it being slower and the HD7850 being quite efficient.
Also when RDNA2 had a power consumption advantage and VRAM advantage over the RTX3080,it wasn't important:
60W more for an RTX3080 10GB at average gaming load over an RX6800XT 16GB,and it was all about RT. Nobody cared about power consumption.
The RX6800 was also the most efficient gaming card of the last generation.
Suddenly,when the RX7900XTX 24GB was slightly faster than a more expensive RTX4080 16GB it was all about power consumption and RT:
50W over an RTX4080 16GB was suddenly a big deal.
The same goes with the spin over the RTX4070TI:
The RX7900XT was 10% faster,had nearly 70% more VRAM and consumed 40W~50W more. Suddenly it was all about the power consumption and RT.
Edit!!
If people cared about the planet you wouldn't touch an Intel CPU and have got an RX6600 8GB and played games at 1080p. The RX6600 can be made to use under 100W:
Reddit - Dive into anything
www.reddit.com
I might even understand SFF system fans doing this,but most people on here are not using SFF systems.
It's all spin to justify Nvidia jacking up pricing. So to save power you spend 60% more than an RTX3060TI 8GB for just over 40% more performance.
But there were people on here who got rid of perfectly fine AIB R9 290 4GB cards,to get a GTX970 3.5GB because of "power consumption". The same with people ignoring the R9 390 instead of the GTX970 but then pairing them with massively overclocked CPUs too!
I could even understand if they had a SFF system,but nope it was full sized systems. That worked out well longterm didn't it,with the way the R9 290/390 performance has improved.
I showed the numbers for RDNA2 vs Ampere. We have had high energy prices since 2021. If that is the case nobody should have bought an Ampere card.Actually isn't power consumption part of the value argument? This lady shows that the Intel Arc isn't value because of its power usage, and shows that someone could pay for a new gpu simply by having a gpu which uses less power than the arc, https://youtu.be/EYb5PhPSIpE?t=563
The 4070 uses a slightly larger chip than the 3060 and the cost of the better TSCM node for the 4070 v the samsung node for the 3060 also makes the 4070 more expensive to make than the 3060, so to say that the 4070 = a 3060 isn't strictly accurate as the cost to make a 4070 over a 3060 is certainly higher. I remember some people saying that they were concerned about the increase in cost going from the samsung 3000 to the TSCM 4000, because of just the extra cost of the improved node.
As someone who bought a 970 for £180 new I find the increased cost of gaming on a 70 class card beyond what I want to pay, and as always I will wait for the 4000 or 5000 series to go EOL before I upgrade, no one is forced to buy at these prices. If you don't like the price don't buy, there are plenty of people who will pay £589 or more for a gpu and so encourage Nvidia to price around this level. The most expensive Nvidia card I have ever bought is a 3060ti FE, plenty of people have spent more than the £380 i have to buy a Nvidia cards to encourage them to produce more expensive cards
Each card has a minimum power draw so you can only adjust power downwards to a certain amount, eg a 6500xt has 2 watts at idle and a 6800 28 watts, while doing video playback the 6500xt uses 12 watts v the 6800 48 watts.I showed the numbers for RDNA2 vs Ampere. We have had high energy prices since 2021. If that is the case nobody should have bought an Ampere card.
Also I would rather have a much faster dGPU for the same price consuming more power. The RTX4070TI should be the RTX4070 at £470:
That is a proper 50% improvement in performance over the RTX3070. The RTX3070TI was a cash grab. Even if it meant 45W more power than RTX3070 it is worth it:
You can always adjust power downwards if required using MSI Afterburner or locking FPS.
got to play stardew in the summer monthsWhat if you want to play triple AAA during summer ? So you only play triple AAA in winter? Seems strange arrangement
The 4070 chip is more cut down than a 3060 so actually takes up less area but even with TSMC costs of 16k per wafer for 5nm your only talking about $80 per chip, so even if the 3060 Samsung was free which it wasn’t and more like 8k per wafer then your only going from $330 to $410 while the rest of the parts are pretty much identical.The 4070 uses a slightly larger chip than the 3060 and the cost of the better TSCM node for the 4070 v the samsung node for the 3060 also makes the 4070 more expensive to make than the 3060, so to say that the 4070 = a 3060 isn't strictly accurate as the cost to make a 4070 over a 3060 is certainly higher. I remember some people saying that they were concerned about the increase in cost going from the samsung 3000 to the TSCM 4000, because of just the extra cost of the improved node.
As someone who bought a 970 for £180 new I find the increased cost of gaming on a 70 class card beyond what I want to pay, and as always I will wait for the 4000 or 5000 series to go EOL before I upgrade, no one is forced to buy at these prices. If you don't like the price don't buy, there are plenty of people who will pay £589 or more for a gpu and so encourage Nvidia to price around this level. The most expensive Nvidia card I have ever bought is a 3060ti FE, plenty of people have spent more than the £380 i have to buy a Nvidia cards to encourage them to produce more expensive cards
But I don't want to play AAA games in the summer. I prefer to spend time outside with the other half, she loves being in the poolWhat if you want to play triple AAA during summer ? So you only play triple AAA in winter? Seems strange arrangement