Within reasonable ranges, power consumption is a non issue, or should be, for gamers.
Performance first, power second.
It isn't a non issue for everyone,
https://www.pcworld.com/article/394956/why-california-isnt-banning-gaming-pcs-yet.html
Even the the memory bandwidth of the GPU is taken into account on deciding what computer can be sold in California because of regulations on energy usage, so why wouldn't Nvidia cut down memory bandwidth, introduce more cache and reduce energy use if it meant it was easier to sell their product into various USA states?
The problem with the whole RTX4000 series pricing stack,is even significant price reductions don't make it still OK in value
For example if the RTX4070 12GB was reduced to £470,after 2.5 years it is barely 25% faster than an RTX3070 with 4GB more of VRAM:
There is no need to buy at launch, wait and see if they go down in price eg EOL, remember Jay2cent getting excited about price reductions on gpus in the Nvidia shop lol,
https://youtu.be/p79H_XOwpZo?t=61
The extra 4GB isn't free, neither is the node shrink so of course the 4070 costs more than the 3070, even before inflation.
As I have said previously when going to a newer node Nvidia has to decide how much of the benefit goes on performance increase and how much on reduced power consumption. Many people on here complain that they wanted more performance than power savings, however, that ignores growing pressure on Nvidia to reduce power usage, eg as above,
https://www.pcworld.com/article/394956/why-california-isnt-banning-gaming-pcs-yet.html
The sad reality is people just buy these things without bothering check unlike in the past. So all we are having is performance stagnation at current pricing. People trying to spin that an RTX4070 or RX7900XT selling well don't seem to appreciate,a lot of the market isn't moving forward in performance.
Is the fact that the 4070 isn't selling well because people have checked out reviews on its performance and decided that it doesn't give enough of an upgrade on the Nvidia 3000/AMD 6000 series that they bought?
I agree with you that the 4070 isn't a good performance increase over 3000/6000 for the price, however, the energy reduction compared to the 3080 is significant and lowers cost to own over life, the more you play the more you save should be the new Nvidia phrase. If AMD get FSR3 working on my 6800 then the 4070 could be seen as a down grade because of the reduced VRAM.
What about in the night ? You get days when it rains what do you do then ? Lucky we get few weeks of sun
Also you paying for the 6500xt how much are you actually saving can you not just undervolt the 6800 for the less demanding games ?
Within reason I rather have performance first
I use Radeon chill to limit power usage on the 6800, plus the power slider. There is a limit to how much you can get the card down to though, eg just compare the vram, 16GB on the 6800 takes more to power than just 4Gb on the 6500xt.
Taken from the review of the 2060 12GB by Techpowerup, "Compared to the RTX 2060, non-gaming power consumption is increased because the extra memory draws additional power."
https://www.techpowerup.com/review/nvidia-geforce-rtx-2060-12-gb/35.html
Even the 6500xt I frame cap in less demanding games. I get that many, perhaps most, people wish that the 4070 had used the same amount of power as the 3080 and significantly improved performance, however, as I have shown above there is growing pressure on Nvidia and other computer hardware companies to limit the power usage of computers, eg by the state of California.
Not only does a 300 watt gpu cost electricity to run but it costs money for the air conditioning to remove that heat in some parts of the world, such as my home in Missouri, USA
If people don't think that the 4070 is acceptable over the 3070 I can't wait to read what they think about the 4060ti/4060.