Soldato
Within reasonable ranges, power consumption is a non issue, or should be, for gamers.
Performance first, power second.
Performance first, power second.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Each card has a minimum power draw so you can only adjust power downwards to a certain amount, eg a 6500xt has 2 watts at idle and a 6800 28 watts, while doing video playback the 6500xt uses 12 watts v the 6800 48 watts.
This is why I have 2 AMD cards. I use the 6500xt most of the year as it can handle the older games that I mostly play. Then in the winter I install the 6800 and play the new triple AAA games that I fancy on it alongside the old games I play year round. The 6800 also acts as extra heating in the winter, I never had to put the heating on in the room it was in. The 6500xt is paying for itself.
Sapphire Radeon RX 6500 XT Pulse Review
The Sapphire Radeon RX 6500 XT Pulse comes with super impressive noise levels. Even when fully loaded does it run whisper-quiet in an already quiet room. If you put it into a case, it'll be inaudible. Unfortunately, the card is held back by its small VRAM size of 4 GB and the narrow PCIe x4...www.techpowerup.com
But I don't want to play AAA games in the summer. I prefer to spend time outside with the other half, she loves being in the pool
If people want RT and Pathtracing to go forward we need massive jumps in performance at all tiers.
Too many people are patting themselves in defence of poor hardware having "decent" sales. It seems ignorance is the new badge of honour!
Within reasonable ranges, power consumption is a non issue, or should be, for gamers.
Performance first, power second.
The problem with the whole RTX4000 series pricing stack,is even significant price reductions don't make it still OK in value
For example if the RTX4070 12GB was reduced to £470,after 2.5 years it is barely 25% faster than an RTX3070 with 4GB more of VRAM:
Is the fact that the 4070 isn't selling well because people have checked out reviews on its performance and decided that it doesn't give enough of an upgrade on the Nvidia 3000/AMD 6000 series that they bought?The sad reality is people just buy these things without bothering check unlike in the past. So all we are having is performance stagnation at current pricing. People trying to spin that an RTX4070 or RX7900XT selling well don't seem to appreciate,a lot of the market isn't moving forward in performance.
I use Radeon chill to limit power usage on the 6800, plus the power slider. There is a limit to how much you can get the card down to though, eg just compare the vram, 16GB on the 6800 takes more to power than just 4Gb on the 6500xt.What about in the night ? You get days when it rains what do you do then ? Lucky we get few weeks of sun
Also you paying for the 6500xt how much are you actually saving can you not just undervolt the 6800 for the less demanding games ?
Within reason I rather have performance first
I think the 4070 is considerably cheaper to make than than a a 3070.The extra 4GB isn't free, neither is the node shrink so of course the 4070 costs more than the 3070, even before inflation.
Its not selling well because it offers 60 class performance for £600, if people bought based off energy consumption the lower end cards would cost more than higher end cards.Is the fact that the 4070 isn't selling well because people have checked out reviews on its performance and decided that it doesn't give enough of an upgrade on the Nvidia 3000/AMD 6000 series that they bought?
I agree with you that the 4070 isn't a good performance increase over 3000/6000 for the price, however, the energy reduction compared to the 3080 is significant and lowers cost to own over life, the more you play the more you save should be the new Nvidia phrase. If AMD get FSR3 working on my 6800 then the 4070 could be seen as a down grade because of the reduced VRAM.
perhaps most, people wish that the 4070 had used the same amount of power as the 3080 and significantly improved performance, however, as I have shown above there is growing pressure on Nvidia and other computer hardware companies to limit the power usage of computers, eg by the state of California.
Not only does a 300 watt gpu cost electricity to run but it costs money for the air conditioning to remove that heat in some parts of the world, such as my home in Missouri, USA
Some IT companies are selling these cards for <£500, but they aren't available to the public at that price.
This probably gives an indicator of what they're worth - But value is relative and depends what people are willing to pay.
You sure you are not just looking at prices without vat?
Each card has a minimum power draw so you can only adjust power downwards to a certain amount, eg a 6500xt has 2 watts at idle and a 6800 28 watts, while doing video playback the 6500xt uses 12 watts v the 6800 48 watts.
This is why I have 2 AMD cards. I use the 6500xt most of the year as it can handle the older games that I mostly play. Then in the winter I install the 6800 and play the new triple AAA games that I fancy on it alongside the old games I play year round. The 6800 also acts as extra heating in the winter, I never had to put the heating on in the room it was in. The 6500xt is paying for itself.
Sapphire Radeon RX 6500 XT Pulse Review
The Sapphire Radeon RX 6500 XT Pulse comes with super impressive noise levels. Even when fully loaded does it run whisper-quiet in an already quiet room. If you put it into a case, it'll be inaudible. Unfortunately, the card is held back by its small VRAM size of 4 GB and the narrow PCIe x4...www.techpowerup.com
You are paying 20p per kWh of electricity in London.?Well I live in London England and the extra heat is almost always welcome and the limited amount of days it ain't I am out enjoying the sun
An extra 100w is not a huge deal for me personally. That's like around 2p extra per hour for me. Say I managed to game 1000 hours over 2 years before upgrading, that is £20. That is almost worth it just for the extra heat alone
Not defending it by the way, I prefer more efficient GPU's myself and 200w is sweet. But it only goes so far when you take the data above into account.
You are paying 20p per kWh of electricity in London.?
Wow that is very low these days.Yes. 19.88p today to be exact.
Capped rate for my home in the UK. For the USA since 1st April, "The new rate for residential accounts will be 11.71 cents per kilowatt-hour."How much are you paying for KWh of electricity.?
I think the 4070 is considerably cheaper to make than than a a 3070.
GDDR6X is cheaper than standard G6 since only nvidia use it, the AD104 die is 25% smaller and the 4070 cuts that down by another 25% only using 5888 CUDA out of the possible 7680 so yields would be excellent.
Then there is the empty PCB, a lower tdp and lower bus require less components so another cost saving.
If you want to calculate cost of this GPU then base it of a $330 3060 12gb as the spec and production cost is almost identical bar the die which costs about $30-40 more on TSMC.
Yet AMD managed to either cut or keep the same prices for its CPUs built on 5nmCapped rate for my home in the UK. For the USA since 1st April, "The new rate for residential accounts will be 11.71 cents per kilowatt-hour."
Don't forget since the MRSP of the 3070 was announced not only is the smaller node more expensive but TSMC announced 10 to 20% price increase. Substrate prices went up, other component prices went up, even the price of the cardboard box went up. When the selling price goes up even extra vat goes up, so if the selling price goes up by £50 you have to add an extra £10 for VAT.
We heard that companies such as nvidia were booking long term supply contracts to ensure supply of components to make gpus, so are they locked into higher prices? Also it has been claimed that they booked more capacity at TSMC than they likely need so is the extra cost of any unused capacity figured into the price. While gamers shouldn't have to pay extra for gpus because Nvidia signed expensive contracts that they no longer need for components that are now readily available, is this factored into the BOM cost for the 4000 series. My experience in business is that it would be, if it has occured