• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gaming Cost and Electricity prices going up in the UK

Power is not energy used, it is a rate of consumption. In my example I use 0.3 kWh of electricity per day to game. If I mined in that same period it is at least 3.6 kWh per day usage. That's a lot more energy consumed.

I've not included TV/monitor and audio system as I'm comparing apples to apples. It's likely TV/Netflix would be watched too for a portion of the time I was mining so difficult to truly separate.

In your case then, gaming costs you 0.48 kWh per hour whereas mining for the day is 8.4 kWh. Quite the difference.

You say you're comparing apples to apples and then proceed to compare power usage per hour vs per day. :confused:

I'm with @gpuerrilla on this - comparing apples to apples (e.g. over the same time period) - gaming uses far more power than mining does, and it makes perfect sense to include your monitor, speakers etc. since they wouldn't (shouldn't) be in use while mining, but certainly would be during gaming.
 
You say you're comparing apples to apples and then proceed to compare power usage per hour vs per day. :confused:

I'm with @gpuerrilla on this - comparing apples to apples (e.g. over the same time period) - gaming uses far more power than mining does, and it makes perfect sense to include your monitor, speakers etc. since they wouldn't (shouldn't) be in use while mining, but certainly would be during gaming.

Quite the difference! :D

It was intended to be about my use case, but i'd argue it's quite typical actually.

Power is not energy used, it is a rate of consumption. In my example I use 0.3 kWh of electricity per day to game. If I mined in that same period it is at least 3.6 kWh per day usage. That's a lot more energy consumed.

I've not included TV/monitor and audio system as I'm comparing apples to apples. It's likely TV/Netflix would be watched too for a portion of the time I was mining so difficult to truly separate.

In your case then, gaming costs you 0.48 kWh per hour whereas mining for the day is 8.4 kWh. Quite the difference.

Totally understand your pov. However like I said, the sweeping statements people use is where the accuracy is lost. This is why we get mouth foamers blaming groups without digging in to the label mentality. With a real world comparison I showed gaming for me used more power than mining. I was also being generous with figures, if I used stock settings and whacked everything up on a demanding title like CP'77 it would be more like 500w at the wall.

There are also folk who game far more hours in their sessions without any regard for what that costs, at least with the current economic state it will remind people when it comes to energy use what the cost to gaming actually is.
 
Last edited:
Oh right, you only mine for an hour a day? Sorry. :rolleyes:

Spotted the edit. Yeah, I didn't intend on it being a sweeping statement, but my example does go a long way to showing why it still holds quite true. It's the time aspect that really makes mining consumption add up versus gaming. You need to be gaming 8-10 hours a day, accounting for all the ancillaries too, to match the consumption of a single GPU mining system, that was my point really.
 
Last edited:
I'm changing some of my WfH practices. I used to have my main PC turned on (displaying via the TV and sound output via a big amplifier and wall speakers) alongside my work laptop so I could download games and listen to music and use it as a Second Screen kinda thing but now I just use the Spotify app on my TV and have the PC and amp turned off. I'll download games when I'm actually using my PC.

But I'm not really changing my gaming habits. I already used frame limiters to help with noise so I'll keep doing that.
 
For the first time ever that I can remember, people are now factoring in power usage for CPU and now GPU(thanks Nvidia) when buying a bit of hardware. With Leccy costs rising, power efficiency is now a real thing to look into. AMD looking efficient...
 
For the first time ever that I can remember, people are now factoring in power usage for CPU and now GPU(thanks Nvidia) when buying a bit of hardware. With Leccy costs rising, power efficiency is now a real thing to look into. AMD looking efficient...

I agree, it used to be something especially nvidia tribes used to point out for AMD users as there was a while where the AMD cards used a lot more energy to try compete, but since RDNA we have very competitive power delivery where some of the AMD cards are getting one over slightly turning the tables. As some have stated, the tone tends to be nobody cared as its gaming performance that matters!

With electric costs really soaring and may be worse after October, it has become more relevant now to make the numbers factor in to the purchasing decisions. If Ada series is more of a guzzler we will have to see what 7000 cards compare at to see.
 
Just in case you forget check the thread title. You were the one that mentioned mining, nobody else did. Instead of twisting it to suit your narrative think about the facts.

I used mining to elaborate on my example, for my use case. It allowed me to point out that in my use, gaming isn't significantly altering my electricity costs. You correctly identified that my example doesn't apply to everyone so I replied to your observations. I considered your example and found that the results are largely the same. I have no narrative to suit, only some calculations to support my response to the question in the thread title.

I think you should reconsider who is doing the twisting before editing your post again.
 
My PC is 60w idle and 250w gaming. So 4 hours of gaming = 28p+vat. Standing charge is 41p/day so you get whacked even if you turn the leccy off. I'm glad I don't have a smart meter..
 
Say your a gamer that spends 15 hours each day 365 days, 3090 TI cost per annually on a mid tariff around £600, more so depending on many factors & price going up & up
Just think 3090 TI with 12900KS & factor in all the other hardware lol
 
Back
Top Bottom