• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gaming Cost and Electricity prices going up in the UK

I have a power meter.

In Firestrike my 5800X + RTX 3070, 32B (4 sticks) 3733mhz memory, 3 SSD's and a HDD,5 120mm high performance fans, pull 510 watts max from the wall, counting for PSU efficiency, more like 440 watts.
Good luck to anyone with a 12900k and a card faster than the RTX 3070 as your numbers will be much higher.

power-multithread.png


efficiency-singlethread.png


Full rig below.

Untitled.png
 
Last edited:
I've capped all my games to 60fps - saving loads in electricity 100w on my gfx card alone, so more with cpu & ram etc. I really don't notice any difference playing at 60fps , I'll probably start an argument here but the need for any more is just a myth to sell gfx cards and monitors in my view
 
I've capped all my games to 60fps - saving loads in electricity 100w on my gfx card alone, so more with cpu & ram etc. I really don't notice any difference playing at 60fps , I'll probably start an argument here but the need for any more is just a myth to sell gfx cards and monitors in my view
It is a myth if you are not interested in competitive play or gaining all advantage you can in certain games (speed runners).

Not just this but less strain on the eyes when in motion with higher refresh rate screens. Way easier to track targets with your mouse, if competition means nothing and your eyes are ok with blurrier movements and trailing blurred images due to the way LCD's view stuff then sure I agree with you.

I only half agree personally, I stopped playing anything competitively, I no longer have it in my system... zero sum game if you win or lose taking away the passion to even try.
But I still love my motion clarity. In some games I cap FPS to 80-100 FPS, this keeps my 3070 between 140-200 watts if maxing visual settings at 1440P.

The only fix is back to CRT's, but then they incur a power hike and also screen flicker, you can't truly win.
 
I play at 4k on Oled so never see any ghosting or blurring. Depends what you call "Competitive gaming" I'm not really a Fortnite player, occaisionally play Cod but mostly Battlefield and RPG's although I don't care whether I win or lose I play for fun
 
I've capped all my games to 60fps - saving loads in electricity 100w on my gfx card alone, so more with cpu & ram etc. I really don't notice any difference playing at 60fps , I'll probably start an argument here but the need for any more is just a myth to sell gfx cards and monitors in my view
60 FPS is generally fine but its not a myth at all and you dont even need to be a competitive player to notice the difference from going to a 60hz screen to a 120hz tv or a 144hz monitor and it has nothing to do with ghosting or bluring, 144hz is stupidly smoother specially in faster paced games.
 
I play at 4k on Oled so never see any ghosting or blurring. Depends what you call "Competitive gaming" I'm not really a Fortnite player, occaisionally play Cod but mostly Battlefield and RPG's although I don't care whether I win or lose I play for fun

Quick question what watts does one of these OLED tellies use playing games? We did have some good efficiency on monitors as they tend not to be massive but anyone using a 48" tv care to share what its pulling at 120hz.
 
Quick question what watts does one of these OLED tellies use playing games? We did have some good efficiency on monitors as they tend not to be massive but anyone using a 48" tv care to share what its pulling at 120hz.
My CX 48" OLED is around 60 watts, which climbs to around 80 watts with HDR enabled. That is read from my watt meter, not taking into account PSU efficiency.

I have my CX OLED settings as per this guide.
 
That's not as bad as I thought it could be then. Was concerned the larger displays were 100w+ and with higher refresh rates and modules like gysnc another device to think about.
I think it may have drawn more power initially until i adjusted the settings as per that guide. I am sure with HDR on initially it peaked around 100W.
 
While 60fps is good enough, I genuinely see a difference playing at 120fps on an oled screen.

Changing from 120 to 60 feels jarring.

Obviously the longer you play you get used to it. I was able to get used to 30fps on Console especially smooth ones like spider-man with good use of motion blur.

Where I can’t see a difference is from about 80fps to 120. If I use rtss to cap a game at 90fps and then cap it at 120fps they both look equally as smooth when panning
 
We had this same conversation before, in another thread, so I’m surprised we have to do it all over again only to come to the same outcome.

If you look back and read my posts, I already acknowledged that it was more efficient in lightly threaded workloads. It’s less efficient in all core work loads though, agreed?

Whilst I play other games, there’s only one game which I dedicate any serious time to and this game benefits most overall from a 5950X.

I’ll play devil’s advocate for you and pretend it’s slightly faster in a heavy all core workload like FM, but it’s still going to draw more power, most likely quite a bit more power with each core pegged at 100% utilisation. Which comes back to you advising me to get it for power efficiency reasons. It’s not going to be more power efficient in this workload. Hopefully that’s sunk in now.

Just to add, I don’t crave the benchmark that’s just a similar workload to how I have my game setup. That’s my use case like it or not.

Probably better we agree to disagree.

Everything should change in a few months, with 24 core 13900k launching and of course Zen4! Be interesting so see how many cores the flagship desktop AM5 Zen4 CPU has for the first generation, not sure if we've had this leak yet?
 
Probably better we agree to disagree.

Everything should change in a few months, with 24 core 13900k launching and of course Zen4! Be interesting so see how many cores the flagship desktop AM5 Zen4 CPU has for the first generation, not sure if we've had this leak yet?

Zen 4 Vs the 13XXX Intel core will be very interesting.
 
Quick question what watts does one of these OLED tellies use playing games? We did have some good efficiency on monitors as they tend not to be massive but anyone using a 48" tv care to share what its pulling at 120hz.

my 55 inch oled when gaming at 120hz with HDR on and vrr etc uses 100 - 110 watts. I tried power saving mode but its too dark to enjoy
 
after sticking with my 1080ti playing at 4k i bit the bullet when the 3090FE came out, it had been a while and with covid hitting i thought it would be ideal for some amazing gaming times.

I didn't even think about energy costs but its all been sorted now as a very unusual reason is saving me a fortune when it comes to gaming running costs.

AAA game devs releasing utter utter garbage... yep saving me a fortune thanks guys. What a bloody waste of an upgrade cycle :cry:
 
I've capped all my games to 60fps - saving loads in electricity 100w on my gfx card alone, so more with cpu & ram etc. I really don't notice any difference playing at 60fps , I'll probably start an argument here but the need for any more is just a myth to sell gfx cards and monitors in my view
Higher than 60fps is not just a myth to sell GPU's and monitors, that is provably false. I'd suggest watching this: https://www.youtube.com/watch?v=OX31kZbAXsA

When I upgraded my monitor from 60Hz to 144Hz the smoothness and responsiveness was immediately noticeable both for games and general desktop use. After I was used to it I tried 60fps and it was jarring, it felt like how it used to feel switching from a 60fps game to a 30fps game on a PS4.
 
The thing to remember about gaming at 60fps, is if your gpu downclocks too much it may not be a smooth experience. Felt it both on my 580 and 3060ti. The power saving are great when it does downclock though - as the voltage will drop too. Black Mesa on my 580 was the worst example I had of it, where the gpu was running at ~700MHz IIRC. Ended up finding clockblocker to make it run at full clocks which made the game run smooth even though the fps remained unchanged.
 
Back
Top Bottom