• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD really need to fix this.

It's not just graphics cards though, it's most components sucking huge amounts of power and it seems to be increasing. Just look at the power draw of Intel's unlocked cpu's. Why are these ridiculous power draws needed though? Many of us have taken to undervolting our gpu's and cpu's with no apparent downsides. When I got my RTX 3070 I was horrified at the power draw compared to the GTX 1070 that it replaced. I heard about undervolting on here and had a go at it with the 3070. Not only did I knock just over 100w off it's power draw but it's cooler, quieter and boosts higher than it did before so why don't they do this from the factory? I know all silicon is not equal but they seem to be over compensating with power draw across the ranges. We need to be using less power not more and more with each generation.
 
It's not just graphics cards though, it's most components sucking huge amounts of power and it seems to be increasing. Just look at the power draw of Intel's unlocked cpu's. Why are these ridiculous power draws needed though? Many of us have taken to undervolting our gpu's and cpu's with no apparent downsides. When I got my RTX 3070 I was horrified at the power draw compared to the GTX 1070 that it replaced. I heard about undervolting on here and had a go at it with the 3070. Not only did I knock just over 100w off it's power draw but it's cooler, quieter and boosts higher than it did before so why don't they do this from the factory? I know all silicon is not equal but they seem to be over compensating with power draw across the ranges. We need to be using less power not more and more with each generation.

Because there is competition.

Remember when you bought something and could add 30-100% performance on top? Remember when Overclocking voided your warranty?

Competition ramped up and now they are all overclocked and over volted out of the box.

Same reason the 3080 needed a new vbios after launch benchmarks. Because it was clocking balls.

All of the efficiency gains of the shrinks are being thrown out for raw performance and benchmarks.

A few YTers have put these parts where they should be in clocks and volts and they are much better albeit slower.

It won’t go away.
 
It's power supplies too

One of evga's psu's has to be given double the amps requested due to it having a pfc of 0.5

 
Last edited:
It's not just graphics cards though, it's most components sucking huge amounts of power and it seems to be increasing. Just look at the power draw of Intel's unlocked cpu's. Why are these ridiculous power draws needed though? Many of us have taken to undervolting our gpu's and cpu's with no apparent downsides. When I got my RTX 3070 I was horrified at the power draw compared to the GTX 1070 that it replaced. I heard about undervolting on here and had a go at it with the 3070. Not only did I knock just over 100w off it's power draw but it's cooler, quieter and boosts higher than it did before so why don't they do this from the factory? I know all silicon is not equal but they seem to be over compensating with power draw across the ranges. We need to be using less power not more and more with each generation.
Captain obvious here, but you do need power to have something faster even with a decent architecture/manufacturing process.
 
Last edited:
That’s not true is it?

Remember, it’s us you’re talking to dude.

Burning connectors, 3080s that were boosting beyond their means and gawd knows what else disagrees with you. Oh and 3.5gb gate.

Oddly of all of the GPUs you mention yes, a lot of them were turds. But to say Nvidia is a better bet is nuts.

The 7000 series are a new thing. As such yeah, creases that need to be ironed out. Kinda like the creases with Ampere and oh, 2080ti space invader mode.

I’m pretty sure I’ve made my point and don’t need to continue.
I'm pretty sure that the average consumer has made their choice too, with Nvidia having >85% market share of dedicated GPU's

sG8OMrX.png


This would not happen, unless Nvidia released the best all round dGPU. They've done so for generations. Meanwhile, AMD continue to fail at basic things like idle, multi-monitor power consumption and stable drivers.
 
Because it's apparently not fair to compare between companies if one is using a company specific tech, like AMD Chill in this thread.... and buy extension DLSS. In every other thread. We should just use native for comparison.
AMD Chill has to be set up by the player. DLSS works in game as a setting. Just activate FSR where applicable.
 
I'm pretty sure that the average consumer has made their choice too, with Nvidia having >85% market share of dedicated GPU's

sG8OMrX.png


This would not happen, unless Nvidia released the best all round dGPU. They've done so for generations. Meanwhile, AMD continue to fail at basic things like idle, multi-monitor power consumption and stable drivers.

Nvidia just talk the right kinda poo.

And people fall for it.
 
Multi monitor isn't a basic thing. It has been an issue even if nvidia cards until very recently. I remember reading countless threads about how gpus would not use their 0db fan mode when using multi monitors. And it was all down to the gpu not downclocking enough because it had a second monitor attached.
 
More seriously, I'm just sick of the DLSS debate, and this looks like it'll go exactly the same way. Feature Vs defaults.
You brought DLSS into a subject where DLSS, I assume, wasn't involved. All you've done is enable a debate about something irrelevant that you're sick of :confused:
 
AMD Chill has to be set up by the player. DLSS works in game as a setting. Just activate FSR where applicable.
You can just toggle it on via a hotkey while in game, Radeon Chill I mean. The default values are 75-144 FPS depending on movement detected. You can customise the FPS values if needed.
 
In non graphics intensive games use Radeon Chill - has the reviewer not heard of this? It's been around since 2016.

Even with Nvidia cards, they boost too high too in older games.

On my RTX3060TI, I reduce the power target, use a custom curve and use a framerate cap to reduce power power consumption a decent amount.

Seems to be slow news day in the tech press ATM. Weird this concern with power consumption never existed when the RX6000 series was better performance/watt than the RTX3000 series.

My 4080 downclocks all the way to under 500MHz at times in Skyrim by itself, with 3 displays in total connected. Inside the dungeons, power consumption ( as show with HWinfo), was under 50w, outside around 50-60w+ most of the time. Older series may have been problematic, but not anymore.

My god, expecting a player to select an option. The humanity!

More seriously, I'm just sick of the DLSS debate, and this looks like it'll go exactly the same way. Feature Vs defaults.
You should select whatever you want and play as you want. If DLSS/FSR offers about the same quality at lower the power, why not? But you do you and others will do whatever they wanna do.
Multi monitor isn't a basic thing. It has been an issue even if nvidia cards until very recently. I remember reading countless threads about how gpus would not use their 0db fan mode when using multi monitors. And it was all down to the gpu not downclocking enough because it had a second monitor attached.

Is fine. Played a few hours Skyrim, 3 monitors active, fans were off all the time at around 26*C in the room... True, they're 60hz normal 1080p screens, don't know how it does for higher res, higher refresh rate.

You can just toggle it on via a hotkey while in game, Radeon Chill I mean. The default values are 75-144 FPS depending on movement detected. You can customise the FPS values if needed.

If it doesn't introduce lag/stutter that's fine. Curious it doesn't do what it does by default somehow or do you need to test it for stability first at certain levels? I don't remember anymore.
 
There is no way that running a game half the time at 70fps doesn't increase latency compared to running 140fps. You may as well just cap it at 70fps if it's a single player game and man up and play at 140fps if it's a competitive shooter, I mean you don't want to get shot in the face in call of duty just so you can save a cent on power
 
My 4080 downclocks all the way to under 500MHz at times in Skyrim by itself, with 3 displays in total connected. Inside the dungeons, power consumption ( as show with HWinfo), was under 50w, outside around 50-60w+ most of the time. Older series may have been problematic, but not anymore.




You should select whatever you want and play as you want. If DLSS/FSR offers about the same quality at lower the power, why not? But you do you and others will do whatever they wanna do.


Is fine. Played a few hours Skyrim, 3 monitors active, fans were off all the time at around 26*C in the room... True, they're 60hz normal 1080p screens, don't know how it does for higher res, higher refresh rate.
high refresh rate used to be an issue but not much anymore 2x4k screens at 120hz is around 700Mhz. 2x240hz monitors averages around 500
 
There is no way that running a game half the time at 70fps doesn't increase latency compared to running 140fps. You may as well just cap it at 70fps if it's a single player game and man up and play at 140fps if it's a competitive shooter, I mean you don't want to get shot in the face in call of duty just so you can save a cent on power
I don't know how it feels at that FPS as I'm not interested in high Hz gaming, but for lower fps, it does feel like a significant improvement, at times strangely close to native.

A big factor is the overall latency of the game at a certain resolution and details. So it could work better in case A vs case B.

Personally, I definitely want it as an option to have, but is understandable that may not work for everyone (all the time).

That's why, I guess, AMD hasn't launched fsr3 yet. Probably they haven't crack it, both from an image quality perspective and latency.
 
Back
Top Bottom