To my knowledge dlss was never used in this video? Unless i missed it. If it was that's ******* stupid.Oh good, can we stop talking about DLSS then?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
To my knowledge dlss was never used in this video? Unless i missed it. If it was that's ******* stupid.Oh good, can we stop talking about DLSS then?
It's not just graphics cards though, it's most components sucking huge amounts of power and it seems to be increasing. Just look at the power draw of Intel's unlocked cpu's. Why are these ridiculous power draws needed though? Many of us have taken to undervolting our gpu's and cpu's with no apparent downsides. When I got my RTX 3070 I was horrified at the power draw compared to the GTX 1070 that it replaced. I heard about undervolting on here and had a go at it with the 3070. Not only did I knock just over 100w off it's power draw but it's cooler, quieter and boosts higher than it did before so why don't they do this from the factory? I know all silicon is not equal but they seem to be over compensating with power draw across the ranges. We need to be using less power not more and more with each generation.
Captain obvious here, but you do need power to have something faster even with a decent architecture/manufacturing process.It's not just graphics cards though, it's most components sucking huge amounts of power and it seems to be increasing. Just look at the power draw of Intel's unlocked cpu's. Why are these ridiculous power draws needed though? Many of us have taken to undervolting our gpu's and cpu's with no apparent downsides. When I got my RTX 3070 I was horrified at the power draw compared to the GTX 1070 that it replaced. I heard about undervolting on here and had a go at it with the 3070. Not only did I knock just over 100w off it's power draw but it's cooler, quieter and boosts higher than it did before so why don't they do this from the factory? I know all silicon is not equal but they seem to be over compensating with power draw across the ranges. We need to be using less power not more and more with each generation.
Because it's apparently not fair to compare between companies if one is using a company specific tech, like AMD Chill in this thread.... and buy extension DLSS. In every other thread. We should just use native for comparison.Why?
I'm pretty sure that the average consumer has made their choice too, with Nvidia having >85% market share of dedicated GPU'sThat’s not true is it?
Remember, it’s us you’re talking to dude.
Burning connectors, 3080s that were boosting beyond their means and gawd knows what else disagrees with you. Oh and 3.5gb gate.
Oddly of all of the GPUs you mention yes, a lot of them were turds. But to say Nvidia is a better bet is nuts.
The 7000 series are a new thing. As such yeah, creases that need to be ironed out. Kinda like the creases with Ampere and oh, 2080ti space invader mode.
I’m pretty sure I’ve made my point and don’t need to continue.
AMD Chill has to be set up by the player. DLSS works in game as a setting. Just activate FSR where applicable.Because it's apparently not fair to compare between companies if one is using a company specific tech, like AMD Chill in this thread.... and buy extension DLSS. In every other thread. We should just use native for comparison.
My god, expecting a player to select an option. The humanity!AMD Chill has to be set up by the player. DLSS works in game as a setting. Just activate FSR where applicable.
I'm pretty sure that the average consumer has made their choice too, with Nvidia having >85% market share of dedicated GPU's
![]()
This would not happen, unless Nvidia released the best all round dGPU. They've done so for generations. Meanwhile, AMD continue to fail at basic things like idle, multi-monitor power consumption and stable drivers.
You brought DLSS into a subject where DLSS, I assume, wasn't involved. All you've done is enable a debate about something irrelevant that you're sick ofMore seriously, I'm just sick of the DLSS debate, and this looks like it'll go exactly the same way. Feature Vs defaults.
You can just toggle it on via a hotkey while in game, Radeon Chill I mean. The default values are 75-144 FPS depending on movement detected. You can customise the FPS values if needed.AMD Chill has to be set up by the player. DLSS works in game as a setting. Just activate FSR where applicable.
In non graphics intensive games use Radeon Chill - has the reviewer not heard of this? It's been around since 2016.
Even with Nvidia cards, they boost too high too in older games.
On my RTX3060TI, I reduce the power target, use a custom curve and use a framerate cap to reduce power power consumption a decent amount.
Seems to be slow news day in the tech press ATM. Weird this concern with power consumption never existed when the RX6000 series was better performance/watt than the RTX3000 series.
My god, expecting a player to select an option. The humanity!
You should select whatever you want and play as you want. If DLSS/FSR offers about the same quality at lower the power, why not? But you do you and others will do whatever they wanna do.More seriously, I'm just sick of the DLSS debate, and this looks like it'll go exactly the same way. Feature Vs defaults.
Multi monitor isn't a basic thing. It has been an issue even if nvidia cards until very recently. I remember reading countless threads about how gpus would not use their 0db fan mode when using multi monitors. And it was all down to the gpu not downclocking enough because it had a second monitor attached.
You can just toggle it on via a hotkey while in game, Radeon Chill I mean. The default values are 75-144 FPS depending on movement detected. You can customise the FPS values if needed.
high refresh rate used to be an issue but not much anymore 2x4k screens at 120hz is around 700Mhz. 2x240hz monitors averages around 500My 4080 downclocks all the way to under 500MHz at times in Skyrim by itself, with 3 displays in total connected. Inside the dungeons, power consumption ( as show with HWinfo), was under 50w, outside around 50-60w+ most of the time. Older series may have been problematic, but not anymore.
You should select whatever you want and play as you want. If DLSS/FSR offers about the same quality at lower the power, why not? But you do you and others will do whatever they wanna do.
Is fine. Played a few hours Skyrim, 3 monitors active, fans were off all the time at around 26*C in the room... True, they're 60hz normal 1080p screens, don't know how it does for higher res, higher refresh rate.
I don't know how it feels at that FPS as I'm not interested in high Hz gaming, but for lower fps, it does feel like a significant improvement, at times strangely close to native.There is no way that running a game half the time at 70fps doesn't increase latency compared to running 140fps. You may as well just cap it at 70fps if it's a single player game and man up and play at 140fps if it's a competitive shooter, I mean you don't want to get shot in the face in call of duty just so you can save a cent on power