Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
to be fair vega was easily undervolted to be fairly efficient, had mine for around 3yrs, though outside of undervolting its efficiency was woefulSadly another reason why I'd not consider another AMD GPU. I bought their flagships for years, more recently with 390X, Fury, Vega64, Radeon 7. All had high hopes of driver improvements that would eliminate high power draw and improve performance. That sadly never fully materialises.
Much safer option is to buy Nvidia. You get launch day 100% performance, stability and much lower idle/multi-monitor/media playback power consumption. Oh and the ability to turn on RT in games with playable FPS at 4K.
In non graphics intensive games use Radeon Chill - has the reviewer not heard of this? It's been around since 2016.
Even with Nvidia cards, they boost too high too in older games.
On my RTX3060TI, I reduce the power target, use a custom curve and use a framerate cap to reduce power power consumption a decent amount.
Seems to be slow news day in the tech press ATM. Weird this concern with power consumption never existed when the RX6000 series was better performance/watt than the RTX3000 series.
Radeon Chill isn't the solution to this problem. If the 7000 series is boosting more than it needs to, or drawing more than it needs to, then the problem is with the 7000 series - the video notes the 6950 and the 3090 were pretty much matched in the same situations. I suspect it's a feature of moving to chiplets and some teething issues associated with it tbh.In non graphics intensive games use Radeon Chill - has the reviewer not heard of this? It's been around since 2016.
Even with Nvidia cards, they boost too high too in older games.
On my RTX3060TI, I reduce the power target, use a custom curve and use a framerate cap to reduce power power consumption a decent amount.
Seems to be slow news day in the tech press ATM. Weird this concern with power consumption never existed when the RX6000 series was better performance/watt than the RTX3000 series.
You can only turn on nvidia features to make their cards better you are not allowed to do that with AMD...In non graphics intensive games use Radeon Chill - has the reviewer not heard of this? It's been around since 2016.
Even with Nvidia cards, they boost too high too in older games.
On my RTX3060TI, I reduce the power target, use a custom curve and use a framerate cap to reduce power power consumption a decent amount.
Seems to be slow news day in the tech press ATM. Weird this concern with power consumption never existed when the RX6000 series was better performance/watt than the RTX3000 series.
Yep.You can only turn on nvidia features to make their cards better you are not allowed to do that with AMD...
What's Radeon chill and why hasn't AMD implemented this solution into its driver already so it just does it automatically if it's so great
Edit: oh it's a dynamic frame rate limiter; why would someone use it in an esports game though, no matter how small it is, it must have extra input lag because you cannot instantly transition between say 30fps to save power and 120fps because you flicked 180 degrees to scope an enemy, part of your frames were generated at a lower fps so potentially look longer to spot the enemy and react than if you were rendering at 120fps.
Radeon Chill isn't the solution to this problem. If the 7000 series is boosting more than it needs to, or drawing more than it needs to, then the problem is with the 7000 series - the video notes the 6950 and the 3090 were pretty much matched in the same situations. I suspect it's a feature of moving to chiplets and some teething issues associated with it tbh.
I wonder if they tested with freesync enabled. I remember my 580 being clock happy when freesync was on, to the extent that older games I'd see it just max out clock speed, and even in video playback I'd see the clocks go over 1ghz whereas with freesync off I'd see it downclock to 500mhz. Made the difference between it having to turn it's fans on/off or just never turning them on.
Radeon chill, imho, is a nice feature. But I felt there should be an advanced setting to enable a curve of fps ranges rather than min/max.
There are no technologies limiting the fps to the same as the amd card in that video in order to limit the power usage. It's an apples to apples comparison, throwing in Radeon Chill is not an apples to apples comparison.You can only turn on nvidia features to make their cards better you are not allowed to do that with AMD...
Those posters obviously didn't watch the video and should be dismissed. It's an interesting issue IMHO as it's probably not something that's tested too much, AFAIK most reviews dump on a demanding game and look at power usage, or run a benchmark. This scenario would be missed by that testing and it's something that really shouldn't be a thing TBH.According to some of the comments here they seem to want to imply it's all AMD cards forever not just the RX7900 series with it's chiplet design. The video needs to have a clearer title really.
Although I am not surprised after seeing the issues with Zen1.
However, I find my RTX3060TI boosts too high on less intensive games. I literally power limited the card by 50% in Fallout4 and it makes no difference in framerates but the fan is much quieter. If not it was going up and down all the time. Since it is in an NCase M1 right next to me on my desk it was definitely noticeable.
My current playthrough is modded,so is definitely more of a load than normal. It does downclock,but it's quite clear I can drop clockspeeds even more with tweaking and it has no effect on framerates. I use a 12 litre NCase M1 case,which is sat on the desk next to me so it is obvious when fans do ramp up!Those posters obviously didn't watch the video and should be dismissed. It's an interesting issue IMHO as it's probably not something that's tested too much, AFAIK most reviews dump on a demanding game and look at power usage, or run a benchmark. This scenario would be missed by that testing and it's something that really shouldn't be a thing TBH.
I mentioned my 580 and as I said, it seemed to be clock happy with freesync. I did notice a few posts on reddit when I was looking at it at the time mentioning the same thing, but it wasn't as bad as the example in the video. Partly because I don't think its power usage was as aggressive, and secondly because it had a smaller power budget - so even if it was more power hungry than it's counterpart, it'd never be 200w different. Though as I said, the video mentions the 6950 behaving similar to the 3090 so I'd assume it's a 7000 series issue, just like the more common high idle power draw which they seem to be addressing albeit at a glacial pace.
I'm fairly sure my 3060ti underclocks on FO4. It's been a while since I've played it though, and I didn't add graphical mods from memory. And it's already underclocked to 1860 with a 170w max power target, while locking it to 60fps because I don't trust the game not to break. But I don't think it was getting to 1860 when I played. It downclocks in Anno 1800, but then I limit it to 48fps, so it should downclock. Though I think if i'm using fullscreen borderless I don't think it does, or maybe not as much, whereas fullscreen does. Not sure I've noticed anything elsewhere, but then I've not really been looking. Does it run well and is it noisy are the two things that I generally pick up on, maybe also is it pumping out a lot of heat.
My 580 would downclock to ~1100/700mhz when playing black mesa, except doing so caused stuttering. Was the only game i had to use clock blocker on to boost back up to full clocks and solve the stuttering. Annoyingly boosting my power usage. If the 700mhz was smooth it would have been absolutely fine because the fps was fine..
Sadly another reason why I'd not consider another AMD GPU. I bought their flagships for years, more recently with 390X, Fury, Vega64, Radeon 7. All had high hopes of driver improvements that would eliminate high power draw and improve performance. That sadly never fully materialises.
Much safer option is to buy Nvidia. You get launch day 100% performance, stability and much lower idle/multi-monitor/media playback power consumption. Oh and the ability to turn on RT in games with playable FPS at 4K.
Oh good, can we stop talking about DLSS then?There are no technologies limiting the fps to the same as the amd card in that video in order to limit the power usage. It's an apples to apples comparison, throwing in Radeon Chill is not an apples to apples comparison.
If someone suggested locking the nvidia card to 60fps while running the amd card at full throttle people would say it's an unfair comparison - because it would be.
Why?Oh good, can we stop talking about DLSS then?