• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Rtx 3080 lower quality capacitor Issue

I just meant that all the boards that changed got changed to more of the expensive CAPS so the build cost for every board built will increase. Multiply that over a million boards and it becomes a lot.

It does seem strange that almost every manufacturer increase the amount of the "better" CAPS from their pre production boards though. I mean they arent going to respond to youtubers videos to change them if there really wasnt an issue.

Cost will be factor, but not for the reason you appear to think. These capacitors are really not going to be expensive as they come on reels of thousands and whether you pay for 1 bigger one or 10 small ones might only be pennies in difference (someone with more intimate knowledge of smd components can probably tell us how much these cost in bulk). One or the other isn't better quality, but they have different frequency responses and that may make the one or the other more appropriate for the task at hand.

The biggest factor of cost is probably that it takes a pick and place machine a lot more time to put down (and verify the location of) 10 components rather than just 1 component.
 
Asus ROG just posted on facebook a minute ago:

All retail ROG Strix and TUF Gaming GeForce RTX 3080 and 3090 graphics cards use only MLCC capacitors for decoupling close to the GPU.

During development, we discovered the improvement this makes to RTX 3090 and 3080 overclocking headroom, so we made specification changes before we started shipping cards to reviewers and customers.

Please note that some of the product images used on etail sites and our product pages were from early development samples, so are not final.

All images will be updated soon.

Please bear with us!
 
Well, after further experimentation, I do get a little lift in clock boost in some games. I also think it is slightly undervolting compared to the release driver.
I can also apply a small overclock that I couldn't before, and was running Warzone quite happily at 2040 with the oc. TBH its not a lot more, it was running 2010 stock on these drivers, around 1995 on the previous ones.
So far they seem to be a keeper.
 
Well, after further experimentation, I do get a little lift in clock boost in some games. I also thing it is slightly undervolting compared to the release driver.
I can also apply a small overclock that I couldn't before, and was running Warzone quite happily at 2040 with the oc. TBH its not a lot more, it was running 2010 stock on these drivers, around 1995 on the previous ones.
So far they seem to be a keeper.

So we have a simple driver update that not only solves the issue, it also makes the cards even faster.

That'll annoy the pitchfork crowd.
 
Last edited:
I'm pretty sure Asus and EVGA were using 330uF and 220uF caps respectively so they also had way less total capacity on the PDC that probably made the effects more obvious.
6x330uF caps is about the same total capacity as if completely removing 2x470uF caps from the Gigabyte and 6x220uF is even worse than that. Then we see a further small improvement going to MLCC so it could have been quite a noticeable difference with their cards.

If companies are making further improvements because of marketing reasons and using that to sell cards then that's great in my eyes even if it is small. They're not gonna improve their cards for any other reason than making them more money, or losing them less.
 
Cards are only guaranteed to run at specified boost clocks around 1.71-1.75ghz anyway, which they will definitely be stable at, so even if it is the caps causing crash to desktop due to boost past 2ghz, drivers can simply reduce the boost <2Ghz and no problem other than reviews not matching real world performance anymore in some cases.

I imagine the drivers simply nerf the clocks slightly?

Personally don't see this line of cards as appealing anyway because power consumption is far to high and don't use a 4k screen. Each to their own I guess.
 
As I understand it the drivers are upping the power the cards take to push the point the capacitors struggle beyond other limits.
That’s exactly what the guy said on a video posted on the other thread. He fixed it by locking the voltage to 1v, so the GPU didn’t need more power than it could be given. Now the power limit has been upped, no voltage locking is needed, the power will be ample to drive it at its max boost.
 
That’s exactly what the guy said on a video posted on the other thread. He fixed it by locking the voltage to 1v, so the GPU didn’t need more power than it could be given. Now the power limit has been upped, no voltage locking is needed, the power will be ample to drive it at its max boost.

Thanks for the explanation, that makes things clearer.
 
Back
Top Bottom