From a technical stand point, it limits the amount of watts (amps X voltage, per second) the card can potentially use, but the drawback is that some parts of a scene use more power than others. The algorithm tries to maintain all of it to the same limit of 165 watts on a stock Vega 56, and does so by using a management algorithm.
The card can have some pretty high spikes in power usage at stock settings, so the algorithm serves a purpose. Once undervolted, the card can only peak so much, but the relative difference can be much higher at lower average power usage. This creates high variance in frame times as the algorithm tries to balance it all out. At that point, The drawbacks outweigh any potential benefits or even necessity.
For testing your undervolt, something like the free Final fantasy XV benchmark is excellent for determining stability and peak power levels. It uses a form of virtual resolution that allows you to test complex scenes at various resolutions on any screen. I recommend running each of the three default benchmark presets at 1440p and 4k in a single sitting to determine stability, while the 1440p standard test in great for establishing constant power and heat during a normal gaming session. For determining the max power the card can pull while undervolted, try the 4K standard test looped a couple of times.
Something like MSI Afterburner overlay or the driver in-game overlay should provide all the required information on-screen, or an open GPU-Z statistics window on a second display works just as well if you have multiple monitors.