Depends what you're rendering in resolve...
Depending on what your doing to your footage, then rendering may not stress your gpu very much at all even when set to gpu encode. The hardware encoder which will be doing the encoding is only a tiny portion of the GPUs overall size and compute ability. So even if you are hammering the encoder portion at 100%, your usage will only really show that tiny portion of overall use. Simple colour grading uses little GPU. Also, the encoder is the same across the family, so a lower card will still have the same hardware encoder as tbe2080ti...
Filters and effects which are GPU accelerated (sharpening for one ) will increase the usage, but unless it's crazy complicated, you won't hit 100%. It's in this area that you'll see 2080ti benefits compared to lower cards.
It's a misconception that everything is better with GPU, and this is the case in a lot of image related software. Photoshop for one is a program which will GPU accelerate where it suits, but not all the time. Sometimes it takes longer to send data to the GPU and wait on a result than it would do for the Cpu to do it itself...so no point in GPU acceleration as it's a performance detractor.
As mentioned above, have you tried CPU encode to compare times ?