After a lot of testing, I'd strong recommend sticking to 144hz for the best balance in motion quality coupled with image quality. I used a combination of testufo.com and FPS (CoD blops) with settings adjusted to keep the FPS close to native.
If we look at testufo, at 144hz, you can make just about make out the individual white windows on the ufo at 960lines. Once you bump it upto 175hz, the white window separation is more obvious at 960lines. However, you're not able to resolve at 1200 lines. If i had to guess, a setting in between of ~1100 lines would be ideal.
In gameplay, I could not detect any subjective differences in reduced blur (motion blur setting ingame is disabled) or a perceived reducing with input lag by using 175hz.
Staying at 144hz also will let you bump the image quality settings by a notable degree and not rely on tools such as DLSS which I personally don't like due to the artefacts I can see with it. Even on a 3090, 175hz is hard at 3440x1440 in newer games.
Beyond gaming, looking at many hours of HDR test footage which I know quite well, there is *no* difference is 10bit vs 8bit+FRC. Anyone claiming such, please ask them to provide hard proof. This should not be a consideration at all.
If you're playing lesser demanding games, moving to 175hz is naturally a no brainer but most people tend to play more modern games. Chances are you bought a QD OLED for the image quality bump in games so finding the balance between panel speed and image quality becomes a tradeoff.