easiest - obviously, you aren't doing anything bar plugging it in and turning it on
best? - I don't think using considerably more power and getting less scores in benchmarks is best. I can eek more power but then as I've already stated it's 3% performance gain yet using 30% more power. That isn't a good ratio and law of diminishing returns has kicked in. I don't even need to set a custom fan curve. Using 80W less power means it runs as cool as ice. The ram is the only issue and there is nothing you can do to fix that bar modding the card.
I dont play benchmarks or even benchmark any more. Its primary use is Warzone / Modern Warfare.
My goal was a solid 97fps (100Hz gsync monitor) all the time with no crashes, slowdowns or anything else.
3400 x 1440 res, all settings maxed with the dof rubbish all off.
When I played with undervolting and overclocking, benchmarks and some games were fine but Warzone for some reason sometimes in action would either ditch fps then climb back up or crash completely, ruining games. One time was with a slight undervolt, we got to final circle in quads and a vehicle came in, I thermite stuck it and we all opened fire.
There must have not been enough watts to power the gpu at that specific point and my computer flat powered off. I thought it was a power cut at first but it wasnt. It cost us the game and we came 2nd, i almost threw my keyboard out the window.
I immediately benched everything and all seemed fine only for it to happen a few days later. I went to stock settings and its been fine since. Nvidia dont have ‘stock’ settings for no reason, they’re the most stable. Sure overclocking/undervolting is fun but when you play team games you want stability not to care about a slight bit of extra heat or power usage.
Also to add, Quake II RTX was a great tool for benchmarking when i was benching a year or so ago. It really hammered the GPU