i would say there is no point in getting a 4k without proper HDR which is 10 bit tbh.
so yes either stick with 1080p if you cannot afford a 10 bit panel. the jump to 4k alone isn't worth it unless you want access to the better quality streams on netflix/prime
It's all down to the processing. 10 bit tvs and 8 bit tvs look absolutely the same until one of them gets nice processing for HDR. My 8bit Sony 55XD8599 looks amazing in HDR. I returned a 10 bit Samsung KS9000 for it because the Samsung was too bright for my eyes and text wasn't as natural or sharp for desktop use. I guess if you only use it for sky and UHD bluray go for the blinding bright Samsung but if you have a pc for the bigscreen don't go so brash with the backlight and the quantum dot lighting blurs text at 4k desktop.
Even with sharpening disabled probably not going to remove impact of the quantum dot blurring (it's one quantum dot per 4 pixels no ?) and quantum dot probably does not play well with the anti-aliasing on the text fonts.
EDIT - misunderstood was confuisng RGBW display (false 4k) with qauntum dot film displays
I am intrigued what distance you would sit at to use a 55" as a desktop though- using word/excel must be fun and co-ordinating a mouse copy/paste etc.
The HDR screen has a wider colour gamut though, so like similar wide-gamut computer monitors should be good for photo editing, however most(all?) the 10 bit LED panels are not IPS so would have to remain on axis.