There are 2 professional monitors used for mastering HDR content. The sony BVM-X300 which is 30", OLED (full rgb not rgbw), 4k and 1000nits , and the Dolby Pulsar which is 42" lcd, 1080p, and has 4000 nits (it's liquid cooled by the way because the backlight gets it so hot). that's why at the moment some films are mastered at 1000nits and some at 4000nits.
by HDR certification you probably mean the UHD Premium branding from the UHD alliance, which allows you to put a UHD premium badge on your device (along with paying them a fee!). it specifies 2 levels, one for lcd (0.03 nits black, 1000nits bright) and oled (0.0005 nits black, 540 nits bright) along with 4k rez and 95% of the dci p3 colour gamut. TVs can reach this but not apply for the certification (like sony's high end sets).
So what happens when the content is mastered at a higher max brightness than the display can handle? tonemapping. The display will try and adjust the brightness levels knowing that the brightest it can get is say, 600 nits, and so scale everything a bit darker (LG OLEDS) so taht you dont lose detail in the brigghts. Or, it could just clip everything over 600 nits so that you dont lose brightness, but then you lose detail in the highlights (sonys new a1 oled tv). Unfortunately there's no standard so it's up to manufacturers. This is one reason why dolby vision's hdr is quite appealing, dolby maps it all to the display so in theory each display will display it the best it can. There's some great content out there about this, I recommend hdtvtest's youtube video on on tonemapping.
In terms of ULMB, in theory yes the the extra brightness should help, a lot. For the TVs that use it you can still get around 100+ nits when doing black frame insertion/backlight scanning at 120hz on the oleds, and they cant get to 1000 nit so yeh. You wont get hdr games working with ULMB on though.