I didn’t really understand the benefit of HDR until I got a monitor that actually supports it.
And I don’t mean simply can process the 10-bit color values, I mean has a peak brightness of at least 1000 nits.
That’s how they trick you. They make cheap monitors that can process the HDR signal and so have an “HDR” mode, and your computer will output an HDR signal, but at best it’s not really different from the non-HDR mode because the monitor can’t physically produce a high dynamic range image.
If you actually want to see an HDR difference, you need to get something like a 1000-nit OLED monitor (note that “LED” often just refers to an LCD monitor with an LED backlight). Something like one of these: https://www.displayninja.com/best-oled-monitor/
These aren’t cheap. I don’t think I’ve seen one for less than maybe $700. That’s how much it costs unfortunately. I wouldn’t trust a monitor that claims to be HDR for $300.
When you display an HDR signal on a non-HDR display, there are basically two ways to go about it: either you scale the peak brightness to fit within the display’s capabilities (resulting in a dark image like in OP’s example), or you let the peak brightness max out at the screen’s maximum (kinda “more correct” but may result in parts of the image looking “washed out”).
Yeesh sounds like your monitors color output is badly calibrated :/. Fixing that requires an OS level calibration tool. I’ve only ever done this on macOS so I’m not sure where it is on Windows or Linux.
Also in general I wouldn’t use the non-hdr to hdr conversion features. Most of them aren’t very good. Also a lot of Linux distros don’t have HDR support (at least the one I’m using doesn’t).