[00:04] <daftykins> because of HDR mode
[00:04] <daftykins> 4K is utter rubbish tbh in new TVs, can't see the difference over 1080p easily... but HDR is a huuuuge change
[00:05] <penguin42> 4k on my new monitor is nice, but so are the extra 4 inches
[00:05] <daftykins> yeah that's comp usage, i'm really talking about video content her as per a TV
[00:05] <penguin42> daftykins: I don't see why HDR is giving that change - I'd expected more subtle gradations, not such a big overall difference
[00:11]  * penguin42 thinks he needs some experimental drivers to drive his at HDR
[00:49] <ali1234> HDR essentially separates the colour of an object from how bright it is
[00:49] <ali1234> without HDR the only way to make something "lighter" is to make it more white, hence the washed out colours in the first image
[00:51] <penguin42> hmm, ok, maybe I'm missing something about what HDR is - I just thought it was a higher bit depth - is it something else?
[00:51] <ali1234> it is
[00:52] <ali1234> just a higher bit depth
[00:52] <penguin42> so why does that change how you make it more white?
[00:52] <ali1234> because you can literally have values greater than "255, 255, 255"
[00:53] <penguin42> but aren't your bits getting added at the bottom, not the top?  So that your normal full range is 1023,1023,1023 and your HDR is giving you the subtelty for the range?
[00:53] <ali1234> no, that's the point :)
[00:54] <penguin42> oh...
[00:54] <ali1234> ultimately the actual "white" level is determined after you render everything
[00:55] <ali1234> ie what value gets mapped to 255, 255, 255 on your monitor
[00:55] <ali1234> and what value is 0,0,0
[00:55] <ali1234> the thing is, you can choose them after rendering
[00:56] <penguin42> oh, hmm that's not how I'd imagined it
[00:59] <ali1234> you can do this with 8 bit per channel but then you lose colour depth in the process
[01:00] <penguin42> yes, I guess that is easier to take existing code; I'd assumed I'd be seeing 10bpp pixmaps
[01:00] <ali1234> well you would be, but your monitor probably can't display them :)
[01:00] <penguin42> yes it can
[01:00] <penguin42> well, it apparently can; the advert says it can do 2^30 colours
[01:01] <penguin42> but perhaps that's just via this HDR mech
[01:01] <ali1234> well some monitors can
[01:02] <ali1234> but even so, white is still white
[01:02] <ali1234> because monitors are backlit
[01:02] <penguin42> (nod, having turned my brightness down to 30%)
[01:03] <ali1234> i believe HDR uses floats anyway :)
[01:04] <penguin42> https://www.youtube.com/watch?v=QkwmSzPdVnY
[01:04] <penguin42> ali1234: Down the monitor?
[01:04] <ali1234> no, in the internal render buffers
[01:05] <ali1234> the final step is to map that into the monitor colour space somehow - and maybe add a bloom effect on things that are too bright for the monitor to directly display
[01:05] <penguin42> yeh ok
[01:10] <penguin42> I need to fiddle with cabling to get the display onto the DP rather than the HDMI anyway I think before we can think about that, but I've not quite figured out where it will show up if the monitor can do it
[01:10] <ali1234> probably nowhere - i don't think xorg supports more than 32 bpp
[01:11] <ali1234> no idea about wayland
[01:11] <penguin42> ali1234: Well it's not even in the edid data that I can see
[01:11] <penguin42> ali1234: Although this is plugged in via the HDMI at the moment so maybe that will change
[01:13] <penguin42> (The monitor has one uDP, one full DP, and 2 HDMI, so juggling them is interesting)
[01:27] <daftykin1> problem with the basic mode in use in my pics is HDR10 is defined once at the beginning of setting the mode, whereas Dolby Vision is a constantly running variant
[01:27] <daftykin1> one of the kodi fork devs was working on splitting the metadata in Dolby Vision over the last few days, seemed interesting
[16:32] <penguin42> ali1234: ediddecode on this monitor plugged into DP shows '10 bits per primary color channel'
[16:32] <ali1234> cool
[16:33] <ali1234> but can you get Xorg to output that?
[16:34] <penguin42> ali1234: Not that I know of, I'm told there are some Mesa patches that might do it - but I don't understand what happens given that I have a 2nd monitor connected that is only 8bpp - I'm asusming it'll go to lowest depth
[19:22] <ali1234> I would expect the video card to drop the extra bits for the other monitors - if the driver supports it at all that is
[19:35] <penguin42> I'd expect it to only offer a depth supported by all devices
[19:38] <diddledan> a monitor can advertise 10bit support all it likes. that doesn't mean the panel is any more capable in displaying those extra bits
[19:39] <penguin42> diddledan: The advertised spec claims it can
[19:44] <zmoylan-pi> ...advertised...
[19:46] <penguin42> well, it seems to meet the other advertised features
[20:05] <brobostigon> evening boys and girls.
[20:06] <penguin42> hey brobostigon
[20:08] <zmoylan-pi> you're late :-P
[20:12] <Maefs> hi
[20:13] <Maefs> how are you?
[20:14] <diddledan> 12 hours late
[20:18] <brobostigon> hey penguin42
[20:18]  * brobostigon has spent most of the day with family,
[20:19] <zmoylan-pi> pffffft, what have they ever done for you? :-P
[20:19] <brobostigon> well, quite a few things.
[20:20] <zmoylan-pi> the aquaduct? :-D
[20:20] <brobostigon> sounds more like the romans, :). lolz.