Dolphin, the GameCube and Wii emulator - Forums

Full Version: Feature Request: HDR support!
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5
There are two main advantages of HDR, not just the one that May keeps pointing out. There's a wider range of colour, so things can be brighter and darker, but within a given range of colour, there are also more distinct levels. The subset of an HDR gamut which corresponds to SRGB's gamut has more colours, so there's maybe some potential to add partial support in the same way as forcing 24-bit colour on lower-bit-depth games. This isn't likely to make the game look suddenly amazingly better, but might reduce banding slightly.
AnyOldName3 Wrote:The subset of an HDR gamut which corresponds to SRGB's gamut has more colours

Do you have a source for that? Because from my understanding, that's not how it works... HDR doesn't have more detail because it's samples are tighter, it has more detail because it has more samples in all directions.

What the GameCube does is very unusual, since the 18-bit and 24-bit modes have the same color space despite the difference in bits. The console is "compressing" (it's really tonemapping but whatever) the GameCube's native 24-bit RGB output mode down to 18-bit, so each "step" (sample) of color is larger than it should be when converted back to 24-bit by the panel. This of course results in banding. This is super weird, and a pretty hacky way to work around the hardware limitations. But thanks to how it works, forcing a game to render in 24-bit mode is pretty easy, removing banding without expanding beyond the color space. Unless a game decided to optimize specifically for 18-bit in their textures in which case it can actually make things worse but you know!

On the other hand, Rec 709 (regular HD) and Rec 2020/2100 (modern HDR) have "steps" of the same size, just HDR is able to go further in either direction. If you tried to take advantage of the additional steps that HDR has over SDR, you'd have to go outside of SDR's color space, and the saturation and brightness would be all wrong, and it would break in ways unique to each game.

EDIT: Also I forgot to mention specifically that the GameCube has a 24-bit output support and lots of games support it by default. Though some games output in 18-bit due to hardware limitations for whatever the game is specifically wanting to do. The fact that the GameCube has 24-bit mode built in is a large part of why it is so simple to force all games to use it.
(05-22-2018, 06:50 AM)MayImilae Wrote: [ -> ]GameCube's native 24-bit RGB output

Ah that was the piece of information I was looking for. Makes more sense to me now about why we can easily force 24-bit colors but not 30-bit.
Quote:Do you have a source for that? Because from my understanding, that's not how it works... HDR doesn't have more detail because it's samples are tighter, it has more detail because it has more samples in all directions

Rec 2020 and Rec 2100 don't specify a specific bit depth - both can be implemented with 10-bits per sample or 12-bits per sample and be completely compliant and cover the same colour space. That only changes the step width. Both can also be implemented with different transfer functions which means that they can have different step widths in different regions of the colour space, so an HLG Rec 2100 system would focus a lot of the bit depth on the SDR region (in an attempt to maintain compatibility with SDR displays) whereas a PQ Rec 2100 system would spread the bit depth so it appeared even across the whole gamut according to a human observer.
It's quite confusing, these standards are recommendations, not enforced, and it's mostly meant for broadcast TV, which means things may not be the same on computers and other devices.


They did thankfully get rid of interlaced video, so if things are followed, only progressive scan is allowed, but according to wikipedia, odd framerates are still permitted, so oh well.

It's interesting how everyone talks about HDR, that's because this term is getting the biggest media/marketing buzz, while I'm not sure if they actually also include WCG stuff with it? Why did the media then coin the separate WCG term if WCG is just a component of HDR? And then everyone calls the new tech with the wrong term, isn't HDR only suppose to be about exposure,brightness, etc ... or what. Now I looked further into this, there is talk whether or not HDR actually requires WCG (a wider color gamut), it was included in Rec.2020 - so Rec.2100 doesn't change it, it only adds PQ and HGL specifications for HDR, but what do they really mean with HDR these days, who the heck knows.

There is debate of this HDR being a abused as one of those marketing terms, with a lot differences/variations underneath, that discussion more TV and Hardware oriented, one thing is software, but the other thing is what the Hardware can actually reproduce and accurately show what the software wants it to do, it's all about panel technologies then, and with SDR conversion stuff, it's a big big topic, it gets easier when you just focus modern stuff only, user's shouldn't need to bother understanding SDR compatability, that's just for the TV broadcasters. http://www.avsforum.com/forum/465-high-dynamic-range-hdr-wide-color-gamut-wcg/2973578-hdr-falsely-marketted-thing.html

For Dolphin, if something like this would be attempted, one game should be chosen first, something like Metroid Prime, or Starfox Assault, and it wouldn't be practical follow ITU standards, would come out of it could be something unique under the hood while achieveing similar or somewhat close results.
This attempt would help discover the bits and pieces that would be needed, which of them are doable, which not, it may ofcourse not be successful in the end, but it would give at least a good idea how far could someone get.
There could be tricks discovered where Dolphin could use them and manipulate with the assets/shaders or the game's memory without having to modify the ISO or game source code.

EDIT: This Arris Modern Video Quality Targets document from 2016 talks about "HDR WCG" together in the same sentence simultaneously, but also explains what they think HDR and WCG should mean separately.
https://www.arris.com/globalassets/resources/technical-white-papers/setting-video-quality-and-performance-targets-for-hdr-and-wcg-video-services.pdf

It focuses on the actual practical use and what users report on what looks better or worse.

Quote:Spatial Detail
HDR is all about preserving spatial detail. It is not about brighter pictures, or at least
it should not be. The wider luminance range encoded by HDR enables crisp spatial detail
in dark regions and bright highlights to play a role in storytelling that is not possible
otherwise. Similarly, WCG is all about enabling colorfulness of spatial details.
What is “spatial detail?” We know it when we see it; but if we can’t measure it
quantitatively we can’t manage it systematically.


Some of the stuff seems to corroborate what I was thinking, they can be treated separately technically, but it's probably not practical, you want to get conent/software made for a new Monitor/TV with both, not with one or the other.

So when asking if something supports HDR ... it should be typed as "HDR WCG" not only "HDR", because your eyes don't separate it either, it's not like one eye can see with HDR and the other one with WCG. Dolphin, and similar projects are exception, then it's up to what's possible, could we get some HDR and some WCG, or only HDR but no WCG, or vice versa, that's all up in the air, most likely nothing because it would involve source code or hacking the binaries in ISO, but I'm still enthusiastic about somebody attempting something like this. If there is ISO hacking/modding involved, most definitely won't be shipped with Dolphin, so it will not be so simple to set up.

EDIT2: Wow.
At the bottom of that PDF it mentioned about somethign which they talk about "more in other publications". I made a search and found something that made me do a a pretty big reverse-facepalm, I should have found that a long time ago.

The previous PDF is more specific/technical to the broadcasters/content providers, but this one here is what all users should read in full to get a good understanding of what they're really buying/running/developing and if it really makes sense to buy.
https://www.arris.com/globalassets/resources/white-papers/arris_hdrwcghfrvisualperception_whitepaper_final.pdf

Quote:ABSTRACT

UHD, HDR, WCG, HFR are bound to be powerful creative tools with which to engage the
viewer. Such acronyms (and would-be logos) could also prove to be influential
marketing aids. But how justified would standards bodies, content creators, and
distributors be in thinking of each feature as independent?

This paper will provide principles of applied vision science to quantify the extent of
interdependence of luminance, field–of-view, color perception, and temporal sensitivity.
This paper will also identify situations in which luminance, color, and frame rate should
perhaps be considered in concert rather than as independent creative dials.

So what's my point, that we should all be in-sync, as the users so the developers, what true HDR WCG even is before going into the possibility of having any of it in something like Dolphin. Ofcourse maybe some of you already knew this, just for those that didn't.


EDIT3: So the basic list is:
  • Pick a game (upd: or two) used for initial research and testing
  • Force Dolphin render/output higher than RGB8 output per channel (most likely hackety hacky hack)
  • Create brand new custom texture-s in higher than RGB8
  • Write brand new custom shader-s (Implement custom shader support for users, like custom textures, and maybe even moving them to a new CustomAssetsWidget in the graphics options)
  • Placeholder1
  • Placeholder2
  • etc..

So after it doesn't work, start digging into if there's some existing part in Dolphin or Game could be merely blocking it and that the HDR experiment would work otherwise, yeah it's a blind speculation, but that's how treasure hunt without a map is like, no other way than to check everything, maybe some games use some HW features which would interfere, and only some games would use those - It would be quite unfortunate to pick that one out of only a few games which use some feature that wouldn't play nice with this HDR WCG experiment hack, which would make it fail and everyone would think it would never be possible ... so maybe two games should be picked and researched simultaneously, with two different developers, two different engines.

Ehm, it's a lot of work. But for the experiment, probably don't need all shaders, and only a few textures, like a 100-300 maybe, for a scene, character, and GUI.
(05-22-2018, 10:11 PM)Renazor Wrote: [ -> ]Ehm, it's a lot of work. But for the experiment, probably don't need all shaders, and only a few textures, like a 100-300 maybe, for a scene, character, and GUI.

I eagerly await your first findings. Big Grin
AnyOldName3 Wrote:Rec 2020 and Rec 2100 don't specify a specific bit depth

Um, I'm still pretty confident you are incorrect here. The Rec2020 and Rec2100 HDR standards are specifically "Wide Gamut" standards, and they are very VERY specific on bit depth. Could you give me a citation on this please? Because I am just not seeing what you are here!
Literally the Wikipedia page for Rec 2020
Um, apparently we're reading it differently because I've read that several times and I don't see that? Can you find a source that specifically says that more bits can be applied within the same color space within the HDR spec? I've been trying to find something that specifies it clearly and struggling myself!

EDIT: I finally asked someone knowledgeable (the usual Big Grin) for details. And we were kind of both right in a way?

So, if one tried to simply use the additional bits for more data within the SDR color space, it wouldn't work. The spec, and thus displays, are designed so that each "step" of each color in SDR and HDR are the same size, and HDR just goes further (wide gamut). So all the additional data in HDR is put toward colors beyond SDR, and SDR itself is unchanged and fits inside of the HDR spec. This makes perfect sense from a backwards compatibility standpoint, as well as a display manufacturer standpoint. It's so easy to play SDR content on HDR systems because the HDR display can just show the limited gamut as is on its screen, and the rest of the wide gamut is just unused. Simple! But this means that reducing banding by the simple method you described doesn't really work, since you can't just give more bits to the same color range and have it work on a display.

Hoooweeeever, one could use tonemapping (using the full range of brightness and other options available, not just color) to convert the SDR colors into the HDR color space while preserving its original colors. And then you could use interpolation to smooth between the steps to reduce banding. This would be additional inbetween stage for rendering the image, requiring processing power and adding some latency, but that would essentially get what you want: more data within the same colorspace. However, there are problems when doing this, as the interpolation can fail and banding can actually look worse. He explained why it fails but it was a big long explanation and I've since forgotten the details, so I won't recite them here. Anyway, basic interpolation probably wouldn't work, and it would need a more complex solution, hopefully one that doesn't impact the look of the title beyond fixing banding. So it's possible, just really difficult!
Quote:finally asked someone knowledgeable
I'm going to go with them being wrong, too. It can happen. Regarding tone mapping, all of that does apply, and you're right about it potentially making banding worse. It's only really good when you've got a wide bit depth (for example you've just rendered a thing to a floating point render target so have 32-bits of data per channel, which is what I mentioned several posts ago) so it won't make Dolphin look better. The steps are closer together in most HDR standards, though, and while some might exist that directly include the 2^24 SRGB colours as a contiguous region, it isn't a requirement.

A big wall of text will follow and some of it may not be especially well-written, so in case you want to avoid having to read it, I'm going to include a link to an article that also agrees with me but probably makes more sense: https://www.tomshardware.co.uk/what-is-10-bit-color,news-58382.html

The parts of the Wikipedia page I was referring to include, but are not limited to:
Quote:Rec. 2020 defines a bit depth of either 10-bits per sample or 12-bits per sample.
i.e. it doesn't define a specific bit depth that you have to use, instead letting you choose between two options. The paragraphs immediately following this don't say that 12-bit covers a wider colour space, so I read it to mean that they both cover the same colour space.

Quote:In coverage of the CIE 1931 color space the Rec. 2020 color space covers 75.8%, [...] and the Rec. 709 color space covers 35.9%.[6]
Two things here:
1. We don't see different CIE 1931 coverage values for 10-bit and 12-bit representations, but we would if 12-bit had a wider range.
2. Less importantly, and technically this doesn't prove my point, but it does suggest it: CIE colour spaces tend to aim to be perceptually uniform (and then turn out not to be), and SRGB also aims to be perceptually uniform (but isn't), and it would be very silly for Rec 2020 implementations to not be roughly perceptually uniform and all the different attempts at perceptual uniformity turn this into very approximate napkin maths. Anyway, if Rec 2020 covers 75.8% of the volume* of CIE 1931 and SRGB/Rec 709 35.9%, assuming everything was perceptually uniform, the doubling of the volume covered would require just one extra bit per sample to keep the same step width. 12-bit Rec 2020 instead adds four bits, multiplying the potential volume coverage by 16x. Either Rec 2020 implementations are putting all those new colours in the new regions, allowing tiny steps between colours that burn your eyes out or that you can't see in a well-lit room but traditional chunky steps between normal colours, or the step size between Rec 709 colours has shrunk.

* It's volume, even though the diagrams you usually see are 2D pictures of an area because usually brightness is left out.

Quote:The NHK measured contrast sensitivity for the Rec. 2020 color space [...] 11-bits per sample for the Rec. 2020 color space is below the visual modulation threshold [...] for the entire luminance range.
This bit doesn't even make sense if you don't interpret it as the NHK trying to work out how many bits they need to make the step size so small you can't see it as your interpretation has the colour space size and bit depth being dependent on each other.

Quote:Transfer characteristics [...] [MATHS]
The maths for the transfer function is the same for 10-bit and 12-bit (although they allow different minimum precisions for the constants) and the values are normalised relative to the camera-input light intensity, so range from 0 to 1 regardless of whether there are 1024 values in the range or 4096. This is literally the maths that defines the step size and distribution and it doesn't have any special handling for ensuring that the SRGB colours remain contiguous. If Rec 2020 had two different colour spaces explicitly declared for 10-bit and 12-bit representations, this could be made to work as you describe, but then this would be obvious from the rest of the page.



One final non-Wikipedia point. The Adobe RGB colour space is by many definitions an HDR standard because it offers a wider range than Rec 709. However, it still only uses 8 bits per sample, so has wider steps than SRGB which uses the same number of steps to cover a smaller range. That right there is an example of an HDR standard which is actually used for things and has a different step with in the SRGB range than SRGB does.
Pages: 1 2 3 4 5