"good" HDR requires HDR assets in the game, and a full HDR pipeline through the renderer for best results - or at least a custom tonemapping step based on the desired art style of each game, or even each scene. Trying to do a general case HDR step in the emulator will always be of extremely limited use - and at best just give similar results to playing with the "color"/"gamma"/"contrast" settings on your monitor - as that's pretty much the same thing.
Feature Request: HDR support!
|
05-12-2018, 08:24 AM
(05-08-2018, 07:03 PM)MayImilae Wrote: I'm pretty sure AnyOldName was quite wrong with that. If a game renders in HDR, like a 360 or PS3 or newer game, you could in theory hack through to access the already existing HDR lighting data. It wouldn't be that easy, as it's tuned to display in SDR so just about everything would break and you'd have to fix that and that's basically impossible without the game's source code, but with an extreme stretch, it's possible. What does force 24 bit color do? I thought Dolphin was downscaling color to work with 8bit SDR displays.
Yea that is a little confusing. I'll explain!
So 8-bit color is built around the idea of the combined RGB result would have to fit within 8 bits. Everyone was on board with this definition, and it was super simple. Though it was a little weird since Red and Green have 3 bits, while Blue has 2 bits, but that works with the human eye and 8 bits is really tidy so eh. But when it came time to enlarge the color space, some (Windows) decided to continue that additive concept, for the bigger numbers. So 24-bit color has 8 bits for each channel (red, blue, and green) and adds them all up to make 24 bits. That's ok I guess? But then they added alpha to this equation, for like desktop compositing and blending and things, and called it "32-bit color". 32-bit color is 8 bits for each channel too, so it has the same number of colors as 24-bit color, but, you know, bigger numbers I guess? This is is terrible for displays, since alpha is irrelevant for displays since you can't see through them, so they could never ship a "32-bit" display. By this naming scheme they are only 24-bit, despite having the exact same number of possible colors as 32-bit color. (this naming scheme is kind of falling apart here!) Display standards groups noticed the problems of the additive style and went a different route, and focused on the colors per channel, which is, you know, the *actual* colors you can see! Monitors are made to follow these display standards so they went with their definitions. So an 8-bit RGB panel has more colors than a 6-bit RGB panel, and you can know that for sure. But due to this difference, to get a monitor that windows said is 32-bit, you would be buying an 8-bit panel. This wasn't really an issue though, since new display standards like Rec. 709 (HDTV) didn't specify any color changes. Now that's changed. HDR is a new display standard, and the first (that's been widely adopted, at least ) in a long time to expand the color space, so that definition difference is very apparent. So a 10-bit panel (HDR10) is 10-bits per channel RGBW (the new channel being white point, for better brightness control). If it was following the old additive style, it would be "40-bit color", even though only the 10 bit per channel of RGB actually impacts the color range of the standard. Of course by the old naming standard windows would add alpha to that and call it 50-bit color, even though the colors are exactly the same, and just, what a mess! - Sooo, now that that is all established, the GC and Wii output their final image at 18-bit, with 6 bits per channel (there is no alpha in output because it's irrelevant at that stage). But that can result in some banding, so Dolphin by default forces output at 24-bit color (8-bits per channel) for less banding. But this can actually create more banding in cases where the game's textures are optimized for 18-bit color, hence the toggle. None of what Dolphin is doing has anything to do with HDR. AMD Threadripper Pro 5975WX PBO+200 | Asrock WRX80 Creator | NVIDIA GeForce RTX 4090 FE | 64GB DDR4-3600 Octo-Channel | Windows 11 23H1 | (details)
MacBook Pro 14in | M1 Max (32 GPU Cores) | 64GB LPDDR5 6400 | macOS 12
05-21-2018, 03:50 AM
(05-12-2018, 09:17 AM)MayImilae Wrote: Yea that is a little confusing. I'll explain! Wow, I feel like I got a little sub-article with this post! Thank you!
Specs:
Win 10 x64 LTSB (Look it up!); ASUS ROG X370 Mobo; AMD Ryzen 2600X w/PBO; 2x8GB GSkill FlareX DDR4 3200Mhz; ASUS ROG RX580 8GB; Samsung 960 EVO nVME 05-21-2018, 08:55 AM
High-end UHD HDR tvs usually have modes that enables fake-HDR on non-HDR inputs. From what I heard off, it works quite well and the result is stunning. Might add input lag, I don't know how it works.
HDR is basically 10-bit by color channel right ? I see there's a option on dolphin that enables 24 bits colors (so, 8-bit by color channel). Is it easier to implement because the gamecube actually support 24 bits colors but not every game use it ? Edit : I missed page 2 explanation, thanks ! Still, it's not clear about why Dolphin is able to force 8-bit color channel, but not 10-bit color channel.
From France with love.
Laptop ROG : W10 / Ryzen 7 4800HS @2.9 GHz (4.2 GHz Turbo disabled unless necessary for better thermals) / 16 Go DDR4 / RTX 2060 MaxQ (6 Go GDDR6) 05-21-2018, 07:28 PM
(05-21-2018, 06:33 PM)DrHouse64 Wrote: High-end UHD HDR tvs usually have modes that enables fake-HDR on non-HDR inputs. From what I heard off, it works quite well and the result is stunning. Might add input lag, I don't know how it works. Yeah, it's probably similar to what the newer fake-HDR camera apps on the phones do. The input lag would come because the input data(each frame) is rendered 2 or more times at different exposure levels and then combined to create fake HDR, that's quite some work and TVs usually aren't very optimized in what a serious gaming-conscious programming would do so it's probably a lot of lag.
Ok this is pushing beyond my knowledge, so this is a little conjecture... But I'm pretty sure it's because Dolphin isn't expanding the color space, it's just providing "higher resolution" within the same color space. This allows banding to be fixed without altering the look of the game. HDR however is all about expanding the color space. So any HDR implementation is going to considerably change the original visuals of the game and different games will handle it in different ways and have game specific issues and on and on.
Anyway, someone who knows how Dolphin renders will be better able to answer this better! Also I do not think that an add on filter would ever work very well. The point of HDR is to have more detail through brighter brights, darker darks, and more colorful colors. So like, taking SDR content and trying to make it HDR through filters is akin to trying to upscale a 3D game from SD to HD resolution exclusively through filters. You caaan do it, but it will look weird, blotchy, and wrong, because the source material is just missing the additional information that is the point of HD. The same is true for HDR. SDR just lacks the information that is the point of HDR, so any SDR to HDR filter is going to look wrong. Not to mention massively impact the look of the game. Spoiler: AMD Threadripper Pro 5975WX PBO+200 | Asrock WRX80 Creator | NVIDIA GeForce RTX 4090 FE | 64GB DDR4-3600 Octo-Channel | Windows 11 23H1 | (details)
MacBook Pro 14in | M1 Max (32 GPU Cores) | 64GB LPDDR5 6400 | macOS 12
Well, I thought it was kinda obvious all the custom textures would have to be recreated, and then custom shaders, what's left would be some hacks to force higher-bit output/rendering.
Well, let's not mix HDR and WCG, maybe HDR would be harder, but what about WCG, these aren't the same. Now I'm thinking maybe HDR wouldn't need recreated textures, if there's no color gamut change. That's the best case scenario theory, unless someone tries it, we can't be sure how hard exactly would it be, harder than ubershaders? EDIT: Sorry for low effort post, I was rushing with 3 different things all together since I was busy with something else and didn't want to go on a big brainstorm, I might do that later properly. HDR and WCG are big marketing terms(with losts of standards/implementations/iterations/generations underneath) and using them for specific technical meanings is asking for confusion. Let's just say some form of improvement could be doable, but probably not comparable to "HDR" and/or "WCG" of the modern times, then again custom textures are a form of modding the game officially in Dolphin out of the box, just through memory, not the ISO. When speaking about improving color ranges, contrast, brightness, you need to talk in nits and bits, not in these consumer abbreviations. I do put a few days into digging an updating myself on these new technologies occassionally, but I forget and get confused as easily too. 05-21-2018, 09:34 PM
(05-21-2018, 09:31 PM)Renazor Wrote: Well, I thought it was kinda obvious all the custom textures would have to be recreated, and then custom shaders, what's left would be some hacks to force higher-bit output/rendering. Where do you get your information from?
Check my profile for up to date specs.
|
« Next Oldest | Next Newest »
|
Users browsing this thread: 1 Guest(s)