Yea that is a little confusing. I'll explain!
So 8-bit color is built around the idea of the combined RGB result would have to fit within 8 bits. Everyone was on board with this definition, and it was super simple. Though it was a little weird since Red and Green have 3 bits, while Blue has 2 bits, but that works with the human eye and 8 bits is really tidy so eh.
But when it came time to enlarge the color space, some (Windows) decided to continue that additive concept, for the bigger numbers. So 24-bit color has 8 bits for each channel (red, blue, and green) and adds them all up to make 24 bits. That's ok I guess? But then they added alpha to this equation, for like desktop compositing and blending and things, and called it "32-bit color". 32-bit color is 8 bits for each channel too, so it has the same number of colors as 24-bit color, but, you know, bigger numbers I guess? This is is terrible for displays, since alpha is irrelevant for displays since you can't see through them, so they could never ship a "32-bit" display. By this naming scheme they are only 24-bit, despite having the exact same number of possible colors as 32-bit color. (this naming scheme is kind of falling apart here!)
Display standards groups noticed the problems of the additive style and went a different route, and focused on the colors per channel, which is, you know, the *actual* colors you can see! Monitors are made to follow these display standards so they went with their definitions. So an 8-bit RGB panel has more colors than a 6-bit RGB panel, and you can know that for sure. But due to this difference, to get a monitor that windows said is 32-bit, you would be buying an 8-bit panel.
This wasn't really an issue though, since new display standards like
Rec. 709 (HDTV) didn't specify any color changes. Now that's changed.
HDR is a new display standard, and the first (that's been widely adopted, at least

) in a long time to expand the color space, so that definition difference is very apparent. So a 10-bit panel (HDR10) is 10-bits per channel RGBW (the new channel being white point, for better brightness control). If it was following the old additive style, it would be "40-bit color", even though only the 10 bit per channel of RGB actually impacts the color range of the standard. Of course by the old naming standard windows would add alpha to that and call it 50-bit color, even though the colors are exactly the same, and just, what a mess!
-
Sooo, now that that is all established, the GC and Wii output their final image at 18-bit, with 6 bits per channel (there is no alpha in output because it's irrelevant at that stage). But that can result in some banding, so Dolphin by default forces output at 24-bit color (8-bits per channel) for less banding. But this can actually create more banding in cases where the game's textures are optimized for 18-bit color, hence the toggle.
None of what Dolphin is doing has anything to do with HDR.