ExtremeDude2 Wrote:I don't think monitors even support that
Some do, most don't. Right now it's mostly limited to reference/calibrated monitors aimed at professionals and enthusiasts. And some of the "10 bit" monitors do it through FRC while others are true 10 bit panels.
I really doubt there would be any visible difference between 24 bit and 30 bit color depths using color interpolation on a GBC/GBA.
uh, exactly why would anybody need more than 24 bits for display? high bit depths are I think useful for heavy dynamic-range-manipulation image editing. But with the standard contrast ratio, I don't think you need 30 bits. Assuming it's all integer data, then the curve will be much darker than normal in order to spend more of the bits on the darkness which can't be captured by normal RGB. And normal RGB has to be converted using the curve. Or else, most of the curve is wasted on light colors which are already distinguishable.
And exactly why the fsck would anyone need 30 bits for GBA games?
Same reason why people listen to 24 bit audio: placebo + because it makes you feel superior to the 16 bit audio plebs. Hipsters, basically.
With the way some retrofags carry on ("I want my non-interlaced scanlines and barrel distortion!") I would have thought demands for 15/16-bit rendering would be more likely.
delroth Wrote:Same reason why people listen to 24 bit audio: placebo + because it makes you feel superior to the 16 bit audio plebs. Hipsters, basically.
The main purpose of 24 bit audio is to prevent aliasing when editing/rendering. However just because someone likes to have the highest quality rip of their audio possible doesn't mean they do it to "feel superior" or because they're a "hipster". I don't give a damn about music but I would never make that assumption about someone. There is simply no disadvantage in having a 24 bit rip when possible.
tueidj Wrote:With the way some retrofags carry on ("I want my non-interlaced scanlines and barrel distortion!")
I don't personally care for scanlines or barrel distortion but I would never call someone a "retrofag" just because they like that. That's just rude.
jimbo1qaz Wrote:uh, exactly why would anybody need more than 24 bits for display? high bit depths are I think useful for heavy dynamic-range-manipulation image editing. But with the standard contrast ratio, I don't think you need 30 bits. Assuming it's all integer data, then the curve will be much darker than normal in order to spend more of the bits on the darkness which can't be captured by normal RGB. And normal RGB has to be converted using the curve. Or else, most of the curve is wasted on light colors which are already distinguishable.
They are integer data. As for "the curve" I have no idea what you're talking about.
Assuming the gamut stays the same a higher bitdepth usually just gives you more precise red/green/blue shades and therefore colors in monitors. It's usually used in combination with wider gamuts since widening the gamut reduces color precision unless the bitdepth is also raised. It can also be used for storing and processing high dynamic range images but that has nothing to do with the signal that leaves your computer if I'm not mistaken. Just for a programs internal image processing.
The difference on any half decent panel is very noticeable. Especially when you're using a really wide gamut. Assuming you have 30 bit content, which admittedly is rare these days. It's even very noticeable to me, and I have deuteranopia (partial red/green color blindness). Modern LCD displays still have really shitty color reproduction that covers only a tiny portion of the visible spectrum and with limited precision. This is an area where a lot of improvement is both possible and very beneficial to the consumer in the near future. I can't wait until 30/36 bpp and rec.2020 gamut are the norm. Being able to reproduce a wide range of colors visible to humans including very intense hues that are not normally possible without shitty dithering artifacts should be something we can all agree is a good thing.
jimbo1qaz Wrote:And exactly why the fsck would anyone need 30 bits for GBA games?
They don't "need it" and it would likely not be very beneficial. The guy was just asking out of curiosity.
(12-15-2014, 01:58 PM)NaturalViolence Wrote: [ -> ]delroth Wrote:Same reason why people listen to 24 bit audio: placebo + because it makes you feel superior to the 16 bit audio plebs. Hipsters, basically.
The main purpose of 24 bit audio is to prevent aliasing when editing/rendering. However just because someone likes to have the highest quality rip of their audio possible doesn't mean they do it to "feel superior" or because they're a "hipster". I don't give a damn about music but I would never make that assumption about someone. There is simply no disadvantage in having a 24 bit rip when possible.
I explicitly said "listen to" for a reason.
I know you did. Please reread everything after "however".
Delroth, you caught a hipster. I think that means you're now responsible for either rehabilitating him or putting him out of his misery.
In yet another moment of masochism, I wrote an almost functional bitmap writer, although it has to use side effects otherwise it can't actually write to disk, so some purists would probably tell me off. I've also got to the point that my antialiasing would be done if it actually compiled, and it only won't compile because the compiler won't tell me why it won't compile.
Peasant. I wrote a Mandelbrot program for TI-84. Takes less than a minute per 96x64 picture. Axe is far faster than BASIC. (I remember last year I wrote a BASIC one, which took over an hour to render a single picture)