Jesus christ, again?
If you've already decided that all of that is correct then why did you ask me if it was correct? Do you have any idea how infuriating that is?
shoober420 Wrote:I told you before, that I'm buying a VGA cable for the Wii. It shouldn't be too hard to figure out what I'm planning to do with it (even though I told you I was hooking it up to my computer CRT). So I told you exactly what I'm planning to do with it.
shoober420 Wrote:Yes they do. I gave detailed reasons as to why this black screen would occur. Two of them specifically. The threads and videos I posted prove this to be true.
There are three different types of cables you could use for this. So no, you didn't. You were far too vague. You also didn't list any of your sources in your original post.
Like AON3 said you now have context, so now it makes sense.
shoober420 Wrote:You're wrong. The Wii can indeed output in VGA with no converter box needed. Look at these cables below. There is no converter box, and I highly doubt there is a converter within the cable.
If it's a PAL console maybe. But the NTSC consoles don't support RGB output at all:
http://gamesx.com/wiki/doku.php?id=av:wi..._av_pinout
And without opening them you don't know if they have a converter inside. With that price range and size they could. But if they work with both NTSC and PAL consoles they have to.
shoober420 Wrote:I'll list some threads and videos of this problem below. This issue effects all VGA cables for the Wii. It doesn't matter which VGA cable you use, they all have the same issue.
Give me a minute to read through these and I'll get back to you when I have time.
shoober420 Wrote:CRT's are superior to LCD in everyway.
This is objectively false. There are plenty of genuine disadvantages that CRT displays have regardless of which you think is better.
shoober420 Wrote:Don't turn this thread into a CRT vs LCD debate.
We won't but we're also not going to let you post things about CRT and LCD technology that are blatantly wrong.
shoober420 Wrote:The reason there are fewer jaggy edges when using a LCD is because the image is being scaled so it looses picture qiality, and covers up the alasing artifacts much more. A LCD can only display its native resolution without the need to scale, which is why playing older systems on LCD's look like poop. They are being scaled and blurred (which is why you don't see jaggies as much). A CRT can display any resolution without the need to scale it, so it retains its detail and sharpness. Hence, why you can see more jaggies on polygons, because CRT's dont scale the image and blur it.
Completely wrong. Upscaling by nature makes aliasing more visible, not less. It makes it larger and thus more visible. Even if you don't understand the logic behind it you can easily test it yourself and verify it.
The lack of upscaling is not why crt displays have less visible aliasing.
shoober420 Wrote:CRT's do NOT color bleed in anyway. Only LCD's color bleed from scaling.
I don't know where you got this from but this is completely false. All phosphor displays exhibit some degree of color bleed. LCDs by nature do not. This is why CRT shaders always have a Gaussian blur or some other type of blur built in to mimic the effect. The color of pixels bleeds into neighboring pixels and causes a blur effect. This also has the side effect of reducing visible aliasing.
You can reduce the effect of this by reducing the voltage of the cathode ray emitter.
AnyOldName3 Wrote:It has been decided many times by people with really good eyes that LCD technology is now better than the pinnacle of CRT technology
No it hasn't. What people are you referring to?
@Scootaloo
Neither.
Edit:
shoober420 Wrote:LCD can only display at the most, 24-bit color, and that is only with the very best IPS displays.
30 bit panels have been widely available for ages. And 24 bit color has nothing to do with IPS, TN, or VA technology.
shoober420 Wrote:CRT can display true 32-bit color and HIGHER.
This is kind of true. But CRTs don't technically display digital signals at all. You can't compare them that way. Their color reproduction is limited by a number of other factors that don't have anything to do with bitdepth. I would also like to point out here that bit depth deals with color precision rather than gamut and that pretty much all content out there is 24 bit or lower anyways.
And 32 bit isn't a standard RGB bit depth. You won't find any graphics cards that support it. We have 30 bit RGB and we have 32 bit RGBA (24 bit RGB + 8 bit alpha).