Dolphin, the GameCube and Wii emulator - Forums

Full Version: Nvidia Researching SRAA (Subpixel Reconstruction AntiAliasing)
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3
(01-30-2011, 08:17 AM)NaturalViolence Wrote: [ -> ]When we invent monitors that can display shapes without the need to convert them into pixel based images first, that's when aliasing will go away. But that will never happen.

Thats what I meant with interpretation of a coded image, because we cant really reproduce what is generated by the computer because, as you said, the number of vectors is infinite. Too bad antialias is (and will aways be) so CPU intensive. But I guess you're right, we cant expect a better screen tecnology before the next 50 years :/
Antialiasing isn't as much GPU intensive, as it is framebuffer intensive.
IMO, at 1200p the aliasing is really only noticable if you look closely... and it's only a problem when you actually notice it. So when even higher resolutions become common, people will probably just stop caring about AA...
(01-30-2011, 08:26 PM)KHRZ Wrote: [ -> ]IMO, at 1200p the aliasing is really only noticable if you look closely... and it's only a problem when you actually notice it. So when even higher resolutions become common, people will probably just stop caring about AA...

not the case,
Pixel pitch matters as well, the pixel pitch on my display isn't that great so i see aliasing fairly easily at 1920x1200,... and notably, some people just are sensitive to aliasing.
Quote:IMO, at 1200p the aliasing is really only noticable if you look closely... and it's only a problem when you actually notice it. So when even higher resolutions become common, people will probably just stop caring about AA...

This made me lol. This is totally untrue. Unless you're sitting 6-10 feet away from your 24" monitor or you're used to having aliasing YOU WILL NOTICE IT. Aliasing it like frame tearing in that if you get used to it you don't notice it after awhile, but if you're used to not having it then when you do have it it drives you nuts. I've reached the point where anything short of 4xSSAA or 16xS HSAA is insufficient for me on my 1920 x 1200 monitor.
pure 8xmsaa is more than enough, as long as the game engine supports it properly.
especially where trMSAA works.
Well, to make my comment more plesant, I can turn it around: If you have great antialiasing, then your resolution stops mattering that much, as even lower ones looks fine to the eyes. There??
I disagree from personal experience. For most games:

MSAA:
Pros
-Low performance hit
-Most polygon aliasing is gone with enough samples

Cons:
-Some objects still have aliasing or are unaffected
-Shader aliasing not affected
-Texture shimmering not affected
-TRMSAA usually either doesn't work or does a poor job at eliminating aliasing on transparent textures
-Textures look "normal"

SSAA:
Pros
-No polygon aliasing with enough samples
-No texture shimmering with enough samples
-No shader aliasing with enough samples
-Combined with TRMSAA or TRSSAA transparent texture aliasing is completely gone

Cons
-Generally less effective per sample at getting rid of polygon aliasing than MSAA on the objects that MSAA affects
-Huge performance hit
-Blurry textures

HSAA:

Pros
-No polygon aliasing with enough samples
-No texture shimmering with enough samples
-No shader aliasing with enough samples
-Combined with TRMSAA or TRSSAA transparent texture aliasing is completely gone
-Sharp textures

Cons
-High performance hit, higher than MSAA but lower than SSAA

This is typically true for graphics engines that use HDR (nearly all of them these days).

You have not lived until you start playing your games with 16xS HSAA, proper v-sync, and 16xAF. You will never be able to go back to MSAA again, you will cry in pain at the crappy textures, texture shimmering, and abundant aliasing

Quote:Well, to make my comment more plesant, I can turn it around: If you have great antialiasing, then your resolution stops mattering that much, as even lower ones looks fine to the eyes. There??

Sorry to say this but I disagree with that as well lol. Higher resolution > better AA. Low internal resolution will always look like crap no matter what you do.
All we need is 150 ppi monitors as standard, tessellation & very high resolution textures as mandatory features in all games and Windows 8 with UI/text/accessibility features optimized for high-DPI displays.

Has anyone gamed on a high-DPI (140+ dpi) beyond-FullHD display?

Razor-sharp graphics, no jaggies, no shimmering, ultra-detailed textures, you can see a lot of units/stuff/terrain/action at once, there's no need for AA (all it does is blur/alter the image), high levels of AF are no longer needed (it makes everything look unnatural), etc.

Anti-Aliasing is just a temporary workaround (hack) Smile
Quote:Razor-sharp graphics, no jaggies, no shimmering, ultra-detailed textures, you can see a lot of units/stuff/terrain/action at once, there's no need for AA (all it does is blur/alter the image), high levels of AF are no longer needed (it makes everything look unnatural), etc.

Anti-Aliasing is just a temporary workaround (hack) Smile

*Begins laughing*

I've seen my friends 2560 x 1440 27" monitor. Aliasing is still extremely prevalent in all games. Only slightly less aliasing than a 24" 1920 x 1200. 2xMSAA on a 120 dpi display will yield far less visible aliasing than no aa on a 150dpi display. Shimmering and crappy textures are also just as prevalent at high resolutions.

You'de have to be on crack to think AA or AF is a "hack" that makes things look "less accurate". It overcomes a fundamental limitation of 3d polygon rasterization. All it does is make the resulting image more accurate by giving you a better approximation of the actual color by taking more samples. And without AF you will have crappy textures at obtuse camera angles regardless of resolution, that is just a fact.
Pages: 1 2 3