Dolphin, the GameCube and Wii emulator - Forums

Full Version: Nvidia Researching SRAA (Subpixel Reconstruction AntiAliasing)
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3
Quote:It seems that Nvidia is researching a new antialiasing technology to compete with AMD's MLAA (Morphological Anti-Aliasing). Nvidia has published the following abstract from its research: "Subpixel Reconstruction Antialiasing (SRAA) combines single-pixel (1x) shading with subpixel visibility to create antialiased images without increasing the shading cost. SRAA targets deferred-shading renderers, which cannot use multisample antialiasing. SRAA operates as a post-process on a rendered image with superresolution depth and normal buffers, so it can be incorporated into an existing renderer without modifying the shaders.

In this way SRAA resembles Morphological Antialiasing (MLAA), but the new algorithm can better respect geometric boundaries and has fixed runtime independent of scene and image complexity. SRAA benefits shading-bound applications. For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading. Thus SRAA would produce a net speedup over supersampling for applications that spend 1 ms or more on shading; for comparison, most modern games spend 5-10 ms shading. We also describe simplifications that increase performance by reducing quality."

Big Grin The benefits of MLAA without the Blurring Deficits.

The source of most graphical issues with MLAA come from a lack of subpixel reads, which results in warped pixels (if anyone remembers, its also why the PSX has warping pixels the missing z buffer also plays a part in the case of the PSX though).
Somehow deep down I always knew nvidia was going to do something like this. I really want both them and ati (with MLAA) to implement this in such a way where if possible it is done BEFORE hud elements are drawn to the framebuffer. That way it can apply AA to the scene without blurring hud elements. However the game developers would most likely need to implement that in-game for it to work. But they could do that as long as they add a driver AA mode that can be exposed by the application.

Also are they going to use a larger than normal texture buffer size (# of texture samples) as well? Is that what they mean by "normal buffer" (a term I have never heard before). They would need to do that for a proper implementation, it's their only hope at getting rid of the blurry texture problem associated with post-processing AA.
a normal buffer is where normal maps are stored during processing.

http://http.developer.nvidia.com/GPUGems..._ch22.html
Quote:For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading

Damn, those are pretty impressive results. For comparison, someone getting 60 FPS (not vsync capped; just 60 FPS) [rounded down to 16.6 ms] would still get about 54.3 FPS. Someone with 100 FPS would still get 84.7 FPS. That's not a terribly high amount of overhead, and it would only get better on higher-end video cards.

I'm currently on 10.10e, by the way, the oddball driver that supports MLAA even on 5000-series cards. MLAA really is kind of ugly, mainly due to that blurring effect. It's lightweight, but useful in a very small portion of games. It will be quite interesting to see where this research goes. With resolutions just going up and up, standard anti-aliasing techniques remain incredibly resource-intensive.
Personally, I aways thought Antialias was a bit of a just-for-now fix for one of the biggest rendering imperfection of all time.. I mean, I know that it's physically impossiblle to perfectly rendering an image, since it's an interpretation of the coded image, but I expected to do something soon that would make antialias unecessary, like the resolutions used would be so high in the future that our eye couldnt notice the imperfections. wait, that sounds too like a just-for-now fix Sad...
@Runo

As long as the input is a vector (2D or 3D shapes/objects with point coordinates) and the output is a raster (image with pixels) you will always have aliasing. Period. Aliasing comes from the process of taking something with an infinite level of detail (vectors) and using it to render a raster, which has a finite number of pixels/level of detail. Even ray-tracing produces aliasing.

When we invent monitors that can display shapes without the need to convert them into pixel based images first, that's when aliasing will go away. But that will never happen.
Enough with this faux AA. I don't spend more on a GPU than a console to get console AA. Developers need to start using DX10 (why is Gears the only UE3 game that has it?) so we can have deferred rendering+MSAA.
Or is Nvidia tired of doing driver hacks to enable AA in DX9 deferred rendering games?
in stupid terms

Aliasing is a side effect of using triangles to render images, there will always be flat edges and sharp angles that can only be smoothed out by approximating along the edge.
(01-30-2011, 10:56 AM)lamedude Wrote: [ -> ]Enough with this faux AA. I don't spend more on a GPU than a console to get console AA. Developers need to start using DX10 (why is Gears the only UE3 game that has it?) so we can have deferred rendering+MSAA.
Or is Nvidia tired of doing driver hacks to enable AA in DX9 deferred rendering games?

Not all games can be hacked around.

and even if you force all games to Dx10.x, only the GT200 and higher will be able to antialias a deffered image.
(01-30-2011, 08:17 AM)NaturalViolence Wrote: [ -> ]@Runo

As long as the input is a vector (2D or 3D shapes/objects with point coordinates) and the output is a raster (image with pixels) you will always have aliasing. Period.

Go read up about the REYES algorithm. It will blow your mind Smile Too bad it's not really practical to run in realtime.
Quote:Too bad it's not really practical to run in realtime.

^And that's precisely why I don't care about it.
Pages: 1 2 3