• Login
  • Register
  • Dolphin Forums
  • Home
  • FAQ
  • Download
  • Wiki
  • Code


Dolphin, the GameCube and Wii emulator - Forums › Dolphin Emulator Discussion and Support › General Discussion v
« Previous 1 ... 262 263 264 265 266 ... 369 Next »

Nvidia Researching SRAA (Subpixel Reconstruction AntiAliasing)
View New Posts | View Today's Posts

Pages (3): 1 2 3 Next »
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Thread Modes
Nvidia Researching SRAA (Subpixel Reconstruction AntiAliasing)
01-29-2011, 02:23 PM (This post was last modified: 01-29-2011, 02:26 PM by Squall Leonhart.)
#1
Squall Leonhart Offline
Friend of local jackass
*******
Posts: 1,177
Threads: 27
Joined: Apr 2009
Quote:It seems that Nvidia is researching a new antialiasing technology to compete with AMD's MLAA (Morphological Anti-Aliasing). Nvidia has published the following abstract from its research: "Subpixel Reconstruction Antialiasing (SRAA) combines single-pixel (1x) shading with subpixel visibility to create antialiased images without increasing the shading cost. SRAA targets deferred-shading renderers, which cannot use multisample antialiasing. SRAA operates as a post-process on a rendered image with superresolution depth and normal buffers, so it can be incorporated into an existing renderer without modifying the shaders.

In this way SRAA resembles Morphological Antialiasing (MLAA), but the new algorithm can better respect geometric boundaries and has fixed runtime independent of scene and image complexity. SRAA benefits shading-bound applications. For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading. Thus SRAA would produce a net speedup over supersampling for applications that spend 1 ms or more on shading; for comparison, most modern games spend 5-10 ms shading. We also describe simplifications that increase performance by reducing quality."

Big Grin The benefits of MLAA without the Blurring Deficits.

The source of most graphical issues with MLAA come from a lack of subpixel reads, which results in warped pixels (if anyone remembers, its also why the PSX has warping pixels the missing z buffer also plays a part in the case of the PSX though).
[Image: squall_sig2.gif]
[Image: squall4rinoa.png]
VBA-M
Website Find
Reply
01-29-2011, 02:30 PM (This post was last modified: 01-29-2011, 02:31 PM by NaturalViolence.)
#2
NaturalViolence Offline
It's not that I hate people, I just hate stupid people
*******
Posts: 9,013
Threads: 24
Joined: Oct 2009
Somehow deep down I always knew nvidia was going to do something like this. I really want both them and ati (with MLAA) to implement this in such a way where if possible it is done BEFORE hud elements are drawn to the framebuffer. That way it can apply AA to the scene without blurring hud elements. However the game developers would most likely need to implement that in-game for it to work. But they could do that as long as they add a driver AA mode that can be exposed by the application.

Also are they going to use a larger than normal texture buffer size (# of texture samples) as well? Is that what they mean by "normal buffer" (a term I have never heard before). They would need to do that for a proper implementation, it's their only hope at getting rid of the blurry texture problem associated with post-processing AA.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."  
-Ron Swanson

"I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. "
-Mark Antony
Website Find
Reply
01-29-2011, 03:18 PM (This post was last modified: 01-29-2011, 03:19 PM by Squall Leonhart.)
#3
Squall Leonhart Offline
Friend of local jackass
*******
Posts: 1,177
Threads: 27
Joined: Apr 2009
a normal buffer is where normal maps are stored during processing.

http://http.developer.nvidia.com/GPUGems3/gpugems3_ch22.html
[Image: squall_sig2.gif]
[Image: squall4rinoa.png]
VBA-M
Website Find
Reply
01-29-2011, 04:52 PM
#4
Kodiack Offline
Member
***
Posts: 139
Threads: 5
Joined: Jan 2011
Quote:For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading

Damn, those are pretty impressive results. For comparison, someone getting 60 FPS (not vsync capped; just 60 FPS) [rounded down to 16.6 ms] would still get about 54.3 FPS. Someone with 100 FPS would still get 84.7 FPS. That's not a terribly high amount of overhead, and it would only get better on higher-end video cards.

I'm currently on 10.10e, by the way, the oddball driver that supports MLAA even on 5000-series cards. MLAA really is kind of ugly, mainly due to that blurring effect. It's lightweight, but useful in a very small portion of games. It will be quite interesting to see where this research goes. With resolutions just going up and up, standard anti-aliasing techniques remain incredibly resource-intensive.
Website Find
Reply
01-30-2011, 02:39 AM
#5
Runo Offline
Greeny
*******
Posts: 1,194
Threads: 43
Joined: Mar 2009
Personally, I aways thought Antialias was a bit of a just-for-now fix for one of the biggest rendering imperfection of all time.. I mean, I know that it's physically impossiblle to perfectly rendering an image, since it's an interpretation of the coded image, but I expected to do something soon that would make antialias unecessary, like the resolutions used would be so high in the future that our eye couldnt notice the imperfections. wait, that sounds too like a just-for-now fix Sad...
OS: Windows 10 Pro 64bit Creators Update
CPU: AMD Phenom II X4 960 @ 3.6 GHz
Graphics Card: Nvidia GeForce GTX 960 2GB GDDR5
Motherboard: Gigabyte GA-870A-USB3 AM3+ Revision
RAM: HyperX 8GB Dual Channel @ 1600Mhz
Find
Reply
01-30-2011, 08:17 AM (This post was last modified: 01-30-2011, 08:18 AM by NaturalViolence.)
#6
NaturalViolence Offline
It's not that I hate people, I just hate stupid people
*******
Posts: 9,013
Threads: 24
Joined: Oct 2009
@Runo

As long as the input is a vector (2D or 3D shapes/objects with point coordinates) and the output is a raster (image with pixels) you will always have aliasing. Period. Aliasing comes from the process of taking something with an infinite level of detail (vectors) and using it to render a raster, which has a finite number of pixels/level of detail. Even ray-tracing produces aliasing.

When we invent monitors that can display shapes without the need to convert them into pixel based images first, that's when aliasing will go away. But that will never happen.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."  
-Ron Swanson

"I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. "
-Mark Antony
Website Find
Reply
01-30-2011, 10:56 AM
#7
lamedude Offline
Senior Member
****
Posts: 360
Threads: 7
Joined: Jan 2011
Enough with this faux AA. I don't spend more on a GPU than a console to get console AA. Developers need to start using DX10 (why is Gears the only UE3 game that has it?) so we can have deferred rendering+MSAA.
Or is Nvidia tired of doing driver hacks to enable AA in DX9 deferred rendering games?
Website Find
Reply
01-30-2011, 11:01 AM (This post was last modified: 01-30-2011, 11:05 AM by Squall Leonhart.)
#8
Squall Leonhart Offline
Friend of local jackass
*******
Posts: 1,177
Threads: 27
Joined: Apr 2009
in stupid terms

Aliasing is a side effect of using triangles to render images, there will always be flat edges and sharp angles that can only be smoothed out by approximating along the edge.
(01-30-2011, 10:56 AM)lamedude Wrote: Enough with this faux AA. I don't spend more on a GPU than a console to get console AA. Developers need to start using DX10 (why is Gears the only UE3 game that has it?) so we can have deferred rendering+MSAA.
Or is Nvidia tired of doing driver hacks to enable AA in DX9 deferred rendering games?

Not all games can be hacked around.

and even if you force all games to Dx10.x, only the GT200 and higher will be able to antialias a deffered image.
[Image: squall_sig2.gif]
[Image: squall4rinoa.png]
VBA-M
Website Find
Reply
01-30-2011, 11:17 AM
#9
ector Offline
PPSSPP author, Dolphin co-founder
*
Project Owner  Developers (Administrators)
Posts: 189
Threads: 2
Joined: Mar 2009
(01-30-2011, 08:17 AM)NaturalViolence Wrote: @Runo

As long as the input is a vector (2D or 3D shapes/objects with point coordinates) and the output is a raster (image with pixels) you will always have aliasing. Period.

Go read up about the REYES algorithm. It will blow your mind Smile Too bad it's not really practical to run in realtime.
Website Find
Reply
01-30-2011, 12:14 PM
#10
NaturalViolence Offline
It's not that I hate people, I just hate stupid people
*******
Posts: 9,013
Threads: 24
Joined: Oct 2009
Quote:Too bad it's not really practical to run in realtime.

^And that's precisely why I don't care about it.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."  
-Ron Swanson

"I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. "
-Mark Antony
Website Find
Reply
« Next Oldest | Next Newest »
Pages (3): 1 2 3 Next »


  • View a Printable Version
  • Subscribe to this thread
Forum Jump:


Users browsing this thread: 2 Guest(s)



Powered By MyBB | Theme by Fragma

Linear Mode
Threaded Mode