Dolphin, the GameCube and Wii emulator - Forums

Full Version: Global Illumination
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3

BiggestFZeroXFan

I couldn't find anything about adding global illumination to Dolphin.
Is anyone else interested in the addition of global illumination?
Examples of global illumination include:
1. Ambient Occlusion
2. Ray Tracing
3. Photon Mapping

I understand that current hardware wouldn't be able to ray-trace or photon-map Dolphin in real-time (Moore's Law tells me maybe 15 years from now), but it could be used when a game is paused to get an idea of what it'll look like in the near-future.
(I think that some NVidia cards support Ambient Occlusion for games that don't support them natively--using a software tweak--, but mine doesn't.)
Have you done any sort of shader programming of this nature before? I could be wrong but somehow I get the feeling the answer is no. How do you plan to implement ambient occlusion in dolphin (the only practical option out of those listed above)?
Uhmm, I cant even tell if hes asking FOR golbal Illumination, or if he's Volunteering to to it NV.

Ray Tracing and Photon Mapping I believe would be out of the question though right? I don't know to much about photon mapping, but Ray tracing is used in professional rendering applications isnt it, and is too intensive to pull off EFFECTIVELY in real time?
Ray tracing has been right around the corner for about 5 years now. Larrabee was suppose bring use ray tracing then when that got canned it was Fermi then when that didn't happen they moved to voxels hyped by Euclideon's unlimited detail engine.
^This. Big time.

ThorhiantheUltimate Wrote:Uhmm, I cant even tell if hes asking FOR golbal Illumination, or if he's Volunteering to to it NV.

Neither can I. I think I may have jumped to an early conclusion. I'm leaning more towards "asking" now rather than "telling".

ThorhiantheUltimate Wrote:Ray Tracing and Photon Mapping I believe would be out of the question though right? I don't know to much about photon mapping, but Ray tracing is used in professional rendering applications isnt it, and is too intensive to pull off EFFECTIVELY in real time?

Yes. Like he said they're both way too slow. Not to mention pointless and stupid for an application like this.

As far as I know ambient occlusion post-processing shaders need access to the depth buffer. Which dolphin doesn't support. I could be wrong but I've never seen it done without it. While performance would be acceptable it would likely have a pretty hefty hit.

The nvidia implementation that he's referring to is a driver implementation that works with certain graphics engines and has to be configured differently by the driver software developers for each engine that it supports. As a result it generally only supports the more popular engines, and even then not always particularly well. I highly doubt it will ever be usable with dolphin due to all the weird stuff dolphins graphics backends do. The GPU listed in his profile specs does support this feature so I don't know why he says that it doesn't. In fact much older cards support it too.

Ambient occlusion is often either not considered a form of global illumination or is at best considered a very crude form of it.

As far as ray tracing goes I don't see that becoming common anytime soon (including in 10 years). Maybe I'll end up eating these words in 10 years, we'll just have to wait and see. It's pretty common to see stories on blogs and tech news sites about how real time ray tracing in video games is just around the corner. But I've been seeing them ever since I first got internet access a decade ago. And despite the fact that it's been possible for quite some time we haven't seen any AAA games make use of it. Why? Because the framerates and level of detail achievable with ray tracing tend to be MUCH lower than what is achievable with rasterization. Rasterization is just a much more efficient way of rendering graphics. Think of it as a set of optimizations that save a massive amount of computational resources for a minimal drop in maximum potential shading accuracy. Ray tracing can in theory allow for greater shading accuracy but so many trade offs have to be made to get the rendering times down to acceptable levels for real time use that it actually ends up being less accurate in most cases. Sure as GPUs get faster we won't need to make as many of these trade offs and ray tracing quality will improve, but so will rasterization. Which means no matter how fast GPUs get standard rasterization will stay ahead and allow for much greater detail to be achieved within a limited rendering timeframe. Ray tracing on the other hand will always be the method of choice for render farms where we have no problem waiting minutes or hours to render each frame.

A bit ranty. I should really stop writing and just go to bed. But I'm in way too deep at this point.: (Show Spoiler)

Game devs and graphics programmers are divided over whether ray tracing will become a popular option with video games in the near future. But most of ones I've been following (which granted tend to be the more outspoken ones who bother to maintain blogs and such) tend to lean towards no for the reasons stated above.
Two things:

1) The kind of rendering you've been calling rasterisation is called scanline rendering. As you should know (and probably do - you mentioned wanting to go to bed, so I'll just assume you were sleepy) rasterisation is converting a vector representation of a scene/image/whatever into a bitmap representation. As raytracing converts a vector representation of a scene/image/whatever into a bitmap representation, it, too, is a form of rasterisation.

2) You seem to be forgetting that Avatar was rendered at a much, much higher resolution than the 1080p used by most people for gaming due to the fact that it couldn't have noticeable aliasing even on a massive cinema screen. It also had to be rendered 3 times - once for the left eye and once for the right eye for 3D viewings, and then a third time for the 2D viewings, as just using one of the two angles from the 3D viewing would make everything look like it was filmed from an awkward angle. Taking this into account, we can probably shave a good few years off the 52 year estimate you gave.

I'm just nitpicking, though. You're still correct about it being decades away.
AnyOldName3 Wrote:1) The kind of rendering you've been calling rasterisation is called scanline rendering. As you should know (and probably do - you mentioned wanting to go to bed, so I'll just assume you were sleepy) rasterisation is converting a vector representation of a scene/image/whatever into a bitmap representation. As raytracing converts a vector representation of a scene/image/whatever into a bitmap representation, it, too, is a form of rasterisation.

You are technically correct. The best kind of correct.

AnyOldName3 Wrote:2) You seem to be forgetting that Avatar was rendered at a much, much higher resolution than the 1080p used by most people for gaming due to the fact that it couldn't have noticeable aliasing even on a massive cinema screen.

That doesn't really effect the argument or the math since like you said lowering the resolution would impact image quality negatively. The article said avatar graphics so I'm using avatar graphics.

AnyOldName3 Wrote:It also had to be rendered 3 times - once for the left eye and once for the right eye for 3D viewings, and then a third time for the 2D viewings, as just using one of the two angles from the 3D viewing would make everything look like it was filmed from an awkward angle. Taking this into account, we can probably shave a good few years off the 52 year estimate you gave.

Nope. The number of times they rendered it does not effect render time per frame at all. Think about it.
I still haven't got what the hell this has to do with dolphin at all. As all of the wii/gs games renders by polygons, we also have to do it in this way. Funny fact, they could also implement some kind of raytracer, at least the native gpu is flexible enough to do so.
We don't emulate a 3d engine, we emulate a gpu, so we are only able to execute the original software, which just use polygons.
Well to be fair I did sort of bring it up:
NaturalViolence Wrote:How do you plan to implement ambient occlusion in dolphin (the only practical option out of those listed above)?

Pretty much my way of saying "ambient occlusion is the only one of these options that could actually be done". I didn't elaborate in case I might get the details wrong. Thanks for elaborating.

Ambient occlusion could in theory be added as a postprocessing shader if a few changes were made to the way dolphin uses them. So that option is technically possible. I think asmodean even discussed it in his fx shader suite for dolphin thread.
As for Ambient Occlusion, Nvidia actually recently switched their built in SSAO method to HBAO+. Which is much nicer, more accurate(though still an approximation), and other things.


If you find the right flag though, many games actually will work with HBAO+.
Pages: 1 2 3