Dolphin, the GameCube and Wii emulator - Forums

Full Version: Removing shader generation stutter using modern APIs
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3
So there is no way to create a new API for PC that bypasses directx and OGL to avoid creating shaders and execute directly in hardware? Is there actually something physically preventing it on modern GPUs? If there is, then I guess it's a lost cause, but it would be worth collaborating with the Gallium3D guys to see what's possible before giving up, since they are the ones who know more about the inner-workings of the GPU.

Edit: I posted a comment on Axel Davy's blog to see if he has any interest in the project ( http://axellinuxnews.blogspot.com/2015/07/a-new-si-scheduler.html?showComment=1444574787892#c2599837708489526683 ) as he could be an amazing programmer for the project.
Nope! Completely not possible. Even if you had 100% lowest level access to modern PC GPUs, and tried to run GameCube or Wii GPU code on them? It wouldn't work! When a game is coded for a specific GPU, that requires excluding all others that are not coded! That's the problem of low level access, and it's why APIs exist: before APIs like DirectX, if you wanted to make a PC game, you had to code for every single individual GPU you would support! With dozens of GPUs out at any given time, game devs just coded for the main popular ones at the time, and left the rest out of luck. APIs allowed PC games to not have to code for specific hardware, freeing them up to just make the game.

The GC and Wii run directly on the metal because they are consoles - every single console is exactly the same! There was no reason for APIs at the time (APIs are used by modern consoles to make things easier, but console game devs almost always bypass them), and the consoles gave games 100% control over the hardware so APIs weren't really even possible on them.


Anyway, there is a possible solution for the shader compilation stuttering problem - the "ubershader". Basically, the idea is a single shader that is always running and commands are streamed to it. Essentially it's moving the translation phase off of the CPU and onto the GPU. The question with that scenario of course, is how is it going to affect performance? We'll see. Big Grin

OK thanks that is a very good answer. I don't know much about this type of stuff, so I didn't understand the reasoning. I do hope the ubershaders solve the problem.

So I guess modern APIs may not solve anything that can't be done already on the current APIs (unless draw calls are a problem)
The number of draw calls is relatively close to the number you'd expect from a game that looks like a Wii game as there are the same number of things to draw regardless of whether they're being emulated or not, so on modern kit their impact is negligible compared to the cost of the rest of the emulator.
Vulkan and DirectX 12 won't help at all for shader stuttering issues.
Shader stuttering happens because all of a sudden in the middle of a rendering frame we encounter a new GPU state and must create and compile a new shader to represent it, a process which can often take several hundred milliseconds.

Neither API allows you to compile shaders any faster than OpenGL 4.4 or DirectX 11, in fact they use the exact same shaders and use the same shader compilers. If we were to theoretically create a vulkan or DirectX 12 backend, the shaders are about the only thing we wouldn't need to make any changes to.

Ubershaders are the way to go.
(10-19-2015, 10:46 AM)phire Wrote: [ -> ]Vulkan and DirectX 12 won't help at all for shader stuttering issues.
Shader stuttering happens because all of a sudden in the middle of a rendering frame we encounter a new GPU state and must create and compile a new shader to represent it, a process which can often take several hundred milliseconds.

Neither API allows you to compile shaders any faster than OpenGL 4.4 or DirectX 11, in fact they use the exact same shaders and use the same shader compilers. If we were to theoretically create a vulkan or DirectX 12 backend, the shaders are about the only thing we wouldn't need to make any changes to.

Ubershaders are the way to go.

Actually Vulkan can use different languages than GLSL .
It would still have to compile a shader, just in a different language...
(10-27-2015, 03:43 AM)Jhonn Wrote: [ -> ]It would still have to compile a shader, just in a different language...
It can use pre-compiled and cached shaders.
(10-27-2015, 04:18 AM)Ramoth Wrote: [ -> ]It can use pre-compiled and cached shaders.

So what? DX11/OGL already can to this, in fact Dolphin uses a shader cache with current DX11/OGL backends. And because of how GC/Wii GPU works you can't just pre-compile the shaders, you'll only know how that shader will be when the game actually uses that graphical effect. Then, it'll be compiled in real time and emulation will "pause" until the shader finishes compiling. That's the current cause of most of Dolphin's stutterings and AFAIK creating a DX12/Vulkan backend won't offer any improvement regarding that.
I have an offtopic question (but it's on topic in general), about WiiU emulation.
Cemu uses openGL 3.3 while Decaf uses DX12.
Decaf devs claims that DX12 and other low level APIs such as Vulkan are better:

https://github.com/decaf-emu/decaf-emu/wiki
Quote:The developers are using DX12 (maybe Vulkan and other low level APIs later on) in order to directly translate the Wii U GPU calls to DX12 calls. This helps reducing the extra load on the CPU caused by translating Wii U calls to multiple API calls.
Will low level API actually help in this situation?
Pages: 1 2 3