• Login
  • Register
  • Dolphin Forums
  • Home
  • FAQ
  • Download
  • Wiki
  • Code


Dolphin, the GameCube and Wii emulator - Forums › Dolphin Emulator Discussion and Support › Development Discussion v
« Previous 1 ... 32 33 34 35 36 ... 117 Next »

Removing shader generation stutter using modern APIs
View New Posts | View Today's Posts

Pages (3): « Previous 1 2 3 Next »
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Thread Modes
Removing shader generation stutter using modern APIs
10-12-2015, 12:37 AM (This post was last modified: 10-12-2015, 04:57 AM by phly95.)
#11
phly95 Offline
Member
***
Posts: 72
Threads: 18
Joined: Apr 2013
So there is no way to create a new API for PC that bypasses directx and OGL to avoid creating shaders and execute directly in hardware? Is there actually something physically preventing it on modern GPUs? If there is, then I guess it's a lost cause, but it would be worth collaborating with the Gallium3D guys to see what's possible before giving up, since they are the ones who know more about the inner-workings of the GPU.

Edit: I posted a comment on Axel Davy's blog to see if he has any interest in the project ( http://axellinuxnews.blogspot.com/2015/07/a-new-si-scheduler.html?showComment=1444574787892#c2599837708489526683 ) as he could be an amazing programmer for the project.
MOBO/case: Dell XPS Desktop
OS: Windows 10 Pro, 64-Bit
CPU: Intel i7-920 Bloomfield (1st Generation)
GPU: AMD HD 7850
RAM: 6 GB DDR3
Find
Reply
10-12-2015, 01:26 AM (This post was last modified: 10-12-2015, 02:39 PM by MayImilae.)
#12
MayImilae Offline
Chronically Distracted
**********
Administrators
Posts: 4,616
Threads: 120
Joined: Mar 2011
Nope! Completely not possible. Even if you had 100% lowest level access to modern PC GPUs, and tried to run GameCube or Wii GPU code on them? It wouldn't work! When a game is coded for a specific GPU, that requires excluding all others that are not coded! That's the problem of low level access, and it's why APIs exist: before APIs like DirectX, if you wanted to make a PC game, you had to code for every single individual GPU you would support! With dozens of GPUs out at any given time, game devs just coded for the main popular ones at the time, and left the rest out of luck. APIs allowed PC games to not have to code for specific hardware, freeing them up to just make the game.

The GC and Wii run directly on the metal because they are consoles - every single console is exactly the same! There was no reason for APIs at the time (APIs are used by modern consoles to make things easier, but console game devs almost always bypass them), and the consoles gave games 100% control over the hardware so APIs weren't really even possible on them.


Anyway, there is a possible solution for the shader compilation stuttering problem - the "ubershader". Basically, the idea is a single shader that is always running and commands are streamed to it. Essentially it's moving the translation phase off of the CPU and onto the GPU. The question with that scenario of course, is how is it going to affect performance? We'll see. Big Grin

Spoiler: (Show Spoiler)
Disclaimer: I'm very much not a coder!
[Image: RPvlSEt.png]
AMD Threadripper Pro 5975WX PBO+200 | Asrock WRX80 Creator | NVIDIA GeForce RTX 4090 FE | 64GB DDR4-3600 Octo-Channel | Windows 11 22H2 | (details)
MacBook Pro 14in | M1 Max (32 GPU Cores) | 64GB LPDDR5 6400 | macOS 12
Find
Reply
10-12-2015, 05:02 AM
#13
phly95 Offline
Member
***
Posts: 72
Threads: 18
Joined: Apr 2013
OK thanks that is a very good answer. I don't know much about this type of stuff, so I didn't understand the reasoning. I do hope the ubershaders solve the problem.

So I guess modern APIs may not solve anything that can't be done already on the current APIs (unless draw calls are a problem)
MOBO/case: Dell XPS Desktop
OS: Windows 10 Pro, 64-Bit
CPU: Intel i7-920 Bloomfield (1st Generation)
GPU: AMD HD 7850
RAM: 6 GB DDR3
Find
Reply
10-12-2015, 05:14 AM
#14
AnyOldName3 Offline
First Random post over 9000
*******
Posts: 3,533
Threads: 1
Joined: Feb 2012
The number of draw calls is relatively close to the number you'd expect from a game that looks like a Wii game as there are the same number of things to draw regardless of whether they're being emulated or not, so on modern kit their impact is negligible compared to the cost of the rest of the emulator.
OS: Windows 10 64 bit Professional
CPU: AMD Ryzen 5900X
RAM: 16GB
GPU: Radeon Vega 56
Find
Reply
10-19-2015, 10:46 AM
#15
phire Offline
Developer
**********
Developers (Some Administrators and Super Moderators)
Posts: 31
Threads: 0
Joined: Jan 2014
Vulkan and DirectX 12 won't help at all for shader stuttering issues.
Shader stuttering happens because all of a sudden in the middle of a rendering frame we encounter a new GPU state and must create and compile a new shader to represent it, a process which can often take several hundred milliseconds.

Neither API allows you to compile shaders any faster than OpenGL 4.4 or DirectX 11, in fact they use the exact same shaders and use the same shader compilers. If we were to theoretically create a vulkan or DirectX 12 backend, the shaders are about the only thing we wouldn't need to make any changes to.

Ubershaders are the way to go.
Find
Reply
10-27-2015, 01:17 AM (This post was last modified: 10-27-2015, 01:20 AM by Ramoth.)
#16
Ramoth Offline
Member
***
Posts: 141
Threads: 0
Joined: Jul 2014
(10-19-2015, 10:46 AM)phire Wrote: Vulkan and DirectX 12 won't help at all for shader stuttering issues.
Shader stuttering happens because all of a sudden in the middle of a rendering frame we encounter a new GPU state and must create and compile a new shader to represent it, a process which can often take several hundred milliseconds.

Neither API allows you to compile shaders any faster than OpenGL 4.4 or DirectX 11, in fact they use the exact same shaders and use the same shader compilers. If we were to theoretically create a vulkan or DirectX 12 backend, the shaders are about the only thing we wouldn't need to make any changes to.

Ubershaders are the way to go.

Actually Vulkan can use different languages than GLSL .
Find
Reply
10-27-2015, 03:43 AM
#17
mbc07 Offline
Wiki Caretaker
*******
Content Creators (Moderators)
Posts: 3,577
Threads: 47
Joined: Dec 2010
It would still have to compile a shader, just in a different language...
Avell A70 MOB: Core i7-11800H, GeForce RTX 3060, 16 GB DDR4-3200, Windows 11 (Insider Preview)
ASRock Z97M OC Formula: Pentium G3258, GeForce GT 440, 16 GB DDR3-1600, Windows 10 (22H2)
Find
Reply
10-27-2015, 04:18 AM
#18
Ramoth Offline
Member
***
Posts: 141
Threads: 0
Joined: Jul 2014
(10-27-2015, 03:43 AM)Jhonn Wrote: It would still have to compile a shader, just in a different language...
It can use pre-compiled and cached shaders.
Find
Reply
10-27-2015, 04:31 AM (This post was last modified: 10-27-2015, 04:33 AM by mbc07.)
#19
mbc07 Offline
Wiki Caretaker
*******
Content Creators (Moderators)
Posts: 3,577
Threads: 47
Joined: Dec 2010
(10-27-2015, 04:18 AM)Ramoth Wrote: It can use pre-compiled and cached shaders.

So what? DX11/OGL already can to this, in fact Dolphin uses a shader cache with current DX11/OGL backends. And because of how GC/Wii GPU works you can't just pre-compile the shaders, you'll only know how that shader will be when the game actually uses that graphical effect. Then, it'll be compiled in real time and emulation will "pause" until the shader finishes compiling. That's the current cause of most of Dolphin's stutterings and AFAIK creating a DX12/Vulkan backend won't offer any improvement regarding that.
Avell A70 MOB: Core i7-11800H, GeForce RTX 3060, 16 GB DDR4-3200, Windows 11 (Insider Preview)
ASRock Z97M OC Formula: Pentium G3258, GeForce GT 440, 16 GB DDR3-1600, Windows 10 (22H2)
Find
Reply
11-02-2015, 02:57 AM (This post was last modified: 11-02-2015, 02:59 AM by Shished.)
#20
Shished Offline
Member
***
Posts: 88
Threads: 11
Joined: Dec 2012
I have an offtopic question (but it's on topic in general), about WiiU emulation.
Cemu uses openGL 3.3 while Decaf uses DX12.
Decaf devs claims that DX12 and other low level APIs such as Vulkan are better:

https://github.com/decaf-emu/decaf-emu/wiki
Quote:The developers are using DX12 (maybe Vulkan and other low level APIs later on) in order to directly translate the Wii U GPU calls to DX12 calls. This helps reducing the extra load on the CPU caused by translating Wii U calls to multiple API calls.
Will low level API actually help in this situation?
CPU: Intel Core i5-3570k 
GPU: Nvidia GeForce GTX660 (proprietary nvidia driver v.358.16)
RAM: 2 x 4G DDR3 1333 MHz
OS: openSUSE Tumbleweed x86_64 (linux 4.3)
I'm using latest version of Dolphin-emu from git master branch.
Find
Reply
« Next Oldest | Next Newest »
Pages (3): « Previous 1 2 3 Next »


  • View a Printable Version
  • Subscribe to this thread
Forum Jump:


Users browsing this thread: 2 Guest(s)



Powered By MyBB | Theme by Fragma

Linear Mode
Threaded Mode