Dolphin, the GameCube and Wii emulator - Forums

Full Version: [Linux, OpenGL 4] How are you devs setting/using OpenGL?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.

Katastic Voyage

I've downloaded a branch and I've been wanting to experiment with the Dolphin rendering pipeline. However, I need access to OpenGL 4.0 functions.

1) I'm not entirely sure what's going on in the /GLExtensions/ directory. I tried following what was done for say, the gl 3.2 header but I still get undefined reference linker errors.

2) How/where in CMake are you linking OpenGL? Because it seems like it's using MESA even though I need it to link against the property nVidia drivers to get OpenGL 4.x. MESA doesn't support OpenGL 4.x enough at this time.

3) I read in the patch notes you guys removed GLEW. Why is that? How is your current system better? Was it solely for Android support? What was the rational? I couldn't find much details in the git notes--unless I missed them or don't know where to look.

4) Perhaps this is a silly question but is OpenGL ES solely used for mobile ports, or are you somehow using it for the desktop ports as well instead of plain OpenGL? I noticed version ES in the shaders.

Thank you for your time.

[edit] I've made progress. The functions seem to be working. However, I'm having trouble figuring out how the shaders and vertices are set up.

The main function that's doing drawing is VertexManager:Big Grinraw, correct? And each time it's called, it's expecting to draw a strip of triangles. Is this function called per object? Per frame? Entirely dependent on the game code?

How are the shaders set up? It seems like you've got ones for texture conversion, rendering, and more. What are the ones that are primarily active during a game?

Additionally, could you give me some general info about what's going on in ProgramShaderCache::SetShader?
I'll let HdkR reply more in details to these questions (as he is the author of GLExtensions), but from my understanding:

1) GLExtensions is a way to do the GL functions lookup at runtime instead of compile time. The main point of it is to be able to generate a binary that works on both GL and GL ES. This means we do not link to libGL directly at compile time, but instead we lookup the symbols from libGL at runtime.

2) As explained, no GL linking at compile time.

3) If I remember correctly, GLEW has no support for ES, and even less support for ES and normal GL in the same binary.

4) I think we also use ES for desktop versions running on ARM (Chromebook, for example), but that's a very marginal scenario. We almost only use core GL on desktop, but we want to have support for ES as well.

Now, for the more general questions:

5) We accumulate vertices/primitives as long as we can. Then, when a change is made to the emulated GPU pipeline that requires changes to the host GPU pipeline (GL settings, shader, etc.) we flush all the vertices/primitives we currently have accumulated to the GPU with the current pipeline state, and we perform the pipeline state change. So it is not predictable how many vertices/triangles are going to be sent per batch.

6) The most important shaders are the one generated by VertexShaderGen.cpp and PixelShaderGen.cpp. These files contain functions that generate shaders based on the current emulated GPU pipeline state. These shaders are generated at runtime and switched depending on pipeline state changes.

If you have further questions, you can also join us on #dolphin-dev @ freenode, it might make things quicker.
That's right, I dropped GLEW in favour of our own solution due to the limitations that GLEW has.
We don't link against libGL at all, we link against GLX/EGL/etc and then pull in every single OpenGL function manually via the provided GLX/EGL/etc methods.

OpenGL ES can be used with both Nvidia and Intel GPUs in Linux if built with EGL.
Also notable: GLEW doesn't support GL Core profiles, which we use on OS X.

Katastic Voyage

Are shaders being constructed outside of CompileShader() somehow? I modified it to add a tessellation stage but I've been scratching my head trying to figure out why they compile and link fine, but they affect nothing.

That is until I output the values every time SHADER::Bind was called and found 200-250 shader programs are being made before even touching CompileShader, and the only one used in Mario Kart Double Dash (my game for testing) is the font rasturizer.

glUseProgram(glprogid);

is being called where glprogid is "1" through "201" before it hits the first CompileShader.

What's going on there? I tried to find another place producing shaders but I can't find anything calling glAttachShader or glLinkProgram.

I need to figure out where the main drawing programs are being called so I can latch onto them to enable GL_PATCHES instead of GL_TRIANGLES/GL_TRIANGLE_STRIPS. So far, running with GL_PATCHES is drawing nothing but GL_POINTS. Which produces an interesting effect...

http://i.imgur.com/tnhx6CN.png
ProgramShaderCache::CompileShader() is the only place that OpenGL uses for compiling its shaders.
You might want to check to make sure that you aren't hitting shader cache issues?

Katastic Voyage

Okay, so I've got tessellation stage working on Dolphin. Textures, normals, everything displays fine. But I have to apply a projection to implement the PN triangles algorithm to give my new vertices "depth" from their original mesh.

Problem is, Dolphin seems to be doing something strange with the w component of vertices/normals. If I chop it off, and place it back with an identity, I lose essentially all drawing that'd not a 2-D overlay.

And if I keep those vec4's, my projection function:

vec4 ProjectToPlane(vec4 Point, vec4 PlanePoint, vec4 PlaneNormal)
{
vec4 v = Point - PlanePoint;
float Len = dot(v, PlaneNormal);
vec4 d = Len * PlaneNormal;
return vec4(Point- d);
}

Seems to do jack all, while it's supposed to work fine and is straight from an online example and seems to line up exactly with the algorithm. (Except I changed from vec3's to vec4's.)

Also, what would be the easiest way to get access to the current program shader id from within frame.cpp? I'd like to add a key to change the current tessellation level in real-time, so I tried to work around Free Look. But I couldn't figure out how to get the namespaces/variable names to compile.
I think the 4th component of the normal should be stripped completely as it's redundant with the clippos.z value:
https://github.com/dolphin-emu/dolphin/b...n.cpp#L390

I don't think it's worth to change the tesselation shader within a frame, so it's enough to lookup such a values in the shader uids. For ogl, the easiest way is to add the tesselation config within the program uid, so it will be inlined in our programshadercache.