• Login
  • Register
  • Dolphin Forums
  • Home
  • FAQ
  • Download
  • Wiki
  • Code


Dolphin, the GameCube and Wii emulator - Forums › Dolphin Emulator Discussion and Support › Support v
« Previous 1 ... 653 654 655 656 657 ... 1189 Next »

Failed to compile pixel shader (linux - works fine in windows) [SOLVED]
View New Posts | View Today's Posts

Pages (2): 1 2 Next »
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Thread Modes
Failed to compile pixel shader (linux - works fine in windows) [SOLVED]
02-12-2013, 10:02 AM (This post was last modified: 02-14-2013, 11:10 AM by raidzero.)
#1
raidzero Offline
Junior Member
**
Posts: 9
Threads: 1
Joined: Feb 2013
Hello, I just built Dolphin 3.5-397 from git, since my package manager does not have dolphin in it. I installed gcc 4.6.2 alongside my standard 4.5.3 in order to build.

This is on Gentoo Linux x64 kernel version 3.5.4.

core i7 930 @ 4ghz
24gb ddr3-1600 g-skill ram
radeon hd 5850 1gb (which runs dolphin fine in windows)

When I try to launch any game, I tried Tetris Worlds and Metroid Prime (Gamecube) also Metroid Prime Trilogy (Wii), I see a "failed to compile pixel shader" error.

Apparently using code tags smashes newlines, so I used pastebin to paste my error message and file contents: http://pastebin.com/UXNS0WKs

EDIT: added console output: http://pastebin.com/HSFVeQ6e

Any ideas? THANKS
Find
Reply
02-12-2013, 01:34 PM (This post was last modified: 02-12-2013, 01:34 PM by Shonumi.)
#2
Shonumi Offline
Linux User/Tester
**********
Administrators
Posts: 6,502
Threads: 55
Joined: Dec 2011
Some games don't like having "Per-Pixel Lighting" enabled in the latest revisions. Check to make sure that it's off when playing for those games you mentioned.

If you still get issues, try to provide your settings for both Linux and Windows (as screenshots preferably).
Website Find
Reply
02-13-2013, 03:30 AM
#3
raidzero Offline
Junior Member
**
Posts: 9
Threads: 1
Joined: Feb 2013
Shonumi, I am at work today so I cannot get screenshots, but I did ssh to my system and checked the ini files for both Windows and Linux, I found this parameter:

EnablePixelLighting.

Linux:
gfx_opengl.ini: EnablePixelLighting = False

Windows:
gfx_dx11.ini: EnablePixelLighting = False
gfx_dx9.ini: EnablePixelLighting = False

If this is not the corresponding parameter to "Per-Pixel Lighting", sorry. I will post screenshots when I can.

By the way, I probably should have included this in the OP; I am running fglrx 12.6.
Find
Reply
02-13-2013, 06:09 AM
#4
Shonumi Offline
Linux User/Tester
**********
Administrators
Posts: 6,502
Threads: 55
Joined: Dec 2011
Do the latest versions of fglrx fare any better? What about open-source Radeon drivers? Most of the time, your drivers should not be the issue, but we should at least eliminate them as possible problems. Though you're about the third Gentoo user I've seen running into graphical troubles with an AMD GPU.

Has Dolphin ever worked for you before? If so, what was the last revision you tried?
Website Find
Reply
02-13-2013, 06:26 AM (This post was last modified: 02-13-2013, 06:28 AM by neobrain.)
#5
neobrain Offline
"Wow, I made my code 1000x faster! That means I can make it 2048x slower now!"
**********
Developers (Some Administrators and Super Moderators)
Posts: 3,208
Threads: 50
Joined: Jun 2009
Something is clearly going wrong here on our side. Someone should probably check where the instruction limit is being set to zero because, actually, we're forcing it to 4096 when it's reported to be zero by the driver.

@ OP: In Plugins/Plugin_VideoOGL/Src/PixelShaderCache.cpp, exchange the line http://code.google.com/p/dolphin-emu/source/browse/Source/Plugins/Plugin_VideoOGL/Src/PixelShaderCache.cpp#255 with "sprintf(stropt, "MaxLocalParams=224");"
My blog
Me on Twitter
My wishlist on Amazon.de
Find
Reply
02-13-2013, 06:55 AM
#6
raidzero Offline
Junior Member
**
Posts: 9
Threads: 1
Joined: Feb 2013
Shonumi, Yes dolphin 3.0 (not sure what revision) worked on my system, though it was really really, really bad. I'll update fglrx if it comes down to it.

neobrain, thanks for the stropt suggestion, I applied and rebuilt. Will have to see if it works when I get home, can't run 3d accelerated things over ssh X forwarding Wink
Find
Reply
02-13-2013, 09:25 AM
#7
raidzero Offline
Junior Member
**
Posts: 9
Threads: 1
Joined: Feb 2013
I tried neobrain;s suggestion but it didn't make a difference. Error message and file contents: http://pastebin.com/MLgQ9qUd
Find
Reply
02-14-2013, 03:21 AM
#8
neobrain Offline
"Wow, I made my code 1000x faster! That means I can make it 2048x slower now!"
**********
Developers (Some Administrators and Super Moderators)
Posts: 3,208
Threads: 50
Joined: Jun 2009
Try replacing with sprintf(stropt, "MaxLocalParams=224,NumInstructionSlots=4096"); then?
My blog
Me on Twitter
My wishlist on Amazon.de
Find
Reply
02-14-2013, 03:57 AM (This post was last modified: 02-14-2013, 04:18 AM by raidzero.)
#9
raidzero Offline
Junior Member
**
Posts: 9
Threads: 1
Joined: Feb 2013
(02-14-2013, 03:21 AM)neobrain Wrote: Try replacing with sprintf(stropt, "MaxLocalParams=224,NumInstructionSlots=4096"); then?
Sad, no luck. I also receive a failed to compile vertex shader message that I have been forgetting to mention, here are both error messages and both dump txt files:
http://pastebin.com/xqQ9iC32

EDIT: I adjusted the stropt sprintf line in VertexShaderCache.cpp accordingly but it made no difference:

Source/Plugins/Plugin_VideoOGL/Src/VertexShaderCache.cpp: sprintf(stropt, "MaxLocalParams=256,NumInstructionSlots=4096");
Source/Plugins/Plugin_VideoOGL/Src/PixelShaderCache.cpp: sprintf(stropt, "MaxLocalParams=224,NumInstructionSlots=4096");

EDIT 2: I had some doubt that it was actually replacing the dolphin-emu executable with the modified version, so I made a minor change (changed 4096 to 4095 in the pixel shader stropt line) and rebuilt and the checksum of dolphin-emu has indeed changed.

Found this
http://forums.dolphin-emu.org/archive/index.php?thread-8606.html

and it solved my issue, games now load Smile

workaround is to just not call cgGLSetOptimalOptions() in Render.cpp for ATI
Find
Reply
02-14-2013, 05:02 AM
#10
raidzero Offline
Junior Member
**
Posts: 9
Threads: 1
Joined: Feb 2013
I'm not a big fan of just commenting stuff out like that, it worked in my case but it would break others... I have CG version 2.1 on my system, and in Render.cpp there is #if that checks if CG_VERSION_NUM is 2100 (which it should be, for me) but CG_VERSION_NUM is #defined to be 3000 in cg.h. Shouldn't cmake have caught that I have an insufficient version of nvidia CG toolkit if its just expected to be 3.0? I dont know if cmake checks stuff like a typical configure script or just blindly creates makefiles
Find
Reply
« Next Oldest | Next Newest »
Pages (2): 1 2 Next »


  • View a Printable Version
  • Subscribe to this thread
Forum Jump:


Users browsing this thread: 1 Guest(s)



Powered By MyBB | Theme by Fragma

Linear Mode
Threaded Mode