Good news!
Initial code review brought up a major issue, which is fixed meanwhile. Indeed, the reported glitches in the NBA games and in One Piece are gone now.
Would be great if anyone who reported issues could double-check if their issue is fixed using the updated build in the first post. No 32 bit build planned for now, however if you need one for testing I'll get you one.
EDIT: Added 32 bit build.
Lol we should figure out why AMD cards take much less of a hit than NVIDIA cards.
Nvidia cards are actually just fine in D3D. It's just Nvidia OpenGL that hates integers so much.
*pokes neobrain to explain it*
(03-01-2014, 05:42 PM)MaJoR Wrote: [ -> ]Nvidia cards are actually just fine in D3D. It's just Nvidia OpenGL that hates integers so much.
*pokes neobrain to explain it*
I just spend three hours trying to figure this out. Turns out NVIDIA provides a large number of powerful tools, all of which suck even for simplest tasks like showing shader compiler outputs.
MaJoR Wrote:Nvidia cards are actually just fine in D3D. It's just Nvidia OpenGL that hates integers so much.
Are you sure about this? It was my understanding that nvidia cards have poor integer performance in both since they optimize the ALUs to use most of the die space for FP math.
D3D is handling it far, far better than OpenGL. I can't explain why, but that's exactly what's been found. AMD/ATi cards don't seem to care about this merge whatsoever.
If what NV said is true then shouldn't Maxwell fix that or am I remembering something else it fixed?
DatKid20 Wrote:If what NV said is true then shouldn't Maxwell fix that or am I remembering something else it fixed?
It's not. The issue with OpenGL and integers is apparently driver level, and that would likely carry over regardless of the architecture. Again, I'll point to neobrain to explain.
*pokes neobrain*