Got a reply from my nvidia contact about the issue that OGL seems to be worse affected than D3D in terms of performance:
Quote:Our GPUs are notoriously "not great" at integer math. [...] It's also possible that the D3D driver (which uses much more aggressive settings in the compiler and continues to optimize in the background when GPU limited) figured out that you are doing math that could be mostly done in float and converted back just in time.
