(03-01-2014, 05:42 PM)MaJoR Wrote: Nvidia cards are actually just fine in D3D. It's just Nvidia OpenGL that hates integers so much.I just spend three hours trying to figure this out. Turns out NVIDIA provides a large number of powerful tools, all of which suck even for simplest tasks like showing shader compiler outputs.
*pokes neobrain to explain it*
Thread Rating:
|
Green Notice (Development Thread) Testers wanted - tev_fixes_new
|
|
|
|
« Next Oldest | Next Newest »
|
Users browsing this thread: 1 Guest(s)
