Forums » Bugs
OpenGL4 Dynamic Lights and Shadows not working
After the 1.8.328 update, on MacOS X, the entire dynamic lights and shadows system is no longer working. iMac Late 2013 Nvidia GeForce GT 750M/Intel Iris Pro. OSX 10.10.1
Also the same for MBP 2009 OSX 10.10.2, both on the integrated NVIDIA GeForce 9400M, and the discrete NVIDIA GeForce 9600M GT.
Same on Linux. http://www.vendetta-online.com/x/msgboard/1/29844#357326
Radeon HD6870 w/Mesa 10.6-devel Kernel 3.19
Radeon HD6870 w/Mesa 10.6-devel Kernel 3.19
Thanks. I found the problem with the Mac version but I don't see any rendering problems on Linux.
Hm, I wonder if it has something to do with you using an ATI and me using nvidia for Linux.
I'll refer you to a fun quote.
<classibot> Fuck ATI.
<classibot> Fuck ATI.
Maybe it's drivers, maybe hardware differences, but whatever it is didn't happen until this patch. Did you check on momerath42's Intel?
I didn't change kernels or Mesa this time, just updated the game and all shaders stopped working. If you need any extra debug info, let me know.
opengl4info.log
I didn't change kernels or Mesa this time, just updated the game and all shaders stopped working. If you need any extra debug info, let me know.
opengl4info.log
I can confirm that everything is fine with the Nvidia under Linux also.
The shaders are using the dFdx and dFdy functions now, and nvidia drivers have that function but ATI drivers don't because of some strict GLSL #version limitations. Nvidia is more lax in the restriction.
abortretryfail, do you know if your video card supports GLSL 1.5?
It's supposed to. Here's the output of glxinfo.
Thanks. Ug. It's only supported in the 3.3 core profile though.
Vendetta is not using the 3.3 core profile, it's using the 3.0 profile which says it only supports GLSL 1.3.
Blah, I hate opengl sometimes.
Vendetta is not using the 3.3 core profile, it's using the 3.0 profile which says it only supports GLSL 1.3.
Blah, I hate opengl sometimes.
lol, it seems 1.3 is the latest version of GLSL that existed when GL 3.0 became a thing.
I thought you guys were really just using GLES 2.0 (via GL_ARB_ES2_compatibility), which doesn't have dFdx/dFdy anyway.
I thought you guys were really just using GLES 2.0 (via GL_ARB_ES2_compatibility), which doesn't have dFdx/dFdy anyway.
We were but this latest version uses the GL_OES_standard_derivatives extension for the slope-based depth biasing, which doesn't exist on PC platforms. Those functions exist just fine in GLSL 1.3 and above though.
The nvidia GL driver happily let the shader use those functions but the ati and os x drivers said no because those functions didn't exist in base #version 100
The nvidia GL driver happily let the shader use those functions but the ati and os x drivers said no because those functions didn't exist in base #version 100
Heh, it seems to me you gonna love low level (in)compatibility of vendor drivers which all were released according to predefined standards.
Why not ask for the 3.3 core profile?