I’m not a shader programmer, but if this can be cleaned up to make it work with Panda properly, then that will be very nice considering there is nothing like a GLSL,etc editor right now, at least a free one. Which always seemed weird to me, as I think artists should create the shading of their artwork.
It looks like the shader defines a lot of unnecessary functions, though, assuming that the GLSL compiler inlines them. For instance, math_cosine just calls the built-in function cos(), and math_subtract just returns the subtraction of its parameters. This is not a big deal, though, as the compiler surely optimises this out.
In any case, have you tried using this shader in Panda? Looking at it, it should work as-is; though it does define three uniforms, unf9, unf23 and unf37, which you need to pass using setShaderInput.
Judging by the shader, it seems that unf37 is the specular colour, unf23 is the light colour, and unf9 is the lamp vector.
This is a good point, and we’ve been discussing potential ways to improve this. Perhaps a future Panda release will feature a new shader system that does empower artists to do this.
The design hasn’t been worked out yet, but it’ll probably have to do with connecting nodes together by input and output similar to Blender’s node-based shader editor. This enables non-programmers to create Panda shaders, since no knowledge of programming will be required to design a shader.
In the long term, we may work towards supporting FX shaders and allow people to use tools like NVIDIA FX Composer 2, but that’s a different story.
Anyway, if Blender allows exporting this node tree, then perhaps we could also extend the design to allow people to import the Blender material nodes into Panda.
Regardless, your old shaders will of course continue to work.
BTW. I try to use shader above and got a strange behaviour. Shader works through time.
And sometimes it’s a fully red or yellow teapot without lighting. Every time I get the message that “Vertex shader was successfully compiled to run on hardware.”
Panda 1.9.0 cvs20121130, Linux Mint 14 (cinnamon), ATI Radeon HD4850
I ran the code with the shaders in your example.zip file (which are, for the record, not the same as the ones you linked in the first post), and after setting the settings more or less as they were set in the blend file:
Well, when i run shaders from example.zip, I get the same result as rdb (if I set vars in the shader file). But if I try to pass vars by setShaderInput, I get the strange behaviour as in the first case.
May be It’s ATI “feature”? Before I’ve had problems with Cg on ATI when I passed too many vars in shader, although on GeForce it’s does not appears.