GLSL 120, 130, 154...400?

I’m trying to teach myself some gslang, but I really don’t know what version should I learn. From looking at some source I can tell that 120 looks nothing like 400 and from googling a bit I know that some cards and/or systems have problems supporting even version 130.

My target hardwares age is years and not months old, and I’m not aming at very complex effects (instancing and geometry shaders would be nice, but I can live without them), what should I choose?

Is sticking to the 120 version a bad idea?
Is panda any help here on later versions providing imputs that where provided by opengl (gl_MatrixWhatEver)?

Panda’s GLSL support is still experimental, but it should have some support for the more recent versions by providing its own inputs (such as p3d_MatrixWhatEver), but it’s not complete.

If you want to be sure that it’ll work, you can feel free to rely on the older versions; you can still use geometry shaders and instancing if you wish by using the appropriate extensions in your GLSL shader. However, some drivers may still accept newer shader versions on older cards.

If you want broad support, 120 is a good choice. For instancing you will need to move up to 140. Even then a lot of AMD cards seem to struggle with displaying instancing properly. We spent a lot of time implementing it and had to ship our game with the setting disabled by default.

Hmm? GLSL 1.20 corresponds to OpenGL 2.1, which is what the ARB_draw_instanced specification was written against. Shouldn’t it work anyway as long as you have an appropriate “#extension” line?

It was upgraded to the core profile in 1.40 (GL 3.1) perhaps, but that doesn’t mean that it’s not available in earlier versions as an extension. Correct me if I’m wrong.

I supposed that is very possible! I always worried that some GPUs might not have support for certain extensions.

Instancing works with version 120 with ARB_draw_instanced (tested today, rendering a few milion blades of grass) most cards made after 2008 (radeon 2xxx/ geforce 8xxx) should support this (mobile gpu, don’t do so good, there are some ati/amd models from 2010 that support dx 10.1 but opengl only in version 2.0). I hard to find what card supports what extention if you don’t have that card…

Anyway… are array inputs for glsl are supported now? I only found some old posts about this.

teedee was generous enough to provide a patch a while ago to add support for array inputs under GLSL. Looking at the logs, it even made it into 1.8.0.

Basically in your game code you could do something like:

shader_data = PTA_LVecBase4f()
for i in xrange(10):
node.setShaderInput('shader_data', shader_data)

In the shader, access it with:

uniform vec4 shader_data[10];
void main() {
    vec4 my_data = shader_data[gl_InstanceID];