Flattening faster (and instanced animated grass)

Not sure that it’s would work, but try to use following version entry
#version 140 compatibility

Looks like the compatibility profile is for 150+

How about this?

//GLSL
#version 150 compatibility

uniform float time;
uniform vec4 anim_data[50];

varying vec4 diffuse, ambient;
varying vec3 normal, halfVector;

void main()
    {
    float animation =(sin(time+float(gl_InstanceID))*sin(3*time+float(gl_InstanceID))*anim_data[gl_InstanceID].z)*(gl_Color.r-0.5);
	vec4 v = vec4(gl_Vertex);        
	v.x += anim_data[gl_InstanceID].x + animation;
    v.y += anim_data[gl_InstanceID].y + animation;
    
	gl_Position = gl_ModelViewProjectionMatrix * v;
    gl_TexCoord[0] = gl_MultiTexCoord0;
    
    normal = normalize(gl_NormalMatrix * gl_Normal);
    halfVector = gl_LightSource[0].halfVector.xyz;
    diffuse = gl_FrontMaterial.diffuse * gl_LightSource[0].diffuse;
    ambient = gl_FrontMaterial.ambient * gl_LightSource[0].ambient;
    ambient += gl_LightModel.ambient * gl_FrontMaterial.ambient;
    }

The fragment shader (grass_f.glsl) should stay as it is (there’s just texture + directional lighting there).

If your card doesn’t have GL_EXT_draw_instanced, but it does have GL_ARB_draw_instanced, why don’t you try “#extension GL_ARB_draw_instanced : enable” ? Note the GL_ prefix, which is required for it to work.

Since the shader uses gl_ prefixed uniforms, you have to use the old version of GLSL with extensions, not the more recent version which removes most of them. gl_LightSource, gl_Normal, etc. do not exist in version 150, so that shader is not guaranteed to compile. (It could very well be that ATI’s GLSL compiler is more lenient than NVIDIA’s one.)

I must have the weirdest graphic card in the world. Whatever extension or version I put in there it still runs a-ok, just pumping warnings at me in the worst case!

That’s good - because it works, and that’s bad - because I don’t know if it will work elsewhere :confused:

Still no luck with any of those suggestions; your new shader still produces:

:display:gsg:glgsg(error): An error occurred while compiling shader!
0(32) : error C7532: global function texture requires "#version 130" or later

I think the problem is that that the sapphire on my old system is just too old to run it and the gtx on my new system is too new to run it. I think you’re using language features that have been deprecated for a while that it just won’t accept; I’m going to look in to this a bit and see if I can update the shader to a version my card will take.

My own Cg shaders work correctly with the gtx but I had to set “basic-shaders-only” to true before they did anything but produce garble.

I’ve looked into this and I’m fairly certain it fails to compile because gl_LightSource, gl_FrontMaterial, etc. are no longer supported in 150 and up. If you could write a shader that takes your light source values and such as uniform inputs from the python script then I think it would work with newer cards. You also have to rename the “gl_” prefixes on your variables to “in_” or “ex_” as required. I’m just not familiar enough with GLSL, nor your script, to make the attempt, but I’ll keep testing anything you come up with.

panda3d provides replacements for some of the gl_ variables. I posted the ones I know of on the shader input section on the wiki

No, it has nothing to do with the gl_ prefix variables, which are available and valid in GLSL 120.

The error means exactly what it says. You reference a texture() function in the fragment shader which does not exist in GLSL 120.
It should instead be:

    gl_FragColor = color *texture2D(p3d_Texture0, vec2(gl_TexCoord[0]));

The problem with the shader is that you’re trying to use features from different GLSL versions at the same time. You have to choose to either use an old GLSL version and keep with the gl_ input, or a new GLSL version and ditch all gl_ uniforms, replacing them with appropriate shader inputs.

@rdb: I was trying to update his shader to ver 150 that’s why I had to change all the gl_ prefixes. I had it compiling to the point where it was references to gl_Lightsource, etc in the vshader that were causing problems, the texture error was from just running his original shader. But it turns out updating it isn’t necessary since I’ve verified that my card supports back to 120.

Anyways, making the change to texture2D produces a new set of errors, though I think these are something on Wezu’s end:

:display:gsg:glgsg(error): Active attribute gl_NormalMatrix is bound to location -1
:display:gsg:glgsg(error): Active attribute gl_NormalMatrix is bound to location -1
:display:gsg:glgsg(error): Active attribute gl_NormalMatrix is bound to location -1
:display:gsg:glgsg(error): Active attribute gl_NormalMatrix is bound to location -1
:display:gsg:glgsg(error): Active attribute gl_NormalMatrix is bound to location -1
Assertion failed: Shader input anim_data[0] is not present.
 at line 523 of c:\buildslave\release_sdk_win32\build\panda3d\panda\src\pgraph\shaderAttrib.cxx
Traceback (most recent call last):
  File "C:\Panda3D-1.8.1\direct\showbase\ShowBase.py", line 1846, in __igLoop
    self.graphicsEngine.renderFrame()
AssertionError: Shader input anim_data[0] is not present.
 at line 523 of c:\buildslave\release_sdk_win32\build\panda3d\panda\src\pgraph\shaderAttrib.cxx
:task(error): Exception occurred in PythonTask igLoop
Traceback (most recent call last):
  File "gl_grass.py", line 80, in <module>
    run()
  File "C:\Panda3D-1.8.1\direct\showbase\ShowBase.py", line 2921, in run
    self.taskMgr.run()
  File "C:\Panda3D-1.8.1\direct\task\Task.py", line 502, in run
    self.step()
  File "C:\Panda3D-1.8.1\direct\task\Task.py", line 460, in step
    self.mgr.poll()
  File "C:\Panda3D-1.8.1\direct\showbase\ShowBase.py", line 1846, in __igLoop
    self.graphicsEngine.renderFrame()
AssertionError: Shader input anim_data[0] is not present.
 at line 523 of c:\buildslave\release_sdk_win32\build\panda3d\panda\src\pgraph\shaderAttrib.cxx

The gl_NormalMatrix error can be safely ignored. (It’s supposed to be a debug message, but by accident it was classified as an error message.)

The error about anim_data missing is just an issue in the Python code.

In the python script it looks like this:

anim_data= PTA_LVecBase4f()
        for i in xrange(50):
            anim_data.pushBack(UnalignedLVecBase4f(uniform(-1.0, 1.0),uniform(-1.0, 1.0),uniform(-1.0, 1.0),1.0))
        render.setShaderInput('anim_data', anim_data)

Can’t say what’s wrong with that.

Maybe I’ll just rewrite the shaders and skipp lighting, we’ll know then if my instancing/animation code is to blame or the other unimportant bits.