[SOLVED] 2 issues with 1.9 (Bullet / Shader Generator)

Dear all -

I first wanted to say ‘thank you’ to all the 1.9 contributors and especially for rdb ! I assume it was a huge work to reach this milestone : so, thank you so much guys (and please keep going) !

I migrated this week from 1.8.1 to 1.9 (official windows x64 build, under win7 x64) and I am facing 2 strange issues :

  1. Issue #1 : Bullet
    Whereas my code was perfectly working in 1.8.1, it doesn’t anymore. In 1.8.1, I generated 250 BulletRigidBodyNodes in one column, did all the stuff needed and the models were gently falling down to the ground.
    However, now in 1.9, I still can see my nodes in column (and the debug node as well),but they are not falling down anymore and are just static.
    I checked with Bullet samples to see if there were working : they are except for the ‘Ball in Maze’ sample where the ball is not moving anymore…

So anyone has an idea ?

  1. Issue #2 : Shader Generator
    In 1.8.1, when using the shador generator (render.setShaderAuto()) I already noticed that there was some lag (when adding a light for example), but in 1.9 this lag time has considerably been extended. As a consequence, at some point (for example, when a light is removed), the screen is freezed during 2/3s.

Again, any idea ? Any change in the 1.9 that I should be aware of (config, models,…) ?

Many thanks!

After further investigations, it seems I found out a solution for issue #1 in Bullet :

Instead of having the Bullet World updated this way (as I did in 1.8.1):
self.world.doPhysics(dt)

I now need to update the Bullet World by defining the doPhysics() sub-steps in 1.9.0:
self.world.doPhysics(dt, 5, 1.0/180.0)

My code is working again and the “ball in maze” sample as well after having modified it.

Is there any reason for this change in 1.9.0 ?

Ah, thanks for figuring that out! I had trouble tracking the issue down, but this explains it all. There were changes to the way the binding generator handled default arguments, and, well, long story short, there’s this line:

float param3 = (1 / 60);

Which should be reading 1.0f/60.0f otherwise it results in the integer 0.

I’ll look into checking some fixes.

You are welcome rdb and many thanks for everything !

As for issue #2, I still see a huge lag time when autoshader is activated (at render level) and when a light is added/removed (I said a freeze time of 2/3 sec in my initial post but in fact it is more than 5 sec.) : this is definitively a regression compared to 1.8.1 (or maybe I am missing something in 1.9.0…).

Do you have any clue where I should investigate ?

Issue #3 (I can create another thread if you want, in addition to my initial post) :

Regarding autoshader, I have identified another issue (again, it could be my mistake):
In 1.8.1, when applying autoshader at render level and appliying a custom shader on a particular Nodepath, if I wrote in the VS :

gl_FrontColor = gl_Color

Then nothing was shown. So I worked this around by not using this instruction in the VS . I just used and it worked fine:

  gl_Position = p3d_ModelViewProjectionMatrix * p3d_Vertex;
  gl_TexCoord[0] = p3d_MultiTexCoord0;

In 1.9.0 however, it seems that I can’t omit to use either gl_Color (or its equivalent p3d_Color), because It generate an error with this message :

:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display:gsg:glgsg(error): at 4760 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glGraphicsStateGuardian_src.cxx : invalid operation
:display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
a\src\glstuff\glShaderContext_src.cxx : invalid operation
:display(error): Deactivating wglGraphicsStateGuardian.

But when using gl_color or p3d_Color (e.g. gl_FrontColor = p3d_Color), I still have the same issue that is nothing is showing on the screen.
When autoshader is not activated (using non shader P3D lighting system), the Nodepath is showing on the screen and the customer shader is correctly applied.

Any idea ?

Thanks !

First of all, the lag at shader generation is new to me, and is not supposed to happen. I’ll have to investigate.
The only clue I have right now is that the shader generation was changed to be done in the cull thread (by the StandardMunger) rather than in draw thread, just before rendering. This was not intended to impact shader generation time.

If you could send me a small example Python file that demonstrates the issue clearly, though, that would help.

As for the other issue, I’m not really sure if I follow you. I don’t know what gl_FrontColor is supposed to do; from what I understand you’re supposed to pass the color to assign the fragment shader and assign to to gl_FragColor there.

There are also now two meanings for p3d_Color; formerly it was only used as attribute. In 1.9.0, it can be used as per-vertex attribute (in which case it contains the vertex color) or as a uniform color (in which case it will contain a flat color as assigned with setColor). Which one are you using? It might help if you sent me the shader with some example code.

If your card supports it, I highly recommend setting “gl-debug” to true in Config.prc in order to get more detailed error reporting.

Many thanks rdb.

The code is a little bit complicated so I’ll try to put it simple. If needed, I’ll send you the whole code.

The aim of the code is to do a ‘demo maker editor’ that is a tool aiming at creating non interactive but highly configurable animations using (and hopefully demonstrating…) all the P3D possibilities (shading, physics,music,…) - see en.wikipedia.org/wiki/Demomaker for very good examples. I am doing it as an exercise and off course, this will never be as good as these ones and is far from being finished !

To provide a little bit of background, the main code architecture is as below (I hope it will be clear enough):

  • The Main class (DMKDemo.py) inherits from ShowBase, calling a wxPython window for the editor part.
    The initialization is done in the InitializeDemo() method, who will initialize Physics, Lights, Camera, Music and a set of Dobjs (stands for Demo Objects) who could be a Model, Actor, Quad3D,…depending on what the user wants

  • Each Dobj is a different class, which will contain a NodePath and a set of parameters, most of them are encapsulated in several managers (physics, instance,… managers). Each dobj is ‘timed’, meaning they are initialized at a given time and will end at another given time (managed through dolatertask).

  • The Lighting system is for the moment basically using the foward rendering possibilities of Panda (deferred rendering will be a next step). The Lighting is currently managed with a class (DemoLightManager.py) at the demo level,who will contain a set of different Lights (DMKLight.py), which are also ‘timed’.

  • The custom shading is managed at the Dobj level, with a class named “DMKDobjShaderManager”.

Here is an excerpt of the code :

  1. Issue #3. In the issue #3.zip file, I have put :
    a. The Dobj Class,where the InitializeShaderManager is defined (Dobj.py)
    b. The DMKQuad3D class, to which I apply the custom shader (DMKQuad3D.py)
    c. the ShaderManager class used to manage the custom shader (DMKDobjShaderManager.py)
    d. the VS/FS shader (which is a simple alphamap shader) (AlphaMap_FS.glsl and AlphaMap_VS.glsl)
    e. a copy of the debug output, after having put “gl-debug” to true (not a lot of info, probably my card - ATI Mobility Radeon HD 5870)

To answer your question regarding the p3d_Color, I don’t need it because my FS is fetching the color directly from 2 textures (see the AlphaMap_FS.glsl file). I just see that the VS crashes when there is no mention to gl_Color or p3d_Color (in 1.8.1, the VS was not crashing) - see the AlphaMap_VS.glsl file

  1. Issue #2. In the issue #2.zip file,I have put :
    a. The main class (DMKDemo.py)
    b. The DemoLightManager (DMKDemoLightManager.py)
    c. The Light class (DMKLight.py)

I hope this piece of code will be enough. If not, let me know. Thanks again for your help !
Issue #3.zip (4.66 KB)
Issue #2.zip (8.36 KB)

What I was really asking for was for you to create a simple Python sample that I can run to reproduce the issue in isolation. Excerpts from your application, especially out of context, aren’t very helpful to me.

I do notice the issues seem related to vertex arrays. Do you get different results depending on whether you set basic-shaders-only to true or false (this will only take effect for Cg shaders, but not affect GLSL shaders)?

OK, sorry for the misunderstanding. I’ll prepare a simple sample.

According to your reply, I played a bit with basic-shaders-only and what I can see is :

  • It definitively helps for issue #2 when basic-shaders-only is set to True. The lag time is drastically reduced when a light is removed during execution, even if the lag time is still noticeable (but it looks more or less like what I got in 1.8.1)
  • It doesn’t have any impact for the issue #3. The ‘gl_Color’ still seems to be mandatory in the VS (otherwise VS crashes) but nothing appears. The only thing that works is to unactivate the autoshader and let in the VS a “gl_FrontColor = gl_Color;” instruction (VS crashes without that instruction even when autoshader is not activated)

Again, many thanks for your help.

A few thoughts: if you don’t use gl_Color or p3d_Color in the vertex shader it gets compiled out, so that may be what makes the difference (regardless of what you assign it to - I don’t think gl_FrontColor has anything to do with this, or that you should even be using it).

Does your model have vertex colours at all? If so, do you know whether they are in packed-Direct3D format or in OpenGL format? (Is it loaded from an .egg or dynamically generated geometry?)

Yes, I agree,per the testing I did before. It only should be a question of compilation since there is no use of that variables in the GLSL code.

I did some further testing after your reply on issue #3 and I found out that the custom shader is working with autoshader (whatever basic-shaders-only flag is) depending on the model.

The custom shader is not working for :

  • A Quad generated with
cm = CardMaker("plane")
            cm.setFrame(-1, 1, -1, 1)
            cm.setHasNormals(True)
            self.np = render.attachNewNode(cm.generate())
  • Nick-Dragon.egg (the one in the cartoon shader sample)
  • A simple quad generated through Blender / YABEE with vertex color (since there are per vertex fields)

The customer shader is working for a model with no vertex color (I assume so since there is no field in the egg file), even if the autoshader is activated. I still need to have the “gl_FrontColor = gl_Color;” in the VS but at least I can see the model and the customer shader working. The model was exported from Blender with YABEE. I can send it to you if needed.

Thanks again

As for issue #3, here is a small sample that shows strange results with 2 separate models:

  • One working with the custom shader (the ball from the ‘ball in maze’ sample) in every cases
  • One working with the customer shader as long as there is no Quad3d displayed (generated through cardmaker) but not working when displaying the Quad3d

You can play with the sample (instructions within the sample for toggling) to see the effect of the Quad3d on the shader.

This issue only appears when shaderauto is activated: when not, everything is correct.

(in every case the VS contains the ‘gl_FrontColor = gl_Color;’ line)

Hope that helps.

Thanks again !
Issue #3.zip (1 MB)

I’m sort of losing track of all the issues here. In the future it might be a good idea to create a thread per issue.

I get this (not unexpected) error when I press t once (“CS applied to both Models”):

ERROR: created-shader:13: error(#429) "gl_FrontColor" is removed in GLSL 1.40 core. Try to use #extension GL_ARB_compatibility.
ERROR: created-shader:13: error(#429) "gl_Color" is removed in GLSL 1.40 core. Try to use #extension GL_ARB_compatibility.

So I replace gl_FrontColor with a varying I made myself (out vec4 something;) and assigned p3d_Color to it, and that makes the shaders show up fine, as far as I can tell.

Just FYI, it’s better to use Shader.load instead of Shader.make if you’re reading from a file since that enables shader caching and also preserves filenames in error messages.

Thanks for your reply.

OK,you are right. I’ll do it this way the next time.

At that stage and to sum up:

  • Issue #2 : was relating to a significant lag time in 1.9.0 (much higher than in 1.8.1) when using autoshader / removing a light “on the fly” at the render level. Based upon your advice, I set basic-shaders-only to True and the lag time has considerably been reduced.I can’t be sure at 100% that I now have the same situation as 1.8.1(I don’t have measured the lag time in 1.8.1) but it looks ok.
    So I’ll send you later a sample about that specific point and will open a new thread, because the issue #3 is more important and as you pointed out, it will be better for the follow up.

So let’s focus on issue #3.

  • Issue #3 : was relating to the use of a custom shader and autoshader. Several questions:
    a. Why the VS shader needs a line involving gl_Color whereas (as far as I know, since I’m definitively not an OpenGL expert…), it was not required at all.
    b. Why the shader works without any issue when autoshader is off, in any condition
    c. Why (according to the sample I made) the shader works perfectly with autoshader ‘on’ with 2 models (ball and floor) but when adding a CardMaker on the scene, then the ball model is working perfectly but the floor model not (it is not shown at all)

OK, if I (correctly) understood your suggestion:
→ In the VS shader, I :

  1. Removed the"gl_FrontColor = gl_Color;" line in main()
  2. Added a “out vec4 mycolor;” in the declaration section and a “mycolor = p3d_Color;” in main ()
    → In the FS, I added a “in vec4 mycolor;” line in the declaration section.
    Results:
    Compilation seems ok but the execution still shows:
    :display:gsg:glgsg(error): at 1213 of c:\buildslave\sdk-windows-amd64\build\pand
    a\src\glstuff\glShaderContext_src.cxx : invalid operation

So it means that the shader still requires the gl_Color…but my understanding of your suggestion may be wrong. Could it be a driver bug ?

I performed additional tests (I let the line “gl_FrontColor = gl_Color;” in the VS - in order to have the VS working at least- and autoshader ‘on’)

  1. According to the output you provided in your last post, I modified the GLSL versions (from 140 to 120) in VS/FS to see if any impact but unfortunately, there is no impact i.e. I still see what I described in issue #3 c. above (impact of the Cardmaker on the shader)
  2. I simplified the FS by using only one Texture (btw these models only have one TS…) to see if it changes anything but no change in the results.

At that stage, unless I misunderstood your workaround, I don’t see why this is happening.

Many thanks for the information. I let ‘make’ because I was using it in an other shader where I needed to change the shader file before creating it. But in that case, it makes no sense.

Many thanks again rdb.

Well,well, well,…since this issue seemed to have no rational explanation, I decided to take it the other way (I really do apologize for not having checked this before…): I upgraded my ATI drivers and voila, the issue #3 was (magically) solved:no need for gl_Color and the shader is working perfectly. That solved issue #3: thanks rdb for your patience !

Now it “seems” that this driver upgrade has some performance impact and this doesn’t explain why I still need to put ‘basic-shaders-only True’ to reduce the shader generation lag time…I’ll check more deeply and create a new thread for this.

I close this one as ‘solved’.

Thanks

On non-NVIDIA cards, Panda 1.8 limited you to arbvp1/arbfp1 profiles (as if basic-shaders-only was always on). In 1.9, this has been fixed, by allowing use of the glslv/glslf profiles, which are more powerful. Your driver may be slower at compiling shaders with those profiles.

Arguably we should still prefer arbvp1/arbfp1 if the shader supports it, or we should be enabling basic-shaders-only by default. Hmm.

My (personal) perspective would be to not limit 1.9 for driver-specific issues (at least for mines, I don’t know if other people have the same problem) so to let the basic-shaders-only as False by default.