15 FPS loss when playing Actor animations? (resolved)

I cannot remember exactly when this started occurring, possibly when I began using simplepbr. However, consistently across my Actor models, I now get a ~15 fps drop when playing even the simplest animations. I can verify that it is indeed the animation play cycle that causes the fps drop. Sometimes the animation causes a “phantom lag” for a moment afterwards and then the fps is back up to 60.

I have not tested this with low-polygon models. I am maybe willing to drop my polygon counts if this is a vertex processing limit of some kind.

Firstly, note that it is not useful to measure a performance loss in FPS, since FPS is a non-linear scale. It is more useful to express the difference in milliseconds, which you can calculate by doing 1000 / fps, or by setting frame-rate-meter-milliseconds true in Config.prc. The difference between 45 and 60 fps is a much smaller difference than the one between 30 and 15 FPS.

CPU animation can be slow for heavy models. Please verify with PStats that this is indeed CPU animation time that is causing the slowdown. If so, you can simplify the model geometry, or, you can enable hardware skinning, which is easy to enable and vastly more efficient in the majority of cases.

1 Like

I have hardware-animated-vertices true and basic-shaders-only false set in the config. I have not run pstats yet.

Using the millisecond counter, I have 16.7 ms when walking (no actor animation) and 25 ms when walking with an animation playing.

I have tested a low-polygon model now, and the counter stays at 16.7 ms throughout the animation. I’m willing to attribute this to polygon counts, but I would like understand why my CPU cores all remain under 70% utilization during the more complex model lag.

Hard to say without looking at the PStats charts. However, note that unless you disable the multi-threaded render pipeline, Panda will only use one CPU core.

Note that for HW animation to work, you need to enable setShaderAuto, or apply a custom shader with a special tag set. I suspect that it has not actually been enabled.

Ah, I bet that’s it. setShaderAuto explains why the change occurred when I started using simplepbr.

Edit: This is the solution. All animations now playing at a crispy 16.7 ms.

As a heads up, simplepbr does not yet support hardware skinning, and using the auto-shader will disable simplepbr.

How hard would it be to get simplepbr to support hardware-animated-vertices? I am possibly willing to spend some time on a solution.

Turns out that just calling character_body.setShaderAuto() where character_body is my Actor model path seems to turn on hardware skinning, with the config variables in place.

This is an excellent discovery, as I was already knee-deep in writing a custom GLSL shader (which does not seem to be required now). If there are any fatal flaws with this approach, please let me know. The simplepbr rendering seems to be unaffected visually.

simplepbr will be disabled on your model if you enable the shader generator.

It would not be difficult at all, it would just require a few extra lines to the shader and an extra line to set the F_hardware_skinning flag on the ShaderAttrib.

This is the issue tracking support:

I am attempting to get this working with my own files.

So far I get the error AssertionError: false at line 2096 of panda/src/display/graphicsStateGuardian.cxx
when I try to apply the simplepbr shader directly with the modifier. I can get the program to run, with GL errors, using a modified version of that Hardware Skinning post you made.

        actor_shader = Shader.load(Shader.SL_GLSL, "shaders/simplepbr.vert", "shaders/simplepbr.frag")
        actor_shader = ShaderAttrib.make(actor_shader)
        actor_shader = actor_shader.setFlag(ShaderAttrib.F_hardware_skinning, True)
        # char_body.setShaderAuto()
        char_body.setShader(actor_shader)

It should be setAttrib, not setShader, since you have created a ShaderAttrib there.

Are you using the latest version of Panda3D?

setAttrib gives the same error (I checked that first). I am using 1.10.8

This might indicate a bug in Panda3D. Can you send me a self-contained zip file that I can run that demonstrates the error?

I’ll give it a shot. Will reply back here when it’s ready.

Here you go, throws the render state guardian error. I have included the .blend file with the armature action.

(file removed)

The problem is the lack of error message on Panda’s end. It doesn’t recognise some of the inputs, particularly p3d_TextureBaseColor and p3d_TextureMetallicRoughness.

simplepbr’s shader loader has a special routine that inserts #define statements that redefine p3d_TextureBaseColor to p3d_Texture0, etc. Because you’re loading the shader outside of simplebpr, those defines are no longer active and you are now loading an invalid texture input.

You’ll just need to make these substitutions:
https://github.com/Moguri/panda3d-simplepbr/blob/1a6ff562ca21ce0a9ec004ae1f82c05d6cf2078d/simplepbr/init.py#L51-L54

1 Like

Thanks. With a few minor adjustments I think I have managed to get the basic simplepbr .vert and .frag shaders working with your Hardware Skinning example.

One weird issue that remains is that the animation stops playing if I explicitly flag the shader with F_hardware_skinning, True. But as far as I can tell, the animation is being composed on the GPU. It runs butter smooth with a 5,000 polygon model flipping around.

I’m sure there are issues with this preliminary hack. However, this is as far as I got without causing errors in the terminal to be thrown. As far as I can tell, the current code does not throw an error.

However, the animation stops playing when specifically flagged for hardware skinning in the main function. I suspect something’s not being set properly. ; )

You’re assigning the composed skinning matrix to a new mat4 model_matrix, but then not doing anything with it.

What you want to do is incorporate it into the vertex position calculation, like so:

    vec4 vert_pos4 = p3d_ModelViewMatrix * skin_matrix * p3d_Vertex;