hardware animated vertices and GL_ARB_vertex_blend

I was looking into attempting some kind of GPU skinning, so I thought I would take a look at the Panda solution.

It appears that the munger will not do the hardware skinning because

get_gsg()->get_max_vertex_transforms()

returns 0. Looking deeper, this is because _supports_vertex_blend in the GSG is false. It is set with

_supports_vertex_blend = has_extension("GL_ARB_vertex_blend");

This extension seems to have been created very long ago (OpenGL 1.x) and although I have a semi-recent GPU, I would expect that it should be supported by even an older card.
So why then do I seem to not have this extension (or why is Panda reporting I don’t have it)? I have a 9800GTX with up-to-date drivers.

This is an outdated extension. It was intended to implemented GPU skinning on a fixed-function card. It was never really very widely implemented, and not really useful even when graphics drivers implemented it. In modern graphics drivers, it is preferred to write a custom shader to handle the GPU skinning instead.

Unfortunately, Panda doesn’t provide any hooks into such a custom shader, and would not be able to load your standard animation files into the custom shader. You would have to write all of the required infrastructure yourself.

One day we hope to provide these kinds of hooks, when some industrious volunteer adds the needed code to Panda. Until then, you will have to stick with CPU skinning.

David

Actually that doesn’t sound too difficult, I might give it a try.
I am imagining the vertex format would have some extra columns for joint IDs and weight percentages, and then somewhere an array of matrices for the joints would be passed to the shader each frame.
Does that sound about right?

Sounds like a good start.

For anyone ending up in this thread via Google, Panda now provides hooks for people to easily implement skinning in a shader, and also automates it in the shader generator: