Generic conversion model -> bullet body

Yabee is probably the right solution, but to answer your transform questions, just use nodepath.getTransform(render) to get the net composed transform in one call.

David

I have trouble getting yabee to run. Anyway if I want a workaround solution, I have it with flattenLight(). I suppose getTransform(render) would work too.

But at this point I’m trying to get accustomed to the engine, so I need to understand what’s wrong with my latest code, if anyone would care to have a look.

Cheers.

Your problem is that the TransformState still has the scale ((1,1,-1) or so). addShpae takes a transform as an argument, but I evaluates only the transforms translation (pos) and orientation (h/p/r). Scale and shear are not supported by Bullet local shape transforms. Can’t change this - it’s Bullet.

So again: don’t use scale and shear with collision shapes, or you will have to do lot’s of work.

Of course you could try an ask the TransformState if it has a scale, and then apply this scale to the visible geom BEFORE creating a BulletConvexMesh from it. Or simply use the lower level API for BulletConvexMesh. Here you can apply whatever transforms you want to each point before adding it to the convex mesh. You just need to iterate over the vertices within your geom. The tutorial has some sections on how to process visible geometry:

    shape = BulletConvexHullShape()
    shape.addPoint(Point3(...))
    shape.addPoint(Point3(...))
    shape.addPoint(Point3(...))
    ...

A warning: using convex meshes for arbitraty visible geometry is not recommended! Convex meshes should stay small (a few dozen vertices, at max 200 or 300), and the convex meshes should be reused for multiple shapes. common usage is the body of a car, where you have multiple cars of the same type in your game. For more complex dynamic objects you should approximate the shape by using multiple primitives (box, sphere, capsule, …).

By the way, this is a shorter version of your basic processing algorithm:

model = loader.loadModel('path')
for gnp in model.findAllMatches('**/+GeomNode'):
  gnode = gnp.node()
  ts = gnp.getTransform(np)
  shape = ...
  body.addShape(shape, ts)

Can you make the method throw if we call it with scale ? It would reduce the surprise factor a bit (if anyone has the idea of doing like I did).

Now I’m done with experimenting. I got something that reproduces the node structure from the original model. If anyone cares to read and comment on correcteness (except that I’m creating several bodies), it’s here.

I think I will use flattenLight() instead of this, for real code.

I have one more question: I’d like to have a workflow where in the editor, I can flag geometry objects as being “visual”, or “physical” (in order to have simpler collision models than visual, for example bouding boxes). Is there a standard for this ? I say “standard” because I need something that is writable in blender first, can then survive through exporting as .x, and conversion to .egg.

I’d say the “official” way to do this is to write a tag along the lines of bulletcollision { parameters } in the egg file. But I don’t know how you’d make the .x and the x->egg converters aware of any custom properties you set in Blender. I’m not even sure yabee has a way to write tags like this.

The quickest way is probably you could make an empty in Blender, parent all of your collision geometry to it (select geometry, select the empty last, ctrl+P), make sure the empty is called “collision”, then call model.find(“**/collision”) in Panda. This node’s children will be your collision geoms.

I went for a solution based on node names first. My problem now is that I am getting funky physics results: either some objects rebound strangely, or some objects don’t collide, depending on the type of shapes I use. Also, when objects get to sleeping state, there is a visible gap where they should be in contact.

If someone has time to spend reproducing, the code can be found here and the egg file there.

The reason for the funky physics is that your geometry is not suited for generating collision shapes from it. the “ball” from which you create a convex hull shape has 480 vertices! I had a look at your egg file, picked the last vertex, and filtered for “nearby” vertices (manhattan distance < 0.00001) within the text editor - I found 12 matches!

Let me try some ascii art:

       |
       |
-------+-----
       |
       |

This is what a suitable mesh for physics should look like. A single vertex, part of multiple faces.

     | |
     | |
-----+ +-----

-----+ +-----
     | |
     | |

This is what your mesh looks like. A bunch of vertices, all at (more or less) the same location. Each vertex is part of only one face.

Bullet doesn’t care when creating a convex hull. It will create internal planes for every vertex, and the normal of the plane will depend in the nearby vertices. Given the tiny gaps between your “nearby” vertices the result will be random normals, and htus funy physics. Poor performance is another disadvantage.

You could either fix this within your modelling application (in Blender it is called "welding - if I remember right), or add a preprocessing filter where you loop over all vertices of a model, and collect only unique vertices (with a small epsilon of course), then pass these vertices to the convex hull shape.