Joint control and transforms

Hi all,

So I’ve gotten my FABRIK solver code working as far as I can see, but I’ve run into a problem that I think is down to each joint transform being relative to the parent (have I got that right?). The solver I implemented from pseudo-code in a paper expects all the joint positions to be in the same space. I realise I can fix all this using NodePath.getRelativePoint, but before I go down that road, I thought I’d ask if anyone knew a better/cleaner way of doing things. I figure it’s better to ask now since I expect I’ll be facing the same problem when I come to add joint constraints to the code later on. Thanks.

Hmm… One idea that comes to mind is to store for each joint its world-space base transformation; I think that the appropriate local transformation for a controlled joint could then be derived by applying the inverse of that transformation to whatever global transformation you want to apply to it.

I think that you should be able to get those global transformations by first exposing each joint and reading the transformation of the exposed NodePath.

By which mechanism are you accessing the joint information? In Actor’s exposeJoint, there is a flag to indicate that you would like to access absolute transforms, or in the lower-level CharacterJoint there is get_net_transform et al.

Interesting idea there Thaumaturge, thanks for that.

rdb, I do have both an exposeJoint NodePath (currently used for bone visualization) and a controlJoint NodePath for each joint. I saw that flag in the docs, but figured it wouldn’t be too much help to me when it came to writing the changes back to the controlJoint. So if I understand you, with get_net_transform (or the transform from the exposeJoint NodePath with that flag set), I could get all the joint positions with respect to a common origin, apply the calculated translation, and then overwrite the current controlJoint NodePath’s matrix with that one. Would the controlJoint then be using the root joint as origin, or still the joint’s parent? Sorry if I seem dense here, but vectors/matrices etc. have a tendency to turn my mind to mush.

In the end, controlJoint takes relative transformations, still. So you do need to transform your absolute transformations to relative space, by multiplying each transformation you want to set on a joint by the inverse of the parent joint’s matrix.

Ok thanks, that makes sense, but I’m having a little trouble implementing that absolute -> relative transform. Sorry to be a pain, but could I trouble you for a quick code snippet? The solver is currently spitting out an absolute getPos() from an exposeJoint with a translation applied.

Can your solver alter the position of a child joint and a parent joint simultaneously? If so, you need to apply them to the parents first, otherwise your relative transforms will be computed incorrectly. You might need to walk the joint tree and apply the transformation on the parent first, and store the transform as you go along.

Also, you mention that you have the positions of your joints from the solver. That confuses me. Normally, you’d only rotate the individual joints, since applying a translation to a joint will cause a character to become disfigured (ie the bone will separate at the joints). However, it is still possible to calculate the proper rotation for a joint given its position relative to the parent, but then you have to keep in mind that if this position increases the distance, the joint will have to be scaled in order to take this into account.
So, what kind of position data are you getting, and what type of model are you animating? And are you getting position data for all the joints or just a select few?

I’m willing to create a bit of example code, but you’d have to give me a bit more information about what you’re doing and the type of the data you get from the solver. It’d help even more if you e-mailed me the rigged model you were using.

Yeah, it has access to the positions of all joints in the chain. The solver I’m implementing is detailed in a paper here. From the sounds of it it works a little differently to others.

I believe the bone lengths are always preserved, so no distortion should occur (I think). Joint rotations are handled later on in the paper, but I haven’t done anything with that yet. I’m getting an LPoint3f from getPos() of a NodePath attached to the each joint, and passing that data to the solver. It’s a human model from the MakeHuman application, with quite a complex rig (I wanted to see how well FABRIK handled it). The solver works using a joint chain from a root/sub-base (branching joint) to an end effector. The position data I pass to the solver is a list of LPoint3f’s for each joint in that chain.

I’ve attached the test code I’m using along with the model. As far as the solver goes, the pseudo-code in the paper should make more sense than my currently uncommented/obscure code. The iterate() function is basically a straight copy of what’s in the paper.

Thanks again for your time. My hope is if I can get this working and sufficiently polished/general, to post it in the snippets section, so hopefully it’ll be useful to others.

Edit: Apparently I can’t get the stuff to attach. It’s a 22MB zip. Where would you like it emailed?

My e-mail address is available through the forum software (there should be an “email” button next to this post and on my profile).

Odd. All I can see is options for PM and WWW.

This small sample program might help. It sets up a tree of nodes that is bound to the original joint tree, upon which you can simply call NodePath.setPos() specifying the root node as the “other” argument, which means that you’ll effectively be setting absolute positions.

Keep in mind that when setting the new positions, you should set them in a top-down fashion (using either a breadth-first or depth-first walk function) otherwise the new absolute transforms won’t be computed properly.

Try pressing “a” which will set one of the bones to an absolute position.

However, I’m not sure that you indeed want to set the positions, since that won’t cause the geometry to be rotated in the direction that the bone is going. You’ll probably want to instead use lookAt (press “s”), but you have to find a way maintain the roll rotation of the bones, since using lookAt to rotate a bone towards a point might cause it to twist.

from panda3d.core import *
from direct.actor.Actor import Actor
from direct.directbase import DirectStart

def buildJointTree(joint, parentNP):
    for child in joint.getChildren():
        if isinstance(joint, CharacterJoint):
            # Create a new node to represent the joint.
            jointNP = NodePath(child.getName())
            if parentNP is not None:
                jointNP.reparentTo(parentNP)

            # Copy the rest transform
            jointNP.setTransform(child.getTransformState())

            # Joint should henceforth be controlled by the node.
            child.applyControl(jointNP.node())

            # Recurse.
            buildJointTree(child, jointNP)
        else:
            # Skip the non-joint, but still process its children.
            buildJointTree(child, parentNP)

actor = Actor("models/artist.egg")
actor.reparentTo(base.render)

character = actor.getPartBundle("modelRoot")

# Build a joint tree using PandaNodes
jointTree = NodePath(character.getName())
buildJointTree(character, jointTree)
jointTree.ls()

def a():
    # Change the absolute position.
    thighL = jointTree.find("**/thigh.L")
    thighL.setPos(jointTree, (0.5, -0.1, 0))

def s():
    target = Point3(-0.5, 0, 0)

    # Turn towards the given position.
    thighR = jointTree.find("**/thigh.R")
    thighR.lookAt(jointTree, target)

base.accept('a', a)
base.accept('s', s)
base.run()

Thanks a lot rdb, that’s really helpful. I get what you mean about the geometry not going with the movements, but a quick skim through the rotational/oriental constraints stuff in the paper looks to address this. I’m still not sure whether it’ll look natural, but I guess the only way to know for sure is to try it and find out. I’m going to read the paper again in it’s entirety to make sure my mental model of it is right, as it was a while back that I came across it originally. The picture in my head is that it works more like a marionette puppet than a robot arm. Hopefully I’m not just falling prey to inexperience here. Thanks again.

So I’ve finally got a working prototype of the solver, albeit only modifying positions, without rotating the joint to compensate. I’ve an idea of how to do this compensation, but I could use some verification that I’m on the right track.

As I understand things, I’m moving a joint around, but its rotation stays the same as it was in the bind pose. So if I were to:

  • Take two vectors from the joint origin to the old and new positions.
  • Find a quaternion to represent the rotation between these two vectors.
  • Add(?) this quaternion to a quaternion representing the current joint rotation.
  • Call setQuat() on the joint with the resulting quaternion.

…that should rotate the joint appropriately. Does that sound about right?

Edit: Scratch that. I just tried it and it didn’t work as I was expecting. Back to the drawing board. Any ideas would be welcome.

Edit 2: Just figured out what I think I should be doing. Find the quaternion between the old and new positions as before, and rotate the old position by that quaternion. This gets the same new position but also rotates the joint accordingly. Seems to look right as far as the joints go, but also at the moment seems to be swinging the joints around a different axis to the one I intended. Almost there.