We have a children’s interactive as part of a larger project that is meant to mirror a persons motion on a variety of models. Major caveat: this was originally coded by a subcontractor, but they couldn’t get the bone rotations worked out in time, and although I am a programmer by trade, I’m a database guy, and this is all a bit beyond my ken. I’ve stripped out hundreds of lines of code, and rewritten much of the rest, and gotten it to a somewhat workable state, but it feels like someone with more experience than me could probably overcome the remaining issue with ease.
The basic pipeline starts with x,y,z coords in coming from mediapipe. They are just screen coords relative to the mocap image, and they are translated into panda coords using rdb’s amazingly handy camera extrude trick. So far, so good. We’re only trying to move 2 arm bones per side, and 2 leg bones per side.
Then, each frame, I’m creating a dummy node for the 2 keypoints that should track the head and tail of a given bone, for example left_forearm is calculated from the relative offsets between left_elbow and left_wrist by having the left_elbow dummy look at the left_wrist dummy (this helps avoid issues around body position relative to the mocap camera, etc). I can then apply that rotation to the controlled joint, and it will happily track just fine, UNTIL it hits a perpendicular, at which point the lookAt rotation jumps from say (h=90,p=89) to (h=-90,p=89) rather than (h=90,p=91). As an example, if a user holds their arm in a classic bicpe flex, as the angle between their forearm and upper arm closes from >90 to <90, the whole forearm contorts under the upper arm.
joint = self.skeleton_joints.get(bone_name)
pos_base = raw_keypoints[raw_pair['base']]
dummy = self.dummy_nodes[bone_name]
dummy_base = dummy['base']
dummy_base.setPos(pos_base)
pos_target = raw_keypoints[raw_pair['target']]
dummy_target = dummy['target']
dummy_target.setPos(pos_target)
dummy_base.lookAt(dummy_target)
joint.setQuat(dummy_base.getQuat())
I understand that in some sense those are equivalent rotations, but the resulting motion of the actual bone is not correct. If I hard code a rotation of (h=90,p=91), it looks great. I’m currently trying to force this by sniffing the resulting rotation, but it’s a little janky, and not always easy to paramterize post hoc for each bone.
I’ve spent a ton of time crawling this incredibly civilized and useful forum, and I’ve tried everything that I could find to correct this issue, and there was a ton of useful info on dealing with absolute vs relative rotations for parented joints, etc. I get similar results whether applying the rotation via hpr or quats, and no amount of late night monkeying around in Panda or Blender has been able to resolve this issue. At this point, we’re a week late on delivering this particular interactive, and other projects are piling up while I’m still fighting a losing battle on this bone rotation.
If anyone more experienced can offer some guidance or a silver bullet that is obvious to them, but not to me, I would be hugely appreciative. This is also paid client work, and I would be much happier to pay a bounty for a solution than to keep banging my head on this unfamiliar wall Any code, model files, or screenshots totally available if it helps. Thank you!