[SOLVED] Kinect motion capture -> Blender -> Panda?


blendernation.com/2011/01/07 … ox-kinect/

Sounds like it can be used as a pretty cheap motion capture device. Anyone had any luck with using it with 3dsmax/maya/blender etc?

Looks like people have found other uses for it too: allkinect.com/2010/11/23/fantast … t-is-cool/

Well, here’s what’s next on my list, I think:
It’s a kinect/PrimeSensor skeleton toolkit with for VRPN, so should work with Panda.

Im not really intersted in getting it to work with Panda3d, but with 3d modellers like blender (and get them to panda as egg animation), thats why I posted in the Pipeline section.
So if you have any info about that, please post it.
heres an example: blendernation.com/2011/01/07 … ox-kinect/

Though you could make very interesing and innovative games with it if it worked with panda, but im not interested for my current project.
Good luck

And this is the answer: forum.unity3d.com/threads/73319- … a-for-free!

You can get one of these for $150. I know that its not really cheap, but its a pretty good offer for what it does, I guess. Theres also the possibility that you have one because you own an Xbox already, or if you own an Xbox, you can also use it for its primary purpose. Another thing that excites me is that theres no hacking required, Kinect can be connected to the PC usb port and you just need to install some drivers, no modification to the actual kinect device.
Looks like a similar device is being developed for the PC: gizmodo.com/#!5723924

Since BVH can be imported to Blender (and Max and Maya), it can be converted to a Panda3d format, so I guess that answers it.
Give it some time and Im sure the open source community will make more solid applications for using Kinect as a mocap device.

I’ve got the Kinect setup and have played around with it. I even imported some BVH files into Blender. From what I’ve experienced so far, it’s not a very powerfull motion capture, but it can work if you’re willing to work at it.

There’s an issue when ever you let your limbs cross over each other. I noticed the points on my body, shown on the Motion Detection display, turned red when crossing points/limbs behind each other.

This will cause the animation to mess up when importing it into Blender. I would suggest, try to motion your body without crossing over limbs; which means you will have to create more subtle animations. I haven’t given it a real hard working yet because I had to remodel my Main Character and I’m just now rebuilding the Armatures for it.

Turns out Blender 2.49 has an issue when you Duplicate then Mirror objects for a Merging. Mirrored Merged Objects do not accept light correctly when exporting directly from Blender. I solved this issue by exporting my new Model as a 3DsMax file, loading it up into another 3D program, mirrored the object and then re-exported the file back out as .3Ds. From there I just loaded it backup into Blender and Merged the two halves. Wish I knew about the Blender Mirror error earlier, would of saved me a lot of time.

Anyway, once I’m ready to redo my Main Character’s animations, I plan on using the Kinect Motion Capture for simple animations, like the characters none battle and battle idle poses. I will attempt running animations as well. Combat animations with the Kinect will be kind of tough because of the limb crossing issue. The Best GUI program for the Kinect so far is the Brekel Application. The NiUserTracker Application sucks and messes up the whole animation.

The easiest way, that I can see, to transfer the motion from the imported BVH to my Main Character is to simply copy the data from the BVH bones to my Character bones; of course that would be for every frame.

PS, so glad I re-made my Main because the new model makes the first Model look like garbage! But then again, art is my God given talent so it’s no surprise I was able to “out do” my first model.

Wish me luck!

Alright…after checking out the Kinect Motion Capture further, I have drawn this conclusion…

Keep on key framing!

The only real way to make the Kinect useful with Blender is to build the same exact skeleton that’s created for the BVH file, for your character. The bones must have the same names and everything.

Once you’ve finished an animation plus created a BVH file, you can import it into Blender and simply copy the data from the BVH skeleton to your character skeleton, bone by bone.

I still feel the results can be unpredictable because of the angle your motion was captured on by the kinect, things can get pretty ugly when applying the data to you character bones.

I would find it much more easier and accurate to just keep frame all my animations over again. Besides, my character skeleton is much more complete than that of the BVH file.

I guess I’ll buy an Xbox 360 to go with that Kinect, other wise it’ll just collect dust. That hurts because I’m a PS3 fan.

If you really want that BVH file animation, then you have to set your character’s pose in the same pose as the rest position of the BVH armatures. You can see the BVH file rest position when entering edit mode. You use a modifier to get you character in the correct pose, then effect the modifier to keep it there. You now enter edit mode on your BVH armatures and scale down or scale up as needed…just match the bones with your character.

Now make sure your vertex groups mach the bone names of the BVH armature. Once all that is done, set the BVH skeleton as a modifier for your character. Now you should see your character perfectly matched up with the BVH animation.

Takes some work…and I only recommend it for the “die hard” developer who is making a game that can really benefit from the Human motion. Good key framing can look just as good. …Um err… not as good.