Can I export Note Track info into egg animations?

I suspect I already know the answer to this, and that it’s “no”, but wanted to ask anyway.
Is there a way to export Note track information into an egg animation using the current 3ds max exporter?
(Which I’ve seen expressed elsewhere as “adding an event channel to the Actor Animation”.)

I’m aware that I can call a function on a given frame inan animation using code like this:

                part1 = self.model.actorInterval('melee', startFrame = 1, endFrame = 5)
                part2 = Func (self.chkMelee, target)
                part3 = self.model.actorInterval('melee', startFrame = 6, endFrame = 23)

But I will try to explain why I’d prefer to be able to have a function called from a note track-like entry inside the animations themselves:
The game I am working on will have many, many animations. These will be “melees” (ie punches or kicks), and will be randomly played. They will be different lengths, contain different numbers of frames, and have the point of contact at various different frames from animation to animation.
If I could add a “contact” note track note into each animation at the appropriate frame, the game could check for that, and do what it needs to do at that point in the animation.
I’m hoping to avoid having to write separate code for each and every melee, or ranged animation.

If it’s simply not possible with the .egg exporter, how about collada? or .x? Do either of them allow note track info to be exported in an animation?

Or would it be possible to manually add the information within the .egg file in some way if it cannot be exported within the anim?

Based on the last post in this thread doesn’t sound like Panda supports this - unless David got to this item on his list since then.

panda3d.org/forums/viewtopi … 82aaa86f1b

Assuming it’s not possible in Panda, any ideas from anyone on implementing in Python somehow?

bump.

I assume Panda does not support this currently?

Is it on the “list”? (as mentioned in the linked thread).

Is there a way to do it with code? (though I’m fearing this would be quite some piece of code)

The linked thread only references throwing an event at the end of an animation. That feature is not yet present. But as also described in the thread, it’s not a required feature, since you can achieve the same thing with a do-later or with a sequence.

Still, none of this comes close to automatic loading of such a thing from Max. You’ll have to do all that yourself. I have confidence that you can do it. :slight_smile:

David

Appreciate the vote of confidence :slight_smile:

(not sure where to start, but I’ll get to it eventually :slight_smile: )

I had an idea of making some dummy bones or objects in the models, not attached to anything.
At the right frame in an animation, these would be rotated 180 degrees, then rotated back in the next frame.
A function in the game checks the rotation of the bone in the model if that type of animation is playing, and if and when the bone rotates, plays the sound, or whatever the desired result is.
That way if I have 50 different “melee” animations and want the game to play a sound when each “contacts”, and the “contact” at different frames and times in each, I can set up the bone to rotate in each at the right time in Max, export the anims, and the game will automatically play the sound at the correct frame of each anim.
Seems simple enough, but of course I haven’t gotten my badly executed version of the code to work yet.

I can’t think of any reason why it shouldn’t work once someone who actually knows what they are doing writes the function, though.
Can anyone think of any way this is “wrong”? In my ignorance am I overlooking a good coding reason not to do this that way?

You may not observe every frame of the animation play. If the simulation frame rate is lower than the animation frame rate, some of your frames will be skipped, and one of the skipped frames might be your key 180-degree animation.

It would be better to know ahead of time at which frame number (or at which point in time) these key moments occur, and set a timer to expire in synchronicity with that moment. That way you wouldn’t have to rely on observing a particular frame datum when it happens; if you skipped over that frame, the timer would just expire the next frame instead.

You can still determine which frame number(s) are your key frames by pre-examining the data for the key values, if you like that approach. Just don’t rely on examining at runtime.

David

Thanks, David.
That makes sense.

Well, I’m afraid I’m a bit stumped here.
it’s probably very simple, and coding really isnt my thing, but I cannot seem to read the rotation value of the bone within the model.
I can get a value for the “r” out of hpr either by using:

		self.charContact = self.model.exposeJoint(None, "modelRoot", "contact")		                      
		c = self.charContact.getR()

or:

                self.contact = self.model.find("**/contact")
                f = self.contact.getR()

(I used egg-optchar to “label” the “contact” bone first) and I’ve extended the number of frames the bone is rotated for in the animation to 15, to be sure they aren’t being skipped.
But during the animation I get only one value for “r” and it’s -180.

Now some other animations have the whole model rotate over, and in those, the value of “r” changes to 90, even though the bone itself isnt rotated in those anims, the whole model is.
So it appears I’m reading a value for the entire model, not one bone within it.
Can anyone take pity on a none-too-bright and somewhat lazy neophyte coder and explain to me what I’m doing wrong?

EDIT:
I think, looking at the code which calls the function again, that I’m only sampling once per animation, when it should be every frame, or at least every few frames…

Right, you will need to run through the entire animation and get the value for each frame. One way to do this will be to run in a loop:

The call to update() is necessary to force the animation into the joints without waiting for the frame to be rendered.

Note that the “r” rotation in the source animation is not necessarily “r” in the Actor, due to a possible coordinate-system transformation. It might be “p” or “h” instead. That’s why I suggest looking at all three components of getHpr() to understand where it’s changing.

David