Animation Influence on fps

Do you mean it complained when you ran egg-optchar, or when you loaded the resulting egg files? Egg-optchar is more stringent about matching bone names than the Actor loader is, so it might produce more error messages. But it won’t lie to you: if it complains about mismatched bone names, then there are, indeed, mismatched bone names.

One of the other things that egg-optchar does is rename mismatched bone names as needed so that the hierarchy exactly matches between model and all animation files. So if you’re getting this error message when loading the files that you ran through egg-optchar, something impossible has happened, and that makes me think you might be loading the wrong files.


Like I said, I over wrote the original files with the egg-optchar ones. I use the same skeleton type for all my characters, so the bone names are always the same (only the skeleton name changes with Actor to Actor), rather it’s the model file or the file containing the animation, the same skeleton is there with the same bone names.

I was getting error messages stating that Hand.L or ULeg.R did not exits, stuff like that; which is crazy because the left hand bone was not missing from any file used in egg-optchar, nor any other bone.

If I remember correctly, the errors occured after running the app with the new files, and not during the egg-optchar process. It’s like the Actors’ skeletons gets destroyed and broken down to only several working bones. That’s why the Actor is in it’s original modeled pose, but with some of it’s vertices being animated, stretching away from the Actor’s geometry (which looks crazy).

I’m assuming the only way to get a dozen Actors on screen and maintain a perfect 60 fps is to create a super cheap environment around them. Either using a pre-rendered backdrop, or creating smaller, cheaply modeled closed enviroments, like a squared room or something, where the walls are all flat panels with graphic card side bumpmapping.

This however limits one’s creativity.

Just in case you’re wondering, when I ran egg-optchar, I started with the Actor’s model file and then followed that with all the animation files. I figured that’s the way it was suppose to be done.

If you can provide an example of a series of egg files that work correctly before egg-optchar is applied to them, but fail to work after egg-optchar is applied, I’d love to see it.


Mark, if you could upload your problematic actor somewhere, I would love to take a look. If animation is really a problem, (-- I haven’t really tested animations myself), it would motivate me to develop a shader to do this on the gpu :slight_smile:

With that said, assuming you have a modern cpu running at ~2 GHz, and each vertex takes about 20 math ops to transform, the conceivable limit for 60hz animation of actors is something like

2GHz * 0.05 efficient/20ops/60hz ~80k vertices

The 5% efficiency factor is to take into account that unless Panda has really optimized this stage, there are going to be a lot of cache stalls.

In the end, animation transformations should really be moved to GPU, they run ~100 times faster than the CPU for this kind of stuff.

I believe there’s a config setting for this.

hardware-animated-vertices #t

Makes no difference on my machine (fps wise).

As far as I know the hardware animation setting was created long ago for now-outdated hardware/drivers.
I was considering taking a stab at GPU skinning myself, but adding LODs for the actors brought the workload down so much that it wasn’t really worth the effort anymore. Further, if you take advantage of the new 3-thread rendering pipeline it is probably faster to keep skinning on the CPU.

You’re talking about the 1.8 version that’s suppose to release. That version is avaiable?

I was watching one of the P3D class videos were that stuff was talked about. The teacher in the video said fps only increased like 1x, instead of 2x or 3x.

I’m guessing it’s a very hard process to complete; getting P3D to run like other commercial engines using threading with APP, CULL and DRAW.

That’s why I don’t get into the engine construction stuff. Too much for me. :smiley:

I guess my practice game design hasn’t been too bad. I was watching some videos about average fps for consoles and it turns out PS3 and Xbox 360 games mostly run around 30 fps and some of them play more 20ish.

No surprise when I saw videos of this. I always figured console games were running around 30 fps or lower. God of War 3 (PS3) was bouncing from 30ish to 40ish. I saw some shooter game bouncing around 28 fps.

The thing is, consoles are made for playing a game. PCs have to handle numerous processes and can not devote 100 processing time to a game.

So all in all, I guess I’m holding up well with my fps after all. The cool thing about my PC is, it will automatically use an entire processor for a game. I don’t have to set this behavior.

Console games do look smoother at lower fps though, I’ll give them that much.

My enemy spawn will be a lot like the enemy spawn I saw in the God of War 1 videos. Where there’s anywhere from two to four enemies at a time on average and spawn them as one dies. I’ll go with more if I have a really super low poly enemy, like skeletons or something.

Depending on my environment, I should stay 50ish. I might drop to high 40ish in larger outer environments.