Performance with high polygon and vertices counts

Hi a couple quick questions

QUESTION #1:

If I were to use a character like Masha at 3d02-ae.com which has a
polygon count of 22649 and vertices count of 12252, could i expect reasonable performance in panda or should a character like that be left for film and xbox projects? Just thought i’d ask rather than buying those types of s and trying them out.

QUESTION #2:

What would you anticipate is the highest polygon count for a character that i could use with panda to get great results - an exploration type games that does not require immediate response.

QUESTION #3:

Quick question about terrain. Let’s say I needed a specific terrain with valleys houses and mountains along the way, about what would be the maximum size of terrain that I could reasonably import and get good performance.
(I noticed that the sample program imports environment.egg which is a pretty small terrain.)

QUESTION #4:
I’m guessing new terrain could be downloaded in the background so that when the first terrain is crossed and a new terrain is entered the first could be presented. Is that right? or does all game play have to cease before a new terrain is loaded.

Thanks in advance

Wow, lotta questions. :slight_smile:

Q1. The new version of panda3d is pretty efficient with high-poly characters. The highest possible speed that an engine could theoretically achieve would be to put the character into a single big vertex array, and send it to the video card in a single drawing command. If the engine were to do that, then the performance would be limited only by the capabilities of the video card. Panda comes fairly close to that ideal. The big exception is that panda’s shader system and panda’s hardware animation system haven’t been integrated with each other yet. So when I say “hardware animation system,” that means that panda can let the video card do the animation (that’s the fastest way to do animation, currently). However, panda can’t do that and use shaders at the same time. In that one case, panda will have to calculate the vertex positions itself, and then send them to the video card. So how big of a limitation is that? Frankly, I haven’t tested it.

Q2. Don’t know. If I were you, I’d just do a simple test. Download a high-polygon model (like, say, “eve”) from our website. Run it in “pview” (the model viewer), and turn on the frame-rate display. Then, repeat with two copies of eve, then three, and so forth. Get a few data points and extrapolate.

Q3. As far as panda is concerned, terrain is just another 3D model. Again, I’m afraid I don’t know a concrete number. However, I do know that the “Airblade” game contains a very large static model, with literally hundreds of thousands of polygons (the name of the model is “SuperCity”). You might want to try that in the model viewer for a benchmark.

Q4. You can certainly download and load new 3D models (including terrain) on the fly. The simplest way to do it is to actually download a BAM file (or, if you don’t care about speed, EGG file) and then load it up. But if you want to get fancy, you can download the data into RAM and then load it directly from RAM without any disk access. Also, I’ve been meaning to write a subroutine that would convert an elevation array into a 3D model. This would not be hard to do, I just never get around to it. If somebody were to do that, you could download the elevations into an array, and the texture data into another array, and then convert it all to a model and display it.

Greetings to ALL!!!

2mrclean
“The big exception is that panda’s shader system and panda’s hardware animation system haven’t been integrated with each other yet. So when I say “hardware animation system,” that means that panda can let the video card do the animation (that’s the fastest way to do animation, currently). However, panda can’t do that and use shaders at the same time.”

Do You plan to make support of shaders (bump for example) in animation? And if yes then when?

Sorry to resurrect an ancient thread, but this was among the more provocative results when I searched for benchmark/performance information. Is the limitation he described still an issue for Panda? That is, are hardware animation and shaders still incompatible?

Yes, though it’s not so much a limitation as you might think. With modern standards for animation continually creeping up, few people attempt to do hardware animation these days. Turns out the CPU is really good at this sort of thing, and the GPU isn’t particularly. Usually best to save the GPU cycles for what it really is good at.

That being said, you can always write your own shader to perform animation on the GPU, and not use Panda’s entire animation subsystem.

David