Impriving the time taken for converstion to bam

Are there any factors that might significantly slow down Panda’s conversion from egg to bam?

To explain, I have a moderately-large level. While export to egg takes long enough, it’s not too bad. (Somewhere between twenty minutes and an hour, I think–I’m not sure, as I usually let it run while I’m elsewhere.) However, when I load it into my game and it’s (presumably) converted to bam, the process seems to take somewhere between two and three hours–again, I’m not sure.

This only occurs when the level is first loaded; subsequent loads are quick, presumably because they use the cached bam file.

It might help for me to use “egg2bam” directly and have the level just load the bam file, thus removing the overhead of my IDE and the game itself during conversion, but I’m wondering whether some feature of my level might not be a problem.

I would assume that it’s the process of loading the .egg file that takes long, not the process of writing the result out to .bam (since .bam is a fairly straightforward memory dump).

Perhaps you could share with us some details about how your .egg file is structured? If using Blender, it may help to use the DupliGroup functionality with an externally linked .blend file in combination with the “File” game property in order to split up your scene into multiple .egg files. If we have the original .egg file, we could also profile the .egg loader to see if there are any obvious bottlenecks to optimize.

Divide the level into multiple smaller parts?
You can make egg the file smaller by:

  • using short names
    If you have 1mln verts in a node called some_long_named_node then you have probably 1mln+ <VertexRef> { 1 2 3 <Ref> { some_long_named_node.verts } } or something like that. Naming the node n1 will save you 1mln*19 characters in the file. Same is for textures, naming <Texture> T1, T2 and so on can save a lot of space.

  • get rid of unused entries
    Do you use vertex colors? If no, get rid of all <RGBA> { 1 1 1 1 } they might be in Polygon or Vertex entries. Do you use normal maps? If no get rid of <Tangent> and <Binormal> entries.

  • round to zero
    Depending on the exporter you can find numbers like -1.50996e-007 replace all this with 0 also replace all 0.0 with 0

About rounding data and removing unnecessary fields. In my exporter Hatcher, I did it. However, it is not yet complete.

Thank you all for your responses! :slight_smile:

Regarding splitting the level into multiple models, I’ve been thinking about that. However, I hesitate because to some degree that means moving downtime from my end to the player’s end–and even though the additional loading-time on the player’s end would likely be minimal, across many players it would add up. I’d rather that it be my time that’s taken up than that I ask it of my players.

In what way? Do you mean the structure of the level in Blender? If you mean the file itself, I’m just exporting through YABEE.

One thing that I will say is that I do have a lot of nodes: it’s a fairly detailed scene. I can perhaps reduce the node-count a bit by merging the stories of the level’s buildings–but I’m not sure of how much further I can take it, given that different objects may use different materials.

I’m tempted–but the egg file in question is 2.4GB in size. ^^; (There’s a smaller one at about 1.4GB, I think.)

Wow, I didn’t know that node-names were so repeated! 0_0

Hmm… I might experiment with this, and see what effect it has to replace some of the default “Plane”, “Cube”, and “Cylinder” names with names composed of a single character and a numeric suffix…

Hmm… I make fairly extensive use of vertex-colours and especially normal-maps.

As a result of the former, I likely have numerous objects that have materials with the “Vertex Color Paint” box checked (in Blender, that is), but which themselves don’t have vertex colours.

That said, does YABEE export those fields where they’re not used?

That might be worth looking into, too. Would it be better to do this in Blender, or elsewhere, do you think?

Theoretically you could merge the .bam files and ship the merged .bam files in the final version.

It sounds like this ought to be a feature of YABEE, or egg-trans, or both.

Interesting! How would I go about doing that?

It does seem that it might be useful.

You can just load each bam file using loadModel and parent its children to the same node, and then write that node out again using writeBamFile.

I’m not suggesting this as something you should definitely do; just making you aware that this is a possibility.

It seems like a promising approach, however. Are there any caveats that I should be aware of?

Not that I can think of, other than the fact that it makes your pipeline more complicated.

That’s not a huge concern, I think, so excellent! I may well do that, then. Thank you! :slight_smile:

I did have another thought between my last post and this:

I ran some experiments with renamed nodes, and while that didn’t produce huge differences, I was struck by something: I was exporting perhaps a third of my level. A few Blender-layers were excluded–but nothing terribly complicated, I think.

And yet importation took very little time at all–a matter of minutes.

So, the though occurred to me: could it be that I’m hitting a memory wall during the conversion to bam, perhaps leaving the computer swapping things into and out of virtual memory? I do have a fair bit of memory in this computer: 8GB in total, I believe–but I daresay that not all of that is available at a given point. But perhaps somewhere in the process the whole structure is becoming a bit too big…?

I think you can create a EGG file large generator for community tests. If to check the speed of conversion to BAM.

Hmm… I suppose that it would just generate a bunch of arbitrary geometry, and then write it out as an egg file? That might work–but I worry that it might miss some feature of my level setup that may contribute to the problem. :/

And if the problem is in the size itself, and in no way connected with the structure of your file? Sooner or later everything always comes up against the ceiling.

@Thaumaturge, is the blend file significantly smaller (Blender also has an option to compress blend files if you’re not already using it)? If so, is it something you can share (at least privately)? I would be interested in seeing if/how BlenderPanda chokes on it.

That’s a fair point, I think. I’ll give it some thought, then. Thank you, :slight_smile:

It’s about 288MB, albeit that I presume that the textures aren’t included in that.

As to sharing it… Hmm. I’m not entirely comfortable with the idea, I’ll confess, especially as it presumably involves either also sending my various textures, or replacing them with stand-ins. It may be silly, but I get a little paranoid–especially as I’ve spent a lot of time and effort on this. I’ll think about it, however.

I didn’t know about the option to compress blend files! I don’t know whether it’s active in my case, but if not, I don’t think that it’s called for quite yet in my case. Still, I’m glad to know of it–thank you. :slight_smile: