egg ( dae2egg) optimisation ?

Hi.

I’ve recently try to import some (very) big meshes in collada.

They were about 22 mB ( 200k triangles, 4 500 line)

After a conversion to egg by using the collada importer (dae2egg.exe, version 1.6.2) the resulting egg file is 220 mB (6 763 367 lines)

That’s supposed to be 2 text format. Why such a difference ?

My whole pipeline is:

object created procedurally  in panda3d -> export into dae with my own method -> re transform into egg by dae2egg.exe

( I need the dae file for maya, and that’s work fine)

Egg us very verbose. Thats why panda3d can load egg.pz files off the hard drive. I am pretty sure your egg.pz file would be small because egg compress well.

Collada uses:

.. bunch of vertex data ..

Egg uses:
<vertex 1 2 3>
<vertex 1 2 3>

bunch of vertex data

Yes, if I pzip the egg it shrink to 10 mb.

But the problem is still here. It does take a lot in memory.

If I look at the memory used:

python.exe

1 956 k

from direct.directbase import DirectStart

131 324 k (normal result)

bob = loader.loadModel('procedural_mesh_128516.egg.pz')

1 016 000 k
that result include the loading of about 50 textures, but it’s a still a little too much I think.

Now when I load the file, the first time it does
take that 800mB of memory. But I suppose that it create a bam in cache, because if I restart python and load the same file again, this time it only take about 20 mB.

and if I egg2bam the file, it’s a 20 mb file, so that should be it.
I will created the bam first.

but question : why doesn’t that extra 800mB isn’t freed somewhere ?

When memory is “freed”, it doesn’t necessarily go back to the operating system. It remains allocated to the process, even though it’s technically free memory. But it’s still available for the process to re-use, so that if you loaded another large egg file, it would just re-use that same memory, without allocating more.

David

thank, no more question ! :stuck_out_tongue:

i always wonder why people ask for free RAM. free memory is unused memory. better let $something use it than nothing. first when it’s full of needed and used/reserved data i’d care.

Because if you want to have a light application, you want to track memory leak, or even only big data that can be compressed somehow.

When I see the memory allocated be a process increase, I always wonder if that can be optimized.