I do have a problem with flatten strong.
at the moment I’m loading a Model
pandaActor = loader.loadModel(…)
after remove Node
I do have a Memory Leak. If I’m not using flatten Strong I don’t have a memory leak. Any Guesses??
Are you sure it is a memory leak, and not simply a cache filling up? Panda employs several caches internally, and some objects may persist temporarily in a cache even after they have been removed. But they will eventually age out of the cache and be deleted.
How are you determining that there is a leak?
Hi, thanks for your reply!
I’m working on this project together with whiteant and from what we can see in the PStats tool, we’re pretty sure that what we have bumped into is memory leak.
We first noticed when our game crashed after playing a few levels, but not without noticing us that it ran out of memory. What we then saw in the memory graph of PStats was quite convincing: everytime we started a level memory usage went up, just as expected. But when the level geometry should have been unloaded, memory usage did not decrease.
We tried to locate the cause of this issue and found that memory behavior was back to normal when we removed the call to flattenStrong().
Your explanation regarding caches (which of course we can see growing too when profiling our project) seems reasonable - is there any way to free any memory allocated by these caches?
Yes, it is possible to empty any of Panda’s caches, but you first need to know which cache is filling up (if it is indeed a cache). But still, usually there’s not a need to empty a cache.
Just because the memory doesn’t get returned to the system when you remove the models doesn’t mean it’s a leak either, or even a cache filling up: many of Panda’s memory allocation schemes are designed to allocate memory once, but never free it. Instead, it gets recycled for new objects as they are created. This is a performance optimization. When this happens, you can remove a model, and then load a new model, without increasing the memory beyond where it already was the first time.
You can tell you have a genuine leak when you can load and remove models repeatedly, and the memory usage continues to grow without bounds.
For the record, though, off the top of my head, you can set these Config.prc variables to disable many of Panda’s internal caches:
Note that setting these might (or might not) impact your performance.
There are also caches in the TexturePool and ModelPool; you can call TexturePool.garbageCollect() and ModelPool.garbageCollect() from time to time to force unused models and textures out of the cache. Overusing these calls can actually result in increased memory usage if you defeat the ability of the pools to share common objects.
Finally, you can call base.win.getGsg().releaseAll() from time to time to release all of the graphics objects allocated in the current context. This is a drastic call that you really shouldn’t be making unless you are actively researching memory leaks.
Emptying the caches didn’t have any effect
As the cache should kick in when the same data is loaded multiple times, I decided to profile the game again and observed the following effects:
When flattenStrong() is called after loading, the GeomCache, System Memory and Vertex Data graphs grow by the same amount after each reload.
Without flattening the model data, everything behaves as expected - memory usage grows when loading the model for the first time and then stays the same after each reload. In addition, general memory usage after loading the level geometry is lower than in the case with flattenStrong.
Very strange… Is there any possibility that we’re using flattenStrong in a wrong way? To me it seems as if it’s either not using the geometry cache properly or the original vertex data is kept in memory although it isn’t needed anymore.
This is what’s happening. When you call flattenStrong(), it makes a duplicate copy of the vertices (that’s what flatten does). This copy is then recorded in the cache the first time the flattened model gets rendered. When you subsequently remove the model, the geom cache and vertex data cache have no way of knowing the vertex data isn’t needed any more, and they retain the data for a little while.
But these caches are finite, and limited by the config settings above. Eventually, the caches will fill up and then unused vertex data will be aged out. If you feel they are too large, you can set them smaller. If you set the caches to zero, they will not retain data beyond its time onscreen.