In experimenting a bit with a certain feature for my current project, I found myself suddenly faced with an odd problem: the program would crash with a segmentation fault (specifically, with the following error: Process finished with exit code 139 (interrupted by signal 11:SIGSEGV)).
Investigating, it looks like the crash results from a single, very large object that I created for the purposes of the current experiment: even a very simple program crashes when loading that object alone.
Both describe a segmentation-fault crash that occurs when loading a model, and when running under Linux. (And I am, indeed, running under Ubuntu Linux.)
My guess, then, is that the object is simply too big–or more specifically, has too much geometry.
To be specific, Blender reports that it has:
~60 000 vertices
~100 000 edges
~60 000 faces / ~100 000 tris
Does that seem like so much that it would likely crash Panda…?
[edit]
I’ll note that chopping the object up into nine separate models allows the resulting set to be loaded successfully, indeed suggesting that my problem is sheer amount of geometry within a single model/node.
I suppose you need to create another model for the test. It’s probably just a problem with duplicate data, such as a lot of points with the same coordinates, and so on.
It confuses me that the number of vertices and faces is the same.
Possibly. As I said in my edit above, I did run the test of dividing the model into a number of smaller models–changing nothing else, literally just separating it into nine chunks. And when I did so, the resulting smaller models loaded without trouble.
I’d have to do the maths to check, but I think that this is just a result of the model being nothing more than a flat, square plane, divided many times into smaller squares. This means that–in most cases–each vertex is shared by four faces, and each face has four vertices.
This is an unreliable way to identify a problem. According to your logic, it turns out that Panda3D actually has a limit of 60,000 vertices for geometry. However, it’s not much…
I’d argue that it’s good enough to show more-or-less where the problem likely lies.
To be clear:
In the first case, there is a single, large object–i.e. a single Geom–within a model-file. Loading this model-file crashes Panda.
In the second case, the same object has been split into multiple objects–i.e. multiple Geoms–without removing vertices, all within a single model-file. Loading this model-file works as expected.
Now, while not a definitive test, I think that it’s enough to point suspicion at some sort of issue with loading Geoms with a great many vertices.
Especially when paired with the latter of the two threads to which I linked above.
Not exactly–after all, the game-level for which this object was originally made has more vertices overall.
However, those vertices aren’t within a single object, even if they’re stored within a single model-file.
No, instead I infer that Panda has–at least under Linux, as one of the two threads above reported a difference under Windows–some sort of upper bound on data within a individual Geom.
It’s not clear whether this is a hard limit, or related to available system RAM, or what, however.
My position is not being built from scratch, I have uploaded more vertices, but in the form of a format.bam. In your case, the egg module is involved, perhaps there is some kind of conversion error.
But of course, we don’t know how our systems compare–perhaps I have less (free) RAM than you do, for example.
And it’s very possible that it’s an egg-file issue–either an error, or again, perhaps a resource or resource-allocation issue in the conversion process.
(One of the threads to which I linked above connected the issue that they were having with a pointer or heap issue.)