Sparse mesh persistence

What would be the best mechanism to store expandable virtual geometry where there may be potentially many gigabytes of geometry but locally (ie: in an active runtime while rendering) only smaller portions are loaded as needed (when coming into view or close to coming into view)?

Are there currently any open source (GNU GPL, BSD or otherwise) implementations of such currently available that would work in Python+Panda3D?

Use case is a space game/RPG hybrid where you can travel fleets in space amongst planets (or orbit them) but also that is also possible to land on, explore and adventure on individual planets.

One very ugly way to do this would be with text processing capable of modifying a single large EGG in place.
I’ve had issues previously with not being able to reset textures and their related parameters when updating dynamically created meshes in Panda at runtime.

Unrelate topic is whether or not the gravity in the physics system can be set to a point (center of mass of a planet). From everything I’ve seen the physics engine usually expect the world constructed based on a flat grid thus the gravity always ends up being a 90-degree down vector. If I change to a 2D-grid based terrain has any brave soul come up with some sort of transformation to wrap the terrain onto a sphere to make it appear as a planet (or treat the 2D space as a torus by modulus W (and H) of the X (and Y) axes then somehow map the torus onto a sphere)?

I think the best approach would be to not store any data on the HDD at all, but generate it procedurally instead as needed. There’s plenty of material on that topic (quad trees are useful here) If you don’t want it to throw up a different planet every time you re-open the game, just store the seeds used to generate them.

The main problem I see doing it this way is that the view of the planet would not look like the landscape when landed on. A planet with the landscape of Earth when landed on may well look like Mars when in orbit.

I’d like the planet’s view to resemble its landscape however. If you remember my original post it indicates each planet has terrain that may be explored.

If I were to bake the terrain onto a (square) texture is there a way to map this texture onto an normalized (bounded to [-1:1] in all axes) icosphere in Blender then display planets by swapping in the appropriate texture’s image (using each planet’s baked landscape texture as appropriate)? I think the texture animation (descending elevator) example might contain what I need as far as swapping in the appropriate texture and I’d just load a standard planet icosphere model, scale it to the planet’s size then change the texture source image to the appropriate baked planet landscape image.

I fail to see the problem here… When moving the camera further from the planet, the surface detail will eventually fade out and the planet will become nothing more than a textured sphere… (which also uses less and less polygons the further away you are). This quality transition should be pretty much seamless. Here is an example of that: youtube.com/watch?v=7s4G1J9Hiwk

You might want to read this:
gamasutra.com/view/feature/3 … verse_.php
gamasutra.com/view/feature/3 … verse_.php
gamasutra.com/view/feature/2 … verse_.php
gamasutra.com/view/feature/1 … verse_.php

I guess blender handles the UV mapping, so swapping the textures afterwards shouldn’t be anything harder than a simple .setTexture() call.

As for the gravity: There is no gravity in the physics system. All you have is a linearVectorForce… if you set it to (0, 0, -1), it will always pull downwards, but if you compute the difference between an object and the center of the planet and use it as a vector (after scaling it appropriately, of course), the object will be pulled to the center of the planet. Easy.