hm… i dont think there is a convenient build-in way to do that. it might be possible to use the MultitexReducer but i dont know if it will be able to put your 3d-texture into a 2d one. if not you might need to write a small render2texture functino for it.
for your second question. i think there is only perlin noise build into panda. however there should be many libraries on the internet providing 3d-noise functions. or opensource applications containg those (such as blender3d which has both, texturebaking and 3dnoises of all sorts).
I know e.g. Maya has a wizard for baking 3D textures to a file that could be loaded by Panda, which I would imagine is what astelix is getting at with the outside-tool question. Panda isn’t per-se designed as a modeling/texturing tool, so to my knowledge there’s no straightforward way to bake a 3D texture from within Panda. I’d agree with pro-rsoft that your best option would likely be a shader that colors a model based on a fragment’s position in space, effectively applying the 3D texture directly rather than baking it in. That failing, I suppose you could create your own texture buffer and fill it by somehow looping over your geometry and doing lookups into your 3D texture, but that will get crunchy in quite a few ways.
Then again, re-reading your first post, let me just clarify…
You have a 3D (i.e. volumetric, i.e. (u,v,w) lookup) texture that you want applied to something, in this case a sphere.
Your sphere has a (u,v) mapping to look up a 2D texture.
You want to convert those 2D (u,v)s to a 3D texture lookup… somehow.
Is this correct?
If so, I’m not sure there is a direct path to do what you want. You’re going to need 3D surface point position data for every (u,v) on your model to do a proper 3D texture lookup / baking operation. You could probably get that sort of thing in a shader to apply the texture directly, but that failing, you’ll need to look over all your model vertices and do some interpolations on your own.
A clever alternate way to use the 3D texture, however, might be to use an external program to bake a texture mapping a surface point’s (x,y,z) on your model to the (r,g,b) texture channels, apply that texture to your model in Panda, then run a shader on your model, passing in whatever arbitrary 3D texture you want so the shader can grab a point’s (x,y,z) by a standard lookup, and the proper color by another lookup… but at that point you’re probably better off using more traditional shader techniques to begin with.
No, I meant this:
In the vertex shader, you’d usually output to the “POSITION” bindlocation the screen position of the vertex. In this case, however, you would map the desired 2D UV coordinates as screen position, and as color, you would show the 3D texture with your 3D coordinate set.
This shader is applied to a model (and the camera is zoomed in in such a way that no pixel in the render doesn’t belong to the model), rendered to a texture - if you copy that texture to RAM, you have a 2D representation of that 3D texture on your UV coord space.
This might only work if the UV coordinates are in the 0-1 range.
(This might not work at all - it was just a crazy idea that popped into my mind.)
You could do the same trick without actually employing a shader. You just need a flat quad whose vertices are in (0, 1) range, but whose UVW coordinates are 3-D coordinates, corresponding to the same unrolling of the model as a 3-D model.
That is, assume you have a 3-D mesh with a 3-D texture applied to it, and you want to convert this 3-D texture to the equivalent 2-D texture.
To do this, you will create a new mesh, with the same number of vertices as the original mesh (or just temporarily modify the vertices in your original mesh). For each vertex, set the XY position to the UV coordinate you want that vertex to have in the new 2-D mapping. Set its UVW texture coordinate to the original XYZ position of the model (assuming you are projecting the 3-D texture on using generated texture coordinates–if you already have baked-on UVW coordinates, just use those).
Now you’ve generated a perfectly flat, square mesh, because its XY coordinates range from 0-1 in both directions–just as your UV coordinates will range. In fact, the mesh exactly corresponds to the placement of the 3-D texture in 2-D space. So, apply your 3-D texture to the mesh, and because of the UVW coordinates you’ve applied, it will apply in exactly the right way.
Render this mesh in front of the camera, and take a framebuffer capture of the result. Voila, your new 2-D texture.