Texture swapping strategy

Greetings all!

We have a project that involves several character models (created in Maya) that use the same geometry and animations but multiple textures (i.e. same character wearing different clothes). We would ideally like to handle these in the fashion of a ‘palette swap:’ We’d load the same .egg file, but specify that the egg should use alternate textures instead of the default ones.

Normally, we’d use egg-optchar to flag the sections of the model with unique textures and then do a tree-traversal texture swap, but we’ve run into a pipeline issue: since we’re using egg-palettize, our texture source file names will change after they run through the palettization. So ideally, we’d like to kill two birds with one stone by somehow encoding all the textures that the model could be using (i.e. all of its palette swaps) into its egg file in a way that the show code could easily select between them. I’m afraid I don’t know of a method to do this encoding using maya2egg, however.

Does anyone have any suggestions as to how to accomplish this palette-swapping goal? I feel like we could do something clever involving multitexturing, where each texture we’d use is in a stage and we then only display one stage at a time. But that sounds like it’s not the right way to do this, since my understanding of texture stages is that they’re meant to reflect the texture-stage features of the graphics card itself. Is there a better option?

Thank you for your time!

Take care,

I’m assuming that your design involves several complete sets of clothing, each collected on its own palette image. For instance, you expect to have an image of “outfit A”, which includes pants A, shirt A, and hat A. You also have an image of “outfit B”, which includes pants B, shirt B, and hat B, and similarly for all of your other outfits.

In order for you to swap outfit B for outfit A by swapping textures, it will be necessary that pants B lies exactly in the same place within the outfit B texture that pants A lies within the outfit A texture. Unfortunately, there’s no way to guarantee this in egg-palettize; it’s just not designed for that sort of thing. (You can do some tricks with group assignment to make it pretty likely that the outfits will match up together. But it’s tough to guarantee.)

One option is not to use egg-palettize, or equivalently, to use it but to “omit” all of the textures you will be swapping out. Then you can reference the textures with their original, individual names. To swap in outfit B, you will individually swap in pants B, shirt B, and hat B.

Of course, then you won’t benefit from the optimization of combining multiple different textures onto a single texture image. If that’s an important benefit to you, you should consider palettizing your textures by hand, for instance by painting them that way before you apply them in Maya.

On the other hand, that assumes that you will always swap out the complete set of clothing at once. If your character might wear shirt A, pants B, and hat C, then you might as well keep the textures as independent files anyway, since there won’t be much performance advantage to combining them in this case.


Using multitexturing is the best way as far as I can try. Moreover, you can use multiple UVs too.
Try –[THIS]–
It uses 8 textures with 8 different UVs, each one was planar projected from different angles. Runs great on my 4 stages capable junkware.
Press right or left arrow to set the next/prev texture.


This is exactly the effect we were looking for. Thank you very much!

Does anyone know what this technique will do on cards that support no texture stages? I doubt we’ll run into many, but in the event that we do I’m hoping the findAllTextures step at least will continue to work. Will Panda load the model completely and be unable to display it properly, or will it drop the other state information on the floor if there aren’t texture stages to support it?

I’m sorry for the oddly-phrased question, but texture stages are very new to me and I’m having difficulty wrapping my brain around what they are and how they work.

Thank you for your help!

Quick update to the above question:

We tested the code on a computer that (according to Dell) doesn’t support texture stages, and it worked as we’d hoped. Excellent!


dont ask dell. instead you should view the tech-docs of your graficcard. usualy you can find the info as "supprots (2/4/8/16/whatever) texture stages/layers. the today’s “average” cards support 8 layers, some older like geforce4 series can only handle 4. some very old can only make 2. more modern ones can work with 16… and some fancy dx10 cards are said to handle 128 layers in a single pass.

so 4 or 8 should be ok for most hardware.

Note that ynjh_jo’s code employs the brilliant strategy of relying on multitexture in the original Maya model to associate each palette with its corresponding UV’s. This is brilliant because it means that each image comes along with its own set of UV’s, so egg-palettize won’t get things scattered.

Keep in mind that once the model has been converted and loaded, and all of its TextureStages extracted, there is no more need to have multitexture on the model. The multitexture was a modeling convenience, but is not needed for rendering, in this case. That is to say, you could remove all of the TextureStages but the one you actually wanted to apply, and it would then work on any card regardless of its multitexture capabilities.

In fact, the sample code as it stands does almost this: it sets the TextureStage priority to 1 for the texture you want to apply, which ensures that it will be rendered even if the card cannot render any of the other texture stages. The only thing that is funny is that it doesn’t remove the other texture stages, so cards that are capable of rendering them will, in fact, render them–but you won’t see any of the others because they’re hidden by the top layer. So it looks as if only one texture is rendered. Not a real big deal, though there might be a minor issue of render performance.

Bottom line: this code will run on any graphics hardware, regardless of its texturing capabilities. But it needlessly consumes extra texture stages on graphics cards that have them.

Note that a TextureStage is just a slot to hold a texture and its associated parameters. You can use one TextureStage without using multitexturing. In fact, you do this now whether you know it or not: even if you do not ever specify a TextureStage, Panda understands you to mean the default TextureStage.

TextureStages are necessary to use multitexturing, but using TextureStages is not the same thing as using multitexturing.


Oh, yeah. You can query:


To find out how many simultaneous TextureStages your card supports. Note that the answer is sometimes different between OpenGL and DirectX9, depending on your drivers.


One more question on this issue (last one, I’m sure) :wink:

I talked to one of my artists about the demo ynjh_jo cooked up, and he was a little surprised. He made me a screenshot to clarify his feelings on the issue, which you can retrieve at the URL below:


On the left is the shader configuration in the flat-plane demo, and on the right is the way he expected multi-layered shaders to work. I’m afraid I don’t know how to advise him on this issue, since the inner workings of maya2egg (and most of the workings of Maya) are a mystery to me. Essentially, the issue is that the method on the left allows for unique sets of UVs per each texture, but also requires him to wire up all the UVs; we’re using the same UVs for all of our texture swaps, so he’d like to approach the problem from the method on the right.

Does anyone know why the method on the right doesn’t work? When we tried it, only the “5” texture was output to the egg; the rest seemed to be ignored.

Thank you for all the help!

Ah, nuts, I knew there’d be more.

One last, last question in addition to the above: my artist is asking how ynjh_jo crafted his source model. Do you have a short tutorial on how you laid out the shader that way? He can’t figure out how to generate the uvChooser and myShapePlane connections. How do you create a node to represent the geometry?

(This is more of a “How do you use maya” question, so if you can direct me to a tutorial, that would be great. We’re very, very new to multitexturing over here!) :wink:


If you’re using egg-palettize, you might not actually be using the same UV’s for all textures. Egg-palettize works by changing the UV’s to match where the texture is placed on a palette. If it assembles your textures to different places on a palette, they will end up with different UV’s, which means you will need to start them all out with their own copies of the UV’s.

I’ll admit I don’t quite understand which is the fundamental problem you’re trying to solve here. There seem to be several different solutions to several slightly different problems.

Your first post referred to the renaming of the textures by egg-palettize, but surely there’s more than that, since egg-palettize does a lot more than simply rename textures. And if it were just the renaming, that’s not an issue, because you can always extract the actual textures directly from loaded models, regardless of their filename.

(For instance, you could use egg-texture-cards to create an egg file that had one instance of each texture, and then run that file through egg-palettize; then you could load up that model and use model.find(’**/origTexName’).findTexture(’*’) to extract each individual texture, regardless of its filename. This is the preferred way to load textures anyway, since it preserves settings like mipmap settings.)

But you’d still need to deal with the UV adjustment that egg-palettize performs. ynjh’s approach is a clever way to solve this, by having a different set of UV’s for each texture. There are other possible solutions also, but none quite as elegant.


Sorry, I missed that. That curiosity cure code was only cooked for several minutes in that morning before I left my PC :laughing: . So, thanks for pointing it out. But, however there is only 1 stage on the model at the first place (using self.myPlane.findAllTextureStages()). Yes, it start piling upon texture switch, so this is the correct one :

  def throwLastTex(self):

  def setNewTex(self):
      print self.myPlane.findAllTextureStages()

Yes, indeed. I tried to use 1 UV, and only 1 layer was exported.

The manual says that all. Search for UV linking. You can open the Relationship Editor to link UV-texture. Right click on the model, select UV sets > UV linking.
All I did was just texture mapping + create new UVset, and UV-texture linking.
You don’t have to build the exact graph yourself to achieve the same result. The key is to use individual UV for each texture, you should copy the default UV into several new ones if you want uniform UVs.

I used individual UVs since the 1st place to keep the flexibility for UV layout for each texture, which allows different layout/repetition/offset for each … say cloth/fabric.

Well, you have the manual and the MEL reference. I’d suggest to learn more about MEL to grab total control over Maya. Make sure to melt with all weapons you have.
I guess you haven’t seen –[THIS]– yet. It’s my best useful MEL script I’ve ever created so far.

Excellent; thank you for all the help both on the Maya front and the Panda front.

David, thank you especially for pointing out the UV remapping issue; I hadn’t thought of that at all. We currently don’t have palettize set to build palettes; we’re using it primarily to drop our art assets from the original-source resolution (1024x1024) to a more graphics-card-friendly 256x256. We will start building palettes soon, however; I’ll keep that issue in mind when we reach that step.

The particular goal we are trying to accomplish is leveraging the same geometry to create multiple different objects in the scene by changing the texture. The reason I mentioned filenames is that we had previously solved this problem by changing suffixes on texture names: we kept all the textures in one directory, and when a model was loaded we searched it for textured nodes and changed the model’s appearance by reading the texture name, appending a new suffix (like “_redteam” or “_blueteam”), loading a texture with the new name, and replacing the texture. But previously, we hadn’t been using palettize; the palettization process of course renders this solution non-viable because the textures are moved and their names are changed (this is less of a problem when palettize isn’t being used to build palettes, but it still changes all my file types to .rgb and relocates them to another folder). The egg-texture-cards solution would definitely overcome that problem.

At the end of the day, the fundamental issue is that our artists are comfortable with Maya and not much else; they are supremely uncomfortable with the command-line tools since other game engines they are familiar with incorporate some flavor of GUI-based compositing tools (either in-engine or via Maya plugins) to do all the “last-mile” engine import steps. We are therefore in the process of developing various MEL scripts and other components to give the artists full control over the import cycle, freeing up the development team to add features and new engine behaviors. This is, to my understanding, “the way it is done;” it’s just somewhat tricky since I have no experience with Maya (making MEL-script writing a very slow crawl).

Sounds like time to bust out the books and educate myself. :slight_smile: If I come up with anything useful, I’ll let you know!


After played around with it further more, I found that color scale doesn’t work on the plane. It works on my other 2 UVs & 2 stages (lightmapped) scene.
This is my new code :

import direct.directbase.DirectStart
from direct.showbase.DirectObject import DirectObject
from direct.task import Task
import sys, random

class World(DirectObject):
  def __init__(self):



      for i in range(allTS.getNumTextureStages()):
      for i in range(allTex.getNumTextures()):




#      print self.myPlane.findAllTextureStages()

#       print self.stages
#       print self.textures

      for x in range(-10,11):
          for y in range(-10,11):

      taskMgr.add(self.gLoop, 'gLoop')

  def gLoop(self,task):
      for p in self.planes:

  def prevTS(self):

  def nextTS(self):

  def setNewTex(self):
      print self.myPlane.findAllTextureStages()


Any clue ?

Are you still using the “decal” mode to apply the texture to your plane? That specifically hides any vertex color, including any result of a color scale.


So that’s the problem. In Maya, all layers use “over” blend mode, results as “decal” in Panda. Setting it to “none” results as the default “modulate”, which fixes it.

Hey all!

Quick heads-up: We tried the algorithm and it works great. Thanks to everyone for their help!

I wanted to double-check one aspect of the algorithm we’ve devised here. We’re trying to be as efficient with texture memory as we can; I’d like to avoid retaining textures in the graphics card memory if they aren’t being rendered, and we know these are mutually exclusive (so the best case should be that only one of them is retained in graphics memory at a time). So if we load the model, clearTexture() on the stages we aren’t displaying, and setTexture on the one stage we use, are the other textures still retained in graphics memory or will Panda automatically purge them? If they aren’t auto-purged, how can I force them to be evicted?


If you are using DirectX9, Panda will generally try to keep textures in texture memory until you need that memory for something else, at which point it will evict the oldest texture to make room for it.

If you are using OpenGL, the OpenGL interface will do this sort of least-recently-used algorithm automatically, and we don’t have any control over it.

But usually that’s good enough. It’s rare that you would need to early-evict a texture; why evict a texture from texture memory until you actually need the space for something else?

If you really do want to force-evict a texture, you can do texture.release(base.win.getGsg()). That will force Panda to release all graphics handles to the texture, and (probably) evict it from texture memory (though in the OpenGL case, nothing is guaranteed).