In short, if a model has no texture applied, is the texture-matrix supplied to shaders that expect it?
I have a shader that takes two textures as inputs–but not via the built-in texture-system (e.g. as a result of calls to “setTexture” or inclusion in a model-file); they’re applied simply as shader-inputs.
I now want to be able to use the texture-matrix with this shader–but for some reason it doesn’t seem to be working, where it seems to work in other shaders. A quick look at those shaders, however, indicates that at least the few that I examined do in fact take at least one texture in via the built-in texture-system.
So I’m wondering whether perhaps the texture-matrix is simply not provided, or is forced to the identity matrix, in the case in which no texture is applied to the model in question.
Is this so? Or might I be doing something else incorrectly…?
Note, by the way, that the change that I’m looking for from the texture-matrix right now is a scaling-factor, which I’m attempting to apply with a call along the following lines:
model.setTexScale(TextureStage.getDefault(), uVal, vVal)
Now, of course I could rework the shader in question to use the built-in texture-system–but this shader is already doing a lot of work elsewhere, and I’m hesitant to change too much.
(My other option, of course, is to apply the effect of the texture-matrix elsehow, using a new shader-input. But that likewise might incur changes to other things using the shader.)