shader and setTexscale


I have created a ( for now) simple shader, that just show a texture. it’s a texture added by node.setTexture by the way.
It works, everything fine.

except when I use setTexscale

node.setTexScale(ts, size[0]/self.tex_size[0], size[1]/self.tex_size[1])

how can I get this scale in the shader ?
I prefer not to use a setShaderInput
Is there a way to get the very information panda use/create ? I don’t see something like that in the list of possible shader input: … der_Inputs

I believe using shader input is the only way to feed tex scale to the shader.
If you’re too lazy (who doesn’t ?) to call setShaderInput() after setTexScale(), just insert setShaderInput call inside setTexScale.

Like this :

def setTexScale(self,*args):
    # texture stage isn't always the 1st argument, maybe it's a NodePath
    ts=filter(lambda k: type(k)==TextureStage,args)[0]
    v4=[scale[i] for i in range(numComps)] + [0]*(4-numComps)
    print v4
    print 'shader input set'
# saves the original setTexScale function, so it can be called later
# pushes the new setTexScale above to NodePath class
del setTexScale

# test
ts=TextureStage('new stage')

but I think I will find a another way to do what I want.
setTexScale was a useful solution since It was incorporated in panda3d, but I won’t start to hack into it just for a small graphic thing.
The main problem will be rewriting all the shader with this new input, sending it even when I don’t use a scale,…

If you don’t want to rewrite the shader to scale the UV, you just need to save the scaled UV to vertices, by calling flattenLight().