Shaders: Rendering to a texture used by the shader?

For an effect that I’m working on, I want to try a self-referential shader: one that, applied to a piece of geometry, renders not only to the usual output but also to a separate texture that is itself an input to the shader.

(Think of a motion-blur shader, blending a semi-transparent version of its previous frame with its current frame in order to produce a faded trail.)

There is one catch: The geometry to which the shader is to be applied isn’t a full-screen quad, but an arbitrary piece of in-game geometry. I specifically do not want this effect to affect the entire screen.

Is this at all feasible? And if so, how might it be done?

You can use a stencil buffer to create a mask.

So… you’re saying that I would make the shader as a full-screen shader, but limit its effect? That’s possible, I suppose, but I’d prefer that it were a normal single-node shader, if feasible…

I assume that you need a geom node anyway.
Next, you can apply a shader.

An example for a texture, then it looks like this.

geom = Geom(vdata)
# Create new geom state.
state = RenderState.make(TextureAttrib.make(tex))
 # Create geom node.
geom_node = GeomNode('plane')
geom_node.add_geom(geom, state)

I’m not clear on how that helps–I think that perhaps either you or I misunderstand.

To clarify, I have a 3D model already, placed within the world. I’m applying a shader to it, as with many other models.

The problem is that want the specific shader that’s being applied to this model to additionally render to a texture, and to have that texture be one of the shader’s inputs, and, well, I don’t know how to do that. Or whether it’s feasible with a model other than a full-screen quad, for that matter.

It turns out that I didn’t understand you.

1 Like

For complicated reasons relating to how GPUs are designed, you can’t read from the same buffer you are rendering to in the same pass. However, accessing the previous frame is fair game. You would have to “ping-pong” between two textures: one frame you are rendering to texture 1 and reading from texture 2, and the next frame you are rendering to texture 2 and reading from texture 1.

1 Like

That’s fair.

Okay, I see. Hum… But how does one do that? Does one reset the shader-inputs each frame such that the textures are swapped? (I suppose that one could use an if-statement, but I gather that those are to be avoided.)

And how does one connect the target texture to a shader output…?

Connecting a target texture to a shader output can be done with GraphicsOutput.addRenderTexture().

I don’t know what’s the most efficient way to set this up; I think calling addRenderTexture every frame might be more expensive than using two buffers with the same render-to-texture set-up but different textures and switching setActive between them every frame.

Although come to think of it, maybe using RTM_copy_texture (as opposed to RTM_bind_or_copy) would also work fine, since with this mode, Panda already has separate storage for the buffer and the texture and copies between them at the end of each frame. In theory this should allow you to bind the texture as shader input since it’s not rendered directly into. I think you might have to experiment to see what works better.

1 Like

Hmm, thank you.

I think that for now I’m going to look into other ways to achieve an effect that I’m happy with for the feature toward which this is being done–but if I come back to this approach, then the above should be rather useful, I daresay! :slight_smile: