I have a shader applied to the geometry in my scene that outputs to both COLOR0 and COLOR1, but don’t seem to be getting the output assigned to COLOR1. I have checked the value being assigned to COLOR1 by instead assigning it to COLOR0; this produced the expected output in the main colour buffer.
What I think may be thwarting my attempts is that I’m not rendering to an offscreen buffer created with custom frame-buffer properties: I’m applying the shader in question to geometry being rendered by Panda’s default window and camera.
I have successfully used COLOR1/AuxRgba before, but in that case the shader in question was applied to an offscreen root-NodePath to which my geometry was parented, and which was rendered by an offscreen camera into a separately-created buffer which had been set up with “AuxRgba” set to “True”.
My best guess at the moment is that the standard window isn’t set up to hold an auxiliary colour buffer–am I correct in thinking that this is the problem? If so, is there a way to have it be constructed with one, and if that isn’t feasible, am I stuck with rendering to an offscreen buffer?
What would you like the main window to do with the auxrgba value? One pixel can only have one color in the end, so you need a shader that takes color0 and color1 inputs and writes one value out as the final stage.
I want to use the texture thus produced as input to another shader, with the intended end of outputting to a card laid over the screen.
(The primary colour output isn’t required by the shader.)
I do have the thought in mind that I could render to an offscreen buffer, then apply two textures–the standard colour output currently being sent to the screen, and the output of the shader that takes in the output to auxRgba–to the final card in two Texture Stages. This just seems perhaps a little more straight-forward–although I’ll confess that in practice it doesn’t seem to be, thus far!
You have to render the scene into a buffer to get auxiiliary bitplanes, and bind a texture to the auxiliary bitplane using buffer.addRenderTexture. This sounds inconvenient, but really, buffers are much more flexible. If you’re doing this kind of thing, chances are you’re gonna have to render your main scene into a buffer for some or other reason. It’s more efficient, too, as render-to-texture is implemented as an inefficient copy operation when using it on the main window.
FilterManager does all this for you. You just have to call renderSceneInto and pass a texture object as the auxtex parameter.
Aah, fair enough, and thank you for the explanation and guidance; indeed, as I said in my other thread, I think that I should look into the FilterManager class!