Buffers: Passing To Window Via Shader

I have 2 cameras each with a FBO buffer (no main active camera)
Camera 1’s buffers have RTPColor,Depth AUX0-2 textures
Camera 2’2 buffer have RTP Color and Depth

Im using the rgba in each Aux texture very specifically, hence im not using the default panda rendering.

Question:
How do i utilize these buffers directly using a shader(my filters) and show the result in the original window.

FilterManager seems to rerender to the buffers, unless im missing something.

Id like to just have a couple of CG shaders I can run in order and the final result to be used as window contents, using my original buffers for speed.

You need to render the scene into a buffer instead of the main window, and show the result on an onscreen quad with the postprocessing shader applied. This is what FilterManager does.

Sorry for not being clearer…let me try with some code.

Im using this to make my relevant camera buffers

    def make_FBO(self, name, auxrgba):
        self.props.setAuxRgba(auxrgba)
        return base.graphicsEngine.makeOutput(
             base.pipe, name, 0,
             self.props, self.winprops,
             GraphicsPipe.BFRefuseWindow,
             base.win.getGsg(), base.win)

Then attach them to cameras

        self.first_buffer = self.make_FBO("first buffer",3)
        self.second_buffer = self.make_FBO("second buffer",0)
        
        self.first_cam = base.makeCamera(self.first_buffer)
        self.second_cam = base.makeCamera(self.second_buffer)

I generate my own texture buffers now add them to the above

        self.first_buffer_tex0 = Texture()
        self.first_buffer_tex1 = Texture()
        self.first_buffer_tex2 = Texture()  
        self.first_buffer_tex3 = Texture()
        self.first_buffer_tex4 = Texture()
        self.second_buffer_tex0 = Texture()
        self.second_buffer_tex1 = Texture()

        self.first_buffer.addRenderTexture(self.first_buffer_tex0, GraphicsOutput.RTMBindOrCopy, GraphicsOutput.RTPColor)
        self.first_buffer.addRenderTexture(self.first_buffer_tex1, GraphicsOutput.RTMBindOrCopy, GraphicsOutput.RTPDepth)
        self.first_buffer.addRenderTexture(self.first_buffer_tex2, GraphicsOutput.RTMBindOrCopy, GraphicsOutput.RTPAuxRgba0)      
        self.first_buffer.addRenderTexture(self.first_buffer_tex3, GraphicsOutput.RTMBindOrCopy, GraphicsOutput.RTPAuxRgba1)
        self.first_buffer.addRenderTexture(self.first_buffer_tex4, GraphicsOutput.RTMBindOrCopy, GraphicsOutput.RTPAuxRgba2)
        self.second_buffer.addRenderTexture(self.second_buffer_tex0, GraphicsOutput.RTMBindOrCopy, GraphicsOutput.RTPColor)
        self.second_buffer.addRenderTexture(self.second_buffer_tex1, GraphicsOutput.RTMBindOrCopy, GraphicsOutput.RTPDepth)

Now i have not default active camera at the mo and the scene renders to the two camera buffers.

How do i now send these textures to a post process shaderers and the result displayed on the default window then passed to render2d?

Using Filter manager seems to require me dumping the same images again to additional texures…I do not wish to do that step in the process.
I’d like ideally to pass my given texures to the FilterManager if thats how it needs to be done.

Does that help?
Sorry about the wall of code

Managed to sort it.

I simply rendered an empty seen with FilterManager and then passed my other buffers to it as shader inputs.

Thanks for the help.