Combine Deferred rendering & Forward renderring

Hello, I am implementing my own deferred rendering.
I’d like use deferred shading to render non-transparent objects, then use forward renderring to render transparent objects.
In forward renderring stage, I want the forward renderring buffer has exactly same depth as G-Buffer before I do forward renderring, so that transparent objects wont cover the non-transparent objects.
There are many stage between G-Buffer and forward renderring buffer that change the depth. So simply disable depth write seem not the solution.

In OpenGL, I can use glBlitFramebuffer to copy depth bit form G-Buffer to forward renderring buffer just like this:

glBindFramebuffer(GL_READ_FRAMEBUFFER, gBuffer);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, forwardRenderringBuffer);
glBlitFramebuffer(0, 0, Win_Size_X, Win_Size_Y, 0, 0, Win_Size_X, Win_Size_Y, GL_DEPTH_BUFFER_BIT, GL_NEAREST);

I want to know is there a similar approach in Panda? :unamused:

Assuming you render your deferred pass before your forward pass, you could simply do this by writing the depth buffer values to gl_FragDepth in your deferred shader, since you’re (presumably) sampling the scene depth buffer anyway.

Alternatively (and this will only work if both your deferredPost/forward pass is also written to an offscreen buffer rather than to the window), you could use the shareDepthBuffer function on the buffers to make them share the same depth buffer.

Finally, to do a copy operation (as you suggested in your post), you could bind a depth texture to your forward pass and pass it to addRenderTexture of your deferredScene buffer with the RTMCopyTexture flag. This will cause Panda to call glCopyTexImage2D after rendering the scene pass to copy the depth buffer to the texture you use in your second pass. I doubt this will be more efficient than the other approaches, though.

Thanks for the reply rdb! But I found I encount a new question.

I tried shareDepthBuffer like this


to make my forward renderring buffer self.buffers[‘skybox’] has same depth as G-Buffer.

I renderring my deferred light pass result in a Texture self.bufferTex[‘Light’]

    GraphicsOutput.RTMBindOrCopy, GraphicsOutput.RTPColor)

and I am trying to render a skybox on it.

I made a reverse surface cube as my skybox, reparent it to, loaded a cubemap as sky texture to sample. In shader, I set this cube depth = 1.0 and make the depth test attrib RenderAttrib.MLessEqual. It works well in my small test program.

However, the whole scene is disappear when I try to add it in my deferred rendering pipeline, only skybox remain.

I 'd like to know : Have an buffer texture A, and its depth, how can I rendering sky to a new buffer texture B upon A. A contain the whole scene with lights, but no sky. :unamused: