I’m not sure of whether this is a bug, or intended behaviour. It was unexpected to me, so I mention it here, both in case it’s a bug, and to potentially inform others that might bump into it.
It seems that, when setting up an off-screen buffer, the number of depth-bits specified in the frame-buffer properties has little or no effect unless the buffer is also made to render depth to a texture.
For example, the code below shows no apparent depth-fighting, I believe, despite having only sixteen bits specified for depth:
frameProperties = FrameBufferProperties()
frameProperties.setMultisamples(multisamples)
frameProperties.setRgbaBits(8, 8, 8, 0)
frameProperties.setDepthBits(16)
windowProperties = base.win.getProperties()
self.mainBuffer = base.graphicsEngine.makeOutput(base.pipe, "Main Buffer", -1,
frameProperties, windowProperties,
GraphicsPipe.BFRefuseWindow,
base.win.getGsg(), base.win)
# (Further initialisation here omitted here)
self.sceneTexture = Texture()
self.sceneTexture.setWrapU(Texture.WM_clamp)
self.sceneTexture.setWrapV(Texture.WM_clamp)
self.mainBuffer.addRenderTexture(self.sceneTexture,
GraphicsOutput.RTMBindOrCopy)
self.card.setTexture(self.sceneTexture)
While this, I believe, does produce what appears to be visible depth-fighting:
frameProperties = FrameBufferProperties()
frameProperties.setMultisamples(multisamples)
frameProperties.setRgbaBits(8, 8, 8, 0)
frameProperties.setDepthBits(16)
windowProperties = base.win.getProperties()
self.mainBuffer = base.graphicsEngine.makeOutput(base.pipe, "Main Buffer", -1,
frameProperties, windowProperties,
GraphicsPipe.BFRefuseWindow,
base.win.getGsg(), base.win)
# (Further initialisation here omitted here)
self.sceneTexture = Texture()
self.sceneTexture.setWrapU(Texture.WM_clamp)
self.sceneTexture.setWrapV(Texture.WM_clamp)
# NB!
self.depthTexture = Texture()
self.depthTexture.setWrapU(Texture.WM_clamp)
self.depthTexture.setWrapV(Texture.WM_clamp)
self.mainBuffer.addRenderTexture(self.sceneTexture,
GraphicsOutput.RTMBindOrCopy)
# NB!
self.mainBuffer.addRenderTexture(self.depthTexture,
GraphicsOutput.RTMBindOrCopy,
GraphicsOutput.RTP_depth)
self.card.setTexture(self.sceneTexture)
Note that the only difference is that the second code-excerpt adds a second texture, and has “self.mainBuffer” render depth to it.
This was observed in Panda3D 1.10 (a home-built version, specifically), running under Ubuntu 16.04 (“Xenial”), I believe.