r/w access to depth buffer

Hi everyone :slight_smile:
I wanted to know how to:

  1. set the distance of the near and far clipping planes on a camera’s frustum
  2. set the range of the depth values (i.e. from 0.000f to 1.000f) in the depth buffer
  3. have a direct read/write access to the depth buffer

I’m reading the docs but I haven’t find a direct answer to my question…

Thx :slight_smile:

EDIT: for the near/far planes, I think I found it :slight_smile: For the depth buffer, the only hint is to use the Render-To-Texture low-level API but the only reference are samples written in Python. This is only partially helpful…

The range of values in the depth buffer is always 0.0 to 1.0 (after the transform from the frustum, which includes the near/far planes). You can’t change it without writing a custom shader; this behavior is built into your graphics driver.

If you want to extract the depth buffer, you do indeed need to use the render-to-texture effect. This is not quite the same thing as direct read/write access to the depth buffer, because you are making a copy of the depth buffer on the CPU (and writes aren’t propagated back). True direct read/write access is not actually possible without a shader, because the depth buffer is stored in graphics memory which is not directly accessible to the CPU.

Render-to-texture is fairly straightforward, even in C++. There might be some C++ examples in the forum as well. Do you have a specific question about the process?


Thank you David :slight_smile:

Sorry, I used the wrong words: what I meant with “direct access” was an access to the depth buffer using some sort of pointer, and not some “readDepthXY” kind of function. So, I can read the buffer using render-to-texture, but is there a way to propagate the writes back to the real buffer?

I’ll explain my needs in a better way: I’m working on an engine for adventure games that uses pre-rendered backgrounds with real 3d characters with the right occlusion. This means that in each location I have to load the background as an image, but to keep the correct occlusion with real 3d characters, I need a way to fill the entire depth buffer from data previously stored, and disable depth write for my characters. Obviously I need some sort of read/write (mostly write) access to the depth buffer.
Something like A vampyre story, or Syberia, or early Squaresoft games (Parasite Eve 2 or FF8, for example).
I don’t know if I figured out the right technique to do that, but it seems reasonable to me.


Ah, you want to save and restore the depth buffer.

The way this has traditionally been done in the fixed-function pipeline is via glWritePixels(). Unfortunately, the current Panda3D API doesn’t provide a way to call this function. You could do it with an OpenGL callback, I suppose, which would limit you to OpenGL.

Or, if you’re willing to write a simple shader, you can do it quite easily with your own pixel shader. Just apply your depth texture to a fullscreen quad, and render the quad with a shader that copies the texture values to the depth buffer, ignoring the color buffer.


Interesting… Writing a custom pixel shader I could probably store the depth value in the alpha channel of the texture and use it for the depth buffer… Time to improve my shader’s coding skills :stuck_out_tongue:

Note that you can also output from a shader directly to the depth buffer (as far as I know) by creating an o_depth output assigned to the DEPTH semantic.

An additional note, if someone is interested in this kind of game programming technique:
after some research on A vampyre story, I found that this game DOES NOT use the depth technique discussed in this thread, but just a mix of 3d objects and/or character with layered textured cards used to create depth for the hand-drawn stages. In theory this is not a perfect occlusion system but with a correct use of navmeshes/waypoints it is possible to achieve a good-looking effect.