CommonFilters prevents drawing an OnScreenImage?

I am setting up a blur filter like so:

filters3D = CommonFilters(base.win, base.cam)
filters3D.setBlurSharpen(0.1)

and then later setting a background image by calling:

background = OnscreenImage(parent = render2d, image = imagepath)

But the background doesn’t show up at all, it’s just grey. I tried adding in another filter to apply to the background:

filters2D = CommonFilters(base.win, base.cam2d)
filters2D.setBlurSharpen(1.0)

But that created the opposite problem–with both filters running, only the backgrounds render, not the models themselves. Any advice on how this might be resolved?

Also, a followup question: the blurred effect renders just fine in the live window, but I have a task that takes a screenshot of each frame, and the blurriness does not appear in the captured images. Any idea as to why?

Your “background” image is under render2d, which is layered on top of the 3-D display region. So it will be rendered over anything else.

You can change the sort order of the render2d display region, if you want, which will make it display below.

How exactly are you capturing the screenshots? Could you show the code?

Sure. I’m already using

base.cam2d.node().getDisplayRegion(0).setSort(-20)

right after I establish the OnScreenImage in an attempt to make it render behind everything else, but the problem of it not appearing at all unless I create a filter for it too seems like it shouldn’t have much to do with sorting, does it?

My screenshot code is this:

image = PNMImage()
base.camNode.getDisplayRegion(0).getScreenshot(image)
imageFile = "/file/path/here/image_{}.jpg".format(count)
image.write(Filename(imageFile))

The scene is constantly updating thanks to a task that this snippet is at the bottom of, so it just takes a screenshot of every frame and writes them all in sequential order. If there is a better way to take and save frames I am certainly open to hearing it. Thank you for your help!

CommonFilters will change the given display region so that it instead contains a camera that renders a fullscreen quad with a texture, with on it the result of a render-to-texture operation of the scene to an offscreen buffer. At that point, base.camNode will no longer be associated with the display region on the window, but the display region on the offscreen buffer.

If you want to take a screenshot of the filtered result, you need to take a screenshot of the DisplayRegion that’s on the window and not the one on the offscreen buffer, either by storing a reference to the DisplayRegion before setting up CommonFilters, accessing it via base.win.getDisplayRegion, or accessing it via filters3D.manager.region instead.

Okay, great. That solved the issue of the blurred effect not appearing in the images. Any thoughts on the issue of the background? I think I’ve provided all relevant code at this point, but I’m happy to show you whatever else you want to see.

Do you want your background to be included in the filter? If so, you may need to create a new DisplayRegion on the FilterManager’s scene buffer (I think this would be cf.manager.buffers[0]) under the standard one and set up a 2D camera to render your image there.

Alternatively, you probably want to make sure that the scene buffer is being rendered with a background fully-transparent color of (0, 0, 0, 0) so that you can use premultiplied alpha blending on it to have it blend on top of the background when the result of the filter is being blitted on the main window.

I do not want the background image to be included. Ideally, it would render normally, just behind the models in the scene (which are included in the filter). What’s the function I should call to edit the transparency of the scene background?