Offscreen screenshot with background image

I want to be able to render a scene ontop of an image

When working in onscreen mode, I’m using this line to load the image to the background
OnscreenImage(parent=self.scene.render2d, image=camera_frame['file_name'])

But when working in offscreen mode I’m using the CardMaker object
I also want to be able to save the rendered scene to a local file, so i wrote the code below that does the job but there are problems.

First one is I have to run renderFrame() twice in order for the background image to appear in the saved image

Second problems is that in offscreen mode, using base.camNode.getDisplayRegion(0) to get the display region produces a 800x600 image size, while the real image is much bigger than that

bg_tex = Texture()
bg = PNMImage()
bg.read(Filename(frame["file_name"]))
bg_tex.load(bg)

cm = CardMaker('')
card = base.render.attachNewNode(cm.generate())
card.setTexture(bg_tex, 1)
card.setTexScale(TextureStage.getDefault(), bg_tex.getTexScale())

base.graphicsEngine.renderFrame()
base.graphicsEngine.renderFrame()
dr = base.camNode.getDisplayRegion(0)

base.screenshot(source=dr)

I tried using something like

window = base.graphicsEngine.makeOutput(
                 base.pipe, '', 0, buffer_props, win_props,
                 GraphicsPipe.BFRefuseWindow
         )

But I couldn’t make it work eventually

Is your window (or display region) perhaps of size 800 x 600? I would imagine that the “screenshot” method would produce an image matching the region being screenshotted, regardless of the size of any textures within that region, which might thus account for the size of screenshot that you’re getting.

If, however, your window (or display region) is of size larger than 800 x 600, then perhaps it’s an issue with the “screenshot” method provided by ShowBase, or the use thereof for this purpose. In that case, perhaps instead try the “getScreenshot” method provided by the “DisplayRegion” class–like so:

dr = base.camNode.getDisplayRegion(0)
screenshot = dr.getScreenshot()

Thank you for your reply!

dr.getScreenshot() also results in 800x600 resolution.

I ended up taking a different approach because the CardMaker issue was causing a lot of trouble.
Instead, I render the image on a plane model and set it to face the camera at the correct distance and proportion, and use window as the source

The new code (see below) produces an image with the proper dimensions but it stretches the image height a little. When I compare the onscreen render and the screenshot, I find the screenshot to be slightly stretched


The black outline is the final image dimension
the red outline is the dimension of the rendered scene.

My new screenshot code looks like this:

fb_prop = FrameBufferProperties()
fb_prop.setRgbColor(True)
fb_prop.setRgbaBits(8, 8, 8, 0)
fb_prop.setDepthBits(24)

win_prop = WindowProperties.size(frame["width"], frame["height"])
window = base.graphicsEngine.makeOutput(
    base.pipe, "cameraview", 0, fb_prop,
    win_prop,GraphicsPipe.BFRefuseWindow
)

disp_region = window.makeDisplayRegion()
disp_region.setCamera(base.cam)
base.graphicsEngine.renderFrame()

base.screenshot(source=window)

EDIT
I also tried going through window.getScreenshot() and extracting the image from the texture but I get the same stretch for some reason… :frowning:

import matplotlib.pyplot as plt
import cv2
scr_s = window.getScreenshot()
bgr_img = np.frombuffer(scr_s.getRamImage(), dtype=np.uint8)
bgr_img = bgr_img.reshape(scr_s.getYSize(), scr_s.getXSize(), scr_s.getNumComponents())
img = cv2.cvtColor(np.flipud(bgr_img), cv2.COLOR_BGR2RGB)
plt.imshow(img)

Here’s an example of the issue. The area in the red square on the left is missing on the right

To rule this out as a potential source of issues, does the problem go away when you set textures-power-2 none in Config.prc?

no…

I also tried to test with different image sizes (that’s how I set the window props) and I saw that the “stretch” is inconsistent.
Only time I was able to get a screenshot that’s similar to the onscreen window was when window length was exactly twice it’s height

By the way, the stretch happens to the entire rendered scene, not just to the background image.
meaning that the 3D elements are also misplaced in the same way.

it’s like it took a screenshot of the whole scene, then stretched vertically and trimmed it to fit the display dimensions

Panda does stretch the image to fit the -1…1 coordinate range. Is the size of the image not the same as the size of the window, then?

It is, and in on screen mode it looks fine. the problem occurs when taking a screenshot