Buffer-related Python crashes with newer Panda versions

Hi,

It’s been a while since I last ran a Panda application on my laptop with Windows 7 64-bit installed, and now I notice two different things with the newer Panda versions (official 1.9.0 release and 1.10.0 development builds, 32-bit and 64-bit) that cause a Python interpreter crash on this system, while it all works fine with Panda 1.8.1 and early 1.9.0 development builds.

In both cases there is no Python traceback or any Panda error-related console output, only the typically cryptic Windows popup message:

ppython.exe has stopped working

The first cause of such a crash is simply this line:

loadPrcFileData('', 'window-type offscreen')

On my desktop PC with Windows 8.1 and final Panda3D 1.9.0 release, where I normally do my development, I merely get the following warning (but the application runs normally):

:display(warning): FrameBufferProperties available less than requested.

The second thing that makes Python crash is any call to base.win.makeTextureBuffer, as in the following basic code sample:

from panda3d.core import *
from direct.showbase.ShowBase import ShowBase


class MyApp(ShowBase):

    def __init__(self):

        ShowBase.__init__(self)

        #self.tex = Texture("buffer_tex")
        self.buffer = self.win.makeTextureBuffer("buffer", 256, 256)#, self.tex)
        assert self.buffer


app = MyApp()
app.run()

No assertion error is raised, but as soon as a frame is rendered, the crash occurs.

The graphics card in that laptop is an NVIDIA GeForce GT 540M by the way.

On a related note, but perhaps nothing to do with Panda3D, I recently encountered another weird bug causing the Python interpreter to crash.

My current project is rather complex with several packages and sub-packages, so a lot of imports are being done. At a certain point, I added some extra code to one of the imported modules. The project ran fine. Then I ran it again, without any changes. It crashed immediately. Turns out it had to do with the *.pyc files being used instead of the *.py files, because as soon as I deleted the *.pyc files, everything went fine again.

So it seems like Python would, in some cases, compile working source code into faulty bytecode?
Really strange.
Anyway, I worked around this problem by importing some modules only when needed and that seems to do the trick - for now at least…

I tried it on a 64-bit Windows 7 machine with a GeForce 650M, but I don’t get a crash. Maybe you could attach the output of what happens when you set “gl-debug true” and “notify-level-glgsg debug” in Config.prc.

Also make sure your drivers are up-to-date.

Would this mean that the minimum OpenGL version to create a graphics buffer (at least that seems to be the issue?) has changed from Panda 1.8.1 to 1.10.0?

Ah, thank you for those configuration options.
The funny thing is that according to the output, the application is using the Integrated Graphics instead of the NVIDIA card:

:display:gsg:glgsg(debug): HAS EXT WGL_ARB_pixel_format 1
:display:gsg:glgsg(debug): HAS EXT WGL_ARB_multisample 0
:display:gsg:glgsg(debug): HAS EXT WGL_ARB_create_context 0
:display:gsg:glgsg(debug): GL_VENDOR = Intel
:display:gsg:glgsg(debug): GL_RENDERER = Intel(R) HD Graphics
:display:gsg:glgsg(debug): GL_VERSION = 2.1.0 - Build 8.15.10.2189
:display:gsg:glgsg(debug): GL_VERSION decoded to: 2.1
:display:gsg:glgsg(debug): GL_SHADING_LANGUAGE_VERSION = 1.20  - Intel Build 8.1
5.10.2189
:display:gsg:glgsg(debug): Detected GLSL version: 1.20
:display:gsg:glgsg(debug): Using compatibility profile
:display:gsg:glgsg(debug): GL Extensions:
  GL_3DFX_texture_compression_FXT1       GL_ARB_color_buffer_float
  GL_ARB_depth_buffer_float              GL_ARB_depth_texture
  GL_ARB_draw_buffers                    GL_ARB_draw_instanced
  GL_ARB_fragment_program                GL_ARB_fragment_shader
  GL_ARB_framebuffer_sRGB                GL_ARB_half_float_pixel
  GL_ARB_half_float_vertex               GL_ARB_multitexture
  GL_ARB_occlusion_query                 GL_ARB_pixel_buffer_object
  GL_ARB_point_parameters                GL_ARB_point_sprite
  GL_ARB_shader_objects                  GL_ARB_shading_language_100
  GL_ARB_shadow                          GL_ARB_texture_border_clamp
  GL_ARB_texture_compression             GL_ARB_texture_compression_rgtc
  GL_ARB_texture_cube_map                GL_ARB_texture_env_add
  GL_ARB_texture_env_combine             GL_ARB_texture_env_crossbar
  GL_ARB_texture_env_dot3                GL_ARB_texture_float
  GL_ARB_texture_non_power_of_two        GL_ARB_texture_rectangle
  GL_ARB_texture_rg                      GL_ARB_transpose_matrix
  GL_ARB_vertex_array_object             GL_ARB_vertex_buffer_object
  GL_ARB_vertex_program                  GL_ARB_vertex_shader
  GL_ARB_window_pos                      GL_ATI_separate_stencil
  GL_EXT_abgr                            GL_EXT_bgra
  GL_EXT_blend_color                     GL_EXT_blend_equation_separate
  GL_EXT_blend_func_separate             GL_EXT_blend_minmax
  GL_EXT_blend_subtract                  GL_EXT_clip_volume_hint
  GL_EXT_compiled_vertex_array           GL_EXT_draw_buffers2
  GL_EXT_draw_range_elements             GL_EXT_fog_coord
  GL_EXT_framebuffer_blit                GL_EXT_framebuffer_object
  GL_EXT_multi_draw_arrays               GL_EXT_packed_depth_stencil
  GL_EXT_packed_float                    GL_EXT_packed_pixels
  GL_EXT_rescale_normal                  GL_EXT_secondary_color
  GL_EXT_separate_specular_color         GL_EXT_shadow_funcs
  GL_EXT_stencil_two_side                GL_EXT_stencil_wrap
  GL_EXT_texture3D                       GL_EXT_texture_compression_s3tc
  GL_EXT_texture_edge_clamp              GL_EXT_texture_env_add
  GL_EXT_texture_env_combine             GL_EXT_texture_filter_anisotropic
  GL_EXT_texture_lod_bias                GL_EXT_texture_rectangle
  GL_EXT_texture_sRGB                    GL_EXT_texture_shared_exponent
  GL_EXT_texture_swizzle                 GL_EXT_transform_feedback
  GL_IBM_texture_mirrored_repeat         GL_NV_blend_square
  GL_NV_conditional_render               GL_NV_texgen_reflection
  GL_SGIS_generate_mipmap                GL_SGIS_texture_edge_clamp
  GL_SGIS_texture_lod                    GL_WIN_swap_hint
  WGL_ARB_buffer_region                  WGL_ARB_extensions_string
  WGL_ARB_framebuffer_sRGB               WGL_ARB_make_current_read
  WGL_ARB_pbuffer                        WGL_ARB_pixel_format
  WGL_ARB_pixel_format_float             WGL_EXT_depth_float
  WGL_EXT_extensions_string              WGL_EXT_swap_control
:display:gsg:glgsg(debug): HAS EXT GL_KHR_debug 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_debug_output 0
:display:gsg:glgsg(debug): gl-debug enabled, but NOT supported.
:display:gsg:glgsg(debug): HAS EXT GL_ARB_point_sprite 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_ES3_compatibility 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_storage 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_texture_storage 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_clear_texture 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_texture_array 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_cube_map 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_seamless_cube_map 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_cube_map_array 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_buffer_object 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_bgra 1
:display:gsg:glgsg(debug): HAS EXT GL_EXT_rescale_normal 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_vertex_array_bgra 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_vertex_array_bgra 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_vertex_type_10f_11f_11f_rev 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_multisample 0
:display:gsg:glgsg(debug): HAS EXT GL_SGIS_generate_mipmap 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_non_power_of_two 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_framebuffer_object 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_packed_depth_stencil 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_shadow 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_fragment_program_shadow 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_env_combine 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_env_crossbar 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_env_dot3 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_map_buffer_range 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_buffer_storage 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_vertex_array_object 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_tessellation_shader 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_geometry_shader4 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_geometry_shader4 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_vertex_program 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_fragment_program 1
:display:gsg:glgsg(debug): HAS EXT GL_NV_gpu_program5 0
:display:gsg:glgsg(debug): HAS EXT GL_NV_gpu_program4 0
:display:gsg:glgsg(debug): HAS EXT GL_NV_vertex_program3 0
:display:gsg:glgsg(debug): HAS EXT GL_NV_vertex_program2 0
:display:gsg:glgsg(debug): HAS EXT GL_NV_vertex_program1_1 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_compute_shader 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_vertex_attrib_64bit 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_uniform_buffer_object 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_instanced_arrays 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_draw_instanced 1
:display:gsg:glgsg(debug): HAS EXT GL_EXT_framebuffer_object 1
:display:gsg:glgsg(debug): HAS EXT GL_EXT_framebuffer_multisample 0
:display:gsg:glgsg(debug): HAS EXT GL_NV_framebuffer_multisample_coverage 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_framebuffer_blit 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_viewport_array 0
:display:gsg:glgsg(debug): Occlusion query counter provides 64 bits.
:display:gsg:glgsg(debug): HAS EXT GL_ARB_timer_query 0
:display:gsg:glgsg(debug): HAS EXT GL_SGIS_texture_edge_clamp 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_border_clamp 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_texture_mirrored_repeat 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_texture_mirror_clamp 0
:display:gsg:glgsg(debug): max texture dimension = 4096, max 3d texture = 128, m
ax 2d texture array = 0, max cube map = 2048
:display:gsg:glgsg(debug): max_elements_vertices = 1024, max_elements_indices =
1024
:display:gsg:glgsg(debug): vertex buffer objects are supported.
:display:gsg:glgsg(debug): Supported compressed texture formats:
  GL_COMPRESSED_RGB_S3TC_DXT1_EXT
  GL_COMPRESSED_RGBA_S3TC_DXT3_EXT
  GL_COMPRESSED_RGBA_S3TC_DXT5_EXT
  GL_COMPRESSED_RGBA_S3TC_DXT1_EXT
  GL_COMPRESSED_RGB_FXT1_3DFX
  GL_COMPRESSED_RGBA_FXT1_3DFX
:display:gsg:glgsg(debug): HAS EXT GL_EXT_texture_filter_anisotropic 1
:display:gsg:glgsg(debug): HAS EXT GL_ARB_shader_image_load_store 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_shader_image_load_store 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_sampler_objects 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_multi_bind 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_internalformat_query2 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_bindless_texture 0
:display:gsg:glgsg(debug): HAS EXT GL_ARB_get_program_binary 0
:display:gsg:glgsg(debug): HAS EXT GL_EXT_stencil_wrap 1
:display:gsg:glgsg(debug): HAS EXT GL_EXT_stencil_two_side 1
:display:gsg:glgsg(debug): max lights = 16
:display:gsg:glgsg(debug): max clip planes = 6
:display:gsg:glgsg(debug): max texture stages = 8
:display:gsg:glgsg(debug): HAS EXT GL_NV_gpu_program5 0
:display:gsg:glgsg(debug): HAS EXT GL_NV_gpu_program4 0
:display:gsg:glgsg(debug): HAS EXT GL_NV_fragment_program2 0
:display:gsg:glgsg(debug): HAS EXT GL_NV_fragment_program 0
:display:gsg:glgsg(debug): Supported Cg profiles:
:display:gsg:glgsg(debug):   arbvp1
:display:gsg:glgsg(debug):   arbfp1
:display:gsg:glgsg(debug):   glslv
:display:gsg:glgsg(debug):   glslf
:display:gsg:glgsg(debug): Cg GLSL version = CG_GL_GLSL_120
:display:gsg:glgsg(debug): Cg latest vertex profile = arbvp1
:display:gsg:glgsg(debug): Cg latest fragment profile = arbfp1
:display:gsg:glgsg(debug): Cg latest geometry profile = unknown
:display:gsg:glgsg(debug): basic-shaders-only #f
:display:gsg:glgsg(debug): Cg active vertex profile = glslv
:display:gsg:glgsg(debug): Cg active fragment profile = glslf
:display:gsg:glgsg(debug): Cg active geometry profile = unknown
:display:gsg:glgsg(debug): shader model = 2.0
:display:gsg:glgsg(debug): HAS EXT WGL_EXT_swap_control 1
:display:gsg:glgsg(debug): HAS EXT WGL_ARB_pbuffer 1
:display:gsg:glgsg(debug): HAS EXT WGL_ARB_pixel_format 1
:display:gsg:glgsg(debug): HAS EXT WGL_ARB_multisample 0
:display:gsg:glgsg(debug): HAS EXT WGL_ARB_render_texture 0
:display:gsg:glgsg(debug): Creating depth stencil renderbuffer.

Maybe it’s because I’m using an external monitor? I’ve checked the NVIDIA Control Panel and made sure that “High-performance NVIDIA processor” is set as the preferred graphics processor under Manage 3D Settings → Global Settings. I even did this specifically for ppython.exe under Manage 3D Settings → Program Settings. And on top of that, I created a shortcut to the Panda application and run it by rightclicking the shortcut and choosing “Run with graphics processor → High-performance NVIDIA processor (default)” from the context menu. But the result is always the same. Perhaps there is some option in Python or Panda that I need to set?

My apologies if this is going far beyond regular Panda troubleshooting; for me personally it’s not such a big deal, as I’m not using this machine for serious development anyway (but for others with the same problem a solution might still be welcome of course).

No, it has not changed; this must be an unintended bug.

Hmm, yes, that does seem to be the case. I take it you’re using a laptop that uses NVIDIA Optimus; I am able to switch to my NVIDIA card on my Optimus laptop via the NVIDIA control panel. Strange that it doesn’t work for you.

Nevertheless, it is supposed to work just fine on the Intel card as well. Which specific Intel HD Graphics chip is in there? 3000, 4000?

The Intel drivers do appear to be outdated. Curiously, it does not advertise GL_ARB_framebuffer_object support, but it does support the outdated GL_EXT_framebuffer_object extension. However, Panda does take advantage of this extension, so it should work.

As for tracking down the issue, it seems a bit suspect that this is the last thing it mentions:

:display:gsg:glgsg(debug): Creating depth stencil renderbuffer.

This seems to suggest it might be a crash in the code that sets up a depth-stencil renderbuffer. To see if it might work, it might be interesting to do the buffer set-up differently, and either create a depth texture to attach to the depth slot, or request a 32-bit depth buffer, to prevent it from creating a depth-stencil renderbuffer.

Let’s start with the most important thing:

Yes, you’re right, that did it :slight_smile: !

For those interested, here’s some working code for the first suggestion:

from panda3d.core import *
from direct.showbase.ShowBase import ShowBase


class MyApp(ShowBase):

    def __init__(self):

        ShowBase.__init__(self)

        #self.tex = Texture("buffer_tex")
        self.buffer = self.win.makeTextureBuffer("buffer", 256, 256)#, self.tex)
        self.depth_tex = Texture("depth_texture")
        self.buffer.addRenderTexture(self.depth_tex, GraphicsOutput.RTMBindOrCopy,
                                      GraphicsOutput.RTPDepth)


app = MyApp()
app.run()

The last part of the output now looks like this instead of the last line in the previous version:

:display:gsg:glgsg(debug): Binding texture depth_texture to depth attachment.
:display:gsg:glgsg(debug): loading texture with NULL image depth_texture
:display:gsg:glgsg(debug): loading new texture object for depth_texture, 256 x 2
56 x 1, z = 0, mipmaps 0, uses_mipmaps = 0
:display:gsg:glgsg(debug):   (initializing NULL image)
:display:gsg:glgsg(debug): Creating color renderbuffer.

And here is working code for the second suggestion:

from panda3d.core import *
from direct.showbase.ShowBase import ShowBase


class MyApp(ShowBase):

    def __init__(self):

        ShowBase.__init__(self)

        props = FrameBufferProperties()
        props.setDepthBits(32)
        #props.setFloatDepth(True) # this can be used as fix instead of the line above
        self.buffer = self.win.makeTextureBuffer("buffer", 256, 256, fbp=props)


app = MyApp()
app.run()

Note that in the second solution, using props.setFloatDepth(True) instead of props.setDepthBits(32) also works.

This time, the last part of the output is:

:display:gsg:glgsg(debug): Creating depth renderbuffer.
:display:gsg:glgsg(debug): Creating color renderbuffer.

So it seems it is indeed related to the depth-stencil renderbuffer setup.

There are however still a couple of other things I’d like to report, related to this.
In my own project, with the previous Panda versions, I used the following camera setup for picking objects based on their vertex color:

class PickingCamera(object):

    def __init__(self):

        self._tex = Texture("picking_texture")
        props = FrameBufferProperties()
        #props.setFloatColor(True) # makes it impossible to get a TexturePeeker
        props.setDepthBits(32)
        #props.setFloatDepth(True) # this can be used as fix instead of the line above
        self._buffer = base.win.makeTextureBuffer("picking_buffer",
                                                  1, 1,
                                                  self._tex,
                                                  to_ram=True,
                                                  fbp=props)
        self._buffer.setClearColor(VBase4())
        self._buffer.setClearColorActive(True)
        self._buffer.setSort(-100)
        self._tex_peeker = None
        self._np = base.makeCamera(self._buffer)
        self._mask = BitMask32.bit(21)
        self._pixel_color = VBase4()
        node = self._np.node()
        lens = node.getLens()
        lens.setFov(1.)
        cull_bounds = lens.makeBounds()
        lens.setFov(.1)
        node.setCullBounds(cull_bounds)
        node.setCameraMask(self._mask)

        state_np = NodePath("flat_color_state")
        state_np.setTextureOff(1)
        state_np.setMaterialOff(1)
        state_np.setShaderOff(1)
        state_np.setLightOff(1)
        state_np.setColorOff(1)
        state_np.setColorScaleOff(1)
        state_np.setRenderModeThickness(5, 1)
        state_np.setTransparency(TransparencyAttrib.MNone, 1)
        node.setInitialState(state_np.getState())

        base.taskMgr.add(self.__checkPixel, "get_pixel_under_mouse", priority=0)


    def __checkPixel(self, task):

        if not base.mouseWatcherNode.hasMouse():
          return task.cont

        screen_pos = base.mouseWatcherNode.getMouse()
        far_point = Point3()
        base.camLens.extrude(screen_pos, Point3(), far_point)
        self._np.lookAt(far_point)

        if not self._tex_peeker:
          self._tex_peeker = self._tex.peek()
          return task.cont

        self._tex_peeker.lookup(self._pixel_color, .5, .5)
        print "pixel color:", self._pixel_color # alpha always 1.

        return task.cont

The above worked well, but with Panda 1.9.0 and later, the alpha component of the pixel color as returned by the TexturePeeker is always 1, while with older Panda versions it was either 0 or the alpha value of the vertex color (which I need to distinguish between different types of objects).
So it looks like something related to the TexturePeeker class has changed also?
Also note that calling setFloatColor(True) on the FrameBufferProperties for the offscreen buffer makes the call to peek() on the associated render texture return None, so maybe there’s a bug in there too? (This is not related to my laptop, btw.)

Then I found a workaround by using a PNMImage instead of a TexturePeeker, like this (new code indicated with “# workaround” comment):

class PickingCamera(object):

    def __init__(self):

        self._tex = Texture("picking_texture")
        self._img = PNMImage(1, 1) # workaround
        props = FrameBufferProperties()
        props.setFloatColor(True) # workaround
        props.setAlphaBits(32) # workaround
        props.setDepthBits(32)
        #props.setFloatDepth(True) # this can be used as fix instead of the line above
        self._buffer = base.win.makeTextureBuffer("picking_buffer",
                                                  1, 1,
                                                  self._tex,
                                                  to_ram=True,
                                                  fbp=props)
        self._buffer.setClearColor(VBase4())
        self._buffer.setClearColorActive(True)
        self._buffer.setSort(-100)
        self._np = base.makeCamera(self._buffer)
        self._mask = BitMask32.bit(21)
        self._pixel_color = VBase4()
        node = self._np.node()
        lens = node.getLens()
        lens.setFov(1.)
        cull_bounds = lens.makeBounds()
        lens.setFov(.1)
        node.setCullBounds(cull_bounds)
        node.setCameraMask(self._mask)

        state_np = NodePath("flat_color_state")
        state_np.setTextureOff(1)
        state_np.setMaterialOff(1)
        state_np.setShaderOff(1)
        state_np.setLightOff(1)
        state_np.setColorOff(1)
        state_np.setColorScaleOff(1)
        state_np.setRenderModeThickness(5, 1)
        state_np.setTransparency(TransparencyAttrib.MNone, 1)
        node.setInitialState(state_np.getState())

        base.taskMgr.add(self.__checkPixel, "get_pixel_under_mouse", priority=0)


    def __checkPixel(self, task):

        if not base.mouseWatcherNode.hasMouse():
          return task.cont

        screen_pos = base.mouseWatcherNode.getMouse()
        far_point = Point3()
        base.camLens.extrude(screen_pos, Point3(), far_point)
        self._np.lookAt(far_point)
        self._tex.store(self._img) # workaround
        self._pixel_color = self._img.getXelA(0, 0) # workaround
        print "pixel color:", self._pixel_color # always LVecBase4f(0, 0, 0, 0)

        return task.cont

This code works fine on my desktop, but again not on my laptop: the returned pixel color under the mouse is always LVecBase4f(0, 0, 0, 0).

Then about my laptop.

Processor / integrated graphics: Intel Core i3-390M / Graphics Media Accelerator HD (GMA HD)

If you’re interested, here you can find all the details of my laptop.

There’s nothing about Optimus in the BIOS, so I can’t check how it’s configured from there.
So I downloaded and ran this nVidia Optimus GPU State Viewer, and indeed it shows “NVIDIA GPU: OFF” while running Panda (or any other 3D) applications.
However, if I go to “3D Settings → Adjust image settings with preview” in the NVIDIA Control Panel, then the tool does say “NVIDIA GPU: ON” and lists nvcplui.exe as an application rendered by the NVIDIA card, so I know it’s not defective.

Updating the drivers seems like the obvious thing to do, but a previous bad experience has made me a bit reluctant to do so.

Bug is still active and reported here:
bugs.launchpad.net/panda3d/+bug/1495955

Well, you cannot be guaranteed of having alpha bits in the framebuffer without setAlphaBits, of course.

Instead of setAlphaBits(32) and setFloatColor(True) (a rather strange combination of options, especially given that that would yield a very slow framebuffer), you should try calling setRgbaBits(8, 8, 8, 8 ) to make sure you are requesting all of the bits.

It’s more about loadPrcFileData(’’, ‘window-type offscreen’) code. I was faced with problem while trying to setup tobspr’s pipeline. It was not obvious why interpreter just crashed.

Since this isn’t really related to the original topic anymore, I’ve created a new one here.