I’ve wanted to use the threaded render pipeline, as it should produce an improvement in frame-rate on multi-core machines. However, for some time enabling it seemed to result in crashes. Today I decided to give it another shot, especially given that I’m presumably using more-recent engine-code than I was when last I tried.
To start with, it at least doesn’t seem to crash as it once did.
However, it also seems to break rendering somehow. Specifically, it looks as though it’s unhappy with my shadow-buffers, and specifically their frame-buffer properties. While all is fine if multi-threading is disabled, when it’s enabled I get a stream of errors; with “gl-debug #t” in my PRC file, the errors are as follows (with the final three lines repeated indefinitely, it seems):
(The repetition below is as printed in the output, I believe.)
:display:gsg:glgsg(warning): Framebuffer unsupported. Framebuffer object light buffer is unsupported because the depth and stencil attachments are mismatched.
:display:gsg:glgsg(warning): Framebuffer unsupported. Framebuffer object light buffer is unsupported because the depth and stencil attachments are mismatched.
:display:gsg:glgsg(warning): Framebuffer unsupported. Framebuffer object light buffer is unsupported because the depth and stencil attachments are mismatched.
:display:gsg:glgsg(error): EXT_framebuffer_object reports non-framebuffer-completeness:
:display:gsg:glgsg(error): FRAMEBUFFER_UNSUPPORTED for light buffer
:display:gsg:glgsg(warning): Framebuffer unsupported. Framebuffer object light buffer is unsupported because the depth and stencil attachments are mismatched.
:display:gsg:glgsg(error): GL_INVALID_FRAMEBUFFER_OPERATION error generated. Operation is not valid because a bound framebuffer is not framebuffer complete.
:display:gsg:glgsg(error): GL_INVALID_FRAMEBUFFER_OPERATION error generated. Operation is not valid because a bound framebuffer is not framebuffer complete.
I’m currently not requesting stencil bits, I believe–only depth bits, more or less as follows:
frameProps = FrameBufferProperties()
frameProps.setDepthBits(1)
I’ve tried changing the “1” to “16”, and adding “setStencilBits(0)”, to no avail.
The exact results after loading a level seem to vary according to which threading model I choose–“Cull” still seems to crash or force-quit, while the other two produce different graphical glitches, likely related to the game’s shadow-mapping.
Any suggestions, anyone?