Don't know if this is a bad compilation or video issues

I don’t know if this is a problem of a bad compilation, missing libraries, bad video driver, or is not implemented on the linux version of Panda3D… but i tried all tests, and some of them are not possible to run.

“Toon Shader: Video driver does not support multiple render targets”
“Toon Shader: Video driver does not support multiple render targets”
“OpenAL support is not enabled, cannot proceed.”
“Shadow Demo: Video driver cannot create an offscreen buffer.”

well… i know the “video drivers” issues, sounds like they are actually issues of my video driver, but i want to be really sure they are from my video card (or drivers) and not panda itself, compilation, or bad libraries.

all other tests worked excellent.

My video card is an NVIDIA GeForce 6200 and i’m using driver version 100.14.03.

btw those errors appear on the “Panda” window on the lower right corner

I can answer all four of those. Three of those errors are actually the same error — it’s trying to create an offscreen buffer, and failing. The GeForce 6200 itself is quite capable of creating an offscreen buffer. So that leaves two possibilities. Either the driver’s no good, or the code we use to create offscreen buffers under Linux contains a bug. Since I rewrote a good chunk of that code myself, and since I’ve rarely tested under Linux, it’s entirely possible that there’s a bug. On the other hand, I’d like to hear from some of the other Linux users here — do those sample programs work for you?

The error about OpenAL is quite correct. I added OpenAL support to Panda3D just about a month ago. However, it seems to be unstable under Linux, so I temporarily disabled it until I find the cause of the problem.

I get the same errors on my GeForce 5200, but in my own application I have lots of offscreen buffers and it works fine though for me…

Pro: in your app, what kind of offscreen buffer is it creating? There are three possibilities: a WglGraphicsBuffer (ie, pbuffer), a GlGraphicsBuffer (ie, fbo), or a ParasiteBuffer (ie, a fake offscreen buffer).

Hi Josh,
I have just upgraded to v1.4.2 from v1.3.2, and got the same errors reported by frapell.
Using :
Ubuntu 7.04 (Feisty Fawn?), Python 2.5
ATI Radeon 1950 Pro

In particular, & used to work, but now they fail “Toon Shader: Video driver does not support multiple render targets”.
I see the only changes to those Tuts are paths in loadModel() & loadAnims(). I have not changed video drivers & only taken Ubuntu security updates.

Seems like Panda code changes - unless it is one of those new config settings that I need to alter?

Also, I noticed in Disco-Lights tutorial that the spotlight now has jagged edges & does not respond to the Spotlight Exponent setting any more.


By the way, should there be a Sample-Programs–GUI ?
My v1.4.2 installation has an empty directory.


Here are fixes for the other problems reported ::

Procedural-Cube missing “envir-reeds.png”

cd /usr/share/panda3d/samples
sudo cp Sample-Programs--Fractal-Plants/envir-reeds.png Sample-Programs--Procedural-Cube/

Media-Player “OpenAL support is not enabled, cannot proceed.”

/etc/Config.prc is wrong (or else change the comment!)

# Enable audio using the OpenAL audio library by default:
	audio-library-name p3fmod_audio

should be

audio-library-name p3openal_audio

Unhelpful error messages are one of my pet hates, so may I recommend the following code for the tutorial?

def addError(text):
    return OnscreenText(text=text, style=1, fg=(0.4,0,0,1),
                        pos=(-1.0,0.5), align=TextNode.ALeft, scale = .07)

class World(DirectObject):
  def __init__(self):
    self.tex = loader.loadTexture(MEDIAFILE)
    name = base.sfxManagerList[0].getType().getName()
    if (name != "OpenALAudioManager"):
      addError("OpenAL support is not enabled, cannot proceed.\n\n"+
               'Your sfxManager is "'+name+'", not "OpenALAudioManager".\n'+
               'To run this tutorial you need\n'+
               '     audio-library-name p3openal_audio\n'+
               'in your Panda3D Config.prc file.')
    name = self.tex.getType().getName()
    if (name != "MovieTexture"):
      addError("Cannot play this kind of media file.\n\n"+
               'Your media file texture is "'+name+'", not "MovieTexture"')

BTW, Wouldn’t it be nice if there were consistent names ?
audio-library-name ~ sfxManager
p3openal_audio ~ OpenALAudioManager
p3fmod_audio ~ FmodAudioManager


A quick search has not found any solution for the offscreen buffer problem. I thought I’d quickly post the above fixes, in case anyone else is suffering, before I start fumbling around trying to gather more info.

I can’t explain the buffer creation problems. I’ll look into them more this week. But I can explain the OpenAL issue - OpenAL doesn’t work under linux. It compiles, but it crashes constantly. So that’s why I enabled fmod instead.

Onscreenbuffers create fine here, though the examples say it doesn’t. There’s probably a bug at the place where you test whether they are supported or not.

Works ok for me - Ubuntu 7.04.
Though I have only tried it on your panda AVI, plus a couple of short sample AVI & MPG downloaded from web.

About the offscreen buffer problem - please let me know if there are any diagnostics I can get for you.
I’ll try ynjh_jo’s code & see what I find.

I am now (almost a year since this thread) experiencing the same issues with not supporting multiple render targets (no error mentioning an offscreen buffer though, nor any problems with sound). In addition, one of the tutorials gives me an error of

“video card not powerful enough to do image postprocessing”

I have an FX5200 card, and am running on Ubuntu 8.04. Are these actual problems on Linux, or something wrong with my setup somehow? Does anyone have all the tutorials working perfectly on Linux?

Thanks in advance. I am currently surveying several 3D libraries for a project, and Panda3D is on the shortlist. Hopefully the Linux problems mentioned above are fixable so that we can seriously consider using it.

I have the exact same card here, and am also running Ubuntu, but everything works fine here.
My guess is you are using old drivers. Try “sudo apt-get install nvidia-glx-new” to install the latest proprietary drivers. After you done that, restart X or reboot.

If that still doesn’t work, try disabling Xgl and rebooting:

mkdir ~/.config/xserver-xgl
touch ~/.config/xserver-xgl/disable

Thanks for the quick response.

I’m already using nvidia-glx-new, and I’m not running Xgl (that is, I’m not running Compiz - I presume that means I’m also not running Xgl?). So this seems still unclear.

Not sure how helpful this is, but I’ve tested a few other graphics engines (Ogre, Irrlicht, etc.), and all of their samples worked, including ones with various post-processing effects, rendering to multiple targets, etc. Of course this isn’t proof that my drivers are ok, since the other engines might be doing some of that in software, with a silent fallback.

Here is the output of ‘glxinfo’, maybe that’s helpful somehow?

name of display: :0.0
display: :0  screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
    GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig, 
    GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control, 
    GLX_EXT_texture_from_pixmap, GLX_ARB_multisample, GLX_NV_float_buffer
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
client glx extensions:
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
    GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync, 
    GLX_NV_swap_group, GLX_NV_video_out, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, 
    GLX_SGI_swap_control, GLX_NV_float_buffer, GLX_ARB_fbconfig_float, 
    GLX_EXT_fbconfig_packed_float, GLX_EXT_texture_from_pixmap, 
GLX version: 1.3
GLX extensions:
    GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig, 
    GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control, 
    GLX_EXT_texture_from_pixmap, GLX_ARB_multisample, GLX_NV_float_buffer, 
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce FX 5200/AGP/SSE2/3DNOW!
OpenGL version string: 2.1.2 NVIDIA 169.12
OpenGL extensions:
    GL_ARB_depth_texture, GL_ARB_fragment_program, 
    GL_ARB_fragment_program_shadow, GL_ARB_fragment_shader, 
    GL_ARB_half_float_pixel, GL_ARB_imaging, GL_ARB_multisample, 
    GL_ARB_multitexture, GL_ARB_occlusion_query, GL_ARB_pixel_buffer_object, 
    GL_ARB_point_parameters, GL_ARB_point_sprite, GL_ARB_shadow, 
    GL_ARB_shader_objects, GL_ARB_shading_language_100, 
    GL_ARB_texture_border_clamp, GL_ARB_texture_compression, 
    GL_ARB_texture_cube_map, GL_ARB_texture_env_add, 
    GL_ARB_texture_env_combine, GL_ARB_texture_env_dot3, 
    GL_ARB_texture_mirrored_repeat, GL_ARB_texture_rectangle, 
    GL_ARB_transpose_matrix, GL_ARB_vertex_buffer_object, 
    GL_ARB_vertex_program, GL_ARB_vertex_shader, GL_ARB_window_pos, 
    GL_S3_s3tc, GL_EXT_texture_env_add, GL_EXT_abgr, GL_EXT_bgra, 
    GL_EXT_blend_color, GL_EXT_blend_func_separate, GL_EXT_blend_minmax, 
    GL_EXT_blend_subtract, GL_EXT_compiled_vertex_array, GL_EXT_Cg_shader, 
    GL_EXT_draw_range_elements, GL_EXT_fog_coord, GL_EXT_framebuffer_blit, 
    GL_EXT_framebuffer_multisample, GL_EXT_framebuffer_object, 
    GL_EXT_gpu_program_parameters, GL_EXT_multi_draw_arrays, 
    GL_EXT_packed_depth_stencil, GL_EXT_packed_pixels, 
    GL_EXT_paletted_texture, GL_EXT_pixel_buffer_object, 
    GL_EXT_point_parameters, GL_EXT_rescale_normal, GL_EXT_secondary_color, 
    GL_EXT_separate_specular_color, GL_EXT_shadow_funcs, 
    GL_EXT_shared_texture_palette, GL_EXT_stencil_two_side, 
    GL_EXT_stencil_wrap, GL_EXT_texture3D, GL_EXT_texture_compression_s3tc, 
    GL_EXT_texture_cube_map, GL_EXT_texture_edge_clamp, 
    GL_EXT_texture_env_combine, GL_EXT_texture_env_dot3, 
    GL_EXT_texture_filter_anisotropic, GL_EXT_texture_lod, 
    GL_EXT_texture_lod_bias, GL_EXT_texture_object, GL_EXT_texture_sRGB, 
    GL_EXT_timer_query, GL_EXT_vertex_array, GL_IBM_rasterpos_clip, 
    GL_IBM_texture_mirrored_repeat, GL_KTX_buffer_region, GL_NV_blend_square, 
    GL_NV_copy_depth_to_color, GL_NV_depth_clamp, GL_NV_fence, 
    GL_NV_float_buffer, GL_NV_fog_distance, GL_NV_fragment_program, 
    GL_NV_fragment_program_option, GL_NV_framebuffer_multisample_coverage, 
    GL_NV_half_float, GL_NV_light_max_exponent, GL_NV_multisample_filter_hint, 
    GL_NV_occlusion_query, GL_NV_packed_depth_stencil, GL_NV_pixel_data_range, 
    GL_NV_point_sprite, GL_NV_primitive_restart, GL_NV_register_combiners, 
    GL_NV_register_combiners2, GL_NV_texgen_reflection, 
    GL_NV_texture_compression_vtc, GL_NV_texture_env_combine4, 
    GL_NV_texture_expand_normal, GL_NV_texture_rectangle, 
    GL_NV_texture_shader, GL_NV_texture_shader2, GL_NV_texture_shader3, 
    GL_NV_vertex_array_range, GL_NV_vertex_array_range2, GL_NV_vertex_program, 
    GL_NV_vertex_program1_1, GL_NV_vertex_program2, 
    GL_NV_vertex_program2_option, GL_SGIS_generate_mipmap, 
    GL_SGIS_texture_lod, GL_SGIX_depth_texture, GL_SGIX_shadow, 

Ah. Running compiz has nothing to do with running Xgl or not.
Try disabling compiz and trying again. It’s always a good idea to temporarily turn off compiz effects before running a 3d game, with compiz 3d stuff renders much slower.
If turning off compiz doesn’t work, try using Xorg instead of Xgl by disabling Xgl as I explained in my last post.
I think you can check whether Xgl is running by either checking gnome-system-monitor or by doing the following cmd command:

ps aux | grep Xgl | grep -v grep

Ok, I did the commands you mentioned above to ensure disabling of Xgl, and also looked to see if Xgl was running with the ps command. Xgl isn’t running, and the problem is exactly the same.

By the way, oddly the ‘Basic’ Cartoon-Shader tutorial fails on this error, but the ‘Advanced’ Cartoon-Shader in fact works. I guess they’re using different shading, but what’s odd is that the basic one is the one that fails.

I just upgraded my nvidia drivers to the exact same version as you are using. I’ll test whether I have the same problems.

EDIT: Oh. I just noticed my compilation of panda has no samples dir. I’m currently compiling Panda3D, when it’s done I’ll tell you.

EDIT2: OK, now everything segfaults here, with the newest drivers. I’m going to investigate this. I’ll tell you when I got news.

linux: no bugs in offscreen buffers on my end (on windows there is)

Same FX5200 here, the results on dual-OS :
[#] Windows, P3D 1.5.1, last year NV driver :
getSupportsRenderTexture() returns 1.
I can get GLGraphicsBuffer, I can create a buffer larger than framebuffer.

[#] Ubuntu Hardy, P3D 1.5.2, latest NV driver :
getSupportsRenderTexture() returns 0.
I can only get ParasiteBuffer, thus it’s limited by framebuffer size.

All working RTT samples run at the same speed on both OS (visual effects disabled on Linux).

I’ve got the same problems, with the newest BETA-drivers from Nvidia (177.13). I’ll downgrade to another driver later, to see if there are any differences.

Edit: Downgraded to 173.14.09, with the same results. I’ll give 173.14.05 a try too.

Hmm… That’s interesting. I’m having two similar-looking problems:

In the Fireflies demo: “Toon Shader: Video driver does not support multiple render targets”
In the Cartoon Shader Basic demo: “Toon Shader: Video card not powerful enough to do image postprocessing”
(Cartoon Shader Advanced works, however)

I’m using Windows XP, SP2 (if I recall correctly), and have an FX5200 graphics card. According to DXDiag, my driver is nv4_disp.dll, version 6.14.0011.6921. I’m using Panda 1.5.2.

(Up until reading this thread I had just assumed that the problem lay with my card, I think. Perhaps I give it too little credit. ^^; )

same problems here, on all my machines. fedora, kubuntu and 2x ubuntu.
all geforce cards from gf3ti200 up to 8600gt. older as well as new drivers are in use. none seems to be able to create the buffers successfully.
now that i think about it. it never worked for me as long as i worked with panda.
everything else works, thought. just the buffers dont.

the same goes here with my new rig