with gdb, i 've find that was the libGLcore that doesn’t like the line:
self.Ldepthmap.setFormat(Texture.FDepthComponent
when trying to create a depthbuffer:
I’m in ubuntu 7.10 by the way,with an old graphic card nvidia with nvidia driver (if that change something). The same code is working fine in panda3D 1.4
the main problem comes probably from my hardware, since i use a “no-graphic” graphic card.( a Geforce 6150 LE )( i will change it very soon )
but i the meantime, this worked in 1.42 !
I upgraded to a new hardware ( 8800gt ) and the new ubuntu (“Hardy Heron” ), but the crash still occurs.
I’m still trying to isolate the problem, but i’ve got little trouble with it.
With pdb I can say it’s located on showbase : self.graphicsEngine.render() witch doesn’t help me much.
I would also like to chime in on this.
I have three machines running SUSE 10.2 with Panda 10.2.
One machine, an older shuttle X with an nvidia 7600 does not segfault when running SylHar’s depthmap example.
The other two machines, very new motherboards with Intel Q6600Quad Core processors and 8500GT and 8600 GTS cards both segfault on the Shadow example and SylHar’s example.
Alex S Hill
Center for Technology and Social Behavior - Northwestern University
Some buffer crashes on Linux were recently fixed in 1.6.0, so it’s very likely that this one is too.
Please, let me know if someone still runs into this problem with 1.6.0 or later.