hello again !
I still have a problem ! ( how unusual ! ) with my passage to panda3D 1.5
As i said earlier, now i have no problem,… in windows. In linux though i have a segfault when trying to activate my shadow cam
Fatal Python error: (pygame parachute) Segmentation Fault
Aborted (core dumped)
with gdb, i 've find that was the libGLcore that doesn’t like the line:
when trying to create a depthbuffer:
self.Ldepthmap = Texture()
LBuffer=base.win.makeTextureBuffer('depthmap', mapsize, mapsize,self.Ldepthmap)
I’m in ubuntu 7.10 by the way,with an old graphic card nvidia with nvidia driver (if that change something). The same code is working fine in panda3D 1.4
I have a copy of ubuntu 7.10. If you were to send me the program that crashes (or better yet, simplify it, then send it), I could fix the error.
ok, i’m trying to reduce it to a single file
I’ve been experimenting a lot with depth buffers and other kinds of buffers lately in 1.5.0, but I haven’t experienced any problems.
the main problem comes probably from my hardware, since i use a “no-graphic” graphic card.( a Geforce 6150 LE )( i will change it very soon )
but i the meantime, this worked in 1.42 !
( …still working on a 1 file version,…)
a little post to give some info:
I 've been quite busy, obviously
I upgraded to a new hardware ( 8800gt ) and the new ubuntu (“Hardy Heron” ), but the crash still occurs.
I’m still trying to isolate the problem, but i’ve got little trouble with it.
With pdb I can say it’s located on showbase : self.graphicsEngine.render() witch doesn’t help me much.
i post the code as soon as i can.
this is it.
the code bellow seg fault on my machine:
from direct.directbase import DirectStart
from pandac.PandaModules import Texture
Ldepthmap = Texture()
LBuffer=base.win.makeTextureBuffer('depthmap', 1, 1024,Ldepthmap)
That is odd. That code works fine here (GeForce FX 5200, Panda 1.5.0)
what can i say ?
it’s working fine under windows here too, but crash on ( a least ) two different box. ( the old 150 and this new 8800gt ) (panda3d 1.5.0 and 1.5.1 )
sorry to bump this message like that, but the code still crash here.
Any news or solutions ?
I would also like to chime in on this.
I have three machines running SUSE 10.2 with Panda 10.2.
One machine, an older shuttle X with an nvidia 7600 does not segfault when running SylHar’s depthmap example.
The other two machines, very new motherboards with Intel Q6600Quad Core processors and 8500GT and 8600 GTS cards both segfault on the Shadow example and SylHar’s example.
Alex S Hill
Center for Technology and Social Behavior - Northwestern University
sorry to bump this post.
But I found a little more on the matter.
It work with nvidia 180.22 drivers but not with 169.12 drivers so it seems that there an incompatibility somewhere
Some buffer crashes on Linux were recently fixed in 1.6.0, so it’s very likely that this one is too.
Please, let me know if someone still runs into this problem with 1.6.0 or later.