Rendering and shading issues on OS X

I recently switched from Linux to OS X (10.8.2) and have been trying to get my Panda3D development environment back up and running. I need to use Python 2.7, and am running 64-bit, so I installed the experimental build mentioned in [OSX 10.7 Lion Error) (I tried and failed to get it to compile from source on my own).

Now Panda is working, which is great! Except that there seems to be some shading and rendering issues that were not present under Linux. I’m not sure if this is an issue with Panda or perhaps with OpenGL or what, but any help or pointers would be greatly appreciated!

  1. If I use render.setShaderAuto(), I get these weird artifacts that look a bit like “ripples”, especially on the floor (see image below). Removing the line that turns on the shaders makes these ripples go away, but then I don’t have shadows, which I would really like to have.

I noticed similar issues with the shading demos provided with Panda, so I think this might not just be isolated to my code (though the two points below might be).

  1. At the base of the tower of blocks in the above screenshot, you’ll notice some serious aliasing going on between the bottom block and the floor. This doesn’t go away when I turn off the shaders, so it’s something else. It almost looks like the blocks are penetrating each other or the floor, but I know that’s not the case – I think it’s just rendering in a way that makes it look like that’s the case.

  2. The aspect ratio of the blocks looks kind of squashed. If I maximize and then restore the window, this is fixed (see below) but ideally I would like the blocks to render properly right from the get-go. It looks like the whole scene is actually being rendered too wide, or something, but I’m not sure what the solution to that would be.

Thanks for any help you can give, and let me know if you need any more information about these problems.

This is very odd. Are you using post-processing effects?

No, I’m not doing any sort of post processing. I figured out the problem with #2 and #3, which seem to have been caused by the same thing:

# Position the camera, -10, 1.75), 0, 1.5)
lens = lp.PerspectiveLens()
# The following line causes the weird perspective and aliasing:
base.camLens = lens

It’s been a while since I wrote that particular piece of code so I’m not sure why (or if) it was necessary to set a perspective lens like that, but commenting out that one line fixes two of my problems and doesn’t seem to break anything else (though perhaps I will discover problems later on).

I’m still stumped by the ripples, but it definitely is related to shadows. I forgot to mention I have a spotlight, on which I’ve called setShadowCaster(True, 2048, 2048). If I change the buffer size it affects the appearance of the ripples – a smaller buffer size seems to make the ripples more regular and decreases the wavelength between them. This is what I get with 512x512:

Perhaps it has to do with the resolution of the depth buffer? Decreasing the distance between the near and far clips of your shadow camera, does that also decrease the artifacts?

No, changing the near and far clips doesn’t decrease the artifacts. But, I did discover that they are definitely only occurring where the light from the spotlight is hitting the objects.

In the image below, I’ve turned off the textures on the blocks so you can see the artifacts on them better (and I changed the attenuation, so the shadows and therefore artifacts are stronger/easier to see). I also narrowed the vertical FOV on the spotlight camera so you can clearly see the region it’s affecting – you can see there’s nothing wrong outside of the vertical strip in the middle (except, of course, no shadows at all).

Wherever there is a shadow (that is supposed to be there), there are no artifacts (e.g., on the ground, the shadows cast by blocks onto other blocks, etc.), and there are no artifacts on the sides of the blocks that don’t get hit by the spotlight. Also, I should mention that the ripples change if I rotate the camera around the tower (though that’s not too surprising given that the issue is with shadows, which need to be re-rendered whenever the camera angle changes); if you compare the various screenshots I’ve posted you’ll notice the ripples are different in all of them.

The relevant code is:

# Create a NodePath to hold all the lights
self.lights = pm.NodePath('lights')

# ... create some other point lights and stuff ...

# Create the spotlight
slight = pm.Spotlight('slight')
slight.setShadowCaster(True, 2048, 2048)
slight.getLens().setFov(40, 20)
#slight.setAttenuation((1, 0, 0.02))
slnp = self.lights.attachNewNode(slight)
slnp.setPos((4.5315389351832494, 2.1130913087034973, 8))
slnp.lookAt(0, 0, 0)

Could you put together a minimal sample program that demonstrates this issue? Or does the Shadows sample also show the issue?

Do you have an ATI or an NVIDIA card?

Yes, I will put together a sample program and get back to you. I have a Intel HD Graphics 4000 512 MB (this is a Macbook Air). Thanks for your help!

Actually, yeah, no need – the shadows demo shows it. Here are some more screenshots:


I missed before that the advanced demo worked but the basic one didn’t; I’ll try to play around with using the more advanced shaders and see if that fixes the problem in my code.

Interesting update: if, in the advanced demo, I move the light far away or decrease the push factor (even only to 0.02 from 0.04), I get the same sort of rippling artifacts. I haven’t had much experience with shaders before so my intuitions might be totally off base, but perhaps this is the issue – that in the default shader settings that panda provides, this push factor is just too low for some graphics cards?

That would confirm my suspicion that these are self-shadowing artifacts. This happens when the objects are casting shadows on itself, which happens when the sample from the shadow buffer deviates a tiny bit from the surface depth of the fragment that is rendered by the main camera. Because the resolution of your depth map is lower than that of the surface you’re rendering, this shows up as lines (or in your case, concentric circles).

This can happen more easily with a lower resolution depth buffer, which is probably why you’re seeing it and I’m not.

The way that I chose to resolve this in Panda’s shadow implementation is to only cause the back faces of an object to cast shadows. This is usually all that is needed, since any artifacts on the back faces will be hidden because the back faces are always dark. The only case where this is an issue is with double-sided surfaces, since the back faces will overlap with the front faces and therefore the back face will be trying to cast shadows on the front face. In this case, a simple setDepthOffset(1) is sufficient to offset the depth buffer by the tiniest amount, just enough to prevent overlap. (It could be argued that this should be on by default for shadowed surfaces.)

(The way that the Advanced shadow sample fixes this is a cruder way of achieving more or less the same as setDepthOffset.)

Is the plane that shows these artifacts double-sided?

Hmm, no, the objects are not double sided (and in my code, the floor is actually a cylinder rather than a plane/card). But also, I would assume the teapot and panda in the shadow demo are not double sided either, but they get these artifacts too. Am I wrong about that?

Setting setDepthOffset(1) does seem to get rid of the self-shadows, but also the good shadows – both in the shadow demo and in my code. Those objects that I set the offset on are are no longer illuminated by the light; consequently the shadows cast on them don’t show up (there’s no contrast between illumination and shadow, I suppose).

Is there a way to change the resolution of the depth buffer, or is that a fixed quantity associated with the graphics card?

You can force Panda to request a number of depth bits using the depth-bits variable in your /Developer/Panda3D/etc/Config.prc file.

Though, I’m surprised to hear that your models are not double-sided. I don’t understand how this issue can occur if they aren’t double-sided, and I’m even more surprised to hear that setDepthOffset makes the shadows disappear entirely.

Unfortunately, I cannot reproduce this issue on my Mac Mini with an NVIDIA Card, so I’m bound to assume that Intel’s shadow implementation is buggy - no surprise there, really.

What do you get when you put this at the end of the init in the Advanced shadow sample?

        print "depth bits: ", LBuffer.getFbProperties().getDepthBits()

It prints out:

And the number of depths bits it prints does not seem to be affected by changing depth-bits in Config.prc.

I ran into a possibly related issue with z-fighting on my Intel HD 3000 GPU running under Windows 7. Seems like the same issue you had with #1 and #2 there.

In our game (which is first-person) 1 unit = 1 meter, so the default near clip of 1.0 was way too close. Setting the near clip to 0.01 results in terrible z-fighting. 0.1 has the same problem but it doesn’t happen until the objects are about 10 times as far away, and this ended up being our “happy medium”. Adjusting the far clip plane does not seem to have any effect on the problem, though we need to have it set to a specific value (350) anyways.

This only happened on the Intel GPU running on GL. The same GPU running on DX9 had no such problem. Also the discreet NVIDIA GPU had no problem in DX9 or GL.

The output with “notify-level-glgsg debug” and “notify-level-display debug” set (only the first part, not the frame spam) could be helpful so that we get a better idea for the capabilities of your card.

I’m not sure why you’re not getting the actual number of depth bits back from getDepthBits - perhaps you’re using an old version of Panda? I recall that something regarding that was recently changed, so you could try the latest devel build. Confirming that the problem is indeed the number of depth bits would explain both (1) and (2).