I checked out the shadow demo and I saw that it uses a CG shader. The demo seemed to work on my machine (if I was supposed to see a gridlike shadow on the teapot/panda), although I use an ATI card (HD 3800). (If I stop the teapot rotation but keep the Panda marching, I will see the shadow vibrating on the teapot at certain positions of the panda; if I stop the panda as well, the vibrations stop too.)
What I don’t understand is:
1, If these shaders are in CG, how is that the demo still works for me (although little buggy).
2, Isn’t shadow casting a standard feature for videocards? Why is there a need for shaders?
Disclaimer:
I never programmed shaders, and have only a superficial knowledge about them.
Why wouldn’t it? I assume your card supports Cg shaders…
It’s a common misconception that Cg does not work on ATI cards. Cg should work on any card that supports shaders.
It’s a standard feature (also see the shaderless shadow example I posted somewhere here at the forums) through the ARB_shadow extension, but some drivers/cards (especially ATI) seem to have a buggy implementation of it.