glsl/cg vs Ati, getting a clear picture


Over the past few days i’ve been trying to get a clear picture on what aspects of glsl/cg are working on an ATI card. After having about 10 tabs open with different posts ranging from 2007 to 2013 i completely lost track of the timeline and decided to just start a new post to find out what i can use right now.

Firstly, it is my understanding that Ati cards only run glsl shaders, and that when i write a Cg shader it will be automatically translated by panda to some glsl profile. From one of my previous posts (New ATI videocard cant use vtf) I understand that currently, due to a bug, the only Cg (and thus actually glsl) profile that is working is the most basic one. This means that texture-look-up in the vertex shader for Cg can not be achieved. Nor is the semantic INSTANCEID. Am i right so far?

Furthermore i read that array passing is implemented in Cg shaders, but not in glsl. So if i want to do some instancing, i cannot use Cg because of the problem with INSTANCEID, and i cannot use glsl because of the array thing.

Which of these problems are bug in panda, and is there some idea to when these will be fixed?

Odds are i’ve mixed up some things because of an overload of info from the forum, so if someone could help me filter out the incorrect info that would be very much appreciated!


All of the problems have been fixed in the latest development release, where you can both use GLSL and arrays, or you can use Cg without a problem.

Great to hear that! Downloading as we speak

I cant seem to get the devel version to work properly. The problem lies with the shaders: if i deactive the setShader call it works again.
For example the file, which only displays a light grey dragon with green eye lids and black pupils on 1.9.0-devel but works normally on 1.8.1

Sorry, please wait until the next buildbot build tomorrow, there was an issue that crept in with shaders and postprocessing effects.

Thanks, you’re being a great help :slight_smile:

The development version still doesnt want to work completely (using #44):

I can run custom shaders now, but only if i set “basic-shaders-only #t”, which doesnt allow Vertex Texture Fetch. Setting it to #f results, for the cartoon demo, in the same grey dragon with green eyelids. For my own project i just get a black screen with some objects colored brown.

Well, I suppose that’s why we had basic-shaders-only in the first place; to prevent these kinds of issues on ATI cards.

Could you try to upgrade your driver versions? If that doesn’t solve it, then I’m afraid that you might be better off using GLSL.

Hmm, it does seem like changing to Glsl is the best option at this point, but i still have some remaining questions:

Glsl can only be used icw OpenGL (cross platform)
Hlsl with DirectX (only usable on windows, where it has slight advantages over OpenGL)
Cg can be compiled to both these but is technically only for Nvidea cards.

From those points it seems Glsl is the most widely supported language. Thusfar it is my understanding that Panda3D can translate Cg to Glsl by adding these two lines on top of the Cg shader:

//Cg profile glslv glslf

Which seems to be the very best option (allow DirectX on windows and Nvidea, Glsl everywhere else).

My two questions are:

  1. am i correct in all this?
  2. Why isnt it working if i use those lines and deactivate basic shader only. Rdb, in reaction to your reaction ‘‘Well, I suppose that’s why we had basic-shaders-only in the first place’’: isnt the point of basic-shader-only to make sure Ati cards dont use any other Cg profiles? So using the profile glslv glslf should not give any problems and should be equivalent to writing the whole shader in glsl.


Yes, the point of “basic-shaders-only” is that all cards (but ATI cards in specific) only use ARB shaders (ie use Cg in the “arbvp1 arbfp1” profiles), which seem to work universally, but are very limited in functionality.

It’s not Panda3D that translates Cg to GLSL, but the Cg runtime does this - it’s not within Panda’s control. (You can alternatively use the command-line “cgc” compiler to compile a Cg shader to GLSL code.)

To answer (2), I don’t know why it doesn’t work, but I think it’s a long-standing issue with the glsl compiler in Cg that sit well with ATI cards.

thanks for the details and tip on cgc. Off to learning glsl now…

I just remembered there was a bug with shaders in the development builds of Panda. So perhaps it’s not a driver issue after all in this case… the latest should have it fixed, but I can’t be 100% sure.

afraid i cant get it to work properly on any of the newer versions either. Porting my shaders to glsl now anyway so its not a huge problem for me, but i can imagine other ATI card users are going to experience similar problems when they step to 1.9.0 (just to be clear, the same program does run on 1.8.1).