i’m trying to make the FXAA shader example work with ATI card, so I’m trying to learn cg, the first problem with the code is this commands
texture2DLod(t, p, 0.0)
tex2Dlod(t, float4(p.xy + (o * rcpFrame), 0, 0))
both commands generate the error:
"cg program too complex for the driver"
whether i use
Is there any replacement for this commands?
When i googled the problem i found that to make this command works with GLSL on ati the following extension must be added:
#extension GL_ATI_shader_texture_lod : enable
But of course this code doesn’t work with cg.
I have the latest driver with high end ATI graphic card so i don’t think the problem in the card itself, but maybe the code is not compatible with all ATI cards.
if i wanted a shader to run perfectly on ati & nvidia, whats better for this task cg or glsl?
I want an advice on what shading language should i learn.
thanks in advance .
i found the shader code in GLSL but i can’t make it work in panda
github.com/mitsuhiko/webgl-mein … /fxaa.glsl
geeks3d.com/20110405/fxaa-fa … geforce/3/
i really need FXAA because AA breaks when i use filters.
You may need to specify a custom profile to use using “//Cg profile X Y” where X and Y are the right Cg profiles you want to use.
how can i know what profile should i use ?
do you mean something like this developer.nvidia.com/cg-profiles
so i need to add something like this at the beginning of the code?
//Cg profile arbvp1
Something like that, but you’ll also want to specify a profile for the fragment program, like “arbvp1 arbfp1”. Those are pretty outdated profiles though, you’ll likely want one of the latest profiles, or perhaps the GLSL profiles.
The Cg documentation and man pages will contain more information about the various profiles.
This won’t work with basic-shaders-only set to #t, so make sure it is set to #f.
i tried a lot of new profiles such as the following
//Cg profile glslv glslg
but i feel like the compiler ignores this line, is there any way to know that the compiler used the profile? i wrote this line :
//Cg profile afsf asfdasf
and the compiler didn’t mind the wrong inputs