1.6.1 Cell Shading Demo

On my mac, when I start up the Tut-Cartoon-Advanced.py the lines don’t actually lay on the dragon itself, rather they exist totally separately from the model.

I’m trying to understand how I can fix this, since I’m hoping to have cell shading.

Thank You

(I can post a screen shot if you can’t duplicate this)

You mean the lines look something like this?
pro-rsoft.com/screens/cartoon-shader-1.6.0.jpg

Darn, I thought this bug was fixed for 1.6.0. Are you sure you are running 1.6.1 and not any older version of Panda?

Btw. to work around this problem, put this in your Config.prc:

textures-power-2 none

I am 100% sure I’m running 1.6.1

And that fixed it, thank you very much.

So, another question, that might be stupid.

I have a model with a texture, but when I load that up in this sample the texture disappears, and I end up with an all grey and white model.

How do I get around this?

The shader used in the Advanced sample does not read out any textures assigned. You just need to change a few lines in the shader.
Since the Basic sample uses the shader generator, textures work fine there.

I guess you missed it in the “1.6.1 Released” thread but a couple people (including myself) noticed this. It was actually first noticed on the Glow-Basic demo, because there the rendering is both zoomed in and the glow isn’t working.

Similarly, in the Cartoon-Basic demo, while the outlines are on the model correctly, both are zoomed in (like the outlines are in your screenshot.)

Ok, excellent, that makes sense.

I tried reading through the cg shader tutorial, and I thought I understood what happened when you put it in different places, but I can’t seem to find the right place to put those few lines of code, or I can’t figure out what lines of code they are.

Any help on which lines of code to change would be great.

Thank You

lightingGen.sha would look something like this: (untested)

//Cg

void vshader(float4 vtx_position   : POSITION,
             float3 vtx_normal     : NORMAL,
             float2 vtx_texcoord0     : TEXCOORD0,
             out float4 l_position : POSITION,
             out float2 l_texcoord0     : TEXCOORD0,
             out float4 l_brite    : TEXCOORD1,
             uniform float4 mspos_light,
             uniform float4x4 mat_modelproj)
{
  l_position = mul(mat_modelproj, vtx_position);
  float3 N = normalize(vtx_normal);
  float3 lightVector = normalize(mspos_light - vtx_position);
  l_brite = max(dot(N,lightVector), 0);
  l_texcoord0 = vtx_texcoord0;
}


void fshader(float2 l_texcoord0     : TEXCOORD0,
             uniform sampler2D tex_0 : TEXUNIT0,
             float4 l_brite     : TEXCOORD1, 
             out float4 o_color : COLOR)
{
  if (l_brite.x<0.5) l_brite=0.8;
  else l_brite=1.2;
  o_color=l_brite * l_color * tex2D(tex_0, l_texcoord0);
}

So, I changed lightingGen to be

//Cg

void vshader(float4 vtx_position   : POSITION,
             float3 vtx_normal     : NORMAL,
             float4 vtx_color      : COLOR,
             float2 vtx_texcoord0  : TEXCOORD0,
             out float4 l_position : POSITION,
             out float2 l_texcoord0     : TEXCOORD0,
             out float4 l_brite    : TEXCOORD1,
             out float4 l_color    : COLOR,
             uniform float4 mspos_light,
             uniform float4x4 mat_modelproj)
{
  l_position = mul(mat_modelproj, vtx_position);
  float3 N = normalize(vtx_normal);
  float3 lightVector = normalize(mspos_light - vtx_position);
  l_brite = max(dot(N,lightVector), 0);
  l_color = vtx_color;
  l_texcoord0 = vtx_texcoord0;
}


void fshader(float4 l_brite     : TEXCOORD1,
 			 float2 l_texcoord0     : TEXCOORD0,
             uniform sampler2D tex_0 : TEXUNIT0,
            float4 l_color     : COLOR,
             out float4 o_color : COLOR)
{
  if (l_brite.x<0.5) l_brite=0.8;
  else l_brite=1.2;
  o_color=l_brite * l_color * tex2D(tex_0, l_texcoord0);;
}

and get the following graphic

Huh, you mean you do want vertex colors?

But yeah, I left the vertex color calculation in the last line. In my code, replace it with:

o_color=l_brite * tex2D(tex_0, l_texcoord0); 

even without the color in there I still get the same blank dog.

With the shader I’ve posted, with the “* l_color” removed, I get:
pro-rsoft.com/screens/ralph-cartoon.png

Remember that the shader just takes the first texture coordinate set with the first texture that were applied to the model - maybe you don’t have one of these? Or your first texture happens to be grey?

Maybe it has an alpha? To be certain, put this as last line in the shader, before the closing } :

  o_color.a = 1.0f;

I figured out what was going on. Just in case this helps anybody I’ll explain it.

The renderer was working fine, I tested it on a different model and it was fine, so then I tested same model different texture, and it worked fine.

So, the answer was that my textures were not powers of two, and so they were not getting applied because in order to get cell shading I had to turn off the automatic powers of two converter.

So, short answer, make sure your textures are a power of two.

Also, Pro-rsoft thank you for all your help, you’ve made this much less frustrating.

I’m guessing you probably already now what it looks like, but for the sake of reporting confirmation I want to point out that your work around completely corrects the issues with the glow demo (ie. both basic and advanced demos work) the work around doesn’t entirely fix the basic cartoon shading demo. That is, the ink lines in the advanced demo are now fixed, and the basic demo no longer looks zoomed in, but it isn’t cartoon shading in the normal demo, just standard garoud shading.

Ah, I see. It just lacks a setShaderAuto() call.

Does that just mean I need to add a line in the Python code or is that a fix in Panda itself?

ADDITION: Here’s the full output from the terminal session when I run Cartoon-Basic, sorry this didn’t occur to me to post earlier:

DirectStart: Starting the game.
Known pipe types:
osxGraphicsPipe
(all display modules loaded.)
:display:gsg:glgsg(error): Could not load Cg fragment program:created-shader (arbfp1 Error on line 2: unrecognised option (hint: ‘ATI_draw_buffers’))
:display:gsg:glgsg(error): GL error in ShaderContext destructor

ADDITION 2: oh dear I’ve just realized something really weird. While the above error messages are still being returned, Cartoon-Basic actually looks correct when runing most of the time, but there’s no cartoon shading (only garoud shading) if I run it right after running Cartoon-Advanced. In other words, running Cartoon-Advanced first prevents Cartoon-Basic from working right. doh

And just to add to the weirdness, even when cartoon shading isn’t working on most of the model, it’s working on the eyelids.

ADDITION 3: Actually y’know what, it doesn’t seem to be quite as straightforward as “Cartoon-Basic is weird if run after Cartoon-Advanced.” The issue seems pretty intermittent, with cartoon shading randomly working or not working when I run the demo multiple times.

Similarly, it’s pretty much random whether or not cartoon shading is on the eyelids; sometimes the body is garoud while the eyelids are cartoon shaded, sometimes the other way around, and sometimes both are cartoon shaded or both garoud.

intermittent problems are the worst

The setShaderAuto should be placed in the sample program. But I think that’s already done but the shader simply failed to load.
Panda appends the ATI_draw_buffers hint instead of ARB_draw_buffers because of a bug in some ATI drivers.
What kind of video card do you have? More specifically, do you have an ATI card? What version of the Cg toolkit do you have installed?

I guess I can at least work around this by not appending this hint if the ATI_draw_buffers extension is not available, and in that case relying on the drivers to do their job correctly.

About the color-switching: I think it just might be related to a bug clcheung saw the other day about vertex colors being randomly switched sometimes, when the exact same shader is applied to multiple nodes. That has already been fixed for 1.6.2.

I’ve just fixed the scaled-buffer bug when padding is enabled. Turned out to be a tiny typo.

Can I add a new question.

I’m looking at this, and I’ve looked everywhere, but I want the lines to be anti-aliased, right now they are very very harsh.

Is there any way to do this?

I’ve looked at setting render and render2d to antialias MAuto, but when I zoom in and look they are doing nothing.

I’ve looked at trying to modify the shaders, but looking at the logic, I see no way to figure out how to tail off, other then what is already done with cutoff and that makes my model look very strange…

Thank You

You’ll need to put these commands in your Config.prc besides setting the antialiasing to MAuto:

framebuffer-multisample #t
multisamples 1

“multisamples 1” means: get as much multisamples as possible. Replace it with, say, 8 for an 8x antialiasing effect, but usually you will want to leave this to 1.