Help me write a shader

Hi,
I’m in need of a shader for shadows, here’s what I want it to do:

  1. I takes a image that is just rendered with a camera:
  2. The image is desaturated (turned gray-scale):
  3. A blurred copy of the image is made:
  4. The blurred copy is multiplied by the gray-scaled image:
  5. The brightness of the image is increased:
  6. The image colors are inverted:

I imagine this can’t be all that difficult, yet I don’t know even where to start. Anyone out there willing to write a shader for me for some kudos?

Shaders aren’t that difficult to learn, you should give it a try. The more difficult part is in setting up the buffers (what you describe would involve several passes of postprocessing filters for the blur stages), though this is relatively easily achievable using FilterManager.

In fact, the process you described is fairly close to what the stock bloom postprocessing shader does.

For now I do use the bloom + blur+ inverted filters, but I think a dedicated shader would do the job faster.
I’ve spend most of the weekend looking at the shaders in direct/filters and I just don’t get it…but if you say I need multiple passes, then I think I’m starting to.

I need shader #1 to make the steps before the blur, shader# 2 to make the blur and shader#3 to take the output of shader#1 and shader#2 and do the rest?

You don’t strictly need multiple passes, but it is advisable to do so for the blur shader. Doing blur within a single pass would use n*n samples, but doing a separate horizontal and vertical pass would use n+n samples.

Shaders have a vertex shader and a fragment shader: the vertex shader is executed for every vertex, and it describes how a vertex is mapped to the screen (calculates the screen position, and passes it to the fragment shader.) The fragment shader is executed for every pixel on-screen, and you have to calculate the appropriate colour for that pixel.

For a blur shader, the idea is that your blur pass simply renders a quad containing the texture of the previous pass, and calculates the colour as the average of that pixel and its surrounding pixels.

You don’t need three shader passes. You can easily do the desaturation and inverting in one of the blur passes, so you probably only need two.
(In your example, it doesn’t really matter whether the desaturation is done before or after the blur, anyway.)

I’m doing some progress, but at the same time I must be doing something terribly wrong.

This sort of works:

        manager = FilterManager(self.shadowBuffer, self.shadowCamera)
        tex_blur = Texture()
        tex = Texture()
        quad1 = manager.renderSceneInto(colortex=tex_blur)
        quad1.setShader(Shader.load("gaussian_blur.sha"))
        quad1.setShaderInput("tex1", tex_blur)
        
        quad2 = manager.renderSceneInto(colortex=tex)
        quad2.setShader(Shader.load("gray_invert.sha"))
        quad2.setShaderInput("tex", tex)        
        quad2.setShaderInput("tex_blur", tex_blur)

The blur part alone works:


My wacky shader, when provided with a pre-blured static texture

quad2.setShaderInput("tex_blur", loader.loadTexture('blur.png'))

also works:

But put together - they don’t, looks like the blured texture never reached my shader:

Where did I go wrong this time? :smiley:

This is what I use for my shader:

//Cg
 
void vshader(
    float4 vtx_position : POSITION,
    float2 vtx_texcoord0 : TEXCOORD0,
    out float4 l_position : POSITION,
    out float2 l_texcoord0 : TEXCOORD0,
    out float2 l_texcoord1 : TEXCOORD1,
    uniform float4 texpad_tex,
    uniform float4x4 mat_modelproj)
{
    l_position=mul(mat_modelproj, vtx_position);
    l_texcoord0 = vtx_position.xz * texpad_tex.xy + texpad_tex.xy;
}
 
void fshader(float2 l_texcoord0 : TEXCOORD0,
             out float4 o_color : COLOR,
             uniform sampler2D k_tex : TEXUNIT0,
             uniform sampler2D k_tex_blur : TEXUNIT1)
{
    float4 c1 = tex2D(k_tex, l_texcoord0);
    float4 c2 = tex2D(k_tex_blur, l_texcoord0);
    float4 c = lerp(c1,c2,0.6);
    float1 g = 1-((c.x+c.y+c.z)/3.0);
    o_color = float4(g, g, g, 0);
}

You use renderSceneInto twice. This means that in both cases, you’re rendering the scene, but since it is a chain of shaders, you need to have one pass render the scene and the other should render the quad that the first pass is rendering to. The latter is done using renderQuadInto. See CommonFilters.py for some examples of filter chains.

Ok, now I’ve got that.