Advanced Render Order Control Question

Good day.

I have 2 scenes.
The first scene is a few camera aligned planes with various encoded data packed into their textures.
This scene is rendered to a buffer 1st and has some post process maths done on it.

The main scene(2nd scene) is now rendered using the new data in the packed textures from the 1st pass.

I have a couple of questions/problems.

I need to limit the first render so no antialiasing is ever used and the textures dont interpolate colours.
This must not effect any options set for the main render.

I can control the interpolation of the textures but im not sure if anti aliasing is effecting them after slightly.

Im using two cameras each with their own scenegraph.

Anyone able to tell me the correct way to do this. I have this kind of working but im getting some strange data bleeding in the resulting first pass textures?

I am aware this is a rather unusual use :stuck_out_tongue:

For your info, i calculate physics, anims etc in the first pass and embed the results into the image, that i then use to offset vertex postions in the second pass of the relevant objects. All these are done using self written shaders
Other than the weird bleeding i am able to animate 100,000s objects at over 500 fps at mo with quite advanced physics.

Just need help fixing the texture bleeding.

You can set a particular state to be applied to a particular camera instead of a scene node using cam.setInitialState. You can create a RenderState that overrides any antialiasing settings or texture blending settings. (However, you cannot change texture filtering settings using render states at present - this has to be done on the Texture object. However, if you’re using a shader, you can simulate the effects of a particular filtering mode.)

If you want, you can set some desired override states on a dummy NodePath using the simple NodePath interface, and then extract the underlying RenderState object using getState.

I’m not sure what you mean by data bleeding. Could you perhaps show us a screenshot to demonstrate the problem?

I have found that if I add 1/2 pixel to the shader texture look-up position of my shaders (both vert/frag and filters) it seems to eliminate most of the colour bleeding.
It looked like it would always interpolate to the left/right pixel regardless of settings I used.

I don’t understand why offsetting it by half a pixel helps…but its dusted that problem.

I’m currently looking into the states on the cameras to see if I can better control the anti-alias settings so thanks for that.

Again, many thanks for the swift response.