Applying CommonFilters to objects, not whole scene

Hi,

I have looked around in the manual and forums, but haven’t found an answer to this. I can enable commonfilters by:

filters = CommonFilters(base.win, base.cam)

then add filters, shaders such as:

filters.setBloom(blend=(0,0,0,1), mintrigger= 0.4, desat=-0.3, intensity=2.0, size="medium")

But this will apply the shader to every object in the scene.

Could someone please provide a code snippet, for having 2 objects (or 2 groups of objects), and there are 2 filters defined, one applies to one object, the other applies to the other?

Thank you in advance!

This is not easily possible - you would need to change the bloom shader to distinguish between objects that shouldn’t or should get the effect. Fortunately, for bloom, this is relatively easy since it uses the alpha channel of the framebuffer to encode this information; you can prevent nodes from writing to the alpha channel to disable them from getting bloom.

Thanks for the reply, but this would only for bloom or shaders depending on something specific for the object (specific texture maps, alpha channel, whatnot)

I am working on a 2.5D sidescroller shoot-em-up, and for example I would like to apply some blur on objects close to the camera (rocks, asteroids, serving purely visual purposes), giving the impression of a depth of field effect.

Is there a way, that I set up a filter, and have the shaders applied to the filter only affect specific objects?

So are you saying, that if I were to implement this selective blur effect, I would need to take care of the selection of what to blur and what not to blur in the shader, and not in Panda3D’s scripting end?

This meaning, that I would have to write my own shader, and this wouldn’t be possible with using the built-in CommonFilters?

As an alternative, would it be possible that I render the objects I am planning to apply blur on to a texture, and apply the CommonFilter’s blur shader to only that texture?
If yes, how would I be able to do that?

You wouldn’t necessarily need to write your own shader in the case of the bloom effect, since the bloom effect is already selective (in your case, with blend=(0, 0, 0, 1), using the alpha channel of the framebuffer to select which nodes should receive bloom).

If you want to make a depth-of-field shader, then you could use the information in the depth buffer to find out which nodes to blur and which not to.

If you want, I can add a “source” parameter to the setBloom function that you can set to “depth” in order to take the bloom information from the depth buffer, but this is not necessary, as you can already control this by disabling alpha write on a particular object (or using the shader generator on them and disabling glow maps).

That said, the default shaders in CommonFilters can only bend so much, and if you want more, I’m afraid you’ll have to implement your own filters using FilterManager.

Thanks for the clarification.


Bloom:
So this does sort-of take care of the bloom issue. (I was hoping that I could use separate filters, because in some cases I actually want to use a glow-map, like the tron-guy example, but in some others it would be convenient to use RGB as source, like for the starfield/nebula background I am using.)

Blur/Dof:
On to the foreground blur/depth-of-field

As you have explained, this is not really necessary for the bloom, as it can be done with the alpha channels, but do you think it would be possible for you to make something like this for blur?
If some parameters could be specified (similarly to how bloom has minTrigger, maxTrigger and others) but linking the depth buffer and the “amount” parameter.

For example minZ, minAmount, maxZ, maxAmount, defining the amount of blur on minZ depth, and amount of blur on maxZ depth, where the amount in between is lerped. That way I could specify a depth range, where the blurring is in effect, which would be only for objects closest to the camera in my case.

I am not too familiar with shader programming, but I could only assume that the original blurSharpen filter already has the meat to make this happen, it would only take some factors on the amount based on the passed parameters.

If this is not really possible or reasonable for you to do with the built-in CommonFilter’s blur, could you give me some directions on where would I start if I want to write my own depth-of-field shader to used with Panda3D?

I appreciate you help!

Well, if it were that easy we’d just already have a stock Depth of Field shader. But blurring based on depth is not enough, as the background will blur with the foreground, creating halos around objects:
rdb.name/depth-of-field.jpg

GPU Gems is always a good resource:
http.developer.nvidia.com/GPUGem … _ch23.html
http.developer.nvidia.com/GPUGem … _ch28.html

I see, so turns out proper depth of field isn’t that simple :slight_smile:

In my case however I wouldn’t relly need proper depth of field, only a group of objects that are blurred, and a group that isn’t.

This would be my last shot regarding this, if this isn’t doable I guess I’ll just drop the blur thing for now.

Do you think this would be possible:

If yes, how would I do it?

Certainly; if the objects are always going to be in front of the rest of the scene and don’t need to interact in any way, this will be easy. I think you can use a second DisplayRegion on top of the main one in your window, and render your rocks to that particular DisplayRegion using a separate camera made using something like:

overlayCam = base.makeCamera(base.win, camName='rock_overlay_cam', lens=base.camLens)

base.makeCamera will automatically create a DisplayRegion for you, so you don’t even need to worry about creating one yourself.

base.makeCamera will by default reparent this camera to base.camera, so they will automatically share the same viewpoint; you can then either choose to make a separate scene graph for these rocks (using scene=myRockRootNP); but if the rocks are integrated into your scene graph structure, then you can simply share the scene with the main camera (scene=base.render) and use camera masks to hide your rocks from your main camera and vice versa.

Then, you pass the camera for that DisplayRegion to the constructor of CommonFilters, which will then look for the DisplayRegion with that camera and apply the filters to it. (You may need to enable transparency on filters.finalQuad to allow the rest of the scene to see through.)

Note that you should be able to have multiple CommonFilters objects; one for the rock overlay DisplayRegion, which will have blur effects, and one for the main DisplayRegion, which will have bloom effects or whatever you want the rest of the scene to have.

I’m not 100% sure that this will work as I’ve never tried it, but I’d be happy to help out if it doesn’t. If all this fails (it shouldn’t), then you can still achieve this affect by rendering to a separate buffer altogether.

Thank you, I’ll give this a shot, and let you know whether I manage to make it work or not