new features for shader system

Hi all,

My Team at the ETC, Carnegie Mellon University is currently working on improving the Panda3d’s shader system , and we’d like to request the community’s opinion on it. We have approximately 8+ weeks to work on it.

what are the features that are important and missing in Panda3d’s shader system? Is there something like a priority list?

Or are there critical fixes that need to be taken care in the shader system?

Our team really wants to do something for the community, so please tell do share your opinions with us as to what you’d like to include/fix in the panda3d’s shader system. We’ll do our best to implement them.

Wei-Feng Huang

Wei-Feng Huang
Graduate Student
Entertainment Technology Center
Carnegie Mellon University

Wow, you’re propposingv the pandora box for us (unexperienced programmers).
That’s a great initiative and asking the community is also a good idea :slight_smile: Let’s start shooting :slight_smile:

A perfect idea might somekind of a natural element kit :

  • Water effects (lot of possible apps : oceans, rivers, pools… and specially RAIN).
  • Ligthing effects (thunder for skies and electricity for character’s magic).
  • Wind effect (for static objects like trees and grass).
  • Fire effects (for levels = lava, fire particles and character = fireballs).

These are ideas that I’ll definetly use but don’t know how to create them.

I don’t think he meant the shader generator, but the shader system itself.

I guess there are only a handful of Panda users who are so advanced that they will understand the pros and cons of the shader system and so can provide feedback about it.

Personally, as your average “Panda needs a better content pipeline” end user, I’d like to see some one click stuff. Like you want volumetric lighting on an object, you do this: you have a spherical egg model with a yellow-to-orange texture, you throw one line on it, it becomes a volumetric light sun for a game. Want water, you do this. Want post processing, this. Maybe a GUI shader file generator where you combine elements in a tree and then get an exported shader file.

Another thing I remember is that fog could not be combined with the shader generator. Don’t know about the details, and I guess I don’t care (see above). But maybe that’s an area to work on.

Not sure what you can achieve in 8 weeks, when you are probably going to spend most of that time learning your way around the system, but I’ll bite:

Most modern graphics cards support 4 outputs from a shader - Panda already supports having a second (The auxiliary you can request when creating a buffer, and write to as o_aux in the shader.) - any chance of some more? Just an aux2 would make me much happier:-)

The shader generator could do with some fog, just so its complete. I also understand that D3D support is flaky - doesn’t affect me but would be good to improve that.

A clean up of the filters system might also be an idea - the CommonFilters class is quickly descending into a mess of special cases, so replacing it with something plugin-based would be well advised, especially as normal users could then write such plugins - would lower the barrier to getting new stuff in there.

Woups :blush:
Sorry for spaming with my previous post. I was so glad to see someone proposing some shaders ready to use but…
I never touched the shader system and most probably not do it in the next few years…

Anyway, good luck for your work.

This is already supported, and also used by various builtin postprocessing filters. I may have even used this feature in Naith (not sure though).

Good point. This is not really related to the shader system though, but I am planning an FX-based pipeline for 1.8.0 that will make this much easier.

Improvements i would like to see are mostly production of new examples/filters.

  • Tested and optimized deferred shading pipeline example. Yes, we have firefly example but it breaks as soon as your rotate the camera, the lights are not spherical but cones, no shadows. So a better “firefly” sample but using the new Filters.

  • Fuzzy particle filter. Motion blur filter. Lens flair filter.

  • Passing Vec3 or Matrix arrays to shader. Ether though texture or other ways.

  • Tested and optimized HDR example. Over using bloom is not HDR, hdr is using full 32bit floats to do light computations.

  • Parallax and more advanced mapping examples.

  • Pure GPU based bones and animations. Maybe some thing like render bone waits to vertexes, render bone positions to a texture. In vertex shader read which vertex belongs to which bones, read where the bones are now from a texture. Make sure this integrated with panda3d CPU based bones and animations.

This is a good point, actually. I think someone from CMU is working on this.

I renamed the Normal Mapping to Bump Mapping example, and it now also covers parallax.

I think this is already supported, with “hardware-animated-vertices #t”.

rdb: I looked for such a feature but couldn’t find it. I can imagine how you write the shader, but I don’t see how you would generate the buffer to write it out to. In my search I did look through all the filters in the CommonFilters and the shader generator, which uses the auxiliary to store the surface normal, with glow in the alpha, but I couldn’t see anything involving a 3rd output. Any pointers to what I missed?

AuxBitplaneAttrib, which I think is what you’re referring to, is simply a hint to the Shader Generator. You could simply create an o_blah output that is assigned to a COLOR1, COLOR2, etc. These will be bound to the aux attach points that you can specify when passing a RenderTexturePlane: … b8d349d400

Sort of, though this only uses the old fixed-function OpenGL extension, which doesn’t support very complex characters, and often has poor driver support. A system that interfaced with the shader generator and provided more power would be welcome.


Actually, I was referring to the FrameBufferProperties object you create a buffer with - I couldn’t see any way of requesting a second auxiliary. However, I just had a wonder through the file and found its createBuffer method, which amply demonstrates how you do so. I hadn’t realised you could call setAuxRgba(n) to request n auxiliary buffers. The documentation for FrameBufferProperties doesn’t mention any of this (And many of the other methods in that class also have a random ‘i’ variable with no explanation as to its use.), plus I’ld looked down the RenderTexturePlane list and assumed they were for future expansion, seeing as I saw no way of requesting there use!

Thanks though - you gave me a clue as to what I should grep the source for;-)

what everyone said + document any new work so manual gets properly updated and eventually document stuff you dig and that you need to figure out in the process.

All the stuff I would like in shader:

  • full cg compatibility:

    • I’m not sure how that’s work but I remember problem when using struct as in or out.
    • Passing more than one file as argument to the shader ( so that functions can be factorized in one file)
    • possibility to use semantic to name the uniform variable.
    • probably other thing that I’m missing. I want to be able to take a random shader and only add few semantic in the function declaration to make it work.
  • cgfx/colladafx compatibility ( import/export ) ?

This requires Panda to have an FX-based rendering pipeline. I think I might do this for 1.8.0.

it’s strange to me nobody mentioned to have high quality shadows but maybe I miss it is a task already done in 1.7, or maybe have nothing to do with this proposal.

That’s a Shader Generator thing, not a shader system feature - and yes, this is already implemented in 1.7.0.

ok I found it in the online manual, thanks - I was referring to a WIP manual-2010.01.09-4.chm that isn’t mentioned yet.

Wei-Feng Huang, was never heard from again…