What do you want in a shader generator/editor?

A while ago I wrote my own shader generator for Panda: https://github.com/Craig-Macomber/Panda3D-Shader-Generator
Blueprint: https://blueprints.launchpad.net/panda3d/+spec/more-customizable-shader-generator

Recently the topic came up again, and rdb pointed to this: http://www.blender.org/typo3temp/pics/923052fa19.png
(A video of a similar concept located by Anon here: moddb.com/engines/source/new … n-released )

With Panda 2 in the works: https://launchpad.net/panda3d/2.0.x
I figured I should sort out what it really is that we want.

How about this:
A graphical editor (like rdb showed), but with separate stages for each phase of the shaders (vertex, geom, and fragment). Graphically between the stages would be shown any intermediate values (passed between the shader stages), and the list of them would be editable.

The editor presents a directed graph of nodes, which can have an arbitrary number of inputs and outputs. Collections of nodes in this graph can be packaged up into composite nodes for easy reuse and organization. Composite nodes could be double clicked to edit their contents. Each node is a code generator, that can use the render state and tags to choose what to generate.

Some nodes in this directed graph would be shader inputs, including tags (shader inputs with no associated value, but simply placed on nodes to mark them). When applying the shader generator to a NodePath, parts of the shader graph that depend on shader inputs that are not present will be considered deactivated. To deal with this, there will be some special node types in the shader graph, such as one that takes a value that may or may not be active (not disables) and a default to use when the input is deactivated. This will allow it to do things like the current shader generator does. If there is a normal map, it will activate the normal map nodes, if there is a color scale… if there is a gloss map… but it will also enable more complex stuff, including user defined effects.

One example would be to tag node paths for tessellation in the geometry shader, or to write a custom vertex shader effect for wind deformations.

The editor is provided in such a way that you can import it into your game, and ask it to put up a live editor. It would also be able to run as a standalone, or perhaps as a live editor on something like pview to generate and save shader generator configurations.

Design:
Current approach is to implement a graphical shader meta-language (a language in which shader code generators can easily be written). With no conditionals, it amounts to the same thing as linked there, but conditional nodes allow you to generate different code depending on what textures, shader inputs, and tags are present. Hand written conditional logic (in the form of NodeType subclasses), custom NodeTypes with CG code from the library files, and graph files showing the configuration from the editor together make up the source code of the meta-language which is compiled at runtime into a ShaderBuilder object which you can use like panda’s existing shader generator to generate the desired effects of different nodes.

I’ve designed a powerful system for efficient code generation time computation and conditionals. The links in the graph structure get status values as the graph is processed to generate the code. This allows avoiding generating the parts of the code that are not used, as well as potentially allowing customizable generation time computation. This could be used for things like pre-computing constants that get compiled right into the CG code.

Open Issues:
What about CPU side effects? Multi pass effects? Post process? Animation? The best approach for things that need to work together between CPU and shaders would be to apply the effect on the CPU, and include a tag on the nodes that the shader generator can pick up on and act accordingly. An example would be billboards, they get a compass effect and some tag so the correct shader can be made.
Post processes could be done just fine with the shader generator applied to a full screen quad + Panda’s existing post process tools.

So what do you want from a shader generator/editor? Does this cover it?
Please post comments, questions, suggestions and issues.

I think Unity has a similar tool, could be used as reference.

Not like Ive used a similar tool. But if there was, all Id want would be the possibility to load you own shader files and set how they would work with the existing effects.

A separate Menu could exist for fullscreen fillters, where you could set their order like layers in a Image editor, maybe allow to change their combine modes (like Multiply, Divide Screen, Overlay, etc)

Filters really are a separate issue, but their shaders can be generated with this shader generator.

I just realized I can have special nodes in the shader graph that if they are enabled, they do something special at the Manager or Applier level, such as render modes. That provides at least basic toggle functionality which could be hooked up to custom code provided through a callback system. Then if you wanted an effect like a CPU side vertex deformer, you could write the code, register the callback, then load the shader config that turns it on for the nodes that need it.

Status report: I’ve gotten a lot of the source generator written, as well as the effect selector. Still tons to do, and I’m going to have to work on other project for a while.

When implementing these ideas in C++, I was thinking of integrating it more fundamentally, in a way that’d require changing fundamental parts of Panda3D’s design (perfect for P2).

I was thinking of using the existing data graph system to implement a ShaderNode, which represents an individual node in the image you showed.
The Shader constitutes the whole of the shader, basically it manages inputs and outputs of the shader as a whole and hooks up inputs to subnodes.
The data graph structure would be used for managing inputs and outputs.

Every shader node would override the virtual generate_glsl (etc) method, it’d look something like this:

DivideShaderNode::DivideShaderNode() { 
  define_input("dividend", Colorf::get_class_type()); 
  define_input("divisor", Colorf::get_class_type()); 
  define_output("output", Colorf::get_class_type()); 
} 

DivideShaderNode::generate_glsl(ShaderCode &code) { 
  code << "$output = $dividend / $divisor;"; 
  return true; 
}

Panda would automatically parse $'s in the code and replace them with the actual variable name that it’ll internally assign to that output (in case there are multiple nodes that use the same names).

Then there would be shaders that are composed of other shaders. For instance, you could group the group of nodes that together make up the “bloom” postprocessing effect and call it the “bloom” shader node. These nodes will all be reparented to the parent “bloom” node.

Reminds of LOVE’s tools:
youtube.com/watch?v=DPIA2g8T6Hw

(starts at 4:47)

Right, that’s the system used for shaders by a lot of people. It’s the more artist-friendly approach to writing shaders, because artists often have no idea how to write a single line of code.

Now that I think about it, it does make more sense for the shaders to be created by the artists, like textures and materials are.

Right, which is why with Panda 2 we’re aiming to support tools like Mental Mill and FX Composer.

You’re doing a great job, Craig :slight_smile: I’m a newcomer to Panda, but I’d like to incorporate shader generator in the art pipeline of my project from the very beginning. Is there any estimation as to when some stable examples would appear? It seems that 1st May commit is broken:

D:\...Panda3D-Shader-Generator-ed1f3ae>ppython main.py
Traceback (most recent call last):
  File "main.py", line 3, in <module>
    from shadereffects.txteffects import loadEffectFromFile
  File "D:\PROJECTS\Panda3D\ShaderGenerator\Craig-Macomber-Panda3D-Shader-Generator-ed1f3ae\shadereffects\txteffects.py", line 3, in <module>
    from shadereffects import ShaderEffect,ShaderParam,ShaderEffectParam,shaderParamPlaces
ImportError: cannot import name ShaderEffect

I’m not an expert programmer and my knowledge of shaders is very shallow, but is there something I can do to speed up the process? 0:-)

As to suggestions, I think new shader generator might make Panda rendering pipeline more transparent and uniform.
For example, all render attributes and effects could be generalized as just particular types of ‘shader inputs’ and ‘shading nodes’.
Perhaps some nodes may have fixed-pipeline implementations along with shader ones? (aka Ogre’s “materials/techniques” or Unity’s “shaders/passes”)
The modular approach is great! Ideally I’d wish the whole rendering could be built as a graph of gray boxes, which in turn boil down to primitive atomic bricks :smiley:

[A bit off-topic]:
By the way, currently Panda has a system of StateTags to apply specific rendering for specific objects rendered by specific cameras, but it’s not very flexible. Maybe shader generator can cope with that too? :slight_smile:

I figured I should mention that my attempt at a fully customizable shader generator with a nice GUI is going very well. The GUI started, but not too far along. The shader generation code is done except than I need to port my auto semantic matcher over from the old version. The collection of supported effects is basically non existent for now, and the tools to integrate with with the scene graph arn’t started either. It does generate usable shader though.

Current progress can be viewed on this branch of the repository: https://github.com/Craig-Macomber/Panda3D-Shader-Generator/tree/v3

There are also some updated descriptions on the blueprint I linked in the first post: https://blueprints.launchpad.net/panda3d/+spec/more-customizable-shader-generator

I tried to run yout test.py script, but there is only one button in the whole window and its rotated along the Roll. I dont think thats how its intended.

Using buildbot panda, Win7 64bit.

Thats the editor window, and it does not really do anything yet. Thats actually a stack of button onto of each other. Behind it is a demo scene with a generated shader being shown on a panda. I should probably make the editor window smaller than the demo window so you won’t miss it.

I forgot to push my fixes for dragging buttons, I’ll push those up tonight. (Edit: now pushed)

Thanks for testing it. I haven’t tested it on windows yet, so its nice to know the file path stuff didn’t break.

OK, now its fixed.

Are you using DGUI for development? I think if its a different window you could use wx or tk.

Hm, maybe its a bit late to mention this, but wouldn’t Blender a Blender addon be alot easier than writing an editor from scratch. As far as I know you can access the shaders somehow. the drawback would be only GLSL. just wondering

I’m using Direct GUI. You are right that I could use other things, but I need mostly stuff thats not standard GUI elements anyway. I think I might offer an optional add-on that uses wx for file browsers, but in general I can get by without it and I’d rather avoid the dependancies. I want it to be a light weight drop in. I think it will get more users if the install process is put this tiny folder in your app, import it, and call a function and you get a GUI to setup your effects. Pass it a NodePath and it can apply the effects, and/or preview them live.

OK.
Is this planned to be ported to C++? Is it planned to be included with the engine? Or as an “example” like the Particle Panel?

anyway, keep up the good work

I personally haven’t planned a port, but I have kept the possibility in mind. The code base for the actual shader builder is pretty simple, and could be ported pretty easily I suspect. As a python based developer, I don’t feel any need to port the editor portion.

This project is entirely driven by my personal needs, and I don’t do C++ coding (yet), and don’t need a C++ version, so I won’t be porting anything for now.

The main issue I see in porting the shader generator is the NodeType subclasses (They will be used to do effects that are conditional based on the RenderState of NodePaths, such as use normal mapping if there is a normal map texture). Some people will probably write some in python, and they wouldn’t work with pure C++ stuff which would make a split in the shared community of configurations and effects.

To avoid this effect, I’ll provide some pretty general/configurable NodeTypes that can handle most things, such as checking for the presence of arbitrary textures and other shader inputs. Then as long as no one needs additional custom NodeTypes, everyone in python and C++ land can share all of their effects/nodes and configurations.

might be helpful as reference: moddb.com/engines/source/new … n-released

Thats a nice example. Thanks. It does not do what my shader generator does, but the UI is the same (so its a great example there), and the application is related.

I’ve updated the description in the first post, and the linked source repo has been getting some decent progress of late. I’m using the shader generator in my main game project, so it will be getting some real development and use. I won’t recommend it for use yet though.

To be clear, I’m not making a tool for writing shaders using a GUI. I have a good text editor and I like writing shader code. If you have looked at the source code of panda’s shader generator, I think you will understand the issue at hand: its complex, and hard to extend/customize. I’m implementing a programming language of sorts in which such code is easy to implement and extend. My language I believe is classified as a declarative Meta-language. It has a GUI editor in which you drag around and hook together code generators, not fragments of code.

are you thinking of maybe using a GUI library like wx in the future? I don’t think DGUI can’t handle this, but I think it will require more work than is needed

The GUI does not look too hard (I already have draggy boxes with arrows, I just need to make it editable), and if I want any sort of shader preview graphics like in that video you linked, I’d have to do it with 3D graphics.