[UPDATE] - Fur - Deferred shading

absolutely :slight_smile:

So you want to change that “todo” into “done”, and in the mean time provide better sample models? Are you made of awesomeness or what? ;D Seriously, I’m more and more impressed with your work.

i am not easily impressed… but… in fact. i am impressed right now.
this looks seriously good.

while you are at it…
http://www.gamedev.net/page/resources/_/reference/programming/140/lighting-and-shading/a-simple-and-practical-approach-to-ssao-r2753?
hint**hint

I have a shader generator project ( craig.p3dp.com/MiscPics/fancy/ )

As I assume you may be noticing, having a lot of effects that differ on different objects in your scene can be a pain (writing awesome shader is fun; writing every version of awesome shader you need for you game is not). My shader generator is an attempt to provide a tool for managing and merging all these effects onto the various geoms in the scene. Unlike panda’s shader generator, mine does not have major performance issues when you change the renderstate (I don’t force regenerating shaders automatically, and my cache only looks at the parts of the render state than are specifically needed to generate the shaders, so updating shader inputs every frame works well).

Technically my shader generator isn’t really a shader generator, its more of an implementation of a meta-language for writing custom shader generators, which you could use to make a customizable deferred alternative to panda’s built in one. An example of is use is that in my game, some things have normal maps, and some don’t, so I used a conditional shader input node to capture the normal map, and use another type of conditional node to select from a sample of that texture, or a plain vertex normal depending on availability. The result is a code generation time switch that writes the correct shader for the requested geom. (I’ll mention that this is currently not working quite right, I think it has to do with the actor loading setup though.)

Anyway, when my shader generator is a bit further along, it can serve as a way for you to manage and share all your effects in an easy to use fashion. I’m not looking for much help with the shader generator, but I am looking for lots of help making effects/content/samples to use with it, and you seem to be good at that. If you have any questions regarding my projects, feel free to ask. I’d really like some input from a fellow shader programmer. I want to make sure I’m not missing anything important, and that its actually useful.

I also have tried and failed to have 16 bit textures/buffers.

Hey,

I’ll be happy to provide you effects and samples for your system, I need to clean the code before I post here. I’m currently having problems to deferr shadow maps. About your system do you plan to use different lighting models, and for buffer creation how do you plan to have different materials (for my game I’m planning to store materials information in the unused alpha channel as they do in various game, but for a general purpose generator it might be different). Nice screens you have there. More questions, do you allow dynamic branching in shader and what average user specs do you target with your new generator. It will be nice for your system if you or someone else finds a solution to set textures format.

My generator system isn’t really tied to any set of buffers/bitplanes. It works fine for forward shading, deferred, etc. I’ll even probably use it for the lighting pass and post process shaders in my deferred setup. Its just a shader meta-language. The complexity of the shader generators made with it up to the user. I’d like to provide both low and high end samples, and have a system for doing quality fallbacks for performance (simply dump a quality constant, perhaps as a tag on render, into the renderstate before generating the shaders).

I’d love to have a big pile of lighting models to choose from. Theoretically it shoule be possible (when I add the feature of composite/hierarchical nodes to the generator) to implement a generic lighting model node that lets you select the lighting model based on tags in the scene graph (ex: provide the coefficients needed for lighting model X, and you automatically get it). Such a lighting node could be used in both forward and deferred configurations (though you might have to settle for a single lighting model for the whole scene with deferred)

Regarding static branching, its possible to generate any possible shader code with my system (you can put a whole custom fshader and vshader in as single nodes if you want), so yes, you can use static branching, but its not too practical. The current system is really great at generation time conditionals, and handling shaders with complex data-flow, but its not very good at runtime conditional stuff (currently any conditional stuff needs to be contained within a single node in the shader graph). In short: possible but not good to use with sub nodes and such. I’m thinking about possible fixes for the design deficiency, and I consider it a major issue. (In fact, conditionals are currently the only major issue I see with the design, and I will find a fix, even if it adds a lot of complexity. The main/worst case it applies is if you want an early exit via discard, its not yet practical to force it to occur as early as possible).

Anyway, while its not too relevant to my shader generator, it is relevant to your thread, so here is how my rendering setup works:

Currently my game renders to 3 buffers in the first pass (Diffuse, normals, misc). Misc holds glow and specular information. I also have a (some what experimental) decal rendering pass where I blend in projected decals (using the same projection approaches as the lights). I couldn’t think of a good way for normal map them, so I don’t render normals in that pass. Currently that renders to a second diffuse buffer and second misc buffer. Those textures are then pulled in along with the normals for the lighting phase, which generates a lighting buffer thats used along with the normals diffuse and misc to do the final rendering of the deferred shaded objects, including cartoon inking and cell shaded directional lighting. That is then rendered into the window (on a full screen quad), the sky box is then rendered, and then particles and other transparent effects will be rendered (not implemented yet). Then comes the bloom/glow post process, which currently uses alpha, but will use my misc buffer once I start drawing transparent things.

I currently don’t have shadows.

I get the positions using the depth texture, and do the most basic normal encoding (norm/2+.5) and stick it in a regular 8bit per channel buffer. Not ideal, but without floating point/16 bit textures, that seems like the best choice.

Example use of my decal system: projecting symbols on the ground (shown in images). Perhaps an more interesting application is projecting damp areas (darken the color, increase the specular coefficient a ton, and the specular amount a bit), or flaming areas (project glow and red patches with no specular). I haven’t really had a chance to try anything with it.

Your code is interesting. Your setup process is very different from mine. I create a bunch of custom buffer and cameras (4 camaeras) explicitly and set them all up, configure depth sharing, make most of my own full screen quads without the provided filter stuff and such. I don’t touch any AuxBitplaneAttribs. I even see a setShaderAuto() in your code, which I don’t touch. I have maybe 4-5 shaders just to do a basic rendering (skybox, models, lights, decals, inking+directional lights+assorted) and I’ll need a lot more. My module for just setting up the buffers, cameras, display regions, shaders, lights etc. is just over 400 lines, more than double your entire code.

By the looks of your shader, you don’t know about CG’s ‘lit’ function. It and lots of other goodies here: http.developer.nvidia.com/CgTuto … dix_e.html

Edit: I forgot, specifically regarding materials. On load I convert the material object from the egg files (exported from blender) into shader inputs. I also allow a material texture to be blended over this if provided. My shader generator could potentially make it easy to support models with other approaches, such as vertex data, tags, or something else. Mostly this data just ends up packed into the misc buffer. One good approach is to have a list of materials (256 of them) and use an index and lookup table (texture) for the data, but that required more setup so I didn’t bother.

Import working, had to scale the normal map according to the scale of the model. I’ll look at gpu particles now, and when finished will clean the overall codes and post.


http://youtu.be/TczKcE0iEyY

A question : how to get the maps directly from the .egg ?

At the moment, I’m sending the various maps as inputs to the shaders for each model, is there a way to have a simpler system (import a full working blender scene with multiple textures would be the goal).

I have a rather complex model importer that does a lot of things (especially for my multi part actors). On load, I search for all the textures, and attach them to shader inputs. I have a texture naming convention, so whatever_norm goes on as a normal map etc. Then I use my shader generator to take the existing shader inputs and tags and generate the correct shaders. I’m planning to have a converter application that does most of this work, then dumps the models to bam files. Stuff that does not fit in bam files (shaders and shader inputs) will be put in tags and fixed on load. I believe the normal maps, if setup correctly, are available as some special shader input, but I like to only use my own manually setup ones.

Ok thanks.

I tried different particles rendering system. The particle system seen in the panda samples is nice but it gets slower when particles get closer to the camera (overdraw), apparently it’s a problem with all engines. So I had a look at gpu gems 3 on nvidia site and here is the result :


(particles ideally should come from the bottom of the ship :laughing:)

I use a texture buffer for particles, the resolution is 1/8 of the main window, I have no more slowdowns now ! (from 30fps to 120fps). However I coulnd’t make soft particles work, maybe because I render all the particles to a single texture buffer and composite it with the final lit scene. The bloom effect helps to reduce square effects anyway.

I’ve reused the cloud system from Flock author to make it work with deferred lighting, I’m using a few quads for each cloud, the lighting changes with the direction of the sun.


(only one texture at the moment)

Also there is some halo effect around transparent textures, and I can’t get true transparency with the deferred system.

Some preview of the upcoming samples :

VIDEO :
http://youtu.be/I4qzmBOV2ig

Still need to find a way to place lights visually…

You’re doing a great job. Really hope you’ll consider contributing these to the Panda3d source in the future. :slight_smile:

If I understood you correctly, you want to move the lights around with the mouse cursor? I could help with that.

Wow, the material on those bamboos looks astonishing. Can I get a bigger render without the sliders so I can use it as a wallpaper? I’m not kidding, I’d really like to ;D.

Just like redpanda, I really hope this will make its way into Panda itself.

Keep up the good work :smiley:

Thanks :slight_smile:

I’m happy you recognized bamboos, lol, the normal map details are a bit too high frequency and the “joints” shouldn’t be so opened, I need to redo the uvmap as its only using 1/4 of the map atm.

Yes, the codes are coming, they must be ready and clean before I begin my game actually. The bamboo samples will be free to the community as well. Some animal is coming too :stuck_out_tongue: I need to make the fur.

Yes, atm my code is not clean at all and I need to place lights a bit randomly by code, I’d like more control like in a 3D modeling app. When I use place() function, I can’t move the camera anymore :frowning:

Coppertop :smiley:

Hahaha, great, thanks :smiley:.

Still WIP :

Wow! Really nice work! Keep it up!
The images should go to the gallery now! =P

Thanks man.

Update on the ears, nose geom and texture.

Orthographic view:

Maj:

  • Now low-poly.

If you spot some errors or have any crazy idea for pose and final scene, feel free to comment.

It would be nice to have in the Online Demos section I think, to show that Panda is capable of using modern shaders.
Would love to see pandas travelling on the space ship to go get some bamboo. :slight_smile:

Fur tests (blender) :

It’s using shell rendering technique, not sure about the result.
And I’m having huge problems to export correctly the transparent materials to panda :frowning:
I think I will use fur rendering by code, less control but faster and working.
The nose and cranium is still baby like.

Shell Fur by code :


15 layers.