[UPDATE] - Fur - Deferred shading

By code is a lot better.

VIDEO :

http://youtu.be/A2Z_uV_r8H0

Are you using some kind of image mapping for the fur?

Hey, actually it is the same mesh rendered multiple times with a alpha texture, I will post the code soon as it is not using deferred shading system and my code for deferred is still messy.

It is called shell texturing, the mesh is copied over and over and displaced with shader along the normals, it is an optical illusion, one strand of hair is made of multiple spots/layers.

See the effect with 5 layers :

I used this primary : http://www.xbdev.net/directx3dx/specialX/Fur/index.php

In panda it is a bit messy to implement, I don’t know if it is the optimal way to do it : I have a loop that summons 15 times the panda model and apply the shader on each instance with the layer id as input.

for i in range(15):
		fur = loader.loadModel("panda_fur.egg")
		fur.setShader(Shader.load("fur.cg"))
		fur.setShaderInput("layer",i)
      fur.setShaderInput("color",loader.loadTexture("panda3D_Volume1_UvSet0_color.tga"))
		fur.setShaderInput("fur",loader.loadTexture("panda_fur_alpha.png"))
      fur.setShaderInput("distance",self.distance)	
		fur.reparentTo(render)
		fur.setTransparency(True)
		self.furs.append(fur)

Update task :

def update(self,task):
	
	for i in range(15):
	    self.furs[i].setShaderInput("gravity",self.gravity)
	    self.furs[i].setShaderInput("distance",self.distance)
	    self.furs[i].setShaderInput("alpha",self.alpha)
	    self.furs[i].setShaderInput("atmos",self.atmos)
	    self.furs[i].setShaderInput("move",self.move)
	    self.furs[i].setShaderInput("atmos_bool",self.atmos_bool)
	return task.cont

Atm there is no lighting on the fur, I need to implement the equations here : http://developer.amd.com/media/gpu_assets/Scheuermann_HairSketchSlides.pdf

You can fake fur lighting with classic phong and the proper normal map too : http://www.paultosca.com/varga_hair.html

I will test it with deferred shading even if I know it won’t work :stuck_out_tongue: I’ve read shell texturing is really expensive with pixel shader lighting, optimal way to render it seems to be old vertex lighting.

Fxaa works ok with fur IMO :

[color=red]without

[color=green]with

EDIT : I can’t indent properly the code on this post :frowning:(

With phong light :

A bit slow :frowning:

Grass !

http://youtu.be/TdSlg7R_Pe8

Needs a skybox now.

Both the fur and the grass have a very stylized look ATM. I don’t know if that was your intention or not. If it was, it’s definitely distinctive. Especially the grass.

I would also like to add that it looks much better on a video than on a screen shot, and the fur looks better from larger distance (like on the last screen shot) than it does on closeups.

If I could have a suggestion, though, it would probably look better if you tried to make the grass a little less dense somehow. Right now, there’s so much going on it looks like constant motion blur.

Anyway, I still can’t wait to put my hands on this stuff.

I’m happy to get some feedback !

Yes, you’re right about the stylized look, actually I like it, it looks more painterly than realistic imo.

Maybe it comes from the diffuse texture, because I painted them by hand (mouse), they are not made from photo. I think you can get more realistic results with more realistic textures, I don’t know.

I’m having some fun with kajiya & kay lighting model atm, screenshots incoming.

Edit: I think the motion blur effect can be reduced but it can’t be completely avoided, this shell texturing technique is the same than in the shadow of the colossus, I remember the fur was like motion blur the first time I saw it :frowning:

Yes, it does.

I wouldn’t say it came from the texture itself. I’d rather say it’s a matter of the size of the individual hairs on the fur (and thus the number of them). If they were thinner, it would look more natural. If you were able to achieve this, I guess you could also make it possible to set the level of painterly vs. realistic look.

Additionally, it would be good to reduce the visibility of the layers on the outline. Dunno if that’s possible with this technique. I guess that’s a limitation that becomes apparent under such angles.

It probably can’t be avoided completely, but I guess reducing the density of the grass, if possible, could make it look a lot better. Couple with making the stems thinner, just like with the fur. At the same time, adding more layers could also help, but I don’t know how that would affect performance.

Note, however, that I’m quite ignorant in this field (pun unintended :wink: ), so I might be completely wrong ;D.

That all said, I guess you should make a screenshot of the grass from another angle, because profile might not be the most fortunate in the case of this technique. It seems to expose its weaknesses instead of its strengths, like on the panda’s outline.

I followed your advices I hope.

I have tried the kajya & co lighting model, it looks bit wrong because I’m using normals instead of tangents, as I can’t get tangents to display properly.

Different furs

[color=red]BUG :
I have some bug you can see in the video too, when the camera comes behind the model some shells disappears, I have tried all z-testing mode (zLess, zEqual…), setDepthClearActive and different culling modes, I’m really out of ideas here. Btw, in some papers they do z-testing in different passes (first pass zLess, second pass zEqual for example), how can I achieve this in panda ?

Bug (the white is normal, what is not is there are some hairs disappearing)

Grass

(no vomito smiley on this forum?)

[color=blue]VIDEO :
http://youtu.be/dfKQ75N1Rns

It looks nice :smiley:. It would be even better if the color on the fur was slightly more varied. That way, you could add more depth to it, make it look like hairs were casting shadows on each other and reflecting light in different directions. It seems like that’s already the case, but it would be better (I think) if it was a bit more visible.

Other than that, the Panda looks cool and I’m guessing it’s now possible to shift between more painterly or more realistic look. That’s great.

As far as stuff disappearing. This is a long shot, because I have no idea how this shell texturing works internally, but I use depth offset on my decals. Maybe that could help here too?

I have no idea, but the panda’s Decal Render Effect (panda3d.org/manual/index.php/Render_Effects) does some Z-buffer voodoo. Maybe you could get some hits from that at least.

As far as the grass goes you could make more shell layers, thiner grass stems and, most importantly, make the grass stems become thinner with height. xbdev.net/directx3dx/specialX/Fur/index.php like on the second image here.

Use DepthTestAttrib and DepthWriteAttrib to manage the control of the z-buffer.

David

Hey, thanks for comments.

I’ve looked at some videos of real pandas, the nose and cranium I made are one of a baby panda, but the body is adult like :frowning:

I’ll add bones in blender to correct this, so you can have bigger angular trapezoid nose and more perpendicular higher cranium to get adult look. With this we can have easily baby panda and adult panda from the same model. (I’ve just realized their ears and nose are still growing significantly after birth like humans, lol :arrow_right: )

I really need to have a look at projected shadows now, as I couln’t make them work correctly in deferred shading; atm the lighting of the fur, grass and general objects are made with custom forward pixel shader lighting, but I think I will try to use all of the lights and shadows from panda3d for this scene/sample, it will be easier to customize and finish (and it will use basic shaders only).

At the end, the dof, bloom and color correction shaders will add the modern look you were talking about, and it will be for users who have higher end graphic cards.

[color=red]Question:

Some question now, I’ve read this paper to get a better lighting of the fur : http://developer.amd.com/media/gpu_assets/Scheuermann_HairSketchSlides.pdf, in this paper they use 4 passes to render geometric hairs :

Optimized Scheme: Pass 1
Prime Z buffer with depth of opaque hair regions
• Enable alpha test to only pass opaque pixels
• Disable backface culling
• Enable Z writes, set Z test to Less
• Disable color buffer writes
• Use simple pixel shader that only returns alpha
• No benefits of early-Z culling in this pass, but shader 
is very cheap anyway
Optimized Scheme: Pass 2
Render opaque regions
• Start using full hair pixel shader
• Disable backface culling
• Disable Z writes
• Set Z test to Equal
– Z test passes only for fragments that wrote to Z in pass 1, 
which are the opaque regions
• This and subsequent passes don’t require alpha 
testing and thus benefit from early-Z culling

etc.

Do you think it worth the pain to do it, and how can I achieve this in panda, do I really need to make 4 textures buffers with hair drawn once on each and do the tests they are talking about.

Thanks for reading.

Have you thought of writing your own shaderGenerator by subclassing the shaderGenerator class? panda3d.org/dox/python/html/ … rator.html
I think it would be easier as you wouldn’t need to write your own normal map shader and similar shaders and you could use normal and other maps from the egg files.

Making 4 passes in Panda really does mean setting up 4 different cameras with 4 different DisplayRegions. Unless you can render your passes on top of each other, it also means 4 different offscreen buffers.

We are working on redesigning this in Panda, to greatly simplify setting up advanced techniques like this (in conjunction with adding support for CgFX files and related abstractions), but the redesign process is slow-going, I’m afraid.

In the meantime, it’s not that much trouble to do this with, say, a function to set up each pass, that you call 4 times.

David

Ok thanks drwr.

Redpanda, I will wait for the generator Craig is writing atm :smiley:

I’ve integrated the same shadow mapping as in the samples into the fur shader. Each strand of the fur can now cast shadows but it’s a bit slow and I need to change finely the bias for a correct result. I think having only one panda casting shadows will be ok, with a few artifacts.

All layers casting shadows (1024x1024 map):

http://youtu.be/DpGC9QDMAJE
The lighting should be better than previous vid.
PS : no fxaa in the video & screen.

Was that shader generator going to be included in Panda?

Anyway, I think you shouldn’t make shadows completely black.
How about adding smooth shadows next? :slight_smile:

VIDEO:
http://youtu.be/S-7xSgAqlOQ

The fur looks so great I want to hug this panda. :smiley:

But the grass really needs more layers.

Hey dude!
This is upper-dupper-super-mega awesome :smiley:
No more words … you´re really a devil :wink: