Automatic generation of vertex normals?

So I’m procedurally creating some geometry, some terrain say, and I want to light it with a directional light, so that slopes and hills will be shaded differently to flat parts. Then I need to add the normal as well as the position for each vertex in my GeomVertexData to make the lighting work. I’m told that Panda can calculate the correct vertex normals automatically, but I’ve searched and cannot find how. Is it true?

If Panda can’t calculate the normals for me, then I guess I have to do it myself. I’m building my model out of trifans. So I guess I could take any two of the three points of a triangle and get the surface normal of that triangle by computing their cross (or is it dot?) product. Then apply that normal to the three vertices of the triangle. The problem is that many vertices belong to two triangles in a tristrip, so then you have to somehow interpolate between surface normals to get vertex normals, which I’m guessing is not difficult either. Any help here?

Panda can calculate surface normals for you if you use the egg library to generate your geometry. If you’re creating the GeomVertexData structures yourself, you’re on your own.

You’ve got the right idea. The cross product of any two edges of the triangle makes the surface normal. To smooth the normals shared by multiple triangles, just have to average the common normals (or, in other words, add them all up, then normalize them again).


Using the EGG interface it would be:

EggGroupNode.recomputePolygonNormals( )
EggGroupNode.recomputeVertexNormals( ) <---
EggGroupNode.recomputeTangentBinormal( )

But of course this doesn’t help if you are writing GeomVertexData.

Thanks folks.

That’s interesting. I’ve been looking at the Egg library stuff in the API. It looks usable, and if I do use it I might even expand on the section in the documentation about it.

I didn’t realise that you could use either polygon normals for a faceted look (I assume each polygon gets illuminated differently, revealing the polygon structure of the model) or vertex normals for a smoothed look. Is there a way to use these polygon normals if you’re generating geometry with GeomVertexData etc.?

I’ve been thinking though, and the Egg library might actually not be the best thing for me. In the future I may want to dynamically deform my geometry during gameplay. I think that means changing vertex positions in a GeomVertexData and then recomputing the normals as necessary.

The Egg library seems to be an indirect way of generating a model – you create an EggData, write it out to an .egg file, then read the file in to get your renderable geometry and NodePath. I’m sure there is some way to get at the individual vertices from this geometry read from file. But if you’re going to be moving vertices around, I guess you should probably use the GeomVertexData/GeomVertexFormat/GeomPrimitive/Geom/GeomNode approach desribed here. This means you have to deal with individual triangles and trifans or whatever instead of just specifying your model as polygons, and you have to compute your normals yourself, but once that’s done you already have direct access to the GeomVertexData with the vertices in it the order you put them there.

Am I right in thinking that if you want to deform geometry at runtime, you’re better off generating it the hard way with GeomVertexData etc. instead of with the Egg library?

Of course. After all, the egg library is just creating GeomVertexDatas at the end of the day. If you wanted to create a faceted look by hand, you would have to replicate the shared vertices between different triangles, so that each triangle would point its vertex in the right direction. Alternatively, you could call setShadeModel() on your GeomPrimitive to either SMFlatFirstVertex or SMFlatLastVertex, and then also set an appropriate ShadeModelAttrib on the node (but this approach is more complicated, and isn’t obviously better than just replicating vertices).

That’s generally true, though you don’t actually have to write out an .egg file. You can convert your EggData directly to renderable geometry.

I would say that’s generally true. The advantage of using the egg library is that it’s good at doing all the fiddly little things necessary to create geometry (like smoothing vertex normals and putting together attributes). But it doesn’t leave you with a lot of control over the result.


Hmm… I can’t see how to do this

Never mind:

pandaNode = loadEggData(eggData)

where eggData is your generated EggData object. loadEggData is a global function. Then parent the pandaNode to a NodePath in the scene graph.

Hi, we want to do exactly the same thing than Chombee; but we have a problem to access the VertexData in the Egg library (to flatten some parts of the heightfield for an exemple)

Is it really impossible to access the VertexData in the Egg library???


What you’d call the vertex buffer, or vertex data is named VertexPool in the egg lib : a list of vertices with some information per vertex (position, normal, color…).