Automated texture mapping with geometry

Hello,

I currently have a node path object that is being dynamically regenerated each frame using GeomPath that I got working by following these instructions. I then tried to add a a texture to the object using automatic texture generation documented here; however, even if I use setTexProjector the texture is still tied to the world position.

I believe the issue is that while my geometry vertices are changing (and thus the object moves) the actual world position is still the same. Does anyone have any suggestions on how to get around this? Thank you for any assistance!

I presume that you’re attaching your geometry to a node, and have a NodePath for that node. Could you perhaps then place that NodePath at the desired location, and then build your geometry relative to it?

That should result in the geometry appearing at (effectively) the same location, but with the node that defines its position in the scene being located with it.

Sadly not. I’m integrating panda3d with a face tracking software that returns a set of vertices of the relative face location. I’ve already centered the object relative to 0,0,0 but Im having trouble converting the rotation to nodepath.

On a semi related note since the model is also constantly changing is there anything I have to be worried about in regards to auto uv generation. Thank you for your help!

Ah, okay, I see.

In that case, hmm… This is a bit of a guess, but looking at the texture-projection page, and specifically the example given of the use of “setTexProjector”, have you tried reversing the latter two parameters in that method?

My thinking is that the example given on that page has an object that has unchanging local vertex-positions, but that has a changing node-position. In your case you have the reverse: changing local vertex-positions, and an unchanging node-positions. Perhaps, then, your case calls for the projection to be reversed.

Otherwise, my main thought would be to examine your vertices and determine from them an average position that could be used as their node-position–but I worry that doing so might incur performance issues.

That I don’t know, I’m afraid, and so will leave to others!

Thanks for your suggestions! I looked into setTexProjector but it requires nodepath’s and not geometries. In my case both the render location and the nodepath located at 0,0,0. It’s the internal GeomNode data that is actually being changed.

I’m looking into just converting the rotation from the vertices into the nodepath. I’ll update later with how much of a performance impact it actually is.

1 Like

@Thaumaturge I’ve tried to do the following but as you predicted there was to much of a performance penalty.

Otherwise, my main thought would be to examine your vertices and determine from them an average position that could be used as their node-position–but I worry that doing so might incur performance issues.

Do you have any other suggestions? Could I manually calculate the uv map for the texture? Or would that also perform poorly

Hum. Tricky.

You might be able to do something like this in a shader without too great a performance penalty, I think.

The trick remains, of course, doing so in a way that is insensitive to the overall position of the object, while still covering the vertices of said object…

Hum. One more thought: It’s arguably a little silly, but it might be good enough: What if you take just the first vertex in the model and treat that as indicating your object’s world-space position?

This wouldn’t be strictly accurate, of course, but it might be close enough to serve!

@Thaumaturge first of all thank you so much for your help with this!

Hum. One more thought: It’s arguably a little silly, but it might be good enough: What if you take just the first vertex in the model and treat that as indicating your object’s world-space position?

This ending up being the easiest solution! My general implementation is as follows:

refernce_nose_vertex = LVector3(centered_pos_x[4],centered_pos_y[4],centered_pos_z[4])
refernce_nose_normal = normalized(2 * refernce_nose_vertex.getX() - 1, 2 * refernce_nose_vertex.getY() - 1, 2 * refernce_nose_vertex.getZ() - 1)

refernce_nose_hpr = refernce_nose_normal.angleRad(LVector3.zero())

temp_node_path = NodePath("temp")
temp_node_path.setPosHpr(refernce_nose_vertex, refernce_nose_hpr)
temp_node_path.set_pos(refernce_nose_vertex.getX(), refernce_nose_vertex.getY(), refernce_nose_vertex.getZ())


self.face_model_np.setTexTransform(TextureStage.getDefault(), self.render.getTransform(temp_node_path))

I’m running into a slightly different issue where the texture doesn’t look quite right cause the model changes but the texture is static. Do you happen to know of an easy way to stretch the texture to match the updated model? No worries if that is out of the scope of panda3d. Thanks again for your help!

1 Like

I’m glad that it seems to have worked out! :slight_smile:

Hmm… If you had some means of marking parts of the model as coming from locations in a “reference model”–perhaps simply being the first copy of the model that you generate, or perhaps being something hand-made–you might texture according to the “reference model” rather than the actual model. That should, I think, produce a more-consistent mapping.