Yes, should be possible. First, render an object to an offscreen buffer. Capture the image from the offscreen buffer as a texture. Use the “render to texture” sample program to see how to do this.
Then, projectively apply the texture to a model of a sidewalk. I don’t think we have a sample program showing projective texturing, but it shouldn’t be too hard… I think that it’s just a question of calling setTexture on the sidewalk, putting the texture in decal mode, and applying a TexGenAttrib to get the projective texture coordinates.
Woah sounds like I need to do some research. I was planning on setting up some sensor systems to track the users location so that they could move around the platform and as long as they’re about the right height it’d look like a hologram. Does the projective texturing just (conceptually) pass a light ray through the object and land it on another object?
Maybe I misunderstood what you were trying to do. I thought you were trying to make a scene containing a sidewalk, and the sidewalk contains a picture that looks 3D if you’re standing in about the right place, but it looks like a weird distorted thing if you stand anywhere else.
In that case, the important thing to realize is that it’s just a picture on a sidewalk. Plain old chalk drawing — also known as plain old texture mapping. The challenge, however, is that painting such a texture by hand is too difficult for most people.
So what I was basically suggesting was a way to automatically generate the texture. Once you have the texture, though, it’s ordinary texture mapping.
So the idea behind the render-to-texture business is that it’s the part where you generate the texture. Then, the projective texture mapping is the bit where you take the texture and put it on the sidewalk in the right place.