Creating an off-axis projection in Panda3D

Hey everyone,

I’ve been working on this project for the past couple of months and my scouring of the internet has led me to here.

so what i am trying to do is create an off axis projection which is controlled by the users positioning in space.

the idea is similar in nature to this video which is an example of what i am hoping to achieve.

vimeo.com/9977433

So i have looked into doing this using just openGL but i cannot really get my head around the high level coding to perform this. Ive only been coding with C++ for around 5 months so this entire project is a majorly steep learning curve for me; I just hope the curve doesnt end with a cliff :slight_smile:

So i am already aware to create this distortion of the object i will need to create an off axis projection by creating an assymetric viewing frustum.

So i was wondering if someone could give me a few pointers as how best to approach this within Panda3D.

This may sound daft but i have literally been attempting to get a working concept of this for months using opengl.

Another solution to perform this bit of a cheat but i think it might work would be to:

place a camera in the users XYZ position pointing at an object.

Take this frame and project it as a texture onto a plane (the plane and the projected camera are in the same position as the above camera and object).

A camera (orthagonal) vertically above the plane captures the above projected image to be rendered to the screen.

I believe either system will work and for the sake of this project i dont think i mind which one. The second one although its more processor intensive is likely to be the easier to implement as i can actually get my head around it :smiley:.

hoping someone can help me to figure this out. Im currently working on completing the tracking system which should output an XYZ co-ordinate of the user anywhere within the the scene.

Thanks!!

Hi,
Interesting.
Are you tracking user’s precise eyes position?

Hi Jean-Claude,

The system is much like Track-IR and freetrack. so the system will be tracking the users head movements using a marker based headset. I am still working to get proper values from this system, however they are the same formulas employed from both the above systems so i am confident they will be sufficient.

Khaled

So basically you’ll provide the head 3D position in the room.
BTW. How will you cope with lighting?

I am testing my system in a light isolated area. so the only light is from the screen and the IR markers. although this is far from practical, i can look to improve the image sensing once i have proved the concept and that everything else works.

Hi again,

(1) If you want to use c++, I suggest you to have a look at openGL: glFrustum()

void glFrustum(
GLdouble left,
GLdouble right,
GLdouble bottom,
GLdouble top,
GLdouble nearVal,
GLdouble farVal);

You’ll have to specify the near plane, the far plane and the coordinates of the projection area corners with respect to the observer. This is the way to get an off-axis projection in openGL.

(2) Now, let me ask. What is specific about anamorphosis transformations? Would you have a reference paper on it? or a thesis?

jean-claude

Thanks for your reply Jean-Claude,

(1)…

Everything I have managed to read seems to lead to using OpenGL and glfrustum().

I have previously posted on the opengl forum about a practical explanation of glfrustum as I’m finding it quite hard to get my head around the implementation.

I have the theoretical understanding of how it could be manipulated to create an asymmetric viewing frustum (required for off-axis projection). I have not managed to manipulate any sample code to effectively get the view to distort.

(2)…
Im not sure I understand what you are asking. I believe the anamorphic distortion is a result of off-axis projection. The image distortion is essential in providing the corrected view from the users real-time perspective.

thanks again :slight_smile: