Tracking Data Strategy

Greetings, I have been going down the path of developing a tracking system using the new spatial Phidget (1056 3/3/3) (for user orientation) and WiFi tracking (for user position).

When it comes to choosing a strategy for inputing the 6DoF data into panda I am puzzled about choosing a thread-safe method.

Does anyone have a tip (or better yet documentation) on how I should input my (yaw, pitch, roll) (x,y,z) data into Panda in real-time?

I should mention once this real-time data stream is assessable in panda I do understand how to apply it to camera’s and other objects for transform procedures.

So far I have been looking into VRUI. I read the thread about VRPN input and that is a possible option too.

This is for a grad-school project at Indiana University feel free to private message me too. Thank you for reading this.

I am really sore but i have no clue how this works. Did you try to contact Phidgits author directly?

I have talked with the Phidget guys and their device works very well in generating the realtime orientation data. That data gets paired with position data generated by a separate system I created that triangulates user position by measuring signal strength from pre-positioned WiFi transmitters.

The above system is refined and in a testing phase.

My question is really “How do I best import realtime tracking data into Panda3D being thread-safe”

What I have tested in Panda so far is I directly imported the “current” measurement for (yaw, pitch, roll) & (x,Y,Z) into .CameraPos(yaw, pitch, roll) and .CameraHpr(x,Y,Z). This way the camera’s position and orientation would be linked to the user. I assumed that this would just update every time the Panda3D script naturally looped. When the script is ran it works for a few seconds and then throws a runtime error. On one of the other threads about phidgets I read that this is because directly importing real-time data into Panda with the phidget library is not thread-safe.

I had a feeling that this method wouldn’t work. Generally in VR we use a “Tracking Server” that sends the tracking data over TCP/IP or LAN to a client daemon on the PC running the VR/Game Graphics engine and the daemon places the data into shared memory so that the graphics engine can reference it and redraw the scene graph accordingly. I have never done this in Panda3D yet