Rendering a pixel stream with python in panda?


Just started using panda 3d and it’s really neat, but I was wondering what the best practice for rendering a rapidly changing pixel stream is using the python api?

Specifically, I’m using the kinect python wrapper to read a constant stream of data (RGB + depth for each pixel). Effectively I have a memoryview with the pixel data in it, which is updated constantly by a thread running the background (yes, I’m running panda under 2.7; is that bad? It seems to work perfectly)

Anyhow, what’s the best way to render this?

I see there’s a but I can’t see any examples of how to use it from a pixel stream. I can render it frame by frame, generating a new texture each time, but that doesn’t see to be a good solution (and it’s slow).

I was wondering if there was some way to using the 2D rendering capabilities as a background?

This is obviously for an augmented reality application, so the 3D rendering has to take place on top of the rendered background image.

Any help or suggestions of where to look for some examples of this sort of thing?


Hi. welcome to panda,

if your streams of pixel registers as a webcam, you can use panda’s buildin webcam access. in case of the kinect i am not sure if the driver does that.
instead you may find this technique more useful: