FFGL for panda. is this possible?

I would ideally like to be able to embed instances of the panda engine in FFGL plugins… is this possible?

If not, could Panda3D be made to suport FFGL ouput?
There are a number of opensource examples of this being done, here are a few:
vvvv.org/contribution/directx-freeframegl-bridge
ni-mate.com/ni-mate-v1-1-rel … -bpm-2012/ (unclear where source is)

In the end I want to integrate Panda3D with Isadora, troikatronix.com/

any guidance would be great… thanks.

DusX

Perhaps you could explain a bit more about what these systems are and do and how they interact with other applications?

The systems are media control packages, probably the best known being Apples quartz composer, but many others exist and are widely used for creating interactive video. For art, theatre, concert visuals and more.

FFGL is a plugin standard that is widely used by these software packages… freeframe.sourceforge.net/
Syphon, is a Mac only options for sharing video between software packages… syphon.v002.info/

I use, a software package called Isadora, it can communicate via OSC, Midi, Serial, HID, TCP/IP, FFGL, and Syphon. However I am on a windows machine, so I can not use syphon.

The link I included to vvvv is another visual programming package that supports similar IO. vvvv is windows only… and has had much development of FFGL. The link provide information and source files for a FFGL plugin to share video from vvvv to Resolume (a popular package for live visuals for nightclubs and concerts), but outlines changes needed to support other software.

I hope that helps you help me :wink:

So what role would Panda3D play in this? Would it use the FFGL API to render content so that it can be used by these systems? Or is it the other way around - would the FFGL API control Panda3D’s rendering features?

In what environment/circumstances would Panda3D run? Would it run as its own process as normal, or would it be embedded in some way in some sort of plugin architecture?

Sorry for the questions - maybe someone else understands these types of systems better and might be better able to help, because I don’t understand the majority of the terms you’re using. :frowning:

I primarily imagined Panda3D outputting video via one of these methods.
In the case of FFGL, it could also except some control parameters so that for instance Panda3Ds render could be controlled via data from a microsoft Kinect (via Isadora).
However, as FFGL it is also possible to input a video stream (more common than not) and then effect it before output. In this case possibly the video in would be mapped by Panda3D to a series of 3d objects as a texture, and the rendered video would then be output again to Isadora.
My understanding is that Panda3D has very few dependencies, so I hoped that it would be possible with Panda.

So FFGL is mainly a bridge library that would be used to pass video data into Panda3D? And you would imagine that Panda3D acts as some sort of host for plugins that process the video data via OpenGL?

Normally, video input would be implemented by subclassing MovieVideoCursor (like FfmpegVideoCursor or OpenCVVideoCursor), but in this case, it would seem to require interaction with the OpenGL API.

Hmm, then you would need to put your code into the draw thread so that it can make OpenGL calls. Panda has a feature for this called “draw callbacks”, which would allow this perfectly, though it is typically implemented at the node or DisplayRegion level, and not at the texture level. Well, perhaps that would make the most sense anyway - to implement some sort of draw callback on the texture level that allows you to call arbitrary OpenGL code before a texture is used.

Perhaps we should add some sort of virtual draw_callback method to Texture that takes a TextureContext and would allow a specific implementation (FFGLTexture?) to do preprocessing on the texture before Panda renders it. In this case, you would then use that opportunity to take the FFGL input and let the plugins perform all their operations on the texture.

I still only have a vague understanding of this system, and I’m just thinking out loud here, but I hope I made any sense. I’m willing to help you wherever I’m able though.