Hello, i have question with panda3D, can i use 2 GPU ?
for example, i load my first model in my AMD GPU and load second model in my Intel graphic card.
the question may be stupid, but with PyOpenCL for example i can do calculations in parallel on multiple GPU.
That can’t be done with any engine (that I know of).
ok, and for sli or crossfire, it’s possible ?
Sure, you should be able to use multiple GPUs in Panda. There’s no inherent limitation in Panda why you couldn’t create two different graphics contexts (windows or offscreen buffers) on different graphics cards. In Panda, you can even use different graphics APIs (DirectX and OpenGL) simultaneously if you were so inclined.
I have never used SLI/crossfire, so I wouldn’t know about that.
This should work fine. SLI and crossfire work at the driver level, and the most common form is AFR (alternate frame rendering), which does what it sounds like- gives alternate frames to each GPU in turn.
From a programming perspective, your code does not need to know about SLI or crossfire, although you can optimize things if you do. In DX11 you create a single normal D3D11Device object and swap chain like normal. Most likely Panda3D already does this normally and it works.
You don’t need to know about DXGI devices, unless you want to use the GPUs independently outside of the normal SLI/crossfire frameworks. You’d do that for the Intel+DiscreteGPU case for example, but you can’t do this when SLI is enabled, because the driver takes control of the second GPU.
maybe panda3d support this when Vulkan support multi-gpu.