Panda3D and its use cases in GUI

Greetings! It has been a while since I created posts in the community! Recently, I have tended to develop an OS in Python. OS’s generally require low-level programming languages such as C, C++, etc. But, due to a lack of interest in learning those languages, I’ve decided to use Linux as a start and create an autorun script to run Python scripts upon boot. I have tried using Pygame for developing a GUI and Window-Manager, but have been met with struggle. This struggle has been due to the fact that Pygame only utilizes the CPU and not the GPU. My question today is whether Panda3D(which I know uses the GPU) can handle those tasks, such as creating surfaces, utilizing low-level GPU usage, and creating more graphical or 2D elements that are interactive to create a GUI for my OS.

Of course, Panda3D can handle the task. It all depends on what your requirements are, for example, a decillion buttons and so on,.. it’s hard to assume anything. However, there is an easier way.

It’s great to see you again, Serega! Thanks for explaining! However, Panda3D does contain limitations for my needs and may require further scripts from other coding languages such a C. Therefore, I may need to switch to the use of PyQt5 with some workarounds and Panda3D integration. Thanks for the help!

By the way, pygame also supports the ability to use a GPU.

Thanks, Serega! While the pygame shaders you provided me run on the GPU, other aspects of Pygame, such as event handling and certain drawing operations, remain CPU-bound. But I am still grateful for the information!

I think you should know that without the preliminary code on the CPU, it is not possible to get the result on the GPU. This is also true for Panda3D.

You maybe could give Panda3d Kivy a try:

GitHub - Cheaterman/panda3d-kivy: Panda3D add-on for Kivy integration.

Simple and powerful for nice GUI.

@serega-kkz That is true. However, PyQt5 renders its output using both the CPU and the GPU. Pygame is a wrapper for SDL2 that touches the GPU in minuscule amounts. It was only in later versions of SDL that the GPU was utilized more. Thanks!

@Melan Thanks. I have used Kivy before, but I require more well-known libraries with more features. Kivy doesn’t seem to have these features.