Hi, I’m experiencing an issue incorporating multiprocessing into my game. Almost everything in the game is procedurally generated. It’s quite cpu heavy and I don’t want to run it in the main thread blocking the UI. Instead I implemented a background job manager which receives requests from the main thread. The requested task is stored in a multiprocessing.Queue from which the child processes fetch their tasks. When they complete the task they return the result along with the associated task id via the output Queue to the manager. The whole thing works very well.
I was trying to share an ad-hoc build so I did python3 setup.py bdist_apps. When I launched the binary it seems that every time the game spawned subprocesses, instead of them doing their task loop they each executed the
if __name__ == '__main__': logic and started several new instances of the game (one for each subprocess). Then each of those started their own subprocesses recursively, cascading until my laptop crashed.
Reproducible example (Have a
pkill -f ready in another terminal if you build and run the binary):
from time import sleep from direct.showbase.ShowBase import ShowBase from multiprocessing import Process def subprocess(): while True: sleep(5) class App(ShowBase): def __init__(self): super().__init__() self.subprocess = Process(target=subprocess, daemon=True) self.subprocess.start() if __name__ == '__main__': App().run()
Is this a known issue, and can I somehow work around it?