UDP server sucks all available CPU time

I’m pretty new to Panda3D but I’ve been working on UDP connection manager and I got the basic version working. Read/send tasks on server and clients. Problem is that the server is sucking all available CPU when client is “connected”. I started simplifying the code to see what causes this. Below overly simplified codes:

Server:

from panda3d.core import QueuedConnectionManager, QueuedConnectionReader
from direct.distributed.PyDatagramIterator import PyDatagramIterator
from direct.directbase.DirectStart import *

manager = QueuedConnectionManager()
connection = manager.openUDPConnection(10001)

run()

Client:

from panda3d.core import QueuedConnectionManager, ConnectionWriter, NetAddress
from direct.distributed.PyDatagram import PyDatagram
from direct.directbase.DirectStart import *

manager = QueuedConnectionManager()
writer = ConnectionWriter(manager, 0)

datagram = PyDatagram()
datagram.addString("Hello, world!")

server = NetAddress()
server.setHost("127.0.0.1", 10001)

writer.send(datagram, manager.openUDPConnection(), server)

run()

So event without any functionality on the server it sucks all the CPU when client send the datagram. When I quit the client usage drops to the normal ~10%. I also noticed that instead of the normal ~60FPS, tasks run around 300FPS.

Is this normal behavior? Should I expect that my game server always tops the cpu?

By coincidence, there’s a conversation in another thread right now that may give you the answer: [client-sleep)

David

Perfect! That’s exactly what I was looking for. Thanks :slight_smile: