I had to recollect my memories about the problem (It’s still fuzzy, so please correct me if I say something wrong) :
Panda3D never supported HiDPI on macOS, it relied on Cocoa/AppKit to perform the default upscaling on Retina screen (usually 2x, or 1.8x). (And, starting with Catalina, the default value of the configuration flag changed, triggering the problem and the fix mentioned above.)
If an application does not enable the support of HiDPI screen, the following mechanism occurs : the application writes to the frame buffer as if one point in the framebuffer is a pixel on screen, then Cocoa map the framebuffer onto the backing store using the requested backing scale factor (as set on the window object), then scale again the store to the physical screen resolution.
If the HiDPI support is enabled, AppKit allocates (and can change it seems !) the size of the framebuffer according to the actual screen resolution and backing scale factor. Also, depending on the API, backing store units or view units must be used, otherwise you have magnified or distorted rendering. NSView methods use view units, but some OpenGL functions typically use backing store units (but not all)
So, to support HiDPI, wantsBestResolutionOpenGLSurface
must be set to YES
but then some of the call to OpenGL functions, i.e. glViewport
, glScissor
, … must use coordinates converted from view to backing. And if I understand the doc correctly, this can not be cached or done the other way around, using backing store units everywhere and converting to view units when interacting with App Kit, as the size and scale of the framebuffer can change when the display configuration changes or if the window is moved across screens.
When using QT, you are actually bypassing all the view and window management of Panda3D as you render the scene to a texture that is then mapped using the correct coordinate units by QT
See Apple Developer Documentation and Optimizing OpenGL for High Resolution