[updated] there is no retina support on mac (catalina / big sur)

I’m in the “replace debug panels with pretty stuff” phase of my project, so it’s time to get retina support working. My buffer’s not getting set up at the native screen resolution – is there something I need to set to make that happen?

I see that base.win.supports_pixel_zoom()=False

Image: full resolution on window frame, but reduced resolution in window content
osx_bad_buffer_resolution

We are currently disabling HiDPI on Cocoa. See related issues:

I think what we need to do is offer a switch to have the same behaviour as we do on Windows (ie. with dpi-aware)—rendering at full resolution by default while allowing an opt-in to DPI scaling. @el-dee may also wish to weigh in on this.

Feel free to open a new enhancement request in the issue tracker for this.

k… so short-term… : if I do a local build that comments out [_view setWantsBestResolutionOpenGLSurface:NO]; (and figure out how to make a distributable app with a local build of panda3d…), I’d get full-resolution, but potentially have issues when switching to and from fullscreen-mode?

Issue opened: add opt-in for retina/hiDPI support on OSX · Issue #1186 · panda3d/panda3d · GitHub

Looking at my test, I have a workaround by using QPanda3d and subclassing QPanda3DWidget.

Code: subclass of QPanda3dWidget with an explicit scale factor
class QPan (QPanda3DWidget):
    def __init__(self, panda3DWorld, scale=2):
        self.explicit_scale = scale
        super().__init__(panda3DWorld)

    def resizeEvent(self, evt):
        lens = self.panda3DWorld.cam.node().get_lens()
        lens.set_film_size(self.initial_film_size.width() * evt.size().width() / self.initial_size.width(),
                           self.initial_film_size.height() * evt.size().height() / self.initial_size.height())
        self.panda3DWorld.buff.setSize(evt.size().width() * self.explicit_scale, evt.size().height() * self.explicit_scale)


    # Use the paint event to pull the contents of the panda texture to the widget
    def paintEvent(self, event):
        if self.panda3DWorld.screenTexture.mightHaveRamImage():
            self.panda3DWorld.screenTexture.setFormat(Texture.FRgba32)
            data = self.panda3DWorld.screenTexture.getRamImage().getData()
            img = QImage(data, self.panda3DWorld.screenTexture.getXSize(), self.panda3DWorld.screenTexture.getYSize(),
                         QImage.Format_ARGB32).mirrored()
            self.paintSurface.begin(self)

            sz = Point2(self.panda3DWorld.screenTexture.getXSize(), self.panda3DWorld.screenTexture.getYSize()) / self.explicit_scale
            self.paintSurface.drawImage(QRectF(0,0,*sz), img)
            self.paintSurface.end()

This text will be hidden

I had to recollect my memories about the problem (It’s still fuzzy, so please correct me if I say something wrong) :

Panda3D never supported HiDPI on macOS, it relied on Cocoa/AppKit to perform the default upscaling on Retina screen (usually 2x, or 1.8x). (And, starting with Catalina, the default value of the configuration flag changed, triggering the problem and the fix mentioned above.)

If an application does not enable the support of HiDPI screen, the following mechanism occurs : the application writes to the frame buffer as if one point in the framebuffer is a pixel on screen, then Cocoa map the framebuffer onto the backing store using the requested backing scale factor (as set on the window object), then scale again the store to the physical screen resolution.

If the HiDPI support is enabled, AppKit allocates (and can change it seems !) the size of the framebuffer according to the actual screen resolution and backing scale factor. Also, depending on the API, backing store units or view units must be used, otherwise you have magnified or distorted rendering. NSView methods use view units, but some OpenGL functions typically use backing store units (but not all)

So, to support HiDPI, wantsBestResolutionOpenGLSurface must be set to YES but then some of the call to OpenGL functions, i.e. glViewport, glScissor, … must use coordinates converted from view to backing. And if I understand the doc correctly, this can not be cached or done the other way around, using backing store units everywhere and converting to view units when interacting with App Kit, as the size and scale of the framebuffer can change when the display configuration changes or if the window is moved across screens.

When using QT, you are actually bypassing all the view and window management of Panda3D as you render the scene to a texture that is then mapped using the correct coordinate units by QT

See Apple Developer Documentation and Optimizing OpenGL for High Resolution