spherical display advice needed (fisheye rendering)

Sorry, I had intended to write some code snippet for you, but I have not been able to find the time for it.

Hmm, yeah, I’d forgotten about that base.saveSphereMap. It appears to render to a cube map (using a camera with 6 lenses, typical cube map rendering set-up) and then it uses a FisheyeMaker which creates a card with 3D UV coordinates that map a cube map onto a circular card with fisheye projection applied, similar to what you described. This would not require a shader.

I was not aware of this functionality. It might make things easier for you.

Thank you. I’m trying to learn more but it seems like a grey area.

One last thing, can the cube map camera positions be customized? For most scenes for an inverse sphere display like mine a setup like this would make more sense:

than this:

(top and bottom cameras intentionally hidden to make the illustrations simpler)

I looked in the ShowBase source code to see what saveSphereMap actually does. Turns out it’s pretty simple; here is a complete but simple example:

from panda3d.core import *
from direct.showbase.ShowBase import ShowBase

size = 512
numVertices = 1000

base = ShowBase()

# Load a simple scene.
env = loader.loadModel("environment")

# Now make the cube map buffer.
rig = base.camera.attachNewNode("rig")
buffer = base.win.makeCubeMap("test", size, rig)
assert buffer

fm = FisheyeMaker('card')
fm.setSquareInscribed(1, 1.1)
card = base.render2d.attachNewNode(fm.generate())

# Disable the scene render on the normal 'render' graph.


This attaches to base.camera which you can just point upward as you wish.

You may want to change the setReflection flag depending on whether you want an inverted or a regular view.

As for customizing the camera positions: sure, you can change the individual positions of the cameras that makeCubeMap attaches to the rig, although I’m not sure how you could get a continuous image with the set-up you described.


Does the FisheyeMaker do anything but generate a subdivided circle with correct UVs?
I think I’d like to use my own subdivided and UV wrapped circle loaded from an egg file and attached to aspect2d, as I would be able to do geometric correction in my 3d editor without writing shader code. Unless the FisheyeMaker does some other magic.
EDIT: I did try loading my own egg, but the UVs are night right, I can’t figure out how it is for the FisheyeMaker generated geometry. I even dumped a bam file from the NodePath generated by FisheyeMaker and converted it to egg, then obj and loaded in my 3d editor and reexported to egg. The UVs get messed up at some point as this doesnt look right either.

As for the code, a screenshot of a frame looks nice in the visualizer, although I did do some geometric correction by editing the UVs.

As for the real deal, I’ll need to process the globe some to make it more diffuse ( = brighter projection surface).

I think, although it will be clearer when I see it myself, that a radial blur shader might be needed to make the resolution on the edges of the globe and the top look similar. By radial I mean getting blurrier away from the center of the screen.

Yeah, me neither. Even with the side cameras being seamless something different will need to be done for the top and bottom camera for the globe display to make sense. For example if rendering a planet it would be fine, but rendering landscape and the top part will display the ground which won’t make sense. I’ll need to think about this some more.

And BTW when changing the background color and moving the camera this happens:

Something isn’t getting cleared. Maybe I found a general Panda bug, or could be that something extra needs to be done on teh buffer.

You should set a color clear on your dislay region probably, otherwise the color buffer won’t get cleared. I think its called setClearColorActive and setClearColor

Which display region?

Some update.
I was thinking of having a way to play equirectangular Youtube video (pre rendered animations) from inside Panda as well.
It looks surprisingly good.

Scrolling/rotating left-right was easy, like here

but up/down seemed impossible with this kind of texture.

Then I thought of this dead simple solution: instead of wrapping a rectangular texture to a circle, map it to an inverse UV sphere and put the fisheye camera inside the sphere. To rotate the camera angle, rotate the sphere.
Simple, huh?

You can also use setTexHpr to rotate the texture:

card.setTexHpr(TextureStage.getDefault(), (H, P, 0))

For example, add this to the code example I gave earlier to scroll horizontally:

def rotateTask(task):
    time = globalClock.getFrameTime() * 36
    card.setTexHpr(TextureStage.getDefault(), (time, 0, 0))
    return task.cont

base.taskMgr.add(rotateTask, 'rotateTask')

You could then just scroll vertically by altering the P instead of the H.

The reason this works is because cube map texture coordinates are 3D coordinates containing direction vectors. They can be transformed like any other vector.

That’s what I tried at first but of course it didn’t work with prerendered equirectangular textures. In equirectangular texture the right side ends where the left side starts, but that’s not the case with top and bottom.

This is turning up pretty nice, I appreciate all the help, the community is great here.

One last finishing touches:

  1. Does anyone have a radial blur filter somewhere (blurring depending on the distance from the screen’s center point). That will help to keep “focus” uniform across the globe.

  2. Some way to geometric correction on the circle the fisheye texture is rendered into. I tried mking my own in my 3d editor but I don’t understand how the Panda one is uv mapped.

  1. Hmm, do you actually mean to radially blur, like this?

Or do you want a regular gaussian blur, but lessen the effect depending on the distance?

  1. I’m not sure what kind of geometric correction you are talking about. The coordinates on the FisheyeMaker are UVW coordinates - most 3D editing programs can only work with UV coordinates. As said earlier, the coordinates for sampling a cube map are vectors, otherwise described as points on a 1-radius sphere

No, I meant something like this:

I don’t know if it has a standard definition, it was called “circle blur” in one place.

I’ll try to explain it in a simple way: if you have a circle with multiple rings on which the fisheye texture is applied, “geometric correction” on it could mean ‘resizing’ the rings.
If this wasn’t clear, I can make a simple animation to illustrate what I mean.
EDIT: Here’s an illustration:

This could easily be done in a 3d editor if I could import the FisheyeMaker card to it, edit it a bit and reexport. But like I said dae2egg -> egg2whatever -> 3d editor destroys the UVs. And I don’t know how the UVs are layed out to use my own egg for the Fisheye texture. I thought there was one way to do it which worked great in the 3d editor but that’s not how FisheyeMaker does it as I see rubbish instead of proper texture.

  1. Hmm, seems like you can get this effect with a standard gaussian blur (like Panda’s blur filter), and lerping between the original and the blurred version based on some factor that is computed from the distance to the center. Or, scaling the radius of the kernel based on this factor.

  2. Hmm, seems tricky; I can’t say I have a good idea as to how this could be done. Perhaps you can use FisheyeMaker to generate a mesh, call writeBamFile(), use bam2egg, and then manipulate the vertices or the UVWs in some way.

Exporting to a 3D program would indeed likely destroy the UVWs. Perhaps it would be easier to make a Python script to distort UVWs based on a function of their W value, or something of the sort.

I have an idea. How about rendering twice, once with no filter, then with a blur filter, and applying both as a texture and using a grayscale texture as a texture mask? I don’t know how to do this exactly to not destroy framerate but might work.

You only render it once to a texture, then use that texture as an input to a blur shader (these tend to be 2 stage shaders, one horizontal, one vertical, and you can also downscale the texture to 1/2 or 1/4 to give the shader less work). The final stage will take the original texture and the blured one and some extra texture to tell how to mix them (or a point in screen space and some extra math).

It’s not that much work for the gpu.

How can I get current rendered frame and apply it as texture?

FilterManager is you friend, there’s a good page on that in the manual:
www.panda3d.org/manual/index.php/Genera … ge_Filters

I don’t think shaders are necessary.There’s a buffer.getTexture() method. If I could set teh default buffer to hidden somehow I could create a new buffer/scene/camera and apply buffer.getTexture() to a custom circle I made.
Currently the fisheye geometry is attached to aspect 2d , what this would do is attached it to an offscreen buffer node and attach a custom fisheye geometry to aspect2d instead, and I could work with UVs on that.
not sure how though

Hi! Just read through this thread. Can’t say I can help much,
but wanted to wish you success in your endeavours! :slight_smile:

If you could provide an actual output image of your work and simply use paint.net (www.getpaint.net, highly recommended) to illustrate what you want, it would be easier to understand. It would be most likely much faster to do in shaders, either as post-processing effect or on the viewport, before you apply it onto the sphere.

Your example-blur-image looks backwards of what your words described. You either want the center of attention blurred (your example) or you want a fake depth-of-feld effect (rdb’s example). In any way would you be better off using a shader, because you most likely won’t be very happy with moving around the vertices of the sphere … performance-wise and in regards to visual quality.

If you could create an example from your program’s output, that’d be helpfull. :slight_smile: