Sync camera animation speed to video

Hi,

I’m trying to play a camera animation that matches the speed of a video playing in the background.

For the video I’m using:

video = OnscreenImage(parent=aspect2d, image=video_path)
video.setScale(2, 1, 2)
video.setPos(0, 1, 1)
base.cam.node().getDisplayRegion(0).setSort(20)

I’m not sure if that’s the right way of doing it? ‘OnscreenImage’ is supposed to be used for images but it’s apparently working with videos. The only issue I got with this is the scale/position that wasn’t matching my window. If use an image like a .png the aspect is good so apparently there’s an issue with videos to get the right aspect. That’s why I have those lines of code ‘setScale’ and ‘setPos’, but the video is not perfectly matching my window, is there a better and cleaner way of doing it? (my window has the same aspect ratio than the video)

For the camera animation, it’s a bit more custom. I have a list of positions, and I’m reading the animation this way:

reading_spead = 1.5
cam_index = round(round(task.time * 100) / (reading_spead))

cam = positions[0][cam_index] 

It’s working well to play the animation, but I don’t have precise control over the speed. How to play both at the exact same fps? To have the frame number of the video matching the camera position index.

Thank you for your help!

For these purposes, it is better to use CardMaker or just Mesh, but you just need to set the correct aspect ratio, you can get it like this:

base.getAspectRatio()

video = OnscreenImage(parent=aspect2d, image=video_path)
video.setScale(1, base.getAspectRatio())

As for the second problem, you can get the position of the video.
.getTime()

Something like this.

from direct.showbase.ShowBase import ShowBase
from panda3d.core import NodePath, CardMaker, MovieTexture, TextureStage
from direct.task import Task

class Game(ShowBase):
    def __init__(self):
        ShowBase.__init__(self)

        video_logo = "PandaSneezes.ogv"
        
        tex = MovieTexture("logo")
        
        assert tex.read(video_logo)
        
        cm = CardMaker("screen");
        cm.setFrameFullscreenQuad()
        cm.setUvRange(tex)
        
        card = NodePath(cm.generate())
        card.reparentTo(render2d)
        card.setTexture(tex)
        
        self.sound = loader.loadSfx(video_logo)
        tex.synchronizeTo(self.sound)
        self.sound.play()
        
        taskMgr.add(self.getTime, 'getTime')
        
    def getTime(self, task):
        print(self.sound.getTime())
        if (self.sound.status() != 2):
            return task.done
        return task.cont

game = Game()
game.run()

No sound:

from direct.showbase.ShowBase import ShowBase
from panda3d.core import NodePath, CardMaker, MovieTexture, TextureStage
from direct.task import Task

class Game(ShowBase):
    def __init__(self):
        ShowBase.__init__(self)

        video_logo = "PandaSneezes.ogv"
        
        self.tex = MovieTexture("logo")
        
        assert self.tex.read(video_logo)
        
        cm = CardMaker("screen");
        cm.setFrameFullscreenQuad()
        cm.setUvRange(self.tex)
        
        card = NodePath(cm.generate())
        card.reparentTo(render2d)
        card.setTexture(self.tex)

        taskMgr.add(self.getTime, 'getTime')
        
        self.length_video = self.tex.getVideoLength()
        
    def getTime(self, task):
        print(self.tex.getTime())
        if self.tex.getTime() >= self.length_video:
            self.tex.stop()
            return task.done
        return task.cont

game = Game()
game.run()
1 Like

Thanks a lot for your quick reply and the code! It fully answers my question and works perfectly :ok_hand: