artoolkit problem under windowXP

hello. i want to play artoolkit with panda recently because i found it interested. i searched the whole forum, got ThomasEgi’e sample from this thread https://discourse.panda3d.org/viewtopic.php?t=6069, and modify it to run under my windowXP. the problem is the virtual object didn’t appear at the pattern center. there is a snap screen below. i also found some thread talked about this problem https://discourse.panda3d.org/viewtopic.php?t=6508 but no solution. another guy use some tricky method to adjust the camera’s positon in this https://discourse.panda3d.org/viewtopic.php?t=6069. i don’t understand, i’ve tried both the original c version artoolkit sample and the OSGART ARToolKit for OpenSceneGraph sample, they didn’t have this problem using the default config. can anyone explain this?
and another question is the artoolkit object in panda just exposed 5 methods for us, is it sufficient to do complex things multiple patterns detect something like that.
[/img]
here is my code

from pandac.PandaModules import *
loadPrcFileData("", "auto-flip 1") #usualy the drawn texture lags a bit behind the calculted positions. this is a try to reduce the lag.
loadPrcFileData("", "win-size 640 480")

from direct.directbase import DirectStart
from direct.task import Task

option = WebcamVideo.getOption(4) # 640x480 @ 15fps mine cam is logitech quickcam mini for notebook

cursor = option.open()

tex = Texture('movie') 

tex.setTexturesPower2(0)
cursor.setupTexture(tex) 

#create a card which shows the image captured by the webcam.
cm = CardMaker("background-card")
cm.setFrame(-1, 1, 1, -1)
card = render2d.attachNewNode(cm.generate())
card.setTexture(tex)

#set the rendering order manually to render the card-with the webcam-image behind the scene.
base.cam.node().getDisplayRegion(0).setSort(20)

#load a model to visualize the tracking
axis = loader.loadModel("yup-axis")
axis.reparentTo(render)
axis.setScale(.2)

#initialize artoolkit, base.cam is our camera ,
#the camera_para.dat is the configuration file for your camera. this one comes with the artoolkit installation.
#last paremeter is the size of the pattern in panda-units.
ar = ARToolKit.make(base.cam, "./camera_para.dat", 1)

#attach the model to a pattern so it updates the model's position relative to the camera each time we call analyze()
ar.attachPattern("./patt.kanji", axis)
#updating the models positions each frame.
def updatePatterns(task):
 
    if cursor.ready():
        cursor.fetchIntoTexture(0, tex, 0)
        ar.analyze(tex, False)
    return Task.cont

taskMgr.add(updatePatterns, "update-patterns")

run()

no one reply? i still can’t figure it out. there must be someone have the same question as me. please help.

had the same problems, when i tryed it, sometime ago…

the “tricky method” you discribed was the only chance getting it work for me: (Thread: discourse.panda3d.org/viewtopic.php?t=6069
Post: 10)

Try to move the “videocard” as said and see how it affects the position off your Model relative to your marker. Also move the marker closer and further away from the camera…does your model slide away from your card…And so on. Its was trial and error for me and needed some patience but in the end it worked perfectly!

sorry cant explain it better

ABout the artoolkit wrapper:
Im also very interested in more methodes/features that could be wrapped. Do you know about intresting stuff thats missing in panda? I have seen a dignose screen in an AR-flash App where you can see how the image looks with the threshold apllied, something like that would be neat…

greetingz & good luck

thank you a again man. i am grateful to know i am not alone:) although there is a solution, i still want to know why, maybe i should dig into the c++ code.

Have you tested to see if it works with a square window?

thank god. i am waiting for you, pro. :laughing: do you mean set it like

loadPrcFileData("", "win-size 640 640")

it can’t get the right position, before offset seems only horizonly and now both horizonly and vertically. i hope you can give an explantion about this thing. is the same on linux? i dont test it on linux. thank you, pro.

I know this post is old but hopefully someone can help. I set up my ar just as weihuan above but it is super slow, I mean almost a minute for the image to catch up when something moves. Is this a hardware problem in my computer or is there some variable I can change?

hey man i am looking your code and i have one question in what part you call the fiducial.(the path)
ex. kanji or hiro paths