0
$\begingroup$

I'm attempting to use camera tracking on two similar photos to determine focal length and orientation and, subsequently, to reconstruct the scene with geometry. I've tracked all points manually, since there is too large of a perspective change and only two frames. I appear to have gotten a very good solve, with an average error of 0.1334. However, after orienting the camera (set origin, X axis, etc) and trying to match geometry, I realized the focal length was slightly off. The focal length that Blender calculated matches that obtained from fSpy. I also tried a few different focal lengths to verify that I had the lowest error value. I have also tried adjusting the focal length and re-solving to get the trackers to match the geometry, since it's a simple shape with right angles, but with no luck. I believe I've had this same issue after almost every successful camera solve I've gotten in the past. In fact, I think I've only ever gotten one decent camera match, out of numerous attempts over the last few years.

Would anyone be willing to take a look at the file for me and tell me what I'm doing wrong? I would really appreciate it! Here are the images I used, as well as some screenshots.

image 01 image 02 screenshot 01 screenshot 02

$\endgroup$
7
  • $\begingroup$ Are the two images made with the same lens? Do you know the sensor size? Why only two images? $\endgroup$
    – susu
    Commented Aug 9, 2020 at 22:51
  • $\begingroup$ The issue here is not the correct lens size, but the placement of the camera (with all of its markers as constraints) in the 3d scene. Since you don't have a tracker set as origin, and others to determine walls and floor, or x and y it is hard to line up geometry to the existing image. Try also using trackers on the ground, those will help you anchor the geometry. $\endgroup$
    – no-can-do
    Commented Aug 10, 2020 at 7:35
  • $\begingroup$ @susu I don't know for a fact that they use the same lens, but I have every reason to believe they were- they appear to have been taken at almost the exact same time, based on the positions of the clouds, the flag, and foliage. Also, the lighting seems identical. I don't know the sensor size, but I was under the impression that I could just leave it at 35mm and have Blender (or fSpy) calculate the 35mm-equivalent. $\endgroup$
    – gasshadow
    Commented Aug 10, 2020 at 7:46
  • $\begingroup$ @susu Still images are all I have to go on and there are only two similar shots of the front of the house. I've seen this use-case demo-ed by Sebastion Konig, himself, in an early camera tracking tutorial, so it should be possible. $\endgroup$
    – gasshadow
    Commented Aug 10, 2020 at 7:54
  • $\begingroup$ @no-can-do I do have a tracker set as origin (top-right corner of right-front, second-story window) and I do have the X axis set (tops of front windows). There are no suitable features on the ground to track. The ground is barely visible in the second image. $\endgroup$
    – gasshadow
    Commented Aug 10, 2020 at 7:59

1 Answer 1

1
$\begingroup$

Delete the trackers with errors (like the one that tracks a reflection in the window, and one that has only one keyframe).

Then set origin, X and Y, and solve again.

Set the tracking scene again.

You can always fine tune placement manually by grabbing any of the tracking markers and moving them so that they align with the grid (presuming of course that the house is really square, which most of them are not). Keep in mind that the camera solve is a constraint, so moving one element will move everything, including the camera, so there is no danger of having any of the tracked elements be misplaced relative to the camera.

enter image description here

Try to line up markers that belong in the same axis (like a wall, for example).

Then just get a cube in there and try to match the features. Start with the edge of the wall that is closest to the camera, then start moving faces to get the height, width and length.

With a minimum of guesswork and re-orienting the markers manually, will get your camera and geometry in the correct place.

For more detail read: How to align the camera in a solved motion tracked scene?

enter image description here

$\endgroup$
2
  • $\begingroup$ I deleted the two markers you mentioned. (fwiw, the one in the window wasn't a reflection, but another window, seen through this one.) I had already done all the other steps you mentioned, but I tried them again for good measure, after deleting those two markers. Unfortunately, the perspective is still off. This is especially apparent when switching frames. I don't mean to sound ungrateful for your help, because I really appreciate it, but it looks like the perspective is slightly off in your screenshot, as well. Perhaps I'm asking for too much and I've reached Blender's limits on this. $\endgroup$
    – gasshadow
    Commented Aug 11, 2020 at 13:23
  • $\begingroup$ I forgot to mention- when I set the axis, it doesn't line up properly. The 'origin' marker does, but not the 'axis' marker; it's always a little bit off of the designated axis. It appears that the deviation represents a clockwise rotation of the camera on the axis NOT set, if that makes sense. If I set the X axis, the rotation of the camera/constraint is off by a number of degrees on the Y axis, and vice versa. This appears to be true across multiple camera tracking scenes. The number of degrees of rotation and direction do not appear to be consistent. I wonder what this could mean... $\endgroup$
    – gasshadow
    Commented Aug 11, 2020 at 14:07

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .