37
$\begingroup$

Whenever I do camera tracking in Blender the reconstruction is always zooming out or not on the axis. I have a great quality camera, but the end results of the camera tracking always has the model sliding. If you have a solution please tell me.

$\endgroup$
2

2 Answers 2

98
$\begingroup$

Here are a few pointers for camera tracking (for more details follow the links in blue text):

1. Prepare your scene carefully before shooting to make tracking and reconstruction easier

Avoid sudden camera movements to prevent blurry footage and rolling shutter artifacts. Blurry, shaky or otherwise distorted video is very hard to track and will result in inaccurate 3D reconstruction.

If your camera has a zoom lens Do Not Change the focal length during the shot. Blender cannot work with such shots yet.

Include in your scene distinctive, trackable features, and make sure they stay sharp and recognizable throughout the time they are on the screen.

Tracking markers are most effective when they are well distributed and give you a good idea of perspective. There should be some in the foreground and the background. Reconstruction is calculated by how different objects move within the frame according to their distance to the camera, objects that are close to the camera will move faster than those far away. Motion tracking works best when the difference in the tracked object's movement is clear.

If your scene has large areas of flat or homogeneous surfaces with few elements to track, or with features that repeat and might confuse the tracker, then make your own tracking markers and place them in the scene. Small pieces of tape or stickers will work wonders, for grass you can use ping pong or golf balls.

Make sure that the tracking points are not all bunched up in just one area of the frame, and that is placed on different axes, for instance, don't track the floor only but the walls as well.

A common mistake is making very large and featureless makers like these:

enter image description here

instead of small and widely distributed like these:

enter image description here

As for the texture on the tracking points, a quick google image search will give you an idea of what kind of patterns other people use:

enter image description here

(for more details on this topic read this link)

2. Make the Tracking process as accurate as you can

Even though blender has an option to Detect features to be tracked, you'll get better results by placing your own.

Start the tracking procedure by examining the video footage and finding what objects or features of the image are present in most of the shot. Track those First. Then go through the shot and identify features that are sharp and have good contrast or distinctive colors, track those as well.

If the tracking process stops before the object has disappeared or before the end of the shot, it means that blender cannot accurately track an element. In other words, tracking fails when the tracked element cannot be found accurately within the search box area.

enter image description here

(To display the search bound box enable this:

enter image description here

If the tracked element moves too much from frame to frame and falls outside the search area, you can resize the search box (note that this will make the process slower and use more RAM)

enter image description here

Read What can you do when the tracking stops? for possible solutions

Once you track some points, check that none of your markers are sliding around. Go through each one of them, maybe some are not locking properly.

You can check the accuracy of a tracker by selecting it playing back the scene while looking at the small track window on the right of the screen (if it is not visible press N to open the side panel).

This tracker for example is sliding:

enter image description here

Ideally, the tracked feature should stay fixed on the track window and not dance around. It should be rock solid as in the next image:

enter image description here

Carefully examine each and every tracked point, one by one, for accuracy.

You need at least 8 successful trackers to reconstruct a scene, but don't limit yourself to that number only.

It's better to have a few accurate markers than lots of inaccurate ones.

If your tracked points are sliding or giving large error averages, try different tracking motion models other than Loc. Use LocRotScale or Perspective for example (More info on this link)

enter image description here

Re-track existing markers using "Refine" to increase accuracy.

enter image description here

Elements that change in size or distort with perspective shifts are better tracked using Match "Previous Frame" instead of "Keyframe".

enter image description here

If the object you are tracking gets temporarily blocked, or goes out of the frame and comes back, you can either offset the tracker, or track to the frame where it disappears with one tracker, then track from the moment the element is visible again with a new tracker and then join the two of them together. That way blender knows that is dealing with the same object and not different ones.

enter image description here


3. Set up the camera data properly

Set the sensor size and focal length as best as you can using information from the camera and lenses used for the shot. If you don't know the focal length you can use the refine tools when solving. If you don't have such information, you can have blender guess the lens parameters based on tracked information by using the Refine option when you solve the camera motion.

enter image description here

All lenses in the real world create some kind of optical distortion. To integrate images from a real camera into a virtual 3D environment correctly, its important to determine the values for lens distortion or have blender calculate them and refine them for you


4. Set a proper range of keyframes for the camera solution

enter image description here

The solver can get better tracking data from some sections of the video than others. The idea is to use the section of the video that will give blender a better description of the space, based on the difference in the movement of the trackers. To do that the keyframe range should include at least 8 successfully tracked points and they should all be visible in every frame of the selected range. If you are unsure of what the best keyframe range is, let blender choose the right ones automatically, by enabling "keyframes".

One important thing to understand is that this range of keyframes is not the only section that will get solved. It only means the optimal range where there is "reliable" information (with at least 8 common markers). Blender will try to solve the rest of the scene, for example in places where the common (bundled) tracks are not present.


5. Choose the correct solver for your camera movement

The default solver in blender presumes some parallax, or perspective shift, meaning that as the camera moves, the perspective of the objects in the scene changes as well. For this to be true the camera has to have some displacement (side to side, up-down, etc). With this kind of motion, objects that are close to the camera will move at different speeds as those far away and parallel lines will converge at different points depending on where the camera is at.

Example of camera displacement: enter image description here

Shots with no camera displacement, where the camera is in the same place and just rotates (panning or/and tilting), cannot be solved with the default solver.

Example of stationary Camera or Tripod shot:

enter image description here

These kinds of shots can only be solved as Tripod:

enter image description here

When using Tripod solve it's not possible to determine proper 3D information. Blender has no way to know what is closer or further away from the camera, so the tracking information gets projected from the camera in a spherical way.

enter image description here

Some of those shots might be easier to reconstruct using Blam or Fspy.


6. Do whatever it takes to have a low Solve Error

If your solve error is more than 0.3 or is getting "data failed to reconstruct" errors, then you really need to work on the marker accuracy and other elements outlined here. The tracking error is calculated in pixels. An error larger than one-third of a pixel is considered too high.

To find out which trackers have issues or high average errors use the graphs and dopesheet tools.

enter image description here

enter image description here

If the error is still too high, go back and revisit some of the previous steps, re-track, delete inaccurate trackers, etc.

There are times when blender will be able to solve only part of the shot, no matter how accurate the tracking process. That can happen in sections where none of the bundled tracks are present. Maybe the camera moved past them or maybe there are sections of the shot where there aren't 8 common tracking points anymore. When that happens you'll still get a "some data failed to reconstruct", and the frames where the reconstruction failed will be marked in red.

UPDATE:

To further refine your track after solving you can use the Script made by @StephenLeger. See this link for more information. It sets the tracking weight according to the reprojection error so that the information from bad trackers can also have a meaningful contribution to the camera solution.


7. Check for reprojection errors

Once you have a camera solution and further refine the optical center of your camera.


8. Correct the orientation on the scene

If after all of the pain you've been through the orientation is incorrect for the reconstructed scene. You can manually re-orient or re-scale the camera in the 3D viewport and all of the tracking points will follow:

enter image description here

$\endgroup$
3
  • 3
    $\begingroup$ wow for detailed answer $\endgroup$
    – Shadi
    Commented Dec 23, 2020 at 5:04
  • $\begingroup$ In 6, if I have 500 tracks and see that one of them is deviating in the graph, how can I figure out which track it is (by name) that stands out? $\endgroup$ Commented Jul 31, 2021 at 7:02
  • 1
    $\begingroup$ @JohanWalles Just click on the curve in the graph editor. $\endgroup$ Commented Jan 8 at 14:00
2
$\begingroup$

Under Orientation there is a panel in the editor that allows you to set the floor by selecting markers.

That means you don't manually have to orientate the scene.

$\endgroup$
1
  • 3
    $\begingroup$ Sometimes the floor / ground is uneven. Like on a beach. Manually orienting might be a great solution for those circumstances? $\endgroup$
    – ngerbens
    Commented Jun 11, 2020 at 22:06

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .