1
$\begingroup$

I've started with these series to teach myself image processing and I am wondering when the camera's projection matrix (i.e. intrinsic and extrinsic matrices) is calculated?

Is camera calibration a one-time task at the factory? Or, does it need to be performed regularly for some reason?

$\endgroup$

1 Answer 1

1
$\begingroup$

Theoretically, the intrinsic parameters of the camera, i. e. things like the focal length, the principal point, and the lens distortion parameters should not change, and thus can be estimated once. In reality, however, they may change over time because of heat or mechanical stresses.

Heat can easily warp the plastic parts of the camera changing the relative positions of the lens and the imaging sensor. Or heat can warp the lens itself, if it is plastic, changing its distortion characteristics. If your camera is mounted on the front of the robot, which repeatedly bumps into walls, that can also cause the lens to become misaligned relative to the imaging sensor.

The extrinsics parameters of the camera are its 3D rotation and translation relative to something. So depending on what that something is, the extrinsics will change if the camera moves. For example, if you have a camera mounted on a mobile robot or a robot arm, you need to know the extrinsics of the camera relative to the robot's origin. Again, that should not change, assuming that the camera is rigidly attached to the robot or the arm, but in reality things can always move, and then you need to recalibrate.

So, in practice, you calibrate once, and hope for the best. If over time the performance of your vision system degrades, then recalibrating should be one of the first things to try.

$\endgroup$
7
  • 1
    $\begingroup$ Having an idea of how sensitive your algorithms are to camera calibration is probably a good idea, too. Stereo vision systems that try to naively reconstruct a 3D images from known camera calibrations can be very sensitive to changes in camera calibration. OTOH, a 2D camera driving a perception algorithm, or a robust 3D sensor (Lidar, flash lidar, structured light camera, or whatever they come up with next) will generate a point cloud with all the right ranges, but with everything rotated a bit. $\endgroup$
    – TimWescott
    Commented Dec 25, 2022 at 17:47
  • $\begingroup$ Can you recommend DSLR calibration software (preferably free) which would generate the extrinsics? $\endgroup$ Commented Dec 26, 2022 at 20:14
  • $\begingroup$ Extrinsics relative to what? Do you have a single camera or a stereo pair? Generally, you use a calibration target (e. g. a checkerboard), and you get extrinsics relative to that. But I don't think this is what you are looking for. $\endgroup$
    – Dima
    Commented Dec 27, 2022 at 13:30
  • $\begingroup$ Single camera used in computer vision system to assist with calibration of flat display. So, I'm thinking the checkerboard will be displayed on the display itself. $\endgroup$ Commented Dec 28, 2022 at 2:00
  • $\begingroup$ Matlab's computer vision toolbox has an app for camera calibration. There is also an example showing how to compute camera extrinsics relative to a plane: mathworks.com/help/vision/ug/… $\endgroup$
    – Dima
    Commented Dec 29, 2022 at 14:01

Not the answer you're looking for? Browse other questions tagged or ask your own question.