Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
9 views28 pages

Slide - 3DP - 06 - Camera Calibration

The document discusses 3D data processing with a focus on camera calibration, including linear and nonlinear least squares problems. It covers methods such as Gauss-Newton and Levenberg-Marquardt for optimizing parameters, as well as the process of estimating intrinsic and extrinsic camera parameters using known patterns and image data. The document also addresses the Perspective-n-Point problem for estimating rigid body transformations from 3D-2D correspondences.

Uploaded by

amir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views28 pages

Slide - 3DP - 06 - Camera Calibration

The document discusses 3D data processing with a focus on camera calibration, including linear and nonlinear least squares problems. It covers methods such as Gauss-Newton and Levenberg-Marquardt for optimizing parameters, as well as the process of estimating intrinsic and extrinsic camera parameters using known patterns and image data. The document also addresses the Perspective-n-Point problem for estimating rigid body transformations from 3D-2D correspondences.

Uploaded by

amir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

3D Data Processing

Camera Calibration
Alberto Pretto
Linear Least Squares Problems
Let the following overdetermined linear
system (A m × n matrix with m > n):
Nonhomogeneous system
(if b = 0 → homogeneous system)

We aim to find a solution for the following


least squares problem:

2
Linear Least Squares Problems
● The n columns of A span a subspace SA ⊆
Rm with dimension up to n.

Let x̄ ∈ Rn a solution, with y = Ax̄ ∈ SA : y is
the point in SA closest to b

3
Linear Least Squares Problems
b

SA
y = Ax̄

y − b should be perpendicular to each of


the column of A:

4
Pseudoinverse

New linear system, with ATA n × n symmetric matrix with


same solution as
It define the so-called normal equations of a least
squares problem:
– If ATA invertible, the solution is just:

defined as the pseudoinverse of A.


– Otherwise, solve the normal equations.
5
Nonlinear Least Squares Problems
Let f a nonlinear model function:

f depends on a set of parameters:

Dataset composed by n pairs {xi , yi} ("data


points"):
– xi is an independent variable (input)
– yi an observation given xi (output)
6
Nonlinear Least Squares Problems
"Goodness" of the model parameters with
respect to the i-th data point is defined by:
Residual

Nonlinear least-squares methods finds the


parameters vector a that minimizes the sum
of squared residuals:
Cost
function
7
Nonlinear Least Squares Problems
● We assume to have a "good" initial estimate a0 of the
parameters vector.
● The goal of an iterative least square method is to choose
at each iteration an update δa of the current estimate ak
that reduces the value of the cost function.

● Stopping criteria:
– Number of iterations
– (Relative) step size
– ...
8
Gauss-Newton
Gauss-Newton is an efficient iterative
method used to minimize a sum of squared
function values.

9
Gauss-Newton
We can rewrite the error function as:

Derive the derivatives as:

J Jacobian matrix:

10
Gauss-Newton
If we assume that the model function f is
locally linear:

11
Gauss-Newton
We want to find an update δa that
minimizes the error function E

This yields the Gauss-Newton update:

12
Gauss-Newton
● Pros: Gauss-Newton generally ensures fast
convergence also for nonlinear model
functions
● Cons: Convergence is not guaranteed: an
update may increase the cost function.

13
Gradient Descent
Idea: choose an update δa in
the opposite direction of the
gradient of the cost function
Remembering that

14
Gradient Descent
● Pros: Unless a stationary point has
been reached, the error can always
be decreased by reducing the step
length
● Cons: slower than Gauss-Newton,
since the negative gradient can
rapidly oscillate (zig-zag behaviour)
wasting time near a minimum point.
The gradient can vanish near
stationary points
15
Levenberg-Marquardt
Levenberg-Marquardt (LM) algorithm combines Gauss-Newton
and Gradient descent, trying to exploit the strengths of both
these approaches.

Levenberg–Marquardt tunes the "damping factor" λ at each


iteration to ensure rapid progress even where Gauss–Newton fails,
e.g.:
– Multiply λ by 10 if the error increases => it tends to converge toward the
Gradient Descent algorithm
– Divide λ by 10 if the error decreases => it tends to converge toward the
Gauss-Newton algorithm
16
Pinhole Camera Calibration

17
Intrinsic VS Extrinsic Parameters
Camera calibration estimates intrinsic (i.e.
camera-specific) and extrinsic parameters
of a given camera.
Intrinsic (or internal) parameters are the
focal length, coordinates of the principal
point, distortion parameters, etc...
Extrinsic parameters identify the pose (i.e.
rotation and translation) of the camera,
e.g. in a world coordinate system.
18
Calibration Method Overview
● Use a known planar pattern (e.g., a checkerboard)
● Collect a sequence of images of the pattern in several positions,
and extract all corners from images
● Exploit the scene-to-image homography constraints to extract an
initial guess of the (linear) intrinsic parameters matrix K’
– We assume for now no distortion
● Normalize all pixel projections with K’ and for each image i extract
the homography and recover an initial guess of the the intrinsic
parameters (Ri, yi)
● Recover the final intrinsic and extrinsic parameters solving an
iterative least square problem using K’ and (Ri, yi) as initial guesses.
– We optimize also the distortion parameters here
19
Data
● Collect k images, each one framing a
checkerboard with m internal corners from
different point of views.
Fixed m 3D corner positions (in meters) computed
once from checkerboard geometry, e.g.(square
size 5 cm):
P0 : [0,0,0]’
P1 : [0.05, 0, 0]’
P2 : [0.1, 0 0]’
P3 : [0.15, 0 0]’

Pm-1

20
Image Corners and 3D-2D Matches
● For each image i, extract the m 2D corners
projections: pi,0 …, pi,m-1

For each view i, m 2D corner positions (in pixels)


extracted from the images:
pi,0 : [233,436]’
pi,1 : [256,449]’

pi,m-1

Given the order in the corner extraction, is


trivial to associate P0 with pi,0, P1 with pi,1, P2 with
pi,2, …, Pm-1 with pi,m-1 , for each i

21
Initial Guess for K
Skew between the axes
● We need to estimate an initial guess for K, i.e.: of the image plane

● For each image i, use the m 3D-2D matches to estimate


(e.g., by using RANSAC) a scene-to-image homography
Hi:

● Remembering the derivation:

22
Initial Guess for K
● From orthonormality of rotation matrix R:

– Two constraints for each homography!


● Define: symmetric matrix

23
Initial Guess for K
Rearrange each constraint as:

where:

The two constraints can be rewritten as:

that is a 2 x 6 system of equations.


24
Initial Guess for K
● Given k images (i.e., homgraphies) arrange a 2k × 6
homogeneous linear system:

● Solve it e.g., by using SVD and get B.


– We applied the Direct Linear Transformation (DLT) algorithm,
which solves a set of variables from a set of similarity relations.
● Extract the five intrinsic camera parameters plus one
unknown scale factor from the 6 equations defined by:

25
Initial Guesses for Ri, ti
● For each checkerboard position, we
normalize all the image points using the
initial guess camera intrinsics computed in
the previous step.
● Remembering that:

we can extract R,t (see last lecture)


26
Refined Calibration
At each optimization iteration, given the
current estimate of the transformations Ti and
the calibration parameters, we can project
into each image the m 3D corners….

… and compare them with the


corresponding corner pixels .. Ideas for
minimization:
… and optimize again looking for better Ti ● Use LM to solve the
and calibration parameters that minimize the iterative problem
cost function ● Parametrize the
rotations with axis–
angle representation
27
PnP: Perspective-n-Point Problem
Given a rigid object and a set of
3D-2D correspondences, how to
estimate the rigid body
transformation R,t with a
calibrated camera?
● Find an initial guess for R,t Z
– E.g., if the 3D points are coplanar,
estimate the homography and y
x
extract R, t O

X
● Solve with LM a nonlinear least- R,t?
square problem with parameters Y
R,t , using as starting point the
initial guess.
28

You might also like