Computer Vision: Lecture 7 Anders Heyden

2016-02-10

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

1 / 26

Todays Lecture Outline Repetition A First Reconstruction System Outliers and RANSAC Minimal Solvers

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

2 / 26

Repetition: Computing the Camera Matrix Known

Estimate

Image points xi .

A camera matrix P such that λi xi = PXi . Solved using DLT in lecture 3.

Known scene points Xi . Anders Heyden

Computer Vision: Lecture 7

2016-02-10

3 / 26

Repetition: Relative Orientation

Sought:

Known:

Two corresponding point sets {¯ xi } and {xi }. Scene points {Xi } and cameras P1 P2 , such that λi xi = P1 Xi ¯ i ¯xi = P2 Xi λ Anders Heyden

Computer Vision: Lecture 7

2016-02-10

4 / 26

Repetition: Relative Orientation The Fundamental Matrix (see lecture 5)     For cameras P1 = I 0 and P2 = A t . The corresponding image points xi and ¯xi fulfills ¯ xT i F xi = 0, where, F = [t]× A. The scene point Xi has been eliminated. Solve F using 8-point alg, compute cameras (lect. 5).

Problem: Projective ambiguity Anders Heyden

Computer Vision: Lecture 7

2016-02-10

5 / 26

Repetition: Relative Orientation The Essential Matrix (see lecture 5)     For cameras P1 = I 0 and P2 = R t . The corresponding image points xi and ¯xi fulfills ¯ xT i E xi = 0, where, F = [t]× R. The scene point Xi has been eliminated. Solve E using modified 8-point alg, compute cameras (lect. 6).

No projective ambiguity Anders Heyden

Computer Vision: Lecture 7

2016-02-10

6 / 26

Repetition: Triangulation Known

Image points {xij }.

Sought

3D points Xi , such that λij xij = Pj Xi See lecture 4.

Camera matrices Pj Anders Heyden

Computer Vision: Lecture 7

2016-02-10

7 / 26

A First Reconstruction System Sequential Reconstruction Given lots of images

... How do we compute the entire reconstruction? 1

For an initial pair of images, compute the cameras and visible scene points, using 8-point alg.

2

For a new image viewing some of the previously reconstructed scene points, find the camera matrix, using DLT.

3

Compute new scene points using triangulation.

4

If there are more cameras goto step 2. Anders Heyden

Computer Vision: Lecture 7

2016-02-10

8 / 26

A First Reconstruction System Issues Outliers. Noise sensitivity. How to select initial pair. Unreliable 3D points. Will get back to theses issues later in the course.

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

9 / 26

The Outlier Problem What is an outlier? An outlier is a measurement that does not fulfill the noise assumption. Arises from for example mismatches.

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

10 / 26

RANdom SAmpling Consensus - RANSAC Idea If the number of outliers is small, pick a random subset of measurements. With high probability this set will be outlier free. Algorithm 1

Randomly select a small number of the measurements, and fit the model to these.

2

Evaluate the error with respect to the estimated model fore the rest of the measurements. The Consensus set is the set of measurements with error less than some predefined threshold.

3

Repeat a number of times and select the model fit that gives the largest consensus set.

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

11 / 26

Line Fitting Example

Real data

Anders Heyden

Noisy measurements and real data

Computer Vision: Lecture 7

2016-02-10

12 / 26

Line Fitting Example Select 2 points and fit a line. First iteration:

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

13 / 26

Line Fitting Example More iterations:

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

14 / 26

Line Fitting Example Final result:

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

15 / 26

RANSAC How many iterations do we need? Depends on Ratio of outliers Number of points needed for estimating model parameters Probability of getting an inlier set Hint: 1

Calculate the probability of only getting inliers in the set: P

2

Calculate the probability of getting at least one outlier in the set: 1−P

3

Calculate the Q probability of getting at least one outlier in each iteration: (1 − P)

See lecture notes...

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

16 / 26

Minimal Solvers Algorithm Select a monomial basis. Apply the mapping Tx to the monomial basis and reduce the expressions until the result consists of monomials from the basis. Construct the action matrix MxT . Compute eigenvalues and eigenvectors of MxT . Extract the solutions from the eigenvectors. Note: The theory guarentees that the solutions will be among the eigenvectors, but not all eigenvectors are solutions. May need to test the result.

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

17 / 26

Minimal Solvers Consider the following non-linear system of equations:  2 x −y −3 =0 xy − x = 0

(1)

Consider the operator Tx that maps a polynomial p(x, y ) to xp(x, y ) and assume that (x0 , y0 ) is a solution to the system above: 1 7→ x0

(2)

x02

(3)

x0 7→

y0 7→ x0 y0 x02

7→

x0 y0 7→ y02 Anders Heyden

7→

x03 = x0 (y0 + x02 y0 = x02 x0 y02 = x0 y0 .

(4) 3) = x0 y0 + 3x0

Computer Vision: Lecture 7

(5) (6) (7)

2016-02-10

18 / 26

Minimal Solvers What degree of monomials should we choose? Monomials of order 2: 1 7→ x0

(8)

x02

(9)

x0 7→

y0 7→ x0 y0 x02

7→

x0 y0 7→ y02

Anders Heyden

7→

x03 = x0 (y0 + x02 y0 = x02 x0 y02 = x0 y0 .

(10) 3) = x0 y0 + 3x0

Computer Vision: Lecture 7

(11) (12) (13)

2016-02-10

19 / 26

Minimal Solvers What degree of monomials should we choose?  0 0 0 0 0  1 0 0 3 0   0 0 0 0 0 Mx =   0 1 0 0 1   0 0 1 1 0 0 0 0 0 0

Action matrix:  0 0   0  . 0   1 

(14)

0

MxT has eigenvalues λ = −2, 0, 2. (The eigenvalue 0 has multiplicity 4.)

The degree needs to be large enough to give monomials of the same degree. Anders Heyden

Computer Vision: Lecture 7

2016-02-10

20 / 26

Minimal Solvers Other mappings than Tx also work: E.g. Ty : 1 7→ y0

(15)

x0 7→ x0 y0

(16)

y0 7→ x02

7→

x0 y0 7→ y02

Anders Heyden

7→

y02 x02 y0 = x02 x0 y02 = x0 y0 y03 = y02 (x02 −

(17) (18) (19) 3) =

Computer Vision: Lecture 7

x02



3y02 .

(20)

2016-02-10

21 / 26

Minimal Solvers     My =    

0 0 1 0 0 0

0 0 0 0 1 0

0 0 0 0 0 1

0 0 0 1 0 0

0 0 0 0 0 0 0 1 1 0 0 −3

    .   

(21)

MyT has eigenvalues −3, 1, 0. Since two solutions have y0 = −3 the multiplicity of −3 will always be at least 2.

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

22 / 26

The 5-point Solver Finding an essential matrix can be done with five correspondences by solving ¯xT i E xi = 0,

i = 1, ..., 5.

(22)

det(E ) = 0, 2EE T E



trace(EE T )E

(23) = 0.

(24)

Form the M-matrix with the five correspondences. M has a 4 dimmensional null-space: M(α1 v1 + α2 v2 + α3 v3 + α4 v4 ) = 0.

(25)

Reshaping into matrices ¯xT i (α1 E1 + α2 E2 + α3 E3 + α4 E4 )xi = 0, Anders Heyden

Computer Vision: Lecture 7

i = 1, ..., 5. 2016-02-10

(26) 23 / 26

The 5-point solver Use the remaining equations to determine α1 , α2 , α3 , α4 . Nine equations: T

T

2EE E −trace(EE )E =

4 X 4 X 4 X

  αi αj αk 2Ei EjT Ek − trace(Ei EjT )Ek .

i=1 j=1 k=1

(27) One equation: det(E ) =

4 X 4 X 4 X

i j k i j k i j k αi αj αk e11 e22 e33 + e12 e23 e31 + e13 e21 e32

i=1 j=1 k=1

 i j k i j k i j k e21 e33 − e13 e22 e31 , −e11 e23 e32 − e12

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

(28)

24 / 26

The Five Point Solver α1 can be assumed to be one (scale ambiguity). Monomial order: {α43 , α3 α42 , α32 α4 , α33 , α2 α42 , α2 α3 α4 , α2 α32 , α22 α4 , α22 α3 , α23 , α42 , α3 α4 , α32 , α2 α4 , α2 α3 , α22 , α4 , α3 , α2 , 1}.

Gaussian elimination gives reductions for all third order terms.

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

25 / 26

The 5-point solver

Histogram over the size of the consensus set in each iteration of RANSAC (1000 iterations), using 5 points, 8 points and 10 points respectively.

Anders Heyden

Computer Vision: Lecture 7

2016-02-10

26 / 26