4/13/2015
Image
Segmentation
By fitting a model
What is image segmentation?
Technically speaking, image segmentation
refers to the decomposition of a scene into
different components (thus to facilitate the task
at higher levels such as object detection and
recognition)
Scientifically speaking, segmentation is a
hypothetical middle-level vision task performed
by neurons between low-level and high-level
cortical areas
2
4/13/2015
Fitting or Grouping: (Based on fitting a
geometrical model)
here we have a set of distinct data items, and we collect
sets of data items that make sense together according to
our model.
a) Collecting together tokens that, taken together, form a line
or other geometry.
b) Collecting together tokens that seem to share a fundamental
matrix.
The key issues here:
To determine what representation is suitable for the problem in
hand. (Supervised Approach)
3
Hough transform
Image spatial space v/s Images Hough Parameter space.
Hough transform means: Transform the image in spatial plane to
hough parameter plane.
i.e. conversion of image from spatial coordinate domain (x, y) to
(m, b) hough parameter plane corresponds to line represented as:
y= mx+b (slope-intercept representation)
(r, ) hough parameter plane corresponds to line represented as:
x cos + y sin =r (polar representation)
4/13/2015
(m, b) hough parameter space:
-A line in the image
corresponds to a point
in Hough space.
Where is the line
that contains both
(x0, y0) and (x1, y1)?
intersection of the
lines b = x0m + y0
and b = x1m + y1
(m, b) hough parameter space.
-What does a point (x 0, y0) in the image space map to in the
Hough space?
There are many lines passing through the point (x 0, y 0).
Common to them is that they satisf y the equation f or some set of
parameters (m, b).
i.e. the solutions of b = x 0m + y 0 which is a line in hough space.
4/13/2015
(r, ) hough parameter space:
Problems with the line equation y = mx + b in (m, b) space ?
Unbounded parameter domain
Vertical lines require inf inite m so how we can represent the accumulator
array.
-The Alternative: polar representation
The polar (also called normal) representation of straight lines
Each point (x i,y i) in the xy-plane gives a sinusoid in the (, ) parameter
space (or plane).
-Each curve in the figure represents the family of lines that pass
through a particular point (xi ,yi ) in the xy-plane.
4/13/2015
(r, ) hough parameter space:
N nos.of colinear point lying on the line will give N curves that intersect
at (i, j) in the parameter space or plane : i.e.
Sinusoids corresponding to co-linear points intersect at an unique
point.
e.g.
Line: 0.6x + 0.4y = 2.4
Sinusoids intersect at: = 2.4, = 0.9273
Hough Transform Algorithm
10
4/13/2015
Given the following points and discrete value of and
the calculated value of
= x.cos +y.sin
Data points
Accumulator matrix
S.No (x, y)
.
1
(2, 0)
-450
00
450
900
1.4
1.4
(1,1)
1.4
(2, 1)
0.7
2.1
(1, 3)
-1.4
2.8
(2, 3)
-0.7
3.5
(4, 3)
0.7
4.9
(3, 4)
-0.7
4.9
The two equal largest values occurs at (, )
= (2, 00) and (3, 900). Then the lines are:
S.No
.
1
2
3
4
5
6
7
8
9
10
11
12
13
x cos 00 + ysin 00 = 2 i.e. x = 2.
x cos 900 + ysin 900 = 3 i.e. y = 3.
-450
-1.4
-0.7
0
0.7
1
1.4
2
2.1
2.8
3
3.5
4
4.9
1
2
1
2
00
450
900
2
1
2
2
3
1
1
3
1
2
2
11
Data without outliers or noise
12
4/13/2015
In presence of outliers or noise
13
Random data points
14
4/13/2015
Least Square method to fit a line
Fitting aim: To determine values for the slope "m" and the
intercept "b" in an equation:
y=mx+b
Fitting requires definition of some measure of the error between
the data and the line. The Overall measure of error E(m, b):
Best fit when error belong to Gaussian distribution.
Now find m and b values that could minimize the error for best
fit . So to get minima, find the derivative of E with respect to15m
and b.
Least Square method to fit a line
Derivative with respect to m:
Eq.1
16
4/13/2015
Least Square method to fit a line
Derivative with respect to b:
Eq.2
17
Least Square method to fit a line
In standard notation, these two equations can be written as:
Now value for m and b can be given as:
18
4/13/2015
Least Square method to fit a line
19
Least Square method to fit a line
20
10
4/13/2015
RANSAC (RANdom SAmpling Consensus)
Ransac is a robust method for fitting a line in the presence of
much more outliers.
View estimation as a two-stage process:
-Classify data points as outliers or inliers
-Fit model to inliers
RANSAC is a re-sampling technique that generates candidate
solutions by using the minimum number observations (data
points) required to estimate the underlying model parameters.
Developed by M. A. Fischler and R. C. Bolles.
21
Outline of the RANSAC
Randomly select a sample of s data points from S and instantiate
the model from this subset.
Determine the set of data points Si which is within a distance
threshold t of the model. This set Si, is the consensus set of the
sample and defines the inliers of S.
If the size of Si (the number of inliers) is greater than some
threshold T, re-estimate the model using all the points in Si and
terminate.
If the size of Si is less than T, select a new subset and repeat the
above.
After N trials the largest consensus set Si is selected, and the
model is re-estimated using all the points in the subset Si. 22
11
4/13/2015
23
Example of RANSAC
24
12
4/13/2015
Example of RANSAC
25
Example of RANSAC
26
13
4/13/2015
Example of RANSAC
27
Example of RANSAC
28
14
4/13/2015
Example of RANSAC
29
Example of RANSAC
30
15
4/13/2015
Example of RANSAC
Best Consensus as per all the sampling in the
complete process.
31
Example of RANSAC
Again low consensus due to further random sampling.
32
16
4/13/2015
Comparison of Least square & RANSAC
Least square based fitting
RANSAC based fitting
33
How Many Samples are Necessary (N)?
Using all possible samples is often infeasible.
Instead, pick N to assure probability p of at least one
sample (containing s points) being all inliers.
Let
a) Probability that any selected data point is an inlier = u
b) Probability of observing an outlier. = v = 1 u.
Then
N iterations of the samples can be given as:
1 p = (1 us)N
Or
34
17
4/13/2015
Analysis of RANSAC
Example: N for the line-fitting problem
n = 12 points. (total nos. of points)
Minimal sample size s = 2.
If 2 outliers then v = 2/12 = 1/6 = 20% (proportion of
outliers)
So for probability p = 0.99 (giving us a 99% chance of
getting a pure-inlier sample)
The value of N = 5
35
36
18