Digital Image Processing
Lecture 1
Introduction & Fundamentals
Dr. Pabitra Pal
Assistant Professor,
Department of Computer Applications,
School of Information Science & Technology
MAULANA ABUL KALAM AZAD UNIVERSITY OF
TECHNOLOGY
Introduction to the course
► Canvas: link
► Office Hours: Room 220, Acad Building
► Contact: email:
[email protected]
► Textbooks
Weeks 1 & 2 2
Introduction to the course
► Grading
Article Reading and Presentation: 15%
Homework: 20%
Exam: 15%
Project: 50%
Total: 100%
Extra Credits: 50%. If the method and
experimental results of your project achieve the
state of the art, you will earn the extra 50%
credits.
Weeks 1 & 2 3
Journals & Conferences
in Image Processing
► Journals:
— IEEE T IMAGE PROCESSING
— IEEE T MEDICAL IMAGING
— INTL J COMP. VISION
— IEEE T PATTERN ANALYSIS MACHINE INTELLIGENCE
— PATTERN RECOGNITION
— COMP. VISION AND IMAGE UNDERSTANDING
— IMAGE AND VISION COMPUTING
… …
► Conferences:
— CVPR: Comp. Vision and Pattern Recognition
— ICCV: Intl Conf on Computer Vision
— ACM Multimedia
— ICIP
— SPIE
— ECCV: European Conf on Computer Vision
— CAIP: Intl Conf on Comp. Analysis of Images and Patterns
… …
Weeks 1 & 2 4
Introduction
► What is Digital Image Processing?
Digital Image
— a two-dimensional function f ( x, y )
x and y are spatial coordinates
The amplitude of f is called intensity or gray level at the point (x, y)
Digital Image Processing
— process digital images by means of computer, it covers low-, mid-, and
high-level processes
low-level: inputs and outputs are images
mid-level: outputs are attributes extracted from input images
high-level: an ensemble of recognition of individual objects
Pixel
— the elements of a digital image
Weeks 1 & 2 5
Origins of Digital Image
Processing
One of the first applications of digital images was in the newspaper
industry, when pictures were first sent by submarine cable between
London and New York.
Introduction of the Bartlane cable picture transmission system in the
early 1920s reduced the time required to transport a picture across the
Atlantic from more than a week to less than three hours.
Specialized printing equipment coded pictures for cable transmission
and then reconstructed them at the receiving end.
Weeks 1 & 2 6
Origins of Digital Image
Processing
Figure was transmitted in this way
and reproduced on a telegraph printer
fitted with typefaces simulating a
halftone pattern.
The initial problems in improving the
visual quality of these early digital
pictures were related to the selection
of printing procedures and the
distribution of intensity levels.
A Digital picture produced in 1921 from a
coded tape by a telegraph printer with
special type faces
Weeks 1 & 2 7
Origins of Digital Image
Processing
Weeks 1 & 2 8
Sources for Images
► Electromagnetic (EM) energy spectrum
► Acoustic
► Ultrasonic
► Electronic
► Synthetic images produced by computer
Weeks 1 & 2 9
Electromagnetic (EM) energy
spectrum
Major uses
Gamma-ray imaging: nuclear medicine and astronomical observations
X-rays: medical diagnostics, industry, and astronomy, etc.
Ultraviolet: lithography, industrial inspection, microscopy, lasers, biological imaging,
and astronomical observations
Visible and infrared bands: light microscopy, astronomy, remote sensing, industry,
and law enforcement
Microwave band: radar
Radio band: medicine (such as MRI) and astronomy
Weeks 1 & 2 10
Examples: Gama-Ray Imaging
Weeks 1 & 2 11
Examples: X-Ray Imaging
Weeks 1 & 2 12
Examples: Ultraviolet Imaging
Weeks 1 & 2 13
Examples: Light Microscopy Imaging
Weeks 1 & 2 14
Examples: Visual and Infrared Imaging
Weeks 1 & 2 15
Examples: Visual and Infrared Imaging
Weeks 1 & 2 16
Examples: Infrared Satellite Imaging
2003
USA 1993
Weeks 1 & 2 17
Examples: Infrared Satellite Imaging
Weeks 1 & 2 18
Examples: Automated Visual Inspection
Weeks 1 & 2 19
Examples: Automated Visual Inspection
Results of
automated
reading of the
plate content
by the system
The area in
which the
imaging
system
detected the
plate
Weeks 1 & 2 20
Example of Radar Image
Weeks 1 & 2 21
Examples: MRI (Radio Band)
Weeks 1 & 2 22
Examples: Ultrasound Imaging
Weeks 1 & 2 23
Applications of DIP
► Applications
The field of image processing has applications in medicine and the
space program.
Computer procedures are used to enhance the contrast or code the
intensity levels into color for easier interpretation of X-rays and
other
images used in industry, medicine, and the biological sciences.
Geographers use the same or similar techniques to study pollution
patterns from aerial and satellite imagery
Image enhancement and restoration procedures are used to
process degraded images of unrecoverable objects
Weeks 1 & 2 24
Fundamental Steps in DIP
Extracting
image
components
Improving
the Partition an image
appearance into its constituent
parts or objects
Result is
more Represent image for
suitable than computer
the original processing
Weeks 1 & 2 25
Image acquisition
► Applications
Image acquisition is the first process of DIP.
Acquisition could be as simple as being given an
image that is already in digital form.
Generally, the image acquisition stage involves
preprocessing, such as scaling.
Weeks 1 & 2 26
Image enhancement
► Image enhancement is the process of manipulating
an image so the result is more suitable than the
original for a specific application.
► The word specific is important here, because it
establishes at the outset that enhancement
techniques are problem oriented.
► Thus, for example, a method that is quite useful for
enhancing X-ray images may not be the best
approach for enhancing satellite images taken in
the infrared band of the electromagnetic spectrum.
Weeks 1 & 2 27
Image Restoration
► Image restoration is an area that also deals
with improving the appearance of an image.
► However, unlike enhancement, which is
subjective, image restoration is objective, in
the sense that restoration techniques tend to
be based on mathematical or probabilistic
models of image degradation.
► Enhancement, on the other hand, is based on
human subjective preferences regarding what
constitutes a “good” enhancement result.
Weeks 1 & 2 28
Color image processing
► Color image processing is an area that has
been gaining in importance because of the
significant increase in the use of digital
images over the internet.
► In this chapter a number of fundamental
concepts in color models and basic color
processing in a digital domain.
► Color is used also as the basis for extracting
features of interest in an image.
Weeks 1 & 2 29
Wavelets and other image
transforms
Wavelets are the foundation for representing
images in various degrees of resolution.
In addition to wavelets, we will also discuss in
a number of other transforms that are used
routinely in image processing.
Weeks 1 & 2 30
Compression and
watermarking
► Compression, as the name implies, deals with techniques
for reducing the storage required to save an image, or the
bandwidth required to transmit it.
► Although storage technology has improved significantly
over the past decade, the same cannot be said for
transmission capacity.
► This is true particularly in uses of the internet, which are
characterized by significant pictorial content.
► Image compression is familiar (perhaps inadvertently) to
most users of computers in the form of image file
extensions, such as the jpg file extension used in the JPEG
(Joint Photographic Experts Group) image compression
standard.
Weeks 1 & 2 31
Morphological processing
► Morphological processing deals with tools
for extracting image components that are
useful in the representation and description
of shape.
Weeks 1 & 2 32
Segmentation
► Segmentation partitions an image into its constituent
parts or objects.
► In general, autonomous segmentation is one of the most
difficult tasks in digital image processing.
► A rugged segmentation procedure brings the process a
long way toward successful solution of imaging problems
that require objects to be identified individually.
► On the other hand, weak or erratic segmentation
algorithms almost always guarantee eventual failure.
► In general, the more accurate the segmentation, the more
likely automated object classification is to succeed.
Weeks 1 & 2 33
Feature extraction
► Feature extraction almost always follows the output of a
segmentation stage, which usually is raw pixel data,
constituting either the boundary of a region (i.e., the set of
pixels separating one image region from another) or all the
points in the region itself.
► Feature extraction consists of feature detection and feature
description.
► Feature detection refers to finding the features in an image,
region, or boundary.
► Feature description assigns quantitative attributes to the
detected features. For example, we might detect corners in a
region, and describe those corners by their orientation and
location; both of these descriptors are quantitative attributes.
Weeks 1 & 2 34
Image pattern classification
► Image pattern classification is the process that
assigns a label (e.g., “vehicle”) to an object based
on its feature descriptors.
► Here, we will discuss methods of image pattern
classification ranging from “classical” approaches
such as minimum-distance, correlation, and Bayes
classifiers, to more modern approaches
implemented using deep neural networks.
► In particular, we will discuss in detail deep
convolutional neural networks, which are ideally
suited for image processing work.
Weeks 1 & 2 35
Light and EM Spectrum
c E h , h : Planck's constant.
Weeks 1 & 2 36
Light and EM Spectrum
► The colors that humans perceive in an
object are determined by the nature of the
light reflected from the object.
e.g. green objects reflect light with wavelengths
primarily in the 500 to 570 nm range while
absorbing most of the energy at other wavelength
Weeks 1 & 2 37
Light and EM Spectrum
► Monochromatic light: void of color
Intensity is the only attribute, from black to white
Monochromatic images are referred to as gray-
scale images
► Chromatic light bands: 0.43 to 0.79 um
The quality of a chromatic light source:
Radiance: total amount of energy
Luminance (lm): the amount of energy an observer
perceives from a light source
Brightness: a subjective descriptor of light perception
that is impossible to measure. It embodies the achromatic
notion of intensity and one of the key factors in describing
color sensation.
Weeks 1 & 2 38
Image Acquisition
Transform
illumination
energy into
digital
images
Weeks 1 & 2 39
Image Acquisition Using a Single Sensor
Weeks 1 & 2 40
Image Acquisition Using Sensor Strips
Weeks 1 & 2 41
Image Acquisition Process
Weeks 1 & 2 42
A Simple Image Formation Model
f ( x, y ) i ( x, y )r ( x, y )
f ( x, y ) : intensity at the point (x, y )
i ( x, y ) : illumination at the point (x, y)
(the amount of source illumination incident on the scene)
r ( x, y ) : reflectance/transmissivity at the point (x, y )
(the amount of illumination reflected/transmitted by the object)
where 0 < i( x, y ) < and 0 < r ( x, y ) < 1
Weeks 1 & 2 43
Some Typical Ranges of illumination
► Illumination
Lumen — A unit of light flow or luminous flux
Lumen per square meter (lm/m2) — The metric unit of
measure for illuminance of a surface
On a clear day, the sun may produce in excess of 90,000 lm/m 2 of
illumination on the surface of the Earth
On a cloudy day, the sun may produce less than 10,000 lm/m 2 of
illumination on the surface of the Earth
On a clear evening, the moon yields about 0.1 lm/m 2 of illumination
The typical illumination level in a commercial office is about 1000
lm/m2
Weeks 1 & 2 44
Some Typical Ranges of Reflectance
► Reflectance
0.01 for black velvet
0.65 for stainless steel
0.80 for flat-white wall paint
0.90 for silver-plated metal
0.93 for snow
Weeks 1 & 2 45
Image Sampling and Quantization
Digitizing
the
coordinate
values Digitizing
the
amplitude
values
Weeks 1 & 2 46
Image Sampling and Quantization
Weeks 1 & 2 47
Representing Digital Images
Weeks 1 & 2 48
Representing Digital Images
► The representation of an M×N
numerical array as
f (0, 0) f (0,1) ... f (0, N 1)
f (1, 0) f (1,1) ... f (1, N 1)
f ( x , y )
... ... ... ...
f ( M 1, 0) f ( M 1,1) ... f ( M 1, N 1)
Weeks 1 & 2 49
Representing Digital Images
► The representation of an M×N
numerical array as
a0,0 a0,1 ... a0, N 1
a a1,1 ... a1, N 1
A 1,0
... ... ... ...
aM 1,0 aM 1,1 ... aM 1, N 1
Weeks 1 & 2 50
Representing Digital Images
► The representation of an M×N
numerical array in MATLAB
f (1,1) f (1, 2) ... f (1, N )
f (2,1) f (2, 2) ... f (2, N )
f ( x , y )
... ... ... ...
f ( M ,1) f ( M , 2) ... f (M , N )
Weeks 1 & 2 51
Representing Digital Images
► Discrete intensity interval [0, L-1], L=2k
► The number b of bits required to store a M
× N digitized image
b=M×N×k
Weeks 1 & 2 52
Representing Digital Images
Weeks 1 & 2 53
Spatial and Intensity Resolution
► Spatial resolution
— A measure of the smallest discernible detail in an
image
— stated with line pairs per unit distance, dots
(pixels) per unit distance, dots per inch (dpi)
► Intensity resolution
— The smallest discernible change in intensity level
— stated with 8 bits, 12 bits, 16 bits, etc.
Weeks 1 & 2 54
Spatial and Intensity Resolution
Weeks 1 & 2 55
Spatial and Intensity Resolution
Weeks 1 & 2 56
Spatial and Intensity Resolution
Weeks 1 & 2 57
Image Interpolation
► Interpolation — Process of using known
data to estimate unknown values
e.g., zooming, shrinking, rotating, and geometric correction
► Interpolation (sometimes called
resampling) — an imaging method to increase
(or decrease) the number of pixels in a digital image.
Some digital cameras use interpolation to produce a larger
image than the sensor captured or to create digital zoom
http://www.dpreview.com/learn/?/key=interpolation
Weeks 1 & 2 58
Image Interpolation:
Nearest Neighbor Interpolation
f1(x2,y2) = f(x1,y1)
f(round(x2),
round(y2))
=f(x1,y1)
f1(x3,y3) =
f(round(x3),
round(y3))
=f(x1,y1)
Weeks 1 & 2 59
Image Interpolation:
Bilinear Interpolation
(x,y)
f 2 ( x, y )
(1 a)(1 b) f (l , k ) a (1 b) f (l 1, k )
(1 a)b f (l , k 1) a b f (l 1, k 1)
l floor ( x), k floor ( y ), a x l , b y k .
Weeks 1 & 2 60
Image Interpolation:
Bicubic Interpolation
► The intensity value assigned to point (x,y) is
obtained by the following equation
3 3
f 3 ( x, y ) aij x y i j
i 0 j 0
► The sixteen coefficients are determined by using
the sixteen nearest neighbors.
http://en.wikipedia.org/wiki/Bicubic_interpolation
Weeks 1 & 2 61
Examples: Interpolation
Weeks 1 & 2 62
Examples: Interpolation
Weeks 1 & 2 63
Examples: Interpolation
Weeks 1 & 2 64
Examples: Interpolation
Weeks 1 & 2 65
Examples: Interpolation
Weeks 1 & 2 66
Examples: Interpolation
Weeks 1 & 2 67
Examples: Interpolation
Weeks 1 & 2 68
Examples: Interpolation
Weeks 1 & 2 69