0 ratings0% found this document useful (0 votes) 119 views362 pagesFundamental of Remote Sensing
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
Neue?
FUNDAMENTAL OF REMOTE SENSING AND
DIGITAL IMAGE PROCESSING
DR, SUWIT ONGSOMWANG
SCHOOL OF REMOTE SENSING
INSTITUTE OF SCIENCE,
‘SSURANAREE UNIVERSITY OF TECHNOLOGY
2007Page i
Preface
Digital image analysis and interpretation (106 602) is one of three main core subjects for
Graduate students at School of Remote Sensing, Institute of Science, Suranaree University of
Technology. This course provides general knowledge on digital image processing techniques to
assist the interpretation and analysis of remotely-sensed data. Emphasis will be given on image
enhancing technique using various standard image enhancing programs and the analysis of data
from multispectral sensors. The main reference textbook used in the course based on John R.
Jensen (2005) with entitled Introductory Digital Image Processing: A Remote Sensing Perspective.
However, since the technology of remote sensing and techniques in digital image processing are
rapidly change, the review and updating the knowledge in science of remote sensing and digital
image processing are necessary. Therefore, the textbook with entitle “Fundamental of Remote
Sensing and Digital Image Processing” are prepared based for the course (Digital image
analysis and interpretation: 106 602). Major source of this textbooks were extracted from relevant
remote sensing and digital image processing book included:
* Foody, G. M. 2004. Sub-Pixe! Methods in Remote Sensing in In Remote Sensing Image
Analysis: Including the Spatial Domain. De Jong, S.M., van der Meer, F.D. (eds.) Kluwer
‘Academic Publishers, the Netherlands. pp: 37-49.
‘+ Schowengerdt, R. A. 1997, Remote Sensing: Models and Methods for Image Processing.
‘Academic Press, Inc. New York. 522 p.
J Jensen, J. R. 2005. Introductory Digital Image Processing: A Remote Sensing
Perspective. 3rd Edition. Practice Hall. 526 p.
* Lillesand, T. M. and R. W. Kiefer, and J. W. Chipman. 2004. Remote Sensing and Image
Interpretation. John Wiley & Sons, Inc. New York, 763 pp.
© McCloy, K. R. (2006). Resources Management Information Systems: Remote Sensing,
GIS and Modeling. 2Nd Edition. CRC Press Taylor & Francis Group, Fl, 575.
‘The main objectives of this textbook are as follows:
1. To provide concept and principle of remote sensing and sensors technology,
2. To review basic knowledge of aerial photographs and visual interpretation,
3. To provide fundamental and advance knowledge in digital image processing with digital
change detection.
‘The content of the textbook that consists of nine chapters can be summarized in each
chapter as following.Page i
Chapter 1: Concepts and Fundamentals of Remote Sensing. This chapter is firstly
reviewed definition of remote and the development of remote sensing. Then principle of remote
sensing system is explained include (a) an ideal and real remote sensing systems, (b)
electromagnetic remote sensing system, (c) electromagnetic radiation principle, (d) energy
interactions in the atmosphere, (f) atmospheric windows, (h) interactions with the earth surface
feature, and (i) radiometric concepts, terminology and units
Chapter 2: Sensor in Remote Sensing. This chapter provides the important information
about aircraft and satellite remote sensing system include framing and scanning systems. In
addition characteristics of selected remote sensor systems used for multispectral and hyperspectral
data collection are summarized based on spectral, spatial, radiometric, and temporal resolutions.
Chapter 3: Aerial Photography and Visual Interpretation. Characteristics of aerial
Photographs consist of resolution, photographic scale and relief displacement are firstly
summarized. The fundamentals of visual interpretation is then demonstrated in detail include (a)
image interpretation task, (b) element of image interpretation, (c) image interpretation strategies,
and (d) method of search.
Chapter 4: Satellite Data and Digital Image Processing. In this chapter, framework
of digital image processing is introduced includes (a) characteristics of satellite data, (b) digital
image resolution, (c) digital image processing, and (d) an ideal step of digital image processing. In
addition, digital image processing systems are also summarized with summary of commercial and
public software.
Chapter 5: Preprocessing in Digital Image Processing. Three main preprocessing
operations include (1) image quality assessment and statistical evaluation, (2) radiometric
correction and (3) geometric correction are demonstrated in detail in this chapter. Basic operations
of image quality assessment and statistical evaluation are summarized include (a) histogram
characteristics of remote sensor data (b) image metadata (c) viewing individual pixel brightness
value (d) univariate descriptive image statistics,(e) multivariate descriptive image statistics and (f)
feature space plots. While, principle of radiometric and geometric error in remote sensing system
are explained in detail and basic and advance techniques for radiometric and geometric correction
are discussed and demonstrated.
Chapter 6: Image Enhancement in Digital Image Processing. Selected image
‘enhancement operations that have proven of value for visual analysis and/or subsequent digital
image processing are reviewed and discussed in this chapter. They are three categories including
(1) radiometric enhancement, (2) spatial enhancement and (3) spectral enhancement. Radiometric
enhancement deals with the individual values of the pixels in the image consist of (a) linearPage iv
contrast enhancement and (b) nonlinear contrast stretch. While spatial enhancement modified pixel
values based on the values of surrounding pixels include (a) spatial convolution filtering (b) Fourier
transformation (c) CRISP (d) resolution merge (e) adaptive filter and (f) texture transform. Also,
spectral enhancement modified more than one band of data spectral consist of (a) band ratioing
(b) Principal Component Analysis (c) indices and (d) RGB to THS transformation and back again.
Chapter 7: Image Classification in Digital Image Processing. Fundamental of image
classification concept, logic and algorithm are here reviewed and discussed in this chapter.
Selected common image classification algorithms that are applied for multispectral classification will
be here separately explained in three groups: (1) parametric, (2) non-parametric and (3) nonmetric
classification. Parametric classification includes (a) maximum likelihood classifiers and (b)
clustering. While, nonparametric classification consists of (a) level-slice classifier (b) parallelepiped
classifier (c) minimum distance to means classifier (d) nearest-neighbor classifiers (e) sub-pixel
classification (f) artificial neural network (ANN) classifier. Also nonmetric classification consists of
‘expert system.
Chapter 8: Accuracy Assessment. Review of source of error in remote sensing-derived
thematic products is firstly reviewed and discussed. Then the fundamental of error matrix, sample
size, sampling design and evaluation of error matrices explained. Three type of evaluation of error
matrices include (a) descriptive evaluation of error matrices (b) discrete multivariate analytical
technique: Kappa Analysis and (c) fuzzification of the error matrix.
Chapter 9: igital Change Detection. In this chapter, principle of digital change
detection is firstly described and then change detection algorithm are reviewed and explained. The
digital change detection algorithm are here include (a) Write Function Memory Insertion (b) Multi-
date Composite Image (¢) Image Algebra (d) Post-classification Comparison (e) Binary Change
Mask Applied to Date 2 (f) Ancillary Data Source as Date 1 (9) Spectral Change Vector Analysis (h)
‘chi-square Transformation (i) Cross-correlation and (j) Knowledge-based Vision Systems.
Dr. Suwit Ongsomwang
School of Remote Sensing
Institute of Science
Suranaree University of Technology
December 2007Preface
Chapter 1. Concepts and Fundamentals of Remote Sensing
ce
1.2
13
224
22.2
2.2.3
224
Table of Contents
Definition of Remote Sensing ..
‘The Development of Remote Sensing.
‘An ideal and real remote sensing systems
Electromagnetic Remote Sensing System .
The Particular Model.
Stefan-Boltzmann’ Law
Kirchhof's Law.
Wien's Displacement Law.
Planck's Law.
Energy Interactions in the Atmosphere
‘Atmospheric Windows
Interactions with the Earth Surface Feature..
‘The interactions of electromagnetic radiation with vegetation ..
‘The interactions of electromagnetic radiation with water
‘The interactions of electromagnetic radiation with soil
Radiometric Concepts, Terminology and Units.
Radiometric Characteristics of Radiation Measurement...
Radiometric Terminology and Units...
Sensor in Remote Sensing
Framing system
‘Scanning system
(Cross-Track Scanning System.
Gircular Scanning System
Along-Track Scanning System
Side Scanning System.Chapter 3 Aerial Photography and Visual Interpretation
Characteristics of Aerial Photographs
Resolution ..
Photographic Scale.
Relief Displacement...
Visual Interpretation..
Image Interpretation Task
Element of image interpretation
Image Interpretation Strategies..
Method of Search
Satellite Data and Digital Image Processing,
Characteristics of Satellite Data ..
Digital Image Resolution...
Digital Image Processing..
‘An Ideal Step of Digital Image Processing
Preprocessing in Digital Image Processing...
Image Quality Assessment and Statistical Evaluation ..
Histogram Characteristics of Remote Sensor Data
Image Metadata.
Viewing Individual Pixel Brightness Value ..
Univariate Descriptive Image Statistics..
Multivariate Descriptive Image Statistics.
113
114
114
115
17
118
121
Feature Space Plots..
Radiometric Correction .... 124
Source of image radiometry error. 124
‘Type of Radiometric Correction . 1125
Sensor Calibration 125
Atmospheric Correction. 129
Solar and Topographic Correction .. 143
Geometric Correction,
‘Source of Image Geometry Error.
146
147
153
155Image to Map Rectification...
Image to Image Registration.
Hybrid Approach to Image Rectification/Registratio
Image to Map Geometric Rectification Logic .
Spatial Interpolation ..
Intensity Interpolation.
Image Enhancement in Digital Image Processing
Radiometric Enhancement...
Linear Contrast Enhancement..
Nonlinear Contrast Stretch.
Spatial Enhancement.
Spatial Convolution Filtering
Low-frequency Filtering in the Spatial Domain ..
High-frequency Filtering in the Spatial Domain.
Edge Enhancement in the Spatial Domain
Fourier Transformation.
‘Spatial Filter in Frequency Domair
CRISP.
Resolution Merge ..
Principal Components Merge.
Multiplicative Algorithm.
198
‘Spectral Enhancement.
Band Ratioing
Principal Component Analysis (PCA)
Indices.
RGB to IHS Transformation and Back Again..
Land Use and Land Cover Classification Scheme.
Training Site Selection and Feature selection .TAL
742
TA2A
74.2.2
7S
754
TSAL
75.1.2
75.2
75.21
75.2.2
75.2.3
75.2.4
75.25
75.26
753
753.1
Chapter 8
8.1
82
Training Site Selection...
Feature selection
Graphic Methods of Feature Selection
Statistical Measures of Feature Selection
Image Classification Algorithm..
Parametric Classification
Maximum Likelihood Classifiers.
Clustering.
Nonparametric Classification
Level-Slice Classifier .
Parallelepiped Classifier
Minimum Distance to Means Classifier.
Nearest-Neighbor Classifiers.
Sub-Pixel Classification ..
‘Atificial Neural Network (ANN) Classifier
Nonmetric Classification
Expert Systems...
Accuracy Assessment...
Source of Error in Remote Sensing-derived Thematic Products,
General Steps to Assess the Accuracy of Thematic Information Derived
from Remotely Sensed Data
The Error Matrix.
‘Sample Size
‘Sample Size Based on Binomial Probability Theory
Sample Size Based on Multinomial Distribution.
‘Sampling Design
Simple Random Sampling
Systematic Sampling
Stratifiad Random Sampling
Stratified Systematic Unaligned Sampling.
‘Clustering Sampling,
Evaluation of Error Matrices
Descriptive Evaluation of Error Matrices..
Discrete Multivariate Analytical Technique: Kappa AnalysisChapter 1
Table 1-1
Table 1-2
Table 1-3
Table 1-4
Chapter 2
Table 2-1
Table 2-2
Table 2-3
Table 2-4
Table 2-5
Table 2-6
Table 2-7
Table 2-8
Table 2-9
Table 2-10
Fuzzification of the Error Matri
Digital Change Detection ...
‘Step Required to Perform Change Detection.
Remote sensor system considerations.
Environmental Characteristics Considerations.
Selection of a Change Detection Algorithm ...
Change Detection Using Write Function Memory Insertion
Multidate Composite Image Change Detection...
Image Algebra Change Detection ...
Post-classification Comparison Change Detection
Change Detection Using a Binary Change Mask Applied to Date 2.
Ancillary Data Source as Date
‘Spectral Change Vector Analysis.
Chi-square Transformation Change Detection
‘Cross-correlation Change Detection
Knowledge-based Vision Systems for Detecting Change.
Visual On-screen Detection and Digitization.
List of Tables
Concepts and Fundamentals of Remote Sensing,
Comparison of the two major time periods in remote sensing development.
Milestones in the history of remote sensing ..
List of spectral regions in descending usefulness for monitoring green vegetation...
Radiometric Term, Symbol, Measure, Units.
Sensor in remote sensing.
Landsat MSS Landsat TM and Landsat ETM+ sensor system characteristics.
NOAA AVHRR sensor system characteristics.
Characteristics of the Sea-viewing Wide Field-of-View Sensor (SeaWiFS).
‘System characteristics of the AMS and ATLAS..
SPOT FIRV, SPOT HRVIR and SPOT Vegetation sensor system characteristics
Indian Remote Sensing (IRS) satellite characteristics.
NASA ASTER sensor system characteristics.
Sensor characteristics of IKONOS, OrbView-3, and, QuickBird satellites.
‘THEOS sensor system characteristics...
Leica Geostsystems Airborne Digital Sensors 40 (ADS-40) characteristics...Table 2-11
Table 2-12
Table 2-13
Table 2-14
Chapter 3
Table 3-1
Table 3-2
Table 3-3
Chapter 4
Table 4-1
Table 4-2
Table 4-3,
Table 4-4
Chapter 5
Table 5-1
Table 5-2
Table 5-3
Table 5-4
Table 5-5
Table 5-6
Table 5-7
Table 5-8
Table 5-9
Chapter 6
Table 6-1
Table 6-2
Table 6-3,
Table 6-4
Table 6-5
Table 6-6
Page x
Characteristics of AVIRIS and CASI hyperspectral remote sensing system:
Characteristics of TERRA satellite: MODIS ..
Leica Geosystems Emerge Digital Sensor System (DSS) characteristics
Selected remote sensing systems and their characteristics...
Aerial Photography and Visual Interpretation .
Minimum ground separation on typical aerial photographs.
Element of image interpretation
Multidisciplinary scientist brings their unique training to the image
interpretation process.
Satellite Data and Digital Image Processing
Relationship between digitizer scanning spot size (IFOV) and the pixel ground
resolution at various scales of aerial photography or image.
Image processing functions in quality digital image processing systems.
Major Functions of Selected Commercial Digital Image Processing Systems.
Major Functions of Selected Public Digital Image Processing Systems
Preprocessing in Digital Image Processing...
Hypothetical Dataset of Brightness Values...
Variance-covariance matrix of the sample data ..
Correlation Matrix of the Sample Data,
Pre-launch measurement of the TM calibration gain and offset coefficient.
Value of LMin and LMax for the Landsat MSS sensor system...
Example of atmospheric calibration techniques.
Radiometric Variables used in remote sensing.
Bilinear interpolation.
Cubic convolution interpolatior
Image Enhancement in Digital Image Processing .
Various kemels for Low-Frequency Filtering,
Various kernels for High-Frequency Filtering ..
Various kemels for Linear Edge Enhancement...
Various kemels for Line Detection.
Catalog of local convolution filteringChapter 7
Table 7-1
Table 7-2
Table 7-3
Table 7-4
Table 7-5
Table 7-6
Table 7-7
Table 7-8
Table 7-9
Table 7-10
Table 7-11
Table 7-12
Table 7-13
Chapter 8
Table 8-1
Table8-2
Table 8-3
Table 8-4
Table 8-5
Chapter 9
Table 9-1
Chapter 4
Figure 1-1
Figure 1-2
Figure 1-3
Figure 1-4
Figure 1-5
Figure 1-6
Figure 1-7
Figure 1-8
Figure 1-9
Page xi
Image Classification in Digital Image Processing,
USGS Land-Use/Land-Cover Classification System for Use
with Remote Sensor Data
Various distance measure methods for separability analysi
‘Advantage and disadvantage of maximum likelihood decision rule.
‘Advantage and disadvantage of ISODATA decision rule...
Advantage and disadvantage of RGB clustering.
‘Advantage and disadvantage of parallelepiped decision rule..
‘Advantage and disadvantage of minimum distance decision rule...
The input values, activation and output value for simple two-channel ANN.
‘Advantage and disadvantage of ANN algorithm.
Examples of the use of expert system in remotely sensed data classification .
Hierarchical levels, classes and criteria.
A hypothesis (class), variables and conditions necessary to extract white fir...
‘Advantages and disadvantage of expert system
232
239
252
262
287
289
292
296
Error matrix and their accuracy assessment by simple descriptive statistics .......302
Error matrix and their accuracy assessment by Kappa Analysis.
Error matrix and their accuracy assessment with Fuzzy Logic Rule.
Error matrix and their accuracy assessment without Fuzzy Logic Rule.
Digital Change Detection ..
Sector code definitions for change vector analysis using three bands..
List of Figures
Concepts and Fundamentals of Remote Sensing.
Components of an ideal remote sensing system..
Electromagnetic remote sensing system of earth resource.
‘A-conceptual view of an Earth observational system.
Remote sensing process
‘Types of energy transfer: conduction, convection and radiation ..
An electromagnetic wave
Electromagnetic spectrum diagrams.
Spectral distribution of energy radiated from blackbody of various temperatures ..
Interrelationship between temperature, wavelength, frequency radiant energy,radiant exitance and the point of maximum spectral radiant exitance.
Figure 1-10 Atmospheric scattering
Figure 1-11 Atmospheric absorption by various gases.
Figure 1-12 Atmospheric refraction.
Figure 1-13 Spectral characteristics of energy sources, atmospheric effects,
‘and sensing system.
Figure 1-14 Basic interactions between electromagnetic energy and the earth surface feature.
Figure 1-15 Typical spectral reflectance curve for vegetation, soil and water
Figure 1-16 ‘The nature of specular and diffuse reflectance.
Figure 1-17. Hemispherical, directional and bi-directional reflectance
Figure 1-18 The bi-directional reflectance function (BRDF) of a recent ploughed bare field
at 25° solar elevation for 662 nm wavelength.
Figure 1-19 The bi-directional refiectance function (BRDF) of a sand shinnery oak
rangelands community at 31° solar zenith angle at = 662 nm.
Figure 1-20 The bi-directional reflectance function (BRDF) of a sand shinnery oak
rangelands community at 31° solar zenith angle at = 862 nm...
Figure 1-21 The bi-directional reflectance effects..
Figure 1-22 Atmospheric effects influencing the measurement of reflected solar energy
Figure 1-23. Effects of seasonal change on solar elevation angle ..
Figure 1-24 Significant spectral responses characteristics of green vegetation
Figure 1-25 Water bodies receive irradiance from the Sun and atmosphere...
Figure 1-26
Figure 1-27
Figure 1-28 Soils and rocks receive irradiance from the Sun and atmosphere.
Figure 1-29 Representative spectra for the reflectance of soil
Figure 1-30 Reflectance of a typical soil with changes in moisture content.
Figure 1-31 Measurement of incoming and outgoing radiatio
Figure 1-32. Projected area cosine in a viewing direction other than normal
Figure 1-33. Definirig angle used in radiation measurement.
Figure 1-34 Relationship among the various terms in hemispherical and directional
radiation measurement.
Figure 1-35 Characteristics of radiant flux density.
Figure 1-36 Concept of radianceChapter 2
Figure 2-1
Figure 2-2
Figure 2-3,
Figure 2-4
Figure 2-5
Figure 2-6
Chapter 3
Figure 3-1
Figure 3-2
Figure 3-3
Figure 3-4
Figure 3-5
Figure 3-6
Figure 3-7
Figure 3-8
Figure 3-9
Figure 3-10
Figure 3-11
Chapter 4
Figure 4-1
Figure 4-2
Figure 4-3
Figure 4-4
Figure 4-5
Figure 4-6
Figure 4-7
Figure 4-8
Figure 4-9
Figure 4-10
Figure 4-11
Figure 4-12
Figure 4-13
Figure 4-14
Sensor in remote sensing..
Framing system to acquiring remote sensing images..
(Cross-Track Scanner...
Gircular Scanner...
‘Along-Track Scanner
Side Scanning System.
‘Six types of remote sensor systems...
Aerial Photography and Visual Interpretatior
Resolution and detection targets with high contrast ratio.
Resolution and detection targets with low contrast ratio
Ground resolution and minimum ground resolution on aerial photographs.
Geometry of relief displacement on a vertical aerial photograph...
Element of Image Interpretation ~ Tone...
Element of Image Interpretation - Texture..
Element of Image Interpretation ~ Shadow.
Element of Image Interpretation ~ Texture..
Element of Image Interpretation ~ Site, Situation and Association ..
Element of Image Interpretation ~ Shape.
Element of Image Interpretation — Size.
Satellite Data and Digital Image Processing.
Digital remote sensor data
Multispectral concept and data representation
‘Analog-to-digital conversion process
‘Schematic of flatbed densitometer.
‘Schematic of drum densitometer ..
Video densitometer
Video densitometer
Characteristics of spectral resolution
Characteristics of spatial resolution.
Characteristics of radiometric resolution.
Characteristics of temporal resolution based on Landsat~
Four types of resolution of Landsat TM-Band 2...
‘Spatial and temporal resolution considerations for selected application
‘Atypical digital image processing laboratory.
100Figure 4-15
Figure 4-16
Chapter 5
Figure 5-1
Figure 5-2
Figure 5-3
Figure 5-4
Figure 5-5
Figure 5-6
Figure 5-7
Figure 5-8
Figure 5-9
Figure 5-10
Figure 5-11
Figure 5-12
Figure 5-13
Figure 5-14
Figure 5-15
Figure 5-16
Figure 5-17
Figure 5-18
Figure 5-19
Figure 5-20
Figure 5-21
Figure 5-22
Figure 5-23
Figure 5-24
Figure 5-25
Idealized sequence for digital image analysis.
Analog (visual) and digital image processing image processing of remotely
sensed data use the fundamental elements of image processing
Preprocessing in Digital Image Processing...
Histograms of symmetric and skewed distribution
Histogram of a single band of Landsat Thematic Mapper data.
Cursor evaluation of individual pixel brightness values...
‘The areas under the normal curve for various standard deviations ...
Feature Space Plot of Landsat-TM band 3 and 4...
Data flow for calibration of remote sensing images to physical units
Radiometric response function for an individual TM channel.
Inverse of radiometric response function for an individual TM channel,
Various paths of radiance received by a remote sensing system...
‘An example absolute atmospheric correction using ATCOR
Absolute Atmospheric Correction Using Empirical Line Calibration.
Single-image Normalization Using Histogram Adjustment...
Multiple-date Image Normalization Using Regression..
Representation of the Sun's angle of the incidence and solar zenith angi
‘Image offset (Deskew) caused by earth rotation effect.
‘The geometrically configuration of scanning system-induced variation in
‘ground resolution cell size.
One-dimensional relief displacement and tangential scale distortion
in scanning system.
Geometric modification of remotely sensed data caused by
change in platform altitude and attitude.
Example of image-to-map rectification ..
Example of image-to-image hybrid registration
Concept of how different-order transformations fit a hypothetical surface .
The logic of filing a rectified output matrix with values from
an unrectified input image matrix.
Linear transformation
Non-inear transformation
Spatial interpolation algorithm,
150
152
161
161Chapter 6
Figure 6-1
Figure 6-2
Figure 6-3:
Figure 6-4
Figure 6-5
Figure 6-6
Figure 6-7
Figure 6-8
Figure 6-9
Figure 6-10
Figure 6-11
Figure 6-12
Figure 6-13
Figure 6-14
Figure 6-15
Figure 6-16
Figure 6-17
Figure 6-18
Figure 6-19
Figure 6-20
Figure 6-21
Figure 6-22
Figure 6-23
Figure 6-24
Figure 6-25
Figure 6-26
Figure 6-27
Figure 6-28
Figure 6-29
Figure 6-30
Image Enhancement in Digital Image Processing ...
Histograms of radiometrically enhanced dat
Graph of a Lookup Table.
Enhancement with Lookup Tables.
Minimum-Maximum Contrast Stretchin
Percentage Linear and Standard Deviation Contrast Stretching.
Piecewise Linear Contrast Stretch ..
Nonlinear radiometric enhancement.
Histogram Equalization ...
Original Histogram Data..
Equalized Histogram Data..
Comparison of radiometric enhancement.
Example of Brightness Inversion.
Spatial Frequencies
Applying a Convolution Kernel.
‘Application of various linear enhancement to Landsat TM
‘Application of various non linear enhancement to Landsat TM.
One-Dimensional Fourier Analysis.
Application of Fourier transform to the three different sub-images..
‘Two examples of stationary periodic noise and their Fourier transforms.
A filtering algorithm that uses the Fourier transform to compute
a spatial domain convolution.
Spatial Filter in Frequency Domain using a Fourier transform ..
Application of Fourier transform to a portion of Landsat TM
‘Application of Crisp Filtering to Landsat TM
Resolution merge between SPOT and Landsat TM..
‘Application of Adaptive Filtering to Landsat TM.
‘The eight nearest neighbors of pixel X according to angle ¢ used in the creator of
spatial-dependency matrices for the measurement of image texture.
5X5 window of pixels and their image values...
The ratio of various Landsat TM bands.
Principal Component Analysis.
Distribution of total image variance across the original spectral bands and
across the principal components...Figure 6-31 Scene dependence in the percentage of total TM image variance captured by
each eigenvalue. The TIR band is excluded ..
Figure 6-32. Six principal component images of Landsat-TM
Figure 6-33. Various vegetation indices of Landsat T™M ..
Figure 6-34 Intensity-hue-saturation (HIS) color coordinate system ..
Page xvi
216
216
220
222
Figure 6-35. Relationship between intensity-hue-saturation (HIS) color coordinate system and
RGB coordinate system...
Figure 6-36 Resolution merge between SPOT and Landsat TM data based on
modified IHS resolution merge ...
Chapter 7 Image Classification in Digital Image Processing..
Figure 7-1 The data flow in a classification process...
Figure 7-2. Comparison between a traditional hard and soft classification logic.
Figure 7-3 Relationship between the level of detail required and the spatial resolution
of representative remote sensing system for vegetation inventories...
Figure 7-4 Coincident spectral plots for training data obtained in five bands
for six cover type ..
Figure 7-5 Cospectral mean vector plot of 49 clusters
Figure 7-6 Ellipse Evaluation of Signatures...
Figure 7-7 Simple parallelepiped displayed in pseudo three-dimensional space.
Figure 7-8 How the maximum likelihood decision rule function.
Figure 7-9 Land use and land cover classification using maximum likelihood decision rule...
Figure 7-10 An idealized data distribution during three iterations of the K-means clustering
algorithms with the nearest-mean decision criterion...
Figure 7-11 Typical behavior of the net mean migration from one iteration to the next
in the K-mean algorithm ..
Figure 7-12 An idealized data distribution during initial stage up to n iterations of
the ISODATA clustering algorithms.
Figure 7-13. Land use and land cover classification using ISODATA clustering algorithms.
Figure 7-14 RGB clustering algorithm.
Figure 7-15. Land use and land cover classification using RGB clustering algorithms.
Figure 7-16 Level-slice decision boundaries for three classes in two dimensions
Figure 7-17 Parallelepiped classification using + two Standard deviations as limits
Figure 7-18 Land use and land cover classification using parallelepiped decision rule.
Figure 7-19. Calculation of spectral distance to means,
222
223
224
226
230
233Figure 7-20
Figure 7-21
Figure 7-22
Figure 7-23
Figure 7-24
Figure 7-25
Figure 7-26
Figure 7-27
Figure 7-28
Figure 7-29
Figure 7-30
Figure 7-31
Figure 7-32
Figure 7-33
Figure 7-34
Figure 7-35
Figure 7-36
Figure 7-37
Figure 7-38
Chapter 8
Figure 8-1
Figure 8-2
Figure 8-3
Chapter 9
Figure 9-1
Figure 9-2
Figure 9-3
Figure 9-4
Page xvi
Distance use in a minimum distance to means classification algorithm...
Land use and land cover classification by using Minimum distance decision rule ...263
Hypothetical example of nearest-neighbor classification. 265
‘Some common origins of mixed pixel problems...
The linear mixing model for a single GIFOV...
Three possible choices for endmembers for three classes ..
‘An example of a typical sub-pixel analysis
‘The logic of fuzzy classification...
Basic structure of a three layer Artificial Neural Network.
‘The components of a processing element.
The sigmoid activation function
Mathematic mode! of neuron..
‘Two thresholding functions used to transform the activation to output values
at a synapse.
The data space and the decision surface for a simple two channel,
‘The Components of a Typical Rule-based Expert System .
Example of a Decision Tree Branch...
Split Rule Decision Tree Branch.
‘A human-derived decision-tree expert system with a rule and conditions
Land use and land cover classification using expert classification.
Accuracy Assessment,
Source of Error in Remote Sensing-derived Thematic Products.
General steps to assess the accuracy of thematic information derived
from remotely sensed data.
Geographic Sampling methods ..
Digital Change Detection ..
General data processing element for a remote sensing change detection application
project or scientific study 309
‘Sequential steps outlining the project formulation process for remote sensing change
detection project..
‘The general steps used to perform digital change detection of
remotely sensed data
Phenological cycle of Cattails and Water
267
.267
275
276
276
in Par Pond in SC....Figure 9-5
Figure 9-6
Figure 9-7
Figure 9-8
Figure 9-9
Figure 9-10
Figure 9-11
Figure 9-12
Figure 9-13
Figure 9-14
Figure 9-15
Figure 9-16
Figure 9-17
Figure 9-18
Figure 9-19
Figure 9-20
Page xl
Algorithm, advantage and disadvantage of
write function memory insertion change detection
‘An example of change detection using write function memory insertion
Algorithm, advantage and disadvantage of
multiple-date composite image change detection
Principal components derived from a multiple-date dataset of
Landsat TM in 1999 and 2004
317
317
318
319
320
Algorithm, advantage and disadvantage of image algebra change detection.
Image differencing change detection Scaling alternatives and placement of
Image differencing change detection ..
Vegetation cover change detection based on differencing NDVI.
Algorithm, advantage and disadvantage of
‘A\gorithm, advantage and di
using a binary change mask applied to date 2 ..
Algorithm, advantage and disadvantage of change detection
using an ancillary as date 1 327
‘Schematic diagram of the spectral change detection method 329
Possible change sector codes for a pixel measured in three bands on two dates....330
Algorithm, advantage and disadvantage of cross-correlation change detection .....332
Visual on-screen change detection in Tsunami’s effect. 333
326Fundamentals of Remote Sensing and Digital Image Processing Page 1
Chapter 1: Concepts and Fundamentals of Remote Sensing
1.1 Definition of Remote Sensing
Linz and Simonett (1976) defined remote sensing as the acquisition data of an object
without touch or contact.
Lillesand and Kiefer (1979) defined remote sensing as the science and art of obtaining
information about an object, area, or phenomenon through the analysis of data acquired by a
device that is not contact with the object, area, or phenomenon under investigation.
Barrett and Curtis (1982) defined remote sensing as the observation of a target by a device
some distance away from it.
Colwell (1983) defined remote sensing as the measurement or acquisition of information of
‘some property of an object or phenomena, by a recording device that is not in physical or intimate
contact with the objective or phenomenon under study.
Curran (1985) defined remote sensing as the use of electromagnetic radiation sensors to
record images of the environment which can be interpreted to yield useful information.
Sabins (1987) broadly defined remote sensing as collecting and interpreting information
about target without being in physical contact with object.
Campbell (1987) defined remote sensing as the science of deriving information about the
earth's land water areas from images acquired at a distance.
Colwell (1997) defined remote sensing as the art, science, and technology of obtaining
reliable information about physical objects and the environment, through the process of recording,
measuring and interpreting imagery and digital representations of energy patterns derived from
non-contact sensor system.
In conclusion, remote sensing can be defined as the art, science, and technology of
obtaining reliable information about object, area, and phenomenon of earth’s resources and
environment, through the process of recording, measuring and interpreting imagery and digital
representations of energy patterns derived from non-contact sensor system.
1.2 The Development of Remote Sensing
‘Simonet (1983) reported that Colwell (1979) divided the development of remote sensing
into two general areas. Prior about 1960, aerial photograph was the sole system used in remote
sensing. With the advent of space program in the early 1960's and the first photographs from the
Mercury, the pace of technological development for remote sensing accelerated. Table 1-1 shows a
Comparison of the two time periods.
'y Or. Suwit Ongsomwang, Schodl of Remote Sensing, Suranaree University af Technology, 2007.(eget) HeUOWIS Wold
*uoQeWuoju! paAyap-Bulsuas ayoWaL Jo siasn jequazod
Aq_,souexdaace ABojouy>e%, 0 souesise1 Aneay BuINUAUOD ‘I
“VORRLJOJU! 8>{Aap-Bulsuas B}OWS4 JO
suasn jequajod Aq ,aoueydace ABojouyoay,, 03 20uRIS1SA1 ANROH *T
“Auuodeam 2]Wo0UD.,
‘Aq pasoduu yeauug au 40/pue ‘,aumans s,e0ue ue Jo Ayyxajdioo,
*MOISUBYE aiNyeUBis, 0} payefel swialqod peyeposse pue
‘waxsks uoReuUoyu! aounosal jeqo|6 ‘uoRD=jOud jeyeWUONIAUD
‘gounosas jo Ajijigem@uas ayy ynoge wsaDU0D WOM ‘H
+ ,Atuodeam a1woUoDe,,
Aq pasodurt yeouup 23 J0/pue ‘,aanyanins sjza1e Ue Jo Ayxojdusoo,
‘,uolsuap@ amyeubis, 0} payelei sweiqoid payeposse pue
*wagsks UOReUUOJU| B2uNOsAL jeqo|6 ‘uOR2=yOd jeqUaWUOMAUS
‘seoinosas jo Ayigemauas ai ynoge weouoo aman _‘H |
“SJ@JOM PaqUaLO-s2unosad Aq paujequjeW
pue payeiado Apead you ‘anisuadke pue xejdwio> yuewdinba “5
“SiaJ0m paqu@vo-eounose1 Aq pauleuiews
pue payeiedo Aypeas ‘anjsuedxeuy pue ajdwyjs quewidinby ‘9.
“ydeoune NW, Bp 0 Ayiqeayidde eajsuapa '
"3deou0o ,Aynu, aun Jo Augeondde feu “|
“soplunpioddo peunojurun wajqaud Auew “3 |
“saqyunoddo pauoyun wn swaygoid may “3
“Bujsuas
@jowal 0} padsay YIM suORe|a! [IA!D/AJeqNLU Jood Ajsaneioy “O
“Bujsues
‘Bowes 0} jpadsau YIM SUOAE|a! [WA}P/AleyNW POO APeAREIEY “C
“shay uopeyaudiaquy OJOYd JO BSN jeUIIUIW “D
"she uoneyaudienul ojoyd Jo Bsn aAlsuEy “2
“quawsoueyua pue sisAjeue auryrew 2ip UO aoueIEN ANH “a
“saGeull aoueyuaun Jo sisAjeue uewny ty UO aoUeIas AAROH *@
“ejep 6ulsuas ayouau yo sayep pue spupj AueW "Y
AydesBoxoyd Jo axep pue Pury aU0 AUD "¥
"0967 SUIS,
(0961-0987) abe aoeds au 04 JOLld
(626t ‘yemjog) quauidojanap Buisues
Zabed
ayowel ul spoyad aw sofew omy auf Jo UosHedWOD FT-F BIEL
‘uissaooiy a6eu en6ia pu Busuas siouioy Jo sewewiepungFundomental of Remete Sensing and Dita! Image Processing Page 3
Elachi (1987) had also confined the development of remote sensing and summarized as
following:
The early development of remote sensing as a scientific field was clearly tied to
developments in photography. The first photographs were reportedly taken by Daguerre and
Niepce in 1839. The following year, Arago, Director of the Paris Observatory, advocated the use of
photography for topographic purposes. In 1849 Colonel Aime’ Laussedat, an officer in the French
Corps. of Engineerings, embarked on an exhaustive program to use photography in topographic
mapping. In 1858 balloons were being used to acquire photography of large areas. This was
followed by the use of kites in the 1880s and pigeons in the early 1900s to carry cameras to many
hundreds meters of altitude. The advent of the airplane made aerial photography a very useful tool
because acquisition of data specific areas and under controlled conditions became possible. The
first recorded photographs were taken from an airplane piloted by Wilbur Wright in 1890 over
Centocelli, Italy.
olor photography became available in the mid 1930s. At the same time work was
continuing on the development of films that were sensitive to near-infrared radiation. Near-infrared
photography was particularly useful for haze penetration. During World War II research was
conducted on the spectral reflectance properties of natural terrain and the availability of
photographic emulsions for aerial color infrared photography. The main incentive was to develop
techniques for camouftage detection,
In 1956, Colwell performed some of the early experiments on the use of special purpose
aerial photography for the classification and recognition of vegetation types and the detection of
diseased and damage vegetation. Beginning in the mid-1969s, a large and multispectral
Photography were unidertaken under the sponsorships of NASA, leading to the launch of
‘muttispectral imageries on the Landsat satellites in the 1970s.
‘As the long wavelength end of the spectrum, active microwave systems have been used
since early this century and particularly after World War II to detect and track moving objects such
as ships and later, planes. More recently, active microwave sensors have been developed providing
‘two-dimensional images that look very similar to regular photography, except the image brightness
1s a reflection of the scattering properties of the surface in the microwave region. Passive
microwave sensors were also developed to provide photographs of the microwave emission of
natural objects.
The tracking and ranging capabilities of radio systems were known as early as 1889, when
Helrich Hertz showed that solid objects reflected radio waves. In the first quarter of this century, a
number of investigations were conducted in the of radar systems for the detection and tracking of
'y Dr. Suvit Ongsonwang, School of Remote Sensing, Surenaree University of Technology, 2007.Fundametalsof Remote Sensing and Digital Image Processing Page 4
ships and planes and for the study of the ionosphere, Radar work expanded dramatically during
World War II. Today, the diversity of applications for radar is truly starting. It is being used to
study ocean surfaces, lower and upper atmospheric phenomena, subsurface and surface land
structures, and surface cover. Radar sensors exist in many different configurations. These include
altimeters to provide topographic measurement, scatterometers to measure surface roughness,
and imageries.
In the mid-1950s extensive work took place in the development of real aperture airborne
imaging radars. At about the same time, work was on going in developing synthetic aperture
imaging radar (SAR), which use coherent signals to achieve to the scientific community in the mid-
1960s. Since then, work has continued at a number of institutions to develop the capacity of radar
sensors to study natural surface. This work fed to the orbital flight around the Earth of the SEASAT
SAR (1987) and The Shuttle Imaging Radar (1981, 1984).
The most recently introduced remote sensing instrument Is the laser, which was first
developed in 1960. It is mainly being used for atmospheric studies, topographic mapping, and
surface studies by fluorescence,
In addition, de Jong et al. (2004) reviewed the historic development of remote sensing in
each decade as follows:
In 1859 Gaspard Tournachon took an oblique photograph of a small village near Paris from
a balloon. With this picture the era of earth observation and remote sensing had started. Other
people all over the world soon followed his example. During the Civil War in the United States
aerial photography from balloons played an important role to reveal the defence positions in
Virginia. Likewise other scientific and technical developments this Civil War time in the United
States speeded up the development of photography, lenses and applied airborne use of this
technology. Although the space era of remote sensing was still far away after the Civil war, already
in 1891 patents were granted in Germany to successful designs of rockets with imaging systems
under the title: ‘new or improved apparatus for obtaining bird’s eye photographic views of the
earth’. The design comprised a rocket propelled camera system that was recovered by a parachute.
‘The next period of fast developments in earth observation took place in Europe and not in
the United States."It was during World War I that airplanes were used on a large scale for
photoreconnaissance. Aircrafts proved to be more reliable and more stable platforms for earth
observations than balloons. In the period between World War I and World War II a start was made
with the civilian use of aerial photos. Application fields of airborne photos included at that time
geology, forestry, agriculture and cartography. These developments lead to improved cameras,
films and interpretation equipment. The most important developments of aerial photography and
‘By Dr. Suwit Ongsomwang, School of Remote Sensing, Swanaree Universiy of Technology, 2007,Fundamentals of Remote Sensing and Digital Image Processing Page 5
photo interpretation took place during World War II. During this time span the development of
‘other imaging systems such as near-infrared photography, thermal sensing and radar took place,
Near-infrared photography and thermal infrared proved very valuable to separate real vegetation
from camouflage. The first successful airborne imaging radar was not used for civilian purposes but
proved valuable for nighttime bombing. As such the system was called by the military: "plan
position indicator’ and was developed in Great Britain in 1941.
‘After the wars In the 1950s remote sensing systems continued to evolve from the systems
developed for war efforts. Color infrared photography (CIR) was found to be of great use for the
plant sciences. In 1956 Colwell conducted experiments on the use of CIR for the classification and
recognition of vegetation types and the detection of diseased and damaged or stressed vegetation.
Tt was also in the 1950s that significant progress in radar technology was achieved. Two types of
radar were developed at that time: SLAR: side-looking airborne radar and SAR: Synthetic Aperture
Radar. Either development aimed at the acquisition of images at the highest possible resolution.
Crucial to the SAR development was the ability to finely resolve the Doppler frequencies using a
frequency analyses algorithm on the returning radar signal by the US Air Force research centre.
In the early 1960s the US started placing remote sensors In space for weather observation
and later for land observations. TIROS (Television Infrared Observation Satellite) was the first
meteorological satellite. A long series of meteorological satellites followed this one. 1960 was also
the beginning of a famous US military space imaging reconnaissance program called Corona.
Unfortunately, much of this program remained classified until 1995. In 1970 the TIROS program
was renamed into NOAA (National Oceanic and Atmospheric Administration). Until today the NOAA
‘Advanced Very High Resolution Radiometer (AVHRR) is orbiting the globe and collecting
information on weather patterns in visible, near infrared and thermal wavelengths. NOAA-17 was
launched on June 24, 2002. The 1950s and 1960s were also important for the organizational
evelopment of remote sensing. Various civil research organizations and universities became highly
interested in these new technologies. This resulted in the start of various professional organizations
and the publishing of remote sensing journals such as the IEEE Transactions on Geoscience and
Remote Sensing, International Journal of Remote Sensing, Remote Sensing of Environment and
Photogrammetric Engineering & Remote Sensing. Today remote sensing is not only taught at the
university level but also at high schools.
In the early 70s the first satellite specifically designed to collect data of the earth’s surface
and its resources was developed and launched: ERTS-1 Earth Resources Technology Satellite.
Later, in 1975, this program was renamed into Landsat. This first earth resources satellite was in
fact a modified Nimbus weather satellite carrying two types of sensors: a four waveband multi-
y Dr. Suwit Ongsomwang, School of Remote Sensing, Suranaree University of Technology, 2007.Fundamentals of Remote Sensing and Dial Image Processing Page 6
spectral scanner (MSS) and three retum beam vidicon television cameras (RBV). The sensors
aboard this satellite proved to be able to collect high quality images at a reasonable spatial
resolution. These images gave remote sensing a worldwide recognition as a valuable technology.
‘The main advantages recognized at that time were: ready availability of images for most of the
world, lack of political, security and copyright restrictions, low cost, repetitive multi-spectral
coverage and minimal image distortion, Landsat 2 and 3 were launched in 1975 and 1978,
respectively, and carried the same payload as the first satellite of this series. The payload was
changed in 1982 with Landsat 4. The technically more advanced Thematic Mapper (TM) sensor
replaced the RBV. An improved design of the TM, the ETM+ (Enhanced Thematic Mapper) was
‘mounted aboard Landsat 7 and launched in 1999. The Landsat series is a very successful program,
various MSS and TM sensors exceeded by far its design life time and its imagery is probably the
most widely used data in the Earth sciences. One black spot on its history record is the “failure
upon launch’ of Landsat 6 in 1993.
Various other successful earth observation missions carried out by other countries followed
the Landsat program. In 1978 the French government decided to develop their own earth
observation program. This program resulted in the launch of the first SPOT satellite in 1986. To the
original SPOT design of three spectral bands a new sensor called Vegetation was added aboard
SPOT-4 in 1998. Other earth observation missions are the Indian Remote Sensing Program (IRS)
started in 1988, the Russian Resurs series first launched in 1985 and the Japanese ADEOS
(Advanced Earth Observing Satellite) put in orbit in 1996. The European Space Agency (ESA)
launched its first remote sensing satellite, ERS-1, in the year 1991. ERS carries various types of
sensors aboard among which the AMI, a C-band (5 cm radar) active microwave instrument. The
main focus of the ERS program is oceanographic applications although it is also widely used for
monitoring tropical forests. In 1995 ERS-2 was successfully launched. In March 2002 ESA launched
Envisat-1, an earth observation satellite with an impressive payload of 13 instruments such as a
synthetic aperture radar (ASAR) and a Medium Resolution Imaging Spectrometer (MERIS). An
important recent development is the launch of high-resolution earth observation systems such as
IKONOS and QuickBird. These systems have multi-spectral systems collecting information in 4
bands (blue, green, red and near-infrared) at a spatial resolution of 4 meters or better. IKONOS
has also a panchromatic mode (0.45-0.90 um) with a spatial resolution of 1m. With IKONOS,
QuickBird and similar systems, space bore remote sensing approaches the quality of airborne
photography. Table 1-2 shows a few important dates in the development of remote sensing.
‘By Dr. Suwit Ongsomwang, School of Remote Sonaing, Surenaree Unversity of Technology, 2007.undamentals of Remote Sensing and Digital Image Processing
‘Table 1-2: Milestones in the history of remote sensing
Page 7
{1800 | Discovery of Infrared by Sir W. Herschel
839 | Beginning of Practice of Photography
[4847 | Infrared Spectrum Shown by J.B.L. Foucault
1859 _| Photography from balloons
1873 _| Theory of Electromagnetic Spectrum by J.C. Maxwell
1909 | Photography from Airplanes
1916 | World War I: Aerial Reconnaissance
1935 _| Development of Radar in Germany
1940 | WWII: Applications of Non-Visible Part of EMS
1950 _| Military Research and Development
1959 _| First Space Photograph of the Earth (Explorer-6)
1960 | First TIROS Meteorological Satellite Launched
1970 | Skylab Remote Sensing Observations from Space
1971 | Launch of Landsat-1 (ERTS-1): MSS sensor
| 1972 | Rapid Advances in digital image processing
1978 | Launch of Seasat (first spaceborne L-band radar)
1982 | Launch of Landsat-4: new Generation of Landsat sensors TM
1986 | French Commercial Earth Observation Satellite SPOT
1986 | Development Hyperspectral Sensors
1990 | Development High Resolution Spacebome Systems
1990 | First Commercial Developments in Remote Sensing
1991 | Launch of the first European Remote Sensing Satellite ERS1 (active radar)
1998 | Towards Cheap One-Goal Satellite Missions
1999 | Launch of EOS-TERRA: NASA Earth Observing Mission
1999 | Launch of IKONOS, very high spatial resolution sensor system
2001 | Launch of Landsat-7 with new ETM+ sensor
2001 | Launch of QuickBird, very high spatial resolution sensor system
2002 | Launch of ESA’s Envisat with 10 advanced instruments
From: de Jong et al, 2004
‘By Dr. Suwit Ongsomwang, School of Remote Sensing, Suranaree University of Technology, 2007.Fundamentals of Remote Sensing and Digital Image Processing Pages)
1.3 An ideal and real remote sensing systems
Lillesand and Kiefer (1979) introduced the basic components of an ideal remote sensing
‘system (Figure 1-1) and explained the real remote sensing system. The ideal remote sensing
system included:
1. A Uniform Energy Source. This source would provide energy over all wavelength, at a
constant, known, high level of output, irrespective of time and place.
2. A Noninterfering Atmosphere. This would be an atmosphere that would not modify
the energy from the source in any manner, whether that energy were on its way to the earth's
‘surface or coming from it. Again ideally, this would hold irrespective of wavelength, time, place,
and sensing altitude involved.
3. A Series of Unique Energy/Matter Interactions at the Earth's Surface. These
interactions would generate reflected and/or emitted signals that are not only selective with
respect to wavelength, but also are known, invariant, and unique to each and every earth surface
feature type and subtype of interest.
4. A Super Sensor. This would be a sensor highly sensitive to all wavelengths, yielding
spatially detailed data on the absolute brightness (or radiance) from a scene as a function of
wavelength, throughout the spectrum. This super sensor would be simple, reliable, require virtually
No power or space, be accurate, and economical to operate.
5. Real-Time Data Handling System. In this system, the instant the radiance versus
wavelength response over a terrain element were generated, it would be processed into an
interpretable format and recognized as being unique to the particular terrain element from where it
‘came. This process would be performed nearly instantaneous (real time), providing timely
information.
6. Multiple Data Users, These people would have knowledge of great depth, both their
respective disciplines and of remote sensing data acquisition and analysis techniques. The same set
of data would become various forms of information for different users, because of their wealth of
knowledge about the particular earth resource being sensed. With this information, the various
users would make profound, wise decisions about how best to manage the earth resource under
scrutiny and these management decisions would be implemented.
‘By Dr. Sut Ongsomwang, School of Remote Sensing, Suranaree Unversity of Technology, 2007.Fundamentals of Remote Sensing and Dita! Image Processing Page 9
{1) Uniform energy souree (4) Super sensor
|
ance
length
Wavel
‘Unique response
for each feature
{2) Noninterfering
‘atmosphere
‘Reflected and
‘emitted energy
choy
{3) Unique energy interactions
at earth surface features re OY
er
Figure 1-1: Components of an ideal remote sensing system (From Lillesand and Kiefer, 1979)
Unfortunately, an ideal remote sensing system as described above does not exist (Lillesand
and Kiefer, 1979). Regarding the elements of the ideal system that has been suggested, the
following general shortcomings of real systems should be recognized:
1. The Energy Source. All passive remote sensing systems rely on energy that is either
Teflected and/or emitted from earth surface features. Solar energy levels obviously vary with
respect to time and location, and different earth surface materials emit energy to varying degrees
Of efficiency. While we have some control over the nature of sources of energy for active systems,
the sources of energy used in all real systems are generally non-uniform with respect to
wavelength and their properties vary with time and locations. Consequently, we normally must
calibrate for source characteristics on a mission-by-mission basis, or deal with relative energy units
‘sensed at any given time and location.
2. The Atmosphere. The atmosphere normally compounds the problems introduced by
energy source variation. To some extent, the atmosphere always modifies the strength and
‘Spectral distribution of the energy received by a sensor. It restricts "where we can look” spectrally
and its effects vary with wavelength, time and place. Elimination of, or compensation for,
‘atmospheric effects via some form of calibration is particularly important in these applications
Where repetitive observations of the same geographical area are involved.
PY Or. Suit Ongsonwang, School of Remote Sensing, Suranaree University of Technology, 2007.Fundamentals of Remote Sensing and Digtal Image Processing Page 10
3. The Energy/Matter Interactions at the Earth's Surface. Remote sensing would be
simple if each and every material reflected and/or emitted energy in a unique known way,
Although spectral signatures play a central role in detecting, identifying, and analyzing earth
surface materials, the spectral world is full of ambiguity. Radically different material types can have
great spectral similarity, making differentiation difficult. Furthermore, the general understanding of
the energy/matter interactions for earth surfaces is at an elementary level for some materials and
virtually nonexistent for others.
4. The Sensor. No single sensor is sensitive to all wavelengths. All real sensors have
fixed limits of spectral sensitivity. They also have a limit on how small an object on the earth's
surface can be and still be seen by a sensor as being separate from its surrounding. This limit,
called the spatial resolution of a sensor, is an indication of how well a sensor can record spatial
detail.
5. The Data Handling System. The capability of current remote sensors to generate data
for exceeds the current capacity to handle these data. This is generally true whether we consider
‘manval image interpretation procedures or computer assisted analyses, Processing sensor data into
an interpretable format can be an effort entailing considerable thought, instrumentation, time,
experience, and reference data.
6. The Multiple Data Users. Central to the successful application of any remote sensing
is the person (or persons) using the remote sensor data from that system. The "data" generated by
remote sensing procedures become “information” only if and when someone understands theirs
generation, knows how to interpret them, and knows how best to use them. A through
understanding of the problem at hand is paramount to the productive application of any remote
sensing methodology. Also, no single combination of data acquisition and analysis procedures will
satisfy the needs of all data users.
1.4 Electromagnetic Remote Sensing System
Remote sensing system can be thought as many respects, but the one that is currently
being operated to assist in inventorying, mapping, and monitoring earth resources is
electromagnetic rémote sensing. Lillesand and Kiefer (1979) schematically illustrated the
generalized process and elements involved in electromagnetic remote sensing of earth resource
(Figure 1-2).
‘The two basic processes involved are data acquisition and data analysis. The elements of
the data acquisition process are:
(@) Source of energy;
‘By. Suwit Ongsomwang, School of Remote Sensing, Suranaree Univery of Technology, 2007.Fundamentals of Rernote Sensing and Digital Image Processing Page 11
(b) Propagation of energy through the atmosphere;
(©) Energy interactions with earth surface features;
(d) Airborne and/or spaceborne sensors;
(€) Data Product. It is the generation of sensor data in pictorial and/or numerical form.
In short, we use sensors to record variations in the way earth surface features reflect and
emit electromagnetic energy.
The elements of the data analysis process are:
(A) Interpretation;
(g) Information products;
(h) Users.
‘The data analysis process involves examining the data using various viewing and
interpretation devices to analyze pictorial data, and/or a computer to analyze numerical sensor
data, Reference data about the resources being studied (such as soils maps, crop statistics, or
field-check data) are used when and where available to assist in the data analysis. With the aid of
the reference data, the analyst extracts information (f) about the type, extent, location, and
condition of the various resources over which the sensor data were collected. This information is
then presented (g) generally in the form of maps, tables, and a written discussion or report.
‘Typical information products are such things as land use maps and crop area statistics. Finally, the
information is presented to users (h) who apply it to their decision-making process.
DATA ACQUISITION=—=E> DATA ANALYSIS
etwence
oo ‘on
ee
ee | fm Um | | ce)
(© Somonct ows @ i
0) Romo
ZS mes a ee
LOSS 0 error
Figure 1-2: oc vst remote sensing system of earth resource (From Lillesand and Kiefer,
1979)
y Dr. Suwit Ongsomwang, School of Remote Sensing, Suranaree UnWverty of Technology, 2007,Fundamentals of Remote Sensing and Digital Image Processing Page 12
Curran (1985) stated that a remote sensing system using electromagnetic radiation had
four components: (1) a source, (2) interactions with the earth's surface, (3) interaction with
atmosphere and (4) a sensor.
‘Source: the source of electromagnetic radiation may be natural like Sun's reflected light or
the earth's emitted heat, or man-made, like microwave radar.
Earth's surface interaction: the amount and characteristics of radiation emitted or
reflected from the earth's surface is dependent upon the characteristics of objects on the earth's
surface.
Atmospheric interaction: electromagnetic energy passing through the atmosphere is
distorted and scattered.
Sensor: the electromagnetic radiation that has interacted with surface of the earth and the
atmosphere is recorded by a sensor, for example a radiometer or camera.
Landgrebe (2003) divided an Earth observational system into three basic parts: (1) the
scene, (2) the sensor, and (3) the processing system as shown in Figure 1-3.
‘em |_[ ttomaton
1 aatiee []_Ustzton
==]
eth Anca Data
Figure 1-3: A conceptual view of an Earth observational system (From Landgrebe, 2003).
‘Scene: The scene consists of the Earth's surface and the intervening atmosphere. For
systems in the optical portion of the spectrum that rely on solar illumination, the sun is included.
‘This portion of the system is characterized by not being under human control, not during system
design nor operation. However, its most defining characteristic is that it is by far the most complex
and dynamic part of the system. It is easy to underestimate this complexity and dynamism.
'y Dr. Sunit Ongsomwang, School of Remote Sensing, Suranares University of Technology, 2007.Fundamentals of Remote Sensing and Digital Image Processing Page 13,
Sensor: The sensor is the part of the system to collect the main body of data to be used.
‘The sensor systems is usually characterized by being under human control during the system
design phase, but less so or not at all during system operation.
Processing System: The processing system is that part of the system where data analyst
Is able to exercise the most control and have the most choice. One might divide the spectrum of
possible degrees of human/machine participation in the analysis process into four parts: fully
manual methods, machine aided manual methods, manually aided machine methods, and
‘automatic analysis.
Jensen (2005) explained that the remote sensing data-collection and analysis procedures
used for Earth resources application are often implemented in a systematic fashion referred as the
remote sensing process. The procedures in the remote sensing process were here summarized and
shown in Figure 1~4,
+ The hypothesis to be tested is defined using a specific type of logic (e. g., inductive and
deductive) and an appropriate processing model (e. g., deterministic, stochastic).
‘+ In situ and collateral data necessary to calibrate the remote sensor data and/or judge
its geometric, radiometric, and thematic characteristics are collected.
+ Remote sensor data are collected passively or actively using analog or digital remote
‘sensing instruments, ideally at the same time as the in situ data.
‘+ In situ and remotely sensed data are processed using a) analog image processing, b)
digital image processing, c) modeling and d) n-dimensional visualization.
+ Metadata, processing lineage, and the accuracy of the information are the results
Communicated using images, graphs, statistical tables, GIS databases, Spatial Decision
Support Systems (SDSS), etc.
®8y Dr. Suwit Ongsomwang, School of Remote Sensing, Suranaree Univerty of Technology, 2007.(S00z ‘vasuar wou) “eyep pesuas AjajoWe! Woy UOReWOJU! BundenXe UBYM ssadcid BujSUEs a]OWSY ty-T SuNBiy
suoysuowip ¢ pu Z| -
suduin +
sdawojoydoyig -
safiowougug ~
Paynoauun -
sodeuy -
wy8iq paw Sopeuy «
woyoarep auay) ~
soishyd uo posng Buyjopous 90905
ep Sie) Sutsn Suyjopow perc -
stsaypodiy s90[31 40 wdaaoy -
‘Buyjepow
ona auc ~
stshouryeaoadsiodip ~
nue} SUNYDEN -
sayyssep 2as}-ojsiaag ~
suuaysts wad ~
se yon ou jourvieduoy, -
Poort wna «
se ypns ‘sisjourezeg -
sysAyoue aunauiuresBonoy -
uoutoounyur ~
uwopaaun.) surswi099 -
Uwonaaui. dtsoworpey -
‘uss
Buyssao04g a8
oqyoudiony oBouy
fo suauayg ayy Bus ~
QAYNOS) anSnODY -
CavarD 2se1-
(avaye) axemousiy =
saney =
yeansodsiod4y -
eajadsynp -
‘sane wast pu svauy] ~
penpodsiodsy «
yonpadsiynyy ~
ssauuB9g -
uauo ates ~
eudp oasssea ~
Aydestonpi, -
‘upuue ure =
ojpue anysseg - snseyoo}s ~
Suysuag a)ouay « paseg-ss200,
pasug-aipajmouy ~
K
“ona “uisuap
dows Koy
yEQ [eIDHETIOD + Poydojouypay,
owt 09]
9) Kuoweuoqey -
Joypuo aananpuy ~
91807] areysdouddy 3209195 +
: (oroydoudde J)
sqsampod4yy ayeqnusog »
‘ByepEyayy aBvuyy + ‘Suyssar0ag adem (Jenst,) Sopeuy « ‘suaWaANsBayA MIS Uy ©
WO} EIU ISIAG wOHD2T0D way ay)
‘nopyeutaoyay |] #12803 HoppeaLa9yay-oF-rpEq fo eq |~ joquawayers
bt Bed
Gusssooou abou feyEia pur Gusies sury Jo sjewewepunyundamertal of Remote Sensing and Dial Imege Processing Page 15
4.5 Electromagnetic Radiation Principle
Remotely sensed images record the interaction of electromagnetic energy with the earth's
surface. To develop proficiency in the interpretation of these images, it is necessary to understand
the behavior of this energy as it reflected or emitted from the earth's surface through the
‘atmosphere to the sensor. Therefore, knowledge of the properties and behavior of electromagnetic
‘energy forms a foundation of remote sensing.
From basic physics, energy is the ability to do work (Figure 1-5). It is usually transferred
from one point to another point by:
(@) Conduction. This involves atomic or molecular collisions.
(b) Convection. This is a corpuscular mode of transfer in which bodies of energetic material
are themselves physically moved.
(©) Radiation. This is the only form in which electromagnetic energy may be transmitted
either through a medium or a vacuum.
ee Tame Cae
ee AY?
c wiabirer
T- @=
ite
Figure 1-5: Types of énergy transfer: conduction, convection and radiation (From Jensen, 2007)
In remote sensing we are primarily concemed with energy transfer by means of radiation.
Electromagnetic radiation is one of the most useful force fields for remote sensing, forming a high-
‘speed communications link between the sensor and remotely located substances (Suits, 1983).
In the case of electromagnetic radiation two models are necessarily to describe and
elucidate its most characteristic: (a) the wave model and (b) the particular mode! (Lillesand and
Kiefer, 1979; Barrett and Curtis; 1982 and Jensen, 2007).
1.5.1 The Wave Model. This typifies radiation through regular oscillary variations in the electric
and magnetic fields surrounding a charged particle. Wave-like perturbations emanate from the
source at the speed of light (3x108 m sl). They are generated by the oscillation of the particle
itself. The two associated force-fields are mutually orthogonal, and both are perpendicular to the
direction of advancement (Figure 1-6).
'y Dr. Suwit Ogsomwang, School of Remote Sensing, Suranaree Univesity of Technology, 2007.Fundamentals of Remote Sensing and Digtal Image Processing Page 16
Electric field
(Number of evcies per second
‘atzing a fixed pot)
Figure 1-6: An electromagnetic wave. Components include a sinusoidal electro wave (E) and a
similar magnetic wave (M) at right angles, both being perpendicular to the direction of
propagation. (From Lillesand and Kiefer, 1979)
1.5.2 The Particular Model. This emphasizes aspect of the behavior of radiation which suggest
that it is comprised of many discrete units. These are called “quanta” or "photons". These carry
from the source some particle-like properties such as energy and momentum, but differ from all
other particles in having zero mass at rest. It has been hypothesized, consequently that the photon
is a kind of "basic particle".
In term of the wave model, waves obey the general equation from basic physics:
ota (1-4)
where:
c__ isthe speed of light, a constant (3x108 m s-1);
fF _ isthe wave frequency, cycle per second (Hertz, Hz);
4 isthe wavelength (micrometers, jm)
Since c is constant, wave frequency is inversely proportional to wavelength and directly
proportional to its speed of wave advancement.
In another term, particle model, the energy of a quantum is given as
E=hf (12)
where:
E _ is the energy of a quantum, Joule (3);
h isa Planck's constant, 6.226 x 10% J sec.
We can relate the wave and particle model of electromagnetic radiation by the common
frequency value in both equations to obtain:
‘By Dr. Sunit Ongsomwang, School of Remote Sensing, Suranaree University of Technology, 2007.Fundamentals of Remote Sensing and Digital Image Processing Page 17,
hee
E= 7 (1-3)
This tells us that the energy of a quantum is inverse proportional to its wavelength. The
longer the wavelength involved, the lower its energy content. These relationships are fundamental
to appreciation of the behavior of electromagnetic radiation.
In remote sensing, electromagnetic radiation, that occurs as a continuum of wavelengths
and frequencies from short wavelength, high frequency to long wavelength, low frequency, is
‘common categorized as electromagnetic spectrum. The boundaries between them are expressed in
several different ways. Figure 1-7 is the one that be proposed by Suits (1983). It shows the extent
of the electromagnetic spectrum, the various named bands, the transmittances of the earth's
atmosphere to the electromagnetic radiation, and the effects caused by its interaction or presence.
‘The sun is the most obvious source of electromagnetic radiation for remote sensing.
However, all matter at temperatures above absolute zero (0°K or -273°C) continuously emits
electromagnetic radiation. Thus, terrestrial objects are also sources of radiation though it is
considerably different magnitude and spectral composition than the sun (Lillesand and Kiefer,
1979),
Barrett and Curtis (1982) stated that a useful concept widely used by the physicists in
radiation studies, is that of blackbody, a model (perfect) absorber and radiator of electromagnetic
radiation. A blackbody is conceived to be an object or substance which absorbs all the radiation
Incident upon it, and emits the maximum amount of radiation at all temperatures. Although there is
no known substance in the natural world with such a performance, the blackbody concept is
lnvaluable for the formulation of laws by comparison with which the behavior of actual radiators
may be assessed. They also summarized the principal faws that relate to this concept as following:
1.5.3 Stefan-Boltzmann' Law. This states that the total emissive power of a blackbody is
Proportional to the fourth power of its absolute temperature (T). This can be expressed as:
M=oT! (1-4)
Where:
M__ is the radiant exitance in W m?
© _ is the Stefan-Boltzmann Constant, 5.6697x10* W m? °K*
T is absolute temperature (°K) of emitting material.
This relationship applied to all wavelengths shorter than the microwave. In the microwave
region radiant exitance varies as a direct function of T (°K). The Stefan-Boltzmann's Law is that hot
‘adiators emit more energy per unit area than cooler ones.
y Dr. Suit Ongsomwang, Schoo of Remote Sensing, Suranaree University of Technology, 2007.Fundamentals of Remote Sensing and Digital Image Processing Page 18
ULTRA-
VIOLET |VISIBLE INFRARED
Reflected IR Thermal IR
ge
ti
ie
te
ae 0"
Figure 1-20: The bi-directional reflectance function (BRDF) of a sand shinnery oak rangelands
community at 31° solar zenith angle at = 862 nm. (From McCloy, 2006)
-
|
+15" “" o ms
<— backward scattering view Nadir view forward scatering view —p-
Figure 1-21: The bi-directional reflectance effects (From Jensen, 2007)
B. Directional reflectance. If the observer views the surface under overcast conditions
when the main source of illumination is scattered skylight, then the illumination is approximately
‘equal from all directions, but sensor is in one direction relative to the surface. Reflectance of this
form is called directional reflectance. The directional reflectance is the ratio of the single directional
reflected energy to the total irradiance incident on the surface giving the hemispherical directional
reflectance facto as:
Hemispherical directional reflectance factor (15)
‘By Dr. Suwit Ongsomwang, School of Remote Sensing, Suranares University of Technology, 2007.undamentalsof Remote Sersing and Digtal Image Processing Page 33
¢. _ Hemispherical reflectance. Hemispherical reflectance occurs when both the incident and
reflected energy is measured over the whole hemisphere. Hemispherical reflectance is of little
concer in relation to sensor, but is of great interest in the modeling of light interactions within the
surface and in assessing the impact of reflectance on aerial photography and other imagery that is
taken with sensors that have a wide FOV. Hemispherical reflectance should be identical to bi-
directional reflectance for Lambertian surfaces, and provides an average reflectance value for
specular surfaces. Comparison of hemispherical and bi-direction reflectance values can thus
Indicate how close surface are to the Lambertian model. For canopy modeling and inversion,
hemispherical reflectance is defined as the ratio of total (hemispherical) reflected energy from a
location to the total (hemispherical) incident energy. This bi- hemispherical reflectance factor is
given by:
QdQ
Bi-Hemispherical directional reflectance factor = 22a H(7-2)d0. (1-16)
hoa HQ, )40,,
In conclusion, Jensen (2007) stated that the amount of electromagnetic radiance, £ (watts
m-2 sr-1; watts per meter squared per steradian) recorded within the IFOV of an optical remote
sensing system (e.g., a picture element in a digital image) is a function of:
L= sas -40,P.2) (1-17)
XYZ
where:
2. is wavelength (spectral response measured in various bands or at specific frequencies).
Wavelength (2) and frequency (v) may be used interchangeably based on their
relationship with the speed of light (qj: where ¢ = 2x v.
8%y.2_ is x,y,z location of the picture element and its size (x, y)
t is temporal information, i.e., when and how often the information was acquired
© is set of angles that describe the geometric relationships among the radiation source
(eg,, the Sun), the terrain target of interest (e.g., a com field), and the remote sensing
‘system
P is polarization of back-scattered energy recorded by the sensor
2 Is radiometric resolution (precision) at which the data (e.g., reflected, emitted, or back-
scattered radiation) are recorded by the remote sensing system.
Ullesand et al. (2004) illustrated the basic components of radiation entered through the
Sensor when records reflected solar energy as shown in Figure 1-22. This figure provides an initial
frame of reference for understanding the nature of atmospheric affecting to the spectral response
yx. Suit Ongsomvang, School of Remake Sensing, Suanarce Universy of Tecnlagy, 007.fundamen of Remote Sensing and Dial Image Processing Page 4
pattern, They stated that the atmosphere affects the “brightness,” or radiance, recorded over given
point on the ground in the two almost contradictory ways. First, it attenuating a ground object ang
being reflected from the object. Second, the atmosphere acts as a reflector itself, adding a
scattered, extraneous path radiance to the signal detected by the sensor. By expressing these two
atmospheric effects mathematically, the total radiance recorded by the sensor may be related to
the reflectance of the ground object and the incoming radiation or irradiance using the equation:
T
tot ~ a +f
(1-18)
where:
is total spectral radiance measured by sensor,
Is reflectance of object,
is irradiance an object, incoming energy,
is transmission of atmosphere
Is path radiance, from the atmosphere and not from the object.
FAmue
Log
wa ~ eet
(2) Rane ones
{ii lcident radiation € .
{wah tenunton tor
(2) Tein lent of rellacnce »
Figure 1-22: Atmospheric effects influencing the measurement of reflected solar energy. (From
Lillesand et al, 2004)
It should be note that all of the above factors depend on wavelength. Also, the irradiance
(E) stems from two sources: (1) directly reflected “sunlight: and (2) diffuse “skylight” which is
sunlight that has been previously scattered by the atmosphere. The relative dominance of sunlight
versus skylight in any given images is strongly dependent on whether condition (e.9., sunny vs
'By Dr. Suwit Ongsomweng, School of Remote Sensing, Surenaree University of Technology, 2007.Fundamentals of Remote Sensing and Dita Image Processing Page 35
hazy vs. cloudy). Likewise, irradiance varies with the seasonal changes in solar elevation angle
(Figure 1-23) and the change distance between the Earth and Sun.
‘Sali
Figure 1-23: Effects of seasonal change on solar elevation angle. (The solar zenith angle is equal
to 90° minus the solar elevation angle (From Lillesand et al. 2004).
Lillesand et al. (2004) mentioned that the earth-sun distance correction is applied to
‘normalize for the seasonal changes in the distance between the Earth and the Sun. The earth-sun
distance is usually expressed in the astronomical units. (An astronomical unit is equivalent to the
mean distance between the Earth and the Sun, approximately 149.6 x 10° km.) The irradiance from
the Sun decreases as the square of the earth-sun distance. If we ignore atmospheric effects, the
‘combined influence of solar zenith angle and earth-sun distance on the irradiance incident on the
Earth’s surface can be expressed as:
5 F008, (v9)
a
where:
E is normalized solar irradiance,
& is solar irradiance at mean earth-sun distance,
% is sun’s angle from the zenith,
dis earth-sun distance, in astronomical units
y Or. Suwit Ongsomwang, School of Remote Sensing, Suranaree University of Technology, 2007.Fundamentals of Remote Sensing and Digital Image Processing Page 3
1.8.1. The interactions of electromagnetic radiation with vegetation
The complex assemblage of the earth surface consists of a biological and physical feature
Both features involve directly in the remote sensing system. Vegetation is the one of biologic,
features of the earth's feature that plays an important role in remote sensing system. Therefore, a!
fundamental understanding of electromagnetic radiation with vegetation, especially the spectra
reflectance of vegetation and its effecting factor is important.
Knipiing (1970) concluded that the reflectance of a plant canopy is similar, but is modified
by the nonuniformity of incident solar radiation, plant structure, leaf areas, shadows, anj
background reflectivity. Sensors receive an integrated view of all these affects, and each crop cr
‘vegetation type tends to have a characteristics signature which permits its discrimination.
Colwell (1974) suggested the main factors that can be very important influencing
vegetation canopy reflectance in certain situation: (a) leaf hemispherical reflectance and)
transmittance, (b) leaf area, (c) leaf orientation, (d) hemispherical reflectance and transmittance of
supporting structures (stalks, trunks, limbs, petioles), (f) solar zenith angle, (g) look angle, and (h)
azimuth angle.
Hoffer (1987) summarized distinct difference in reflectance of vegetation and their effecting
factors that are found among the visible, near-infrared, and middle-infrared portions of the
spectrum. In visible wavelengths, the pigmentation of the leaves is the dominanting factor. Most of
incident energy is absorbed and remainder Is reflected. The internal structure of the leaves controls
the level of reflectance of the near infrared, where about half of the incident energy is reflected,
nearly half is transmitted, and very litte is absorbed by the leaf. The total moisture content of the
vegetation controls the middle-infrared reflectance, with much of the incident energy being
absorbed by the water in the leaf, remainder being reflected. Characteristic of spectral reflectance
‘curve of vegetation is shown in Figure 1-24.
Furthermore, Cirran (1985) confined the spectral reflectance of vegetation and its effect
factor into two different types of reflectance.
1. The hemispherical reflectance of vegetation. By definition, the hemispherical reflectance
Infers the angles of incidence and collection of reflectance are hemispherical. It is usually operated
in laboratory. Each of three features of leave: pigmentation, physiological structure and water
content have an effect on the reflectance, absorbance and transmittance properties of a green leaf
as indicated in Figure 1-24.
2. The bidirectional reflectance of vegetation. By definition, bidirectional reflectance infers
the angles of incidence and collection of reflectance are directional as would be the case with
satelite sensor measurements of radiance on a sunny day. With constant hemispherical reflectance
of the individually leaves the bidirectional reflectance could vary appreciably due to effect of the
soil background, the presence of senescent vegetation, the angular elevation of the sun and
sensor, the canopy geometry and certain episodic and phenological canopy changes.
By Dr. Suwit Ongsomwang, School of Remote Sensing, Suranaree University of Technology, 2007,