Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
6 views27 pages

Lect00 Introduction

Uploaded by

sanjay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views27 pages

Lect00 Introduction

Uploaded by

sanjay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

EE 392B Course Introduction

• About EE392B

◦ Goals
◦ Topics
◦ Schedule
◦ Prerequisites

• Course Overview

◦ Digital Imaging System


◦ Image Sensor Architectures
◦ Nonidealities and Performance Measures
◦ Color Imaging
◦ Recent Developments and Trends

EE 392B: Course Introduction Intro-1


Motivation

• Image sensors are all around us:


◦ Cell phones
◦ Digital still and video cameras
◦ Optical mice
◦ Cars
◦ Security cameras
◦ PC and Web cameras
◦ Scientific and industrial

• Digital cameras are replacing film and analog cameras for capture
• CMOS image sensors are making it possible to integrate capture and
processing on the same chip, providing new capabilities for
◦ Machine vision
◦ Man-machine interface
◦ Biometrics
◦ Biological applications

EE 392B: Course Introduction Intro-2


• Image sensors are quite different from other types of sensors, e.g.,
pressure, temperature, . . .
◦ They comprise a massive array of detectors
◦ They can detect (see) over very long distances (most other sensors
are local)
• Several important issues beyond physics and fabrication:
◦ How do we read out a very large number of signals quickly?
◦ What are the spatial and temporal nonidealities that limit the
performance of image sensors?
◦ How do we quantify their performance?
• So, to understand image sensors, we need to use tools from several areas
in EE; device physics and fabrication, optics, circuits, signals, and systems

EE 392B: Course Introduction Intro-3


Course Goals

• Provide an introduction to the design and analysis of visible range image


sensors
• Develop basic understanding of the signal path through an image sensor
• Develop an understanding of the nonidealities, performance measures, and
tradeoffs involved in the design of image sensors
• Discuss recent developments and future trends in this area
• The course can be used as part of an MSEE Image Systems Eng depth
sequence

EE 392B: Course Introduction Intro-4


Course Topics

• Silicon photodetectors: photodiode, photogate, and pinned diode;


photocurrent, quantum efficiency, and dark current; direct integration
• CCD and CMOS image sensors; architectures and readout circuits, well
capacity, conversion gain, readout speed
• Image sensor technologies including color filters and microlens.
• Temporal noise
• Fixed pattern noise (FPN), DSNU, PRNU
• SNR and Dynamic range
• Spatial resolution and Modulation Transfer Function (MTF)
• Pixel optics
• High dynamic range extension schemes
• Technology scaling and modification issues

EE 392B: Course Introduction Intro-5


Course Schedule

March 29 Overview – El Gamal.


March 31 Photodetection in silicon, photodiode operation – El Gamal.
April 5 Photocurrent and dark current – Wong. HW1.
April 7 Photogate and direct integration – Wong.
April 12 CCDs – Wong. HW1 due, HW2.
April 14 CCDs – Wong.
April 19 CCDs – Wong. HW2 due, HW3.
April 21 CMOS image sensors – El Gamal. Project HO.
April 26 CMOS image sensors – El Gamal. HW3 due, HW4.
April 28 Process and layout issues. – Wong. Project Groups due.
May 3 Noise analysis in circuits – El Gamal. HW4 due, HW5.
May 5 Noise analysis in image sensors – El Gamal.
May 10 Fixed pattern noise – El Gamal. HW5 due.
May 12 vCam – Farrell. Take Home Midterm.

EE 392B: Course Introduction Intro-6


Course Schedule Contd.

May 17 SNR and dynamic range – El Gamal. HW6, Project information.


May 19 Spatial resolution, MTF – El Gamal. Project information.
May 24 Pixel optics – Catrysse. HW6 due.
May 26 HDR schemes.
May 31 Course Summary. Project Progress reports due.
June 7 Projects due.

Project format:
• We plan to propose two mini-project topics for you to choose from; one in the
device and technology area and the other in the sensor design and analysis area
• The projects will be done in two-student groups
• We are open to project proposals other than the recommended ones. You need
to tell us early, however

EE 392B: Course Introduction Intro-7


Course Prerequisites

• Understanding image sensors requires basic knowledge in several areas of


EE
• You need to have undergarduate (preferably MSEE) level knowledge in:
◦ Device physics and fabrication
◦ CMOS circuits
◦ Basic signals and systems
◦ Optics

• We will try to be as self-contained as possible and review some of the


necessary concepts and derivations
• However, depending on your background and interest, there may be some
material that you will not completely understand
◦ We do not expect you to have complete understanding of everything
◦ As in studying any interdisciplinary field, it is more important to
develop some level of understanding of all aspects of the field before
going deeply into any particular aspect
EE 392B: Course Introduction Intro-8
Reading and References

• The course has no required or recommended textbook. We will hand out


lecture notes and some papers
• Here are some books that may be useful:
◦ CCDs:
A.J.P. Theuwissen, Solid-State Imaging with Charge-Coupled Devices
J. D. E. Beynon, D. R. Lamb, CCD Operation, Fabrication and Limitations
◦ Devise physics and fabrication:
Muller and Kamin, Device Electronics for Integrated Circuits
Pierret, Semiconductor Device Fundamentals
◦ Circuits:
A.S. Sedra and K.C. Smith, Microelectronic Circuits
P. Gray and R. Meyer, Analog Integrated Circuits
◦ Signals and systems:
B.P. Lathi, Signal Processing and Linear Systems.
A. El Gamal, EE278 Class Notes.

• We will handout a fairly comprehensive list of references

EE 392B: Course Introduction Intro-9


Digital Imaging System

Auto
Focus

Image
L C Image A A Color Enhancement
e F G D &
n sensor Processing
s A C C Compression

Auto
Exposure Control &
Interface

EE 392B: Course Introduction Intro-10


Image Sensors

• An area image sensor consists of:


◦ An n × m array of pixels, each comprising
∗ a photodetector that converts incident light (photons) to
photocurrent
∗ one or more devices for readout
◦ Peripheral circuits for readout and processing of pixel signals and
sensor timing and control
• Sensor size ranges from 320×240 (QVGA) for low end PC digital camera
to 7000×9000 for scientific/astronomy applications
• Pixel size ranges from 15×15 µm down to 1.5×1.5 µm

EE 392B: Course Introduction Intro-11


Brief History of Image Sensors

1965-1970 Bipolar, MOS photodiode arrays developed


(Westinghouse, IBM, Plessy, Fairchild)
1970 CCD invented at Bell Labs
1970-present CCDs dominate
1980-1985 Several MOS sensors reported
1985-1991 CMOS PPS developed (VVL)
1990s CMOS APS developed (JPL, . . .)
1994-present CMOS DPS developed (Stanford, Pixim)
2000-present CMOS image sensors become a commercial reality

See reference [11] of the Bibliography for more details

EE 392B: Course Introduction Intro-12


CCD Image Sensors (Interline Transfer)

Vertical
CCD
(analog shift
register)

Photodetector

Horizontal Output
CCD Amplifier

EE 392B: Course Introduction Intro-13


CCD Image Sensors

• Advantage: High quality


◦ optimized photodetectors — high QE, low dark current
◦ low noise and nonuniformity — CCDs do not introduce noise or
cause nonuniformity
• Disadvantages:
◦ difficult to integrate other camera functions on same chip
◦ high power — high speed shifting clocks
◦ limited frame rate — serial readout

EE 392B: Course Introduction Intro-14


CMOS Image Sensors

Word
Row Decoder

Pixel:
Photodetector
& Access
Devices Bit

Column Amplifiers

Output Amplifier

Column Decoder

Most popular type called Active Pixel Sensor (APS), pixel has photodiode and
3 transistors
EE 392B: Course Introduction Intro-15
CMOS Image Sensors

• Advantages:
◦ can integrate other camera functions on same chip
◦ lower power consumption than CCDs (10X)
◦ very high frame rates can be achieved
◦ very high dynamic range can be achieved
• Disadvantages: lower quality at low light CCDs
◦ higher dark current (CMOS process usually modified to optimize the
photodetector and reduce trasistor leakage, but it is still difficult to
match the low dark current of CCDs)
◦ lower QE (higher stack above photodetector reduces incident light)
◦ high noise and nonuniformity due to multiple levels of amplification
(pixel, column, and chip)

EE 392B: Course Introduction Intro-16


Signal Path Through an Image Sensor
ADC
Gain
DN
Photonflux Current density Charge Voltage
Quantum Efficiency Integration Conversion
space/time Gain

ph/cm2·sec A/cm2 Col V

• Quantum efficiency determined by pixel characteristics


• Due to the small photocurrent levels, the photocurrent is integrated over
exposure time into charge
• Charge is converted into voltage for readout using linear amplifier(s)

EE 392B: Course Introduction Intro-17


Quantum Efficiency – Example

0.65

0.6
Quantum Efficiency (e−/ph)

0.55

0.5

0.45

0.4

0.35

0.3

0.25

0.2

0.15
350 400 450 500 550 600 650 700 750
Wavelength (nm)

EE 392B: Course Introduction Intro-18


Image Sensor Non-idealities

• Temporal noise
• Fixed pattern noise (FPN)
• Dark current
• Spatial sampling and low pass filtering

EE 392B: Course Introduction Intro-19


Temporal Noise

• Caused by photodetector and MOS transistor thermal, shot, and 1/f noise
• Can be lumped into two additive components:
◦ Read noise
◦ Integration noise (due to photodetector shot noise)
• Noise increases with signal, but so does the signal-to-noise ratio (SNR)
• Noise under dark conditions (read noise) presents a fundamental limit on
sensor dynamic range (DR)

EE 392B: Course Introduction Intro-20


Fixed Pattern Noise (FPN)

• FPN (also called nonuniformity) is the spatial variation in pixel outputs


under uniform illumination due to device and interconnect mismatches
over the sensor
• Two FPN components: offset and gain (called Pixel Response
Nonuniformity or PRNU)
• Most visible at low illumination (offset FPN more important than gain
FPN)
• Worse for CMOS image sensors than for CCDs
• Offset FPN can be reduced using correlated double sampling (CDS)

EE 392B: Course Introduction Intro-21


Dark current

• Dark current is the photodetector leakage current, i.e., current not


induced by photogeneration
• It limits the photodetector (and the image sensor) dynamic range
◦ introduces unavoidable shot noise
◦ varies substantially across the image sensor array causing
nonuniformity (called Dark Signal Nonuniformity or DSNU) that
cannot be easily removed
◦ reduces signal swing

EE 392B: Course Introduction Intro-22


Sampling and Low Pass Filtering

• The image sensor is a spatial (as well as temporal) sampling device —


frequency components above the Nyquist frequency cause aliasing
• It is not a point sampling device — signal low pass filtered before
sampling by
◦ spatial integration (of current density over photodetector area)
◦ crosstalk between pixels
• Resolution below the Nyquist frequency measured by Modulation Transfer
Function (MTF)
• Imaging optics also limit spatial resolution (due to diffraction)

EE 392B: Course Introduction Intro-23


Color Imaging

• To capture color images, each pixel needs to output three values


(corresponding, for example, to R, G, and B)
• The most common approach is to deposit color filters on the sensor in
some regular pattern, e.g., the RGB Bayer pattern

• A lot of processing is needed to obtain three colors for each pixel with the
right appearance
EE 392B: Course Introduction Intro-24
Color Processing

• Interpolation used to reconstruct missing color components


• Correction and balancing used to improve appearance of color
• Gamma correction and color space conversion needed before image
enhancement and compression
• Color processing very computationally demanding — over 300 MOPS
needed for a 640×480 sensor operating at 30 frames/s
• We do not discuss color processing and other digital image processing
that take place in a digital camera in this course
EE 392B: Course Introduction Intro-25
The vCam Camera Simulator

• Set of MATLAB routines modeling the light source, the object, the optics, the sensor, and the
ADC
• Parameters of the scene, the sensor, and the camera can be set and the corresponding output
image obtained
• Allows us to visualize the effects of different sensor parameters and nonidealities
• Allows us to explore the sensor design space

• Will be used in the last homework set and in the course project

EE 392B: Course Introduction Intro-26


Recent Developments and Future Trends

• CMOS image sensor technology scaling and process modifications:


◦ approach CCD quality
◦ reduce pixel size
◦ increase pixel counts

• Integration of image capture and processing:


◦ most commercial CMOS image sensors today integrate A/D conversion, AGC, and
sensor control logic on the same chip
◦ some, e.g., also integrate exposure control and color processing

• Per-pixel integration is being exploited to provide new capabilities:


◦ High dynamic range sensors
◦ Computational sensors
◦ 3D sensors
◦ Lab-on-chip

• Vertical integration promises higher levels of per-pixel integration

EE 392B: Course Introduction Intro-27

You might also like