CHAPTER FOUR
Fundamental concept in video
• outlines
• Types of video
• Analog video
• Digital video
• Types of color video signal
• Video Broadcasting Standards/ TV
standards
2
Analog Video
Analog technology requires information representing images and sound to
be in a real time continuous-scale electric signal between sources and
receivers.
It is used throughout the television industry. Distortion of images and noise
are common problems for analog video.
In an analogue video signal, each frame is represented by a fluctuating
voltage signal. This is known as an analogue waveform.
3
One of the earliest formats for this was composite video. Analog formats
Quality loss is also possible from one generation to another.
This type of loss is like photocopying, in which a copy of a copy is
never as good as the original. Most TV is still sent and received as an
analog signal. Once the electrical signal is received, we may assume
that brightness is at least a monotonic function of voltage.
An analog signal f(t) samples a time-varying image. So-called
progressive scanning traces through a complete picture (a frame) row-
wise for each time interval.
4
A high-resolution computer monitor typically uses a time interval of
1n2 second. In TV and in some monitors and multimedia standards,
another system, interlaced scanning, is used. Here, the odd-numbered
lines are traced first, then the even-numbered lines. This result in “odd”
and “even” fields – two fields make up one frame.
5
In fact, the odd lines (starting from 1) end up at the middle of a line at the
end of the odd field, and the even scan starts at a halfway point. The
following figure shows the scheme used. First the solid (odd) lines are
traced – P to Q, then R to S, and so on, ending at T – then the even field
starts at U and ends at V. The scan lines are not horizontal because a
small voltage is applied, moving the electron beam down over time
6
7
Digital Video
Digital technology is based on images represented in the form of bits. A
digital video signal is actually a pattern of 1’s and 0’s that represent the
video image. With a digital video signal, there is no variation in the
original signal once it is captured on to computer disc. Therefore, the
image does not lose any of its original sharpness and clarity. The image is
an exact copy of the original. A computer is the most common form of
digital technology. The limitations of analog video led to the birth of digital
8
Digital video is just a digital representation of the analogue video
signal. Unlike analogue video that degrades in quality from one
generation to the next, digital video does not degrade. Each generation
of digital video is identical to the parent. Even though the data is
digital, virtually all digital, formats are still stored on sequential tapes.
There are two significant advantages for using computers for digital
video:
the ability to randomly access the storage of video and
9
Computer-based digital video is defined as a series of individual images
and associated audio. These elements are stored in a format in which
both elements (pixel and sound sample) are represented as a series of
binary digits (bits). Almost all digital video uses component video.
The advantages of digital representation for video are many. It permits
Storing video on digital devices or in memory, ready to be processed
(noise removal, cut and paste, and so on) and integrated into various
multimedia applications
Direct access, which makes nonlinear video editing simple
Repeated recording without degradation of image quality
Ease of encryption and better tolerance to channel noise
10
Analog vs. Digital Video
An analog video can be very similar to the original video
copied, but it is not identical. Digital copies will always be
identical and will not lose their sharpness and clarity over
time. However, digital video has the limitation of the amount
of RAM available, whereas this is not a factor with analog
video. Digital technology allows for easy editing and 11
Displaying Video
There are two ways of displaying video on screen:
Progressive scan
Interlaced scan
12
Progressive scan
Progressive scan updates all the lines on the screen at the same time. This is known as
progressive scanning. Today all PC screens write a picture like this
Figure 4.1 Progressive scan
13
Interlaced Scanning
Interlaced scanning writes every second line of the picture during a scan,
and writes the other half during the next sweep. Doing that we only need
25/30 pictures per second. This idea of splitting up the image into two
parts became known as interlacing and the splitted up pictures as fields.
Graphically seen a field is basically a picture with every 2nd line
black/white. Here is an image that shows interlacing so that you can
better imagine what happens. 14
Figure 4.2 Interlaced Scanning
15
Types of Color Video Signals
Component video
each primary is sent as a separate video signal. The primaries can either
be RGB or a luminance-chrominance transformation of them (e.g., YIQ,
YUV). Best color reproduction. Requires more bandwidth and good
synchronization of the three components. Component video takes the
different components of the video and breaks them into separate signals.
Improvements to component video have led to many video formats,
16
Component video – Higher-end video systems make use of three separate
video signals for the red, green, and blue image planes. Each color
channel is sent as a separate video signal. Most computer systems use
Component Video, with separate signals for R, G, and B signals. For any
color separation scheme, Component Video gives the best color
reproduction since there is no “crosstalk” between the three channels.
This is not the case for S-Video or Composite Video. Component video,
however, requires more bandwidth and good synchronization of the three
17
2. Composite video/1 Signal: color (chrominance) and luminance
signals are mixed into a single carrier wave. Some interference between
the two signals is inevitable. Composite analog video has all its
components (brightness, color, synchronization information, etc.)
combined into one signal. Due to the compositing (or combining) of the
video components, the quality of composite video is marginal at best. The
results are color bleeding, low clarity and high generational loss.
18
In NTSC TV, for example, I and Q are combined into a chroma signal,
and a color subcarrier then puts the chroma signal at the higher
frequency end of the channel shared with the luminance signal. The
chrominance and luminance components can be separated at the
receiver end, and the two color components can be further recovered.
When connecting to TVs or VCRs, composite video uses only one wire
(and hence one connector, such as a BNC connector at each end of a
coaxial cable or an RCA plug at each end of an ordinary wire), and video
19
color signals are mixed, not sent separately.
The audio signal is another addition to this one signal. Since
color information is mixed and both color and intensity are
wrapped into the same signal, some interference between the
luminance and chrominance signals is inevitable
20
3. S-Video/2 Signal (Separated video): a compromise between
component analog video and the composite video. It uses two lines, one
for luminance and another for composite chrominance signal.
As a compromise, S-video (separated video, or super-video, e.g” in S-
VHS) uses two wires: one for luminance and another for a composite
chrominance signal. As a result, there is less crosstalk between the color
information and the crucial gray-scale information. The reason for
placing luminance into its own part of the signal is that black-and-white
21
Humans are able to differentiate spatial resolution in grayscale images
much better than for the color part of color images (as opposed to the
“black-and-white” part). Therefore, color information sent can be much
less accurate than intensity information. We can see only large blobs of
color, so it makes sense to send less color detail.
22
Table 4.1 Types of Color Video Signals
23
Video Broadcasting Standards/ TV standards
There are three different video broadcasting standards: PAL, NTSC,
and SECAM
PAL (Phase Alternate Line)
PAL is a TV standard originally invented by German scientists and
uses 625 horizontal lines at a field rate of 50 fields per second (or 25
frames per second). It is used in Australia, New Zealand, United
Kingdom, and Europe.
Scans 625 lines per frame, 25 frames per second
Interlaced, each frame is divided into 2 fields, 312.5 lines/field
For color representation, PAL uses YUV (YCbCr) color model 24
SECAM (Sequential Color with Memory)
SECAM uses the same bandwidth as PAL but transmits the color
information sequentially. It is used in France, East Europe, etc. SECAM
(System Electronic Pour Couleur Avec Memoire) is very similar to PAL. It
specifies the same number of scan lines and frames per second. SECAM
also uses 625 scan lines per frame, at 25 frames per second; it is the
broadcast standard for France, Russia, and parts of Africa and Eastern 25
SECAM and PAL are similar, differing slightly in their color-coding scheme.
In SECAM U and V, signals are modulated using separate color subcarriers
at 4.25 MHz and 4.41 MHz, respectively. They are sent in alternate lines –
that is, only one of the U or V signals will be sent on each scan line.
26
NTSC (National Television Standards Committee)
The NTSC TV standard is mostly used in North America and Japan. NTSC is
a black-and-white and color compatible 525-line system that scans a
nominal 30 interlaced television picture frames per second. Used in USA,
Canada, and Japan.
525 scan lines per frame, 30 frames per second (or be exact, 29.97 fps,
33.37 sec/frame)
Interlaced, each frame is divided into 2 fields, 262.5 lines/field
27
20 lines reserved for control information at the beginning of each field
Table 5.2 Comparison of analog broadcast TV systems.
28
HDTV (High Definition Television)
First-generation HDTV was based on an analog technology developed by
Sony and NHK in Japan in the late 1970s. HDTV successfully broadcast the
1984 Los Angeles Olympic Games in Japan. Multiple sub-Nyquist Sampling
Encoding (MUSE) was an improved NHK HDTV with hybrid analog/digital
technologies that was put in use in the 1990s. It has 1,125 scan lines,
interlaced (60 fields per second), and a 16:9 aspect ratio. It uses satellite
to broadcast ~ quite appropriate for Japan, which can be covered with
29
one or two satellites.
The Direct Broadcast Satellite (DBS) channels used have a bandwidth of
24 :MHz.
High-Definition television (HDTV) means broadcast of television signals
with a higher resolution than traditional formats (NTSC, SECAM, PAL)
allow. Except for early analog formats in Europe and Japan, HDTV is
broadcasted digitally, and therefore its introduction sometimes coincides
with the introduction of digital television (DTV).
30
Modern plasma television uses this
It consists of 720-1080 lines and higher number of pixels
(as many as 1920 pixels).
Having a choice in between progressive and interlaced is
one advantage of HDTV. Many people have their
preferences
31
Table 4.3 Advanced Digital TV Formats Supported by ATSC
32
HDTV vs Existing Signals (NTSC, PAL, or SECAM)
The HDTV signal is digital resulting in crystal clear, noise-free pictures
and CD quality sound. It has many viewer benefits like choosing between
interlaced or progressive scanning.
Standard Definition TV (SDTV) ~ the current NTSC TV or higher
Enhanced Definition TV (EDTV) – 480 active lines or higher
High Definition TV (HDTV) – 720 active lines or higher. So far, the
popular choices are 720P (720 lines, progressive, 30 fps) and 1080I
(1,080 lines, interlaced, 30 fps or 60 fields per second). The latter
provides slightly better picture quality but requires much higher
bandwidth. 33
Video File Formats
File formats in the PC platform are indicated by the 3 letter filename
extension.
.mov = QuickTime Movie Format
.avi = Windows movie format
.mpg = MPEG file format
.mp4 = MPEG-4 Video File
.flv = flash video file
.rm = Real Media File
.3gp = 3GPP multimedia File (used in mobile phones)
34
Four Factors of Digital Video
With digital video, four factors have to be kept in mind. These are:
Frame Rate
The standard for displaying any type of non-film video is 30 frames
per second (film is 24 frames per second). This means that the
video is made up of 30 (or 24) pictures or framesfor every second
of video. Additionally these frames are split in half (odd lines and
even lines), to form what are called fields. 35
Color Resolution
Color resolution refers to the number of colors displayed on the screen at
one time. Computers deal with color in an RGB (red-green-blue) format,
while video uses a variety of formats. One of the most common video
formats is called YUV. Although there is no direct correlation between RGB
and YUV, they are similar in that they both have varying levels of color
depth (maximum number of colours).
36
Spatial Resolution
The third factor is spatial resolution – or in other words, “How big is the
picture?” Since PC and Macintosh computers generally have resolutions in
excess of 640 by 480, most people assume that this resolution is the video
standard. A standard analogue video signal displays a full, over scanned
image without the borders common to computer screens. The National
Television Standards Committee ( NTSC) standard used in North America
and Japanese Television uses a 768 by 484 display.
37
The Phase Alternative system (PAL) standard for European television is
slightly larger at 768 by 576. Most countries endorse one or the other, but
never both.
Since the resolution between analogue video and computers is different,
conversion of analogue video to digital video at times must take this into
account. This can often the result in the down-sizing of the video and the
loss of some resolution.
Image Quality
38
The last and most important factor is video quality. The final objective is
}END CHAPTER 4
{CHAPTER 5
39