Printed Page: 1 of 2
Subject Code: KCS072
0Roll No: 0 0 0 0 0 0 0 0 0 0 0 0 0
BTECH
(SEM VII) THEORY EXAMINATION 2024-25
NATURAL LANGUAGE PROCESSING
TIME: 3 HRS M.MARKS: 100
Note: Attempt all Sections. In case of any missing data; choose suitably.
SECTION A
1. Attempt all questions in brief. 2 x 10 = 20
Q no. Question CO Level
a. How does context influence error detection? 1 K1,
K2
b. A Hidden Markov Model (HMM) is used for PoS tagging. Explain the 1 K1,
backward algorithms used to compute tag probabilities. K2
c. Provide an example of unification of feature structures for agreement in 2 K1,
number and gender. K2
d. Discuss the limitations of CFGs in modeling natural language syntax. 2 K1,
K2
e. Compare and contrast dictionary-based and distributional methods for 3 K2
word similarity measurement.
2
f. Define selectional restrictions. 3 K2
13
g. How are speech sounds classified?
90 4 K1,
2.
_2
K2
24
h. How do vocal tract shape and size affect the spectrum of speech sounds? 4 K1,
P1
K2
5.
5D
i. Demonstrate the Viterbi algorithm. 5 K3,
.5
K4
17
P2
j. Compare and contrast LPC and PLP coefficients for speech feature 5 K3,
extraction. |1 K4
Q
PM
SECTION B
2. Attempt any three of the following: 10 x 3 = 20
Q no. Question C Level
3
:5
O
a. Consider the regular expression (ab)*c. Draw the corresponding finite- 1 K1,
51
state automaton and explain how it recognizes the language. K2
1:
b. Given the sentence "The dog saw the man with the telescope," illustrate 2 K1,
the ambiguity in parsing using dependency grammar. Propose a K2
25
resolution.
20
c. Discuss how syntax-driven semantic analysis works. Create a semantic 3 K2
attachment for the sentence: "John gave Mary a book."
n-
d. Compare filter-bank and LPC methods in speech feature extraction. 4 K1,
Ja
Provide numerical examples where possible. K2
8-
e. What are likelihood distortions in speech recognition? Provide examples 5 K3,
|0
and their perceptual impact. K4
SECTION C
3. Attempt any one part of the following: 10 x 1 = 10
Q no. Question C Level
O
a. A word processor uses a minimum edit distance algorithm to suggest 1 K1,
corrections for misspelled words. If the word "intention" is misspelled K2
1|Page
QP25DP1_290 | 08-Jan-2025 1:51:53 PM | 117.55.242.132
Printed Page: 2 of 2
Subject Code: KCS072
0Roll No: 0 0 0 0 0 0 0 0 0 0 0 0 0
BTECH
(SEM VII) THEORY EXAMINATION 2024-25
NATURAL LANGUAGE PROCESSING
TIME: 3 HRS M.MARKS: 100
as "execution," calculate the minimum edit distance and outline the
alignment steps.
b. Compare and contrast interpolation and backoff smoothing techniques. 1 K1,
How are these applied to n-gram models? K2
4. Attempt any one part of the following: 10 x 1 = 10
Q no. Question C Level
O
a. Discuss the concept of treebanks in NLP. How do they facilitate training 2 K1,
syntactic parsers? Illustrate with an example. K2
b. Explain the CYK parsing algorithm with a worked example of parsing 2 K1,
the sentence "He saw a cat." using a given CFG. K2
5. Attempt any one part of the following: 10 x 1 = 10
2
Q no. Question CO Level
13
a. 90
Given a sentence with multiple possible word senses (e.g., "bank"), 3 K2
2.
outline how Word Sense Disambiguation (WSD) is performed using
_2
24
supervised learning.
P1
b. Implement a bootstrapping method for WSD using a small set of seed 3 K2
5.
words. Illustrate with examples.
5D
.5
17
6. Attempt any one part of the following:
P2
10 x 1 = 10
Q no. Question CO Level
|1
Q
a. Explain the mathematical basis of the log-spectral distance measure. 4 K1 ,
PM
Compute it for two spectral frames with given power spectra. K2
b. Derive and explain the LPC coefficients for a given speech frame. 4 K1 ,
K2
3
:5
7. Attempt any one part of the following: 10 x 1 = 10
51
Q no. Question CO Level
1:
a. Discuss spectral distortion measures in speech analysis. Calculate the 5 K3 ,
cepstral distance for given cepstral coefficients. K4
25
b. Discuss the role of HMMs in speech recognition. Explain the forward 5 K3 ,
20
and backward procedures with an example. K4
n-
Ja
8-
|0
2|Page
QP25DP1_290 | 08-Jan-2025 1:51:53 PM | 117.55.242.132