Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
20 views5 pages

NLP Exam Paper Analysis

Uploaded by

62.SHRUTI KAMBLE
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views5 pages

NLP Exam Paper Analysis

Uploaded by

62.SHRUTI KAMBLE
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

8

66

78
47
E2
Paper / Subject Code: 42175 / NATURAL LANGUAGE PROCESSING (DLOC - III)

F3

A7

E
25
E8

83
28

66

47

77
24
(MAY 23)

8E

F3

3E
25

7A
73

4E

28

66

78
A6

54
8E

A7
2

F
3

62
23

4E

28

7
Time: 3 hours Max. Marks: 80

36
1D

54
A

8E
2

8F
3

62
3
C5
N.B. (1) Question No. 1 is compulsory

4E
2

36
D

4
EE

E
32

25
51

8F
8
(2) Assume suitable data if necessary

23
83

67

4E

66
C

E2
D
EE
77

3A

F3
2
1
(3) Attempt any three questions from the remaining questions

E8
C5
7A

83

67

8
D2

E2
24
E
77

3A
54

51
E

73

E8
A
62

2
C
8

6
7

24
Q.1 Solve any Four out of Five 5 marks each

EE
36

77

8E
4

1
5

73
3
C5
8F

7A
2

83

4E
D2
6
Explain the challenges of Natural Language processing.

A6
a

EE
E2

77
54

32
1
3

23
C5
F

A
E8

62

83

7
b
8
Explain how N-gram model is used in spelling correction

A6
7

D
EE
E2

77
24

1
F3

23
5
7A
73

E8

83
c Explain three types of referents that complicate the reference resolution problem.

EC
28

D
A6

36

77
4

3A
54

51
E

3E
2

F
d Explain Machine Translation Approaches used in NLP.

7A
3

62
3

D2
EC
7

8
D2

78
6

E2

36
4

54

51
A

A7

3E
2

e Explain the various stages of Natural Language processing.


51

8F
3

E8

62
3

EC
7
D2

78
EC

47
A6

36
4

A7

3E
32

25
51
3E

F
8
23

67

66

78
EC

47
78

E2
D

Q.2 10 marks each 24


A

A7
25
1
A7

3E

8F
3

E8
3
C5

66
2

47
a What is Word Sense Disambiguation (WSD)? Explain the dictionary based approach to
78

A6

E2
47

1D

24
E

F3

25
A7

E
25

73

E8
23
5
3

28

66
Word Sense Disambiguation.
C
66

6
47

1D

24
EE
7

8E

F3
F3

A7
25

3
3
C5

b Represent output of morphological analysis for Regular verb, Irregular verb, singular noun,
83

4E

28
2
28

66

A6
47

1D
E
77

8E
2
8E

F3

E
25

plural noun Also Explain Role of FST in Morphological Parsing with an example
73
23
5
A

4E
EC
4E

28

66

A6
7

D
7
4

32
1
8E

F3

3E
32

23
5
A
2

67
EC
7

4E

28

8
7

1D
A6

77

3A
Q.3 10 marks each
54
8E

F3

3E
32

5
A
62
23

D2
EC
7

4E

78
7

a Explain the ambiguities associated at each level with example for Natural Language
A6

E2

36
1D

51
A7

3E
32

5
8F
E8

2
3
C5

EC
67

processing.
6
D2

78
47
2

6
24
EE

3A

A7

3E
25
51

F
3

b Explain Discourse reference resolution in detail.


67

6
D2

78
EC

47
E2

36
24
3A

A7
25
51
3E

8F
3

E8
67

66
D2
EC

47
78

E2
24
3A

F3

25

Q.4 10 marks each


51
7

3E

73

8
7A

28

66
D2
EC
78

a
A

8E
54

F3
2
51
7

3E

3
3
7A

28
D2
EC
78

A6

24

8E
4

51
7

3E
25

73
23
7A

4E
EC
6

78

6
D
36

A
4

32
1
7

3E
25

3
5
8F

7A

67
2
EC
6

78

D
E2

36

3A
4

1
7

3E
25

5
F

7A

D2
EC
28

78
36

51
8E

3E
25
F

7A

EC
4E

28

78
36

4
8E

A7

3E
2

25
F
73

4E

28

78
47
A6

36
8E

A7
2

25
F
73

4E

28

47
A6

36
8E
2

25
F
73
23

4E

28

66
A6
D

8E

F3
2

30651 Page 1 of 2
51

73
23

4E

28
EC

A6
D

8E
2
51

73
23

4E
EC

6
D

3A

EC51D23A67324E8E28F3662547A7783E
32
51
EC A6 E2 54 EE
51 73 8F 7A C5
D 24E 36 7 78 1D
23 8E 6 25 3E 23
EC A6 28 4 7A EC A6
51 73
2 F 7 51 7
D 23 4E 36
6 78 D2 32
4E
A6 8E
28
25
4
3E
EC 3 A6 8E
51 73 F 7A 7 28
D 24E 36 7 78 51 32 F3
6

a
b

b
23 D2 4E 66

Q.5

Q.6
8E 25 3E

30651
A6 28 4 EC 3A 8E 25
73 F 7A 67 28 47
2 4E 36 7 78 51 32 F3 A7
3A 8E 6 25 3E D2
3A
4E 66
25
78
3E
6 28 4 EC 8E
7A 47

Rules:
73 F 67 28 EC
24E 36 778 51 324 F3 A7 51
8E 6 25 3E D2 E8 66 78 D2
28 4 7A EC A6 3A E2 25
47
3E 3
32 F 7 51 73 67
3 8F3 A7
EC

10 marks each

10 marks each
36 24 7 51
4E 6 78 E 6 6 D2 8 24 D2
8E 25 3E 3 A 8 E 2 5 3 E E8 3
28 47 EC 6 2 4 7 E A 6 E2
F3 A7 5 1 73 8F A C5 7 3 8F
66 78 D 24 36 77 1 D 2 4 36
For given above corpus,

25 3E 23 E8 62 83 23 E8 62
M: Modal verb [can , will]

V:Verb [ watch, spot, pat]

47 EC A6 E2 54 EE 54
8 7 C A6 E2

“The man read this book”


A7 5 1 7 3 F A 5 7 3 8 F 7A
78 D 24 3 6 7 7 1D 2 4 3 77
3E 2 3 E 8 6 2 8 3 2 E 6 6 83
EC A 6 E2 5 4 E E 3 A 8 E 2 5
7 4 EE
Statement is “Justin will spot Will”

51 73
24
8F A 77
C5 67
32
28
F3 7 A C5
36 1D 4 6 77
D2
3A E 8E 6 25 8 3E 23 E8 62 83 1D
23
N: Noun [Martin, Justin, Will, Spot, Pat]

67 28 47 EC A6 E2 54 EE A6
32 F3 A7 5 1D 7 3 8 F 7A C 51 73

Explain Porter Stemmer algorithm with rules


24 36 77

Page 2 of 2
4E 66 78 E D 24
8E 25 3E 2 3 8 6 2 8 3 2
28 47 EC A 6 E 2 54 EE 3A
E8
F3 A7 5 73 8F 7 A C5 67
E2
8
1 D 24 1 3 2

Explain Maximum Entropy Model for POS Tagging


66 78 36 77 D 4E
25 3E 23 E8 62 83 23

EC51D23A67324E8E28F3662547A7783E
47 A6 E2 54 EE A 8E
Create Transition Matrix & Emission Probability Matrix

EC 6
A7 5 7 3 8 F 7A C 7 28
78 1D 24 36 77 51 3 2 F3
3E 23 E8 62 83 D2 4E 66
EC A6 E2 54 EE 3A 8E
Describe in detail Centering Algorithm for reference resolution.

51 73 8F 7A C5 67 28
D2 2 4E 3 66 77 1 D 32 F3

Explain information retrieval versus Information extraction systems


3A 8E 25 83 2 4E 66
47 EE 3A 8E 25
Apply Hidden Markov Model and do POS tagging for given statements

67 28 47
For a given grammar using CYK or CKY algorithm parse the statement

C5 67 28
32
4E
F3
66
A7
78 1 3 2 F 3
D2 4E 66
8E 25
47
3E 3 A 8 E 25
28
F3 A7
EC
5 6 73 2 8F 47
66 78 1 D 24 36 A7
Paper / Subject Code: 42175 / NATURAL LANGUAGE PROCESSING (DLOC - III)

25 3E 23 E8 62 78
47 EC A6 E2 54
A7 51 73 8F 7A
78
3E D 23
24
E8
36
62
77
83
EC A6 E2 54 E
51 7 32 8F3 7 A7
D2 4E 66 78
3A 8E 25 3E
4
8
F

88
4D
5

8A
29
7F
Paper / Subject Code: 42175 / NATURAL LANGUAGE PROCESSING (DLOC - III)

0A
84
FB
5A
73
(DEC 22)

4D
8A

8
29
CE

7F

0A
4
FB
5A
73

A8
68

4D
Time: 3 Hours Max. Marks: 80

29
CE

7F
10

4
FB
5A
73
B9

A8
68
=====================================================================

29
CE

7F
10
ED

8
N.B. (1) Question No. 1 is compulsory

FB
5A
73
B9

68
22

A
(2) Assume suitable data if necessary

29
CE

7F
10
ED

8
88

FB
5A
73
B9
(3)Attempt any three questions from remaining questions

68
0A

22

29
CE

7F
10
ED
88
4D

5A
73
B9

68
0A

22
4

CE

7F
A8

10
ED
88
4D

73
B9
Q.1 Any Four 20[M]

68
8

0A

22
4
FB

CE

7F
A8

10
ED
88
a Differentiate between Syntactic ambiguity and Lexical Ambiguity. [5M]

4D
29

73
B9

68
8

0A

22
4
Define affixes. Explain the types of affixes.

FB
b 5A [5M]

CE
A8

10
ED
88
4D
9
7F
c Describe open class words and closed class words in English with examples. [5M]

B9

68
8

0A
A2

22
4
FB
73

A8

10
ED
d What is rule base machine translation? [5M]

88
4D
F5

9
CE

B9
8

0A
A2

22
37

4
e Explain with suitable example following relationships between word meanings. [5M]

6
FB

A8
68

10
ED
88
7

4D
F5

9
CE

Homonymy, Polysemy, Synonymy, Antonymy


10

B9
8

0A
A2

22
37

4
FB
B9

A8
68

ED
f Explain perplexity of any language model. [5M]

88
7

4D
F5

9
CE
10
ED

0A
A2

22
37

4
FB
B9

A8
68
22

88
7

4D
F5
Q.2 a) Explain the role of FSA in morphological analysis?

9
CE
10
ED

8
88

0A
A2
37

4
FB
B9

A8
68

Q.2 b) Explain Different stage involved in NLP process with suitable example. [10M]
0A

22

4D
F5

9
CE
10
ED

8
88

A2
4D

37

4
FB
B9

A8
68
0A

22

F5
84

9
Q.3 a) Consider the following corpus [5M]
CE
10
ED

8
88

A2
4D
8A

37

FB
B9

68

<s> I tell you to sleep and rest </s>


0A

22

F5
84
FB

9
CE
10
ED
88

A2
4D
8A

37
<s> I would like to sleep for an hour </s>
29

B9

68
0A

22

F5
84
FB
A

<s> Sleep helps one to relax </s>


CE
10
ED
8
4D
F5

8A

37
A8
29

B9

List all possible bigrams. Compute conditional probabilities and predict


68
22
37

7
84
FB
A

CE
D0

10
ED

the next ord for the word “to”.


8
E7

F5

8A

A8
29

B9
44

68
22
8C

37

FB
5A

D0
A8

10
ED
8
E7
06

A8
29
7F

B9

Q.3 b) Explain Yarowsky bootstrapping approach of semi supervised learning [5M]


44
B8

22
8C
91

5A
73

D0
A8

ED
9F

88

Q.3 c) What is POS tagging? Discuss various challenges faced by POS tagging. [10M]
DB

06

CE

7F

44
B8

0A
A2

22
91
2E

73

A8
68

9F

88
DB

4D
F5
CE
10

Q.4 a) What are the limitations of Hidden Markov Model? [5M]


B8

0A
A2
37

84
2E

B9

68

9F

Q.4 b) Explain the different steps in text processing for Information Retrieval [5M]
E7

4D
F5

8A
82

10
ED

A2
8C

37

84

Q.4 c) Compare top-down and bottom-up approach of parsing with example. [10M]
FB
A8

B9
22

E7

F5

8A
06

29
D0

ED
88

8C

37
91

FB
5A
44

0A

Q.5 a) What do you mean by word sense disambiguation (WSD)? Discuss dictionary based [10M]
22

E7
DB

06

29
7F
88
4D

8C
91

approach for WSD.


2E

5A
3
0A

E7
B
84

06
82

Q.5 b) Explain Hobbs algorithm for pronoun resolution. [10M]


7F
ED
4D
8A

8C
91
A8

3
22

E7
DB
84

06
FB

D0

88
8A

8C
91

Q.6 a) Explain Text summarization in detail. [10M]


2E
29

44

0A

DB

06
FB

82

Q.6 b) Explain Porter Stemming algorithm in detail [10M]


A8

4D

91
A8

2E
29

B8

DB
84
5A

82
D0
9F

8A

A8

2E
7F

44
A2

FB

82
73

D0
A8

*****************
F5

A8
29
CE

44
B8
37

5A

D0
A8
9F
E7

7F

44
B8
A2
8C

73

A8
9F
F5
06

CE

B8
A2
37
91

68

16298
9F
E7
DB

F5
10

A2
8C

37
B9

E7

F5
06
ED

2EDB91068CE737F5A29FB8A844D0A882
8C

37
91
A8
4F

3A

71
Paper / Subject Code: 42175 / NATURAL LANGUAGE PROCESSING (DLOC - III)

61

EC
C0

79

D8
BF

A8
4F

3A

CB

1
73

87
61
(DEC 23)

C0

79

AE
AF

BD
BF

A8
4F

3
04

73

61

EC
0

79
C
61

AF

BF

8
4F

3A
A
84
Duration: 3hrs [Max Marks: 80]

04

73

61

C0

9
1D

AE
87
61

F
A

A
87

14
4
N.B.: (1) Question No 1 is Compulsory.

93
4

73
8

C0
10
BD

F6
1D

87
(2) Attempt any three questions out of the remaining five.

AF

F
46

3B

0A
87

14
EC
(3) All questions carry equal marks.

4
D8

10

F7
BD

FC
3A

F
(4) Assume suitable data, if required and state it clearly.

71

46

4A

3B

4
C
79

D8

61
8

C0
AE

7
D
A8

AF

F
CB

F
1

6
3

3B
87

14
84
C0
Attempt any FOUR

9
1 [20]

04
E
7

F7
D

F6
1D
A8
4F

3A

61
a What is the rule-based and stochastic part of speech taggers?

CB

B
87
61

84
0

79

73
b Explain Good Turing Discounting?
C

AE

0
D
F

1D
8
F

61

F
3B

CB
A
c Explain statistical approach for machine translation.
14

4A
3

B
87

4
C0

79

73
F7

F6

D8
d Explain with suitable example the following relationships between word meanings:

10
BD
8
F

3A

AF
4A

3B

71
4

46
Hyponymy, Hypernymy, Meronymy, Holynymy 1

EC
0

04
10

F7

8
F6

D8
C

BD
A8
F

3A

61
46

4A

71
e What is reference resolution? 14
3

EC

84
D8

C0

9
10

F7

D8
6

7
F

1D
2 a Explain FSA for nouns and verbs. Also Design a Finite State Automata (FSA) for the [10]

A8
F
71

3A
46

4A

3B

CB
14

87
D8

words of English numbers 1-99.


D8

9
10

FC

AE
87
F

BD
F
CB

71

b Discuss the challenges in various stages of natural language processing. [10]


46

0A
4

93
04

73
D8

EC
D8
AE

F6

FC

87
1

AF
CB

71

3A
6

3 a Consider the following corpus [10]


93

0A
4
84

3
D8

1
AE

79
7

<s> the/DT students/NN pass/V the/DT test/NN<\s>


F6
D

FC
A8

F
CB

A8
71

46

A
93

B
<s> the/DT students/NN wait/V for/P the/DT result/NN<\s>

4
C0

73
D8

61
8

C0
AE
7

0
D
A8

<s> teachers/NN test/V students/NN<\s>


4F

AF

BF
CB

4F
71

46
93
61

C0

Compute the emission and transition probabilities for a bigram HMM. Also decode
04

73
8

61
D8
AE
7

D
BF

A8
4F

AF

the following sentence using Viterbi algorithm.


BF
CB

6
93
73

7
61

4
C0

“The students wait for the test”


04

73
D8

D8
AE
7
AF

BF

A8
4F

61

AF

b What are five types of referring expressions? Explain with the help of example.
CB

[10]
71
93
04

73

61

84
C0

04
8
AE
7
61

AF

BD
BF

1D
A8
4F

61

4 a Explain dictionary-based approach (Lesk algorithm) for word sense disambiguation [10]
93
04

73

87
61

EC

84
C0

(WSD) with suitable example.


61

AF

D
BF

1D
A8
4F

3A

CB
4

b Explain the various challenges in POS tagging. [10]


04

73

87
61
D8

C0

79

AE
61

AF

BD
BF

8
4F
71

A
84

93

5 a Explain Porter Stemming algorithm in detail. [10]


04

73
D8

61

EC
C0
1D

87
61

AF

BF

b Explain the use of Probabilistic Context Free Grammar (PCFG) in natural language [10]
4F

3A
0A
87

84

04

73

61

79

processing with example.


BD

1D

FC
61

AF

BF

A8

6 a Explain Question Answering system (QAS) in detail. [10]


87

14
C

84

04

73

C0
AE

b Explain how Conditional Random Field (CRF) is used for sequence labeling. [10]
BD

F6
1D

61

AF

4F
93

3B
87
EC

04

61
D8

F7
BD
3A

61

BF
71

4A

**************
C

4
79

73
D8

D8
AE

0
A8

61

AF
CB

71
93

84
C0

04
D8
AE
87

1D
4F

61
CB
0A

93

87

84
FC

AE
87

BD

1D
0A
14

93

87
C
F6

FC

AE
87

BD
3B

0A
14

93

41703
F7

F6

FC

AE
87
3B

0A
14

93
F7

F6

FC

104AF73BF614FC0A8793AECBD871D846
87
37
45

78
23
E7

X2
Paper / Subject Code: 42175 / NATURAL LANGUAGE PROCESSING (DLOC - III)

7Y

8E
8X
58

45

78
23
Y4

E7
(MAY 24)

X2
7Y

8E
8X
58
37

45

78
23
Y4

E7
X2

7Y

8E
8X
58
37
78

5
23
Y4

E7
2
8E

4
Duration: 3hrs [Max Marks: 80]

7Y
8X
58
37
45

78

23
Y4

E7
2
7Y

8E

4
X

7Y
N.B.: (1) Question No 1 is Compulsory.

8X
58
7
45

8
23

23
E7

23
(2) Attempt any three questions out of the remaining five.

Y4

E7
Y
8X

X
8

8X
7

58
37
(3) All questions carry equal marks.

45

78
3
E7

Y4

E7
2
Y

E
8X
(4) Assume suitable data, if required, and state it clearly.

58

X
8
7

58
37
45

78
3
Y4

2
E

4
2
Y

E
X

7Y
58

8E
X
37

8
7
8

8
3
Y4

7
X2

45
23
Y4

E7
X2
E

7Y
8

8X
7
78

8
7
5

8
3

5
3
4

7
X2
8E

23
Y4

E7
Q1a) Explain the applications of Natural Language processing. 5M

2
Y

X
8

8X
37
45

78

8
7
5

5
3
4

7
2
7Y

8E

Y4

E7
Q1b) Illustrate the concept of tokenization and stemming in Natural Language processing. 5M

2
7Y

8E
X

X
45

78
23

58
37
5

8
23

78
4

7
7Y

8E
8X

Y4
X2
Q1c) Discuss the challenges in part of speech tagging` . 5M

7Y

8E
8X

8E
45
23
E7

37
5

78
23
E7

45
4
7Y
X

X2
Q1d) Describe the semantic analysis in Natural Language processing. 5M

7Y
58

8E
8X

7Y
8
8

45
23
Y4

78
23
E7
E

23
4
Y
X

7Y
58

8E
8X
37

58

8X
7
8

23
Y4

7
X2

45
23
4

E7
E

E7
Y
X

7Y
8

Q2a) Explain inflectional and derivational morphology with an example 10M

8X
37
78

58
37
5

58
4

7
X2
8E

23
4

E7
2
Y

Y4
Y
X
8

Q2b) Illustrate the working of Porter stemmer algorithm 10M

8X
7
45

78

58
37
5

78
3

37
4
2
7Y

8E

E7
2
Y

E
X

X2
Y
X
8
7
45

8
23

58
37
45

8
3
E7

78
E7
2
7Y
8X

4
2
7Y
X

7Y
58

8E
8X
8

Q3a) Explain hidden markov model for POS based tagging. 10M
8
23
E7

5
3
Y4

45
23
Y4

E7
2
E
8X
58

7Y
8

8X
37

8
7

Q3b) Demonstrate the concept of conditional Random field in NLP 10M


5

8
Y4

E7

5
23
Y4

E7
X2

23
Y4

E7
58

X
37

8X
37
78

8
37
5

8
Y4
X2

5
Y4

E7
X2
8E

Y4

E7
X2
37
78

58
37
45

78

58
37
8

Q4a) Explain the Lesk algorithm for Word Sense Disambiguation. 10M
X2
8E

Y4

7
X2
7Y

8E

Y4
X2
8E
45

78

37
45

78
23

37
5

78

Q4b) Demonstrate lexical semantic analysis using an example 10M


7Y

8E

Y4
X2
7Y

8E
8X

X2
8E
45
23

7
45

78
23
E7

45

78
23
7Y
8X

7Y

8E
8X

7Y
58

8E
X
23
E7

45

78
23
Y4

E7

45
23
8X

Y
58

8E
8X

Q5a) Illustrate the reference phenomena for solving the pronoun problem 10M
7Y
58

8X
37

7
E7

5
23
Y4

E7
X2

23
Y4

E7
58

Q5b) Explain Anaphora Resolution using Hobbs and Cantering Algorithm 10M
58

8X
37
78

8
7
8
Y4

5
3
Y4

7
X2
8E

Y4

E7
2
8E

X
37

37
45

78

58
37
45

8
X2

7
X2
7Y

8E

Y4
X2
7Y

E
78

Q6a) Demonstrate the working of machine translation systems 10M


8
45

78
23

37
5

78
23

4
7Y

8E
8X

X2
7Y

8E
8X

Q6b) Explain the Information retrieval system 10M


45
23
E7

78
23
E7

4
7Y
8X

7Y
58

8E
8X
58

______________________
23
Y4

E7

45
23
Y4

E7
8X

7Y
58

8X
37

58
37
Y4

E7

23
Y4

E7
X2
58

8X
37

58
37
78
Y4
X2

Y4

E7
X2
8E
37
78

58
37
45

78
X2
8E

Y4
X2
7Y

8E
45

78

37
45

78
23
8E

X2
7Y

8E
8X
45

45

78
23
E7
7Y

7Y

8E
8X
58
23

56153 Page 1 of 1
45
23
Y4

E7
8X

7Y
8X
58
37
E7

23
Y4

E7
X2

8X
58
37
78

X237Y458E78X237Y458E78X237Y458E78X237Y458E78
Y4

E7
X2
8E

You might also like