NLP SEM QB
5 Marks:
2022 1
None
2022 2
1. Differentiate between Syntactic ambiguity and Lexical Ambiguity.
2. Define affixes. Explain the types of affixes.
3. Describe open class words and closed class words in English with examples.
4. What is rule base machine translation?
5. Explain with suitable example following relationships between word meanings.
Homonymy, Polysemy, Synonymy, Antonymy
6. Explain perplexity of any language model.
7. Consider the following corpus
<s> I tell you to sleep and rest </s>
<s> I would like to sleep for an hour </s>
<s> Sleep helps one to relax </s>
List all possible bigrams. Compute conditional probabilities and predict
the next ord for the word “to”.
8. Explain Yarowsky bootstrapping approach of semi supervised learning.
9. What are the limitations of Hidden Markov Model?
10. Explain the different steps in text processing for Information Retrieval.
2023 1
11. Explain the challenges of Natural Language processing.
12. Explain how N-gram model is used in spelling correction.
13. Explain three types of referents that complicate the reference resolution problem.
14. Explain Machine Translation Approaches used in NLP.
15. Explain the various stages of Natural Language processing.
2023 2
16. What is the rule-based and stochastic part of speech taggers?
17. Explain Good Turing Discounting?
18. Explain statistical approach for machine translation.
19. Explain with suitable example the following relationships between word meanings:
Hyponymy, Hypernymy, Meronymy, Holynymy.
20. What is reference resolution?
2024
21. Explain the applications of Natural Language processing.
22. Illustrate the concept of tokenization and stemming in Natural Language processing.
23. Discuss the challenges in part of speech tagging.
24. Describe the semantic analysis in Natural Language processing.
10 Marks:
2022 1
1. What is information retrieval and machine translation in applications? Give a brief answer on
both.
2. What is Word Sense Disambiguation? Illustrate with an example how the Dictionary-based
approach identifies the correct sense of an ambiguous word.
3. Explain derivational and inflectional morphology in detail with suitable example.
4. Why it is important to preprocess text data in natural language? Explain in detail the steps of
preprocessing with examples.
5. What are the five types of referring expressions? Explain with example.
6. Write Note on Text Summarization.
7. What is a language model? Explain the N-gram model.
8. How HMM is used for POS tagging? Explain in detail.
9. What is lexicon, lexeme and Explain the different types of relations that hold between
lexemes with example.
2022 2
10. Explain Different stage involved in NLP process with suitable example.
11. What is POS tagging? Discuss various challenges faced by POS tagging.
12. Compare top-down and bottom-up approach of parsing with example.
13. What do you mean by word sense disambiguation (WSD)? Discuss dictionary based
approach for WSD.
14. Explain Hobbs algorithm for pronoun resolution.
15. Explain Text summarization in detail.
16. Explain Porter Stemming algorithm in detail.
2023 1
17. What is Word Sense Disambiguation (WSD)? Explain the dictionary based approach to
Word Sense Disambiguation.
18. Represent output of morphological analysis for Regular verb, Irregular verb, singular noun,
plural noun Also Explain Role of FST in Morphological Parsing with an example
19. Explain the ambiguities associated at each level with example for Natural Language
processing.
20. Explain Discourse reference resolution in detail.
21.
For given above corpus,
N: Noun [Martin, Justin, Will, Spot, Pat]
M: Modal verb [can , will]
V:Verb [ watch, spot, pat]
Create Transition Matrix & Emission Probability Matrix
Statement is “Justin will spot Will”
Apply Hidden Markov Model and do POS tagging for given statements.
22. Describe in detail Centering Algorithm for reference resolution.
23. For a given grammar using CYK or CKY algorithm parse the statement
“The man read this book”
Rules:
24. Explain Porter Stemmer algorithm with rules.
25. Explain information retrieval versus Information extraction systems.
26. Explain Maximum Entropy Model for POS Tagging.
2023 2
27. Explain FSA for nouns and verbs. Also Design a Finite State Automata (FSA) for the words of
English numbers 1-99.
28. Discuss the challenges in various stages of natural language processing.
29. Consider the following corpus
<s> the/DT students/NN pass/V the/DT test/NN<\s>
<s> the/DT students/NN wait/V for/P the/DT result/NN<\s>
<s> teachers/NN test/V students/NN<\s>
Compute the emission and transition probabilities for a bigram HMM. Also decode the
following sentence using Viterbi algorithm.
“The students wait for the test”
30. What are five types of referring expressions? Explain with the help of example.
31. Explain dictionary-based approach (Lesk algorithm) for word sense disambiguation (WSD)
with suitable example.
32. Explain the various challenges in POS tagging.
33. Explain Porter Stemming algorithm in detail.
34. Explain the use of Probabilistic Context Free Grammar (PCFG) in natural language processing
with example.
35. Explain Question Answering system (QAS) in detail.
36. Explain how Conditional Random Field (CRF) is used for sequence labeling.
2024
37. Explain inflectional and derivational morphology with an example
38. Illustrate the working of Porter stemmer algorithm.
39. Explain hidden markov model for POS based tagging.
40. Demonstrate the concept of conditional Random field in NLP.
41. Explain the Lesk algorithm for Word Sense Disambiguation.
42. Demonstrate lexical semantic analysis using an example.
43. Illustrate the reference phenomena for solving the pronoun problem.
44. Explain Anaphora Resolution using Hobbs and Cantering Algorithm.
45. Demonstrate the working of machine translation systems.
46. Explain the Information retrieval system.