Department of Computer Engineering
(Mock Interview Questions)
Subject: Natural Language Processing (22PCCO7010T) Class: Final Year B Tech
Name of Faculty: Dr. V. M. Patil, Dr. S. M. Pardeshi and Dr. M. M. Saiyyad.
Sr.
Question Answer
No.
To enable computers to understand, interpret, and
What is the primary goal of
generate human language. NLP bridges the gap
1 Natural Language Processing
between human communication and computer
(NLP)?
understanding.
Which level of NLP deals with Syntax Level: It focuses on how words are arranged
2
grammar and sentence structure? in sentences to form correct grammatical structures.
Ambiguity is when a word or sentence has more
What is ambiguity in NLP? Give than one meaning.
3
one example. Example: “I saw the man with the telescope.”
It’s unclear who has the telescope.
Which stage of NLP converts Speech Recognition: It transforms audio speech into
4
spoken input into text? textual data.
Name any one real-life Chatbots, Spam mail detection, Sentiment Analysis,
5
application of NLP. etc
What is the purpose of a Regular To search, match or manipulate text using specific
6
Expression in NLP? patterns. Example: \d+ matches one or more digits.
Which automata is used to model Finite Automata: It is used in pattern matching and
7
lexical analyzers in NLP? lexical analysis.
What does the Porter Stemmer It removes suffixes from words to find their root
8
do? form. Example: “running” → “run”.
A list of tokens (words or punctuation) from input
What is the output of
9 text. Example: “I love NLP.” → [‘I’, ‘love’, ‘NLP’,
tokenization?
‘.’]
Sr.
Question Answer
No.
What is the Minimum Edit
3 – Three operations: substitute ‘k’→‘s’, substitute
10 Distance between "kitten" and
‘e’→‘i’, and add ‘g’.
"sitting"?
Which model is used to compute
N-Gram Model: It predicts the next word using the
11 the probability of a word
previous n–1 words.
sequence?
Which smoothing technique adds
Add-One Smoothing (Laplace): Ensures no word
12 1 to all counts to avoid zero
has zero probability even if unseen.
probability?
Which tagging approach uses Rule-Based POS Tagging – Uses dictionaries and
13
predefined rules and lexicons? manually crafted rules for tagging.
What does POS tagging assign to A grammatical category like noun, verb, adjective,
14
each word in a sentence? etc. Example: “book” can be a noun or verb.
Which algorithm is used in
Earley Parser: Efficient for all context-free
15 parsing that handles left
grammars including ambiguous ones.
recursion and ambiguity?
What does HMM stand for in the Hidden Markov Model: A statistical model used for
16
context of NLP? POS tagging and speech recognition.
Which grammar rule type allows
Context-Free Grammar (CFG): Specifies valid
17 derivation of a sentence in a
syntax rules in a language.
language?
Which parser works by starting
Top-down Parser – Begins with the start symbol
18 from the root and expanding
and applies rules to match input.
productions?
Different words with similar meanings. Example:
19 What is synonymy?
‘big’ and ‘large’.
Which traditional method is used TF-IDF: Term Frequency–Inverse Document
20 to convert text into numeric Frequency measures word importance in a
features? document.
Which model in Word2Vec
CBOW (Continuous Bag of Words): Uses context
21 predicts the current word using
words to predict a target word.
surrounding words?
Which algorithm uses dictionary
Lesk Algorithm: Chooses sense of a word by
22 definitions for disambiguating
comparing dictionary definitions and context.
word senses?
Which relation describes a ‘is-a’
23 Hyponymy: A ‘rose’ is a hyponym of ‘flower’.
hierarchy among words?
Sr.
Question Answer
No.
What is the purpose of Anaphora To link pronouns to the correct nouns (antecedents).
24
Resolution in NLP? Example: “John loves his dog.” → ‘his’ = ‘John’
Which algorithm resolves
Hobbs Algorithm
25 pronominal anaphora by finding
Uses parse trees to trace antecedents of pronouns.
syntactic paths?
What is meant by cohesion in Cohesion refers to how parts of a text are connected
26
discourse? using pronouns, conjunctions, etc.
What is Co-reference Identifying all expressions that refer to the same
27
Resolution? real-world entity. Example: “John” and “he”.
Which transformer-based model
BERT – Processes input in both directions for better
28 is bidirectional and used for
context understanding.
language understanding?
Which NLP task involves
Machine Translation
29 translating text from one
Example: English → Hindi translation.
language to another?
Which model architecture does Auto-regressive Transformer
30
GPT-3 use? It predicts the next word based on previous ones.