1. Explain Single and Multi-pass compiler?
Ans- 🔹 Single Pass Compiler
A single pass compiler processes the source code
only once, going through it from beginning to end
in a single scan.
✅ Characteristics:
• Translates the source code in one pass.
• Faster than multi-pass compilers.
• Requires less memory.
• Suitable for simple and small languages.
• Cannot handle forward references easily (like
using a variable before declaring it).
📌 Example:
Early versions of Pascal compilers, or simple C
compilers.
🧠 Use Case:
• Embedded systems
• Simple assembly languages
Fig- Single Pass compiler
🔹 Multi-pass Compiler
A multi-pass compiler processes the source code
in multiple passes — each pass performs a specific
task such as syntax analysis, semantic analysis,
optimization, etc.
✅ Characteristics:
• Goes through the program multiple times.
• Can optimize code more effectively.
• Can handle complex programming
languages.
• Requires more memory and is slower than
single pass.
📌 Example:
Modern C++, Java, or FORTRAN compilers.
🧠 Use Case:
• General-purpose high-level languages
• Large-scale software system
2. What is Lex explain with suitable example?
Ans- Lex is a lexical analyzer generator tool used
in compiler design to create a lexical analyzer
(scanner).
~ The lexical analyzer is the first phase of a
compiler, which scans the source code and breaks
it into a sequence of tokens such as keywords,
identifiers, numbers, operators, etc.
~ Lex uses regular expressions to define the
patterns of tokens and generates C code that can
recognize these tokens in an input stream.
🔍 Purpose of Lex:
• To identify the basic elements (tokens) of a
programming language.
• To simplify the process of writing a lexical
analyzer.
• Works closely with YACC (parser generator),
which handles syntax analysis.
📌 Structure of a Lex Program:
A Lex program has three sections.
I. Definition Section
II. Rules Section (Token Patterns)
III. User Code (main function)
✅ Lex Program Example:
Output- Number: 123 & Number: 45
3. Explain Compiler Writer tools?
Ans-Compiler writing tools are software tools or
programs that help in building a compiler easily
and efficiently. They reduce the manual work of
writing each phase of a compiler by automatically
generating parts like lexical analyzer, parser, etc.
🔧 Main Compiler Writing Tools:
Tool Type Description Example
1. Lexical
Used to create the
Analyzer Lex
scanner (tokenizer)
Generator
Used to create the
2. Parser
parser (syntax YACC
Generator
checker)
Tool Type Description Example
3. Syntax-Directed
Handles semantic Built in
Translation
rules during parsing YACC
Engines
4. Code Converts intermediate GCC
Generators code to target code backend
Used for code
5. Data Flow LLVM,
optimization using
Engines GCC
data flow analysis
Helps in testing,
6. Debugger GDB,
debugging, and
Profiler Valgrind
performance tuning
📌 Popular Examples:
🔹 Lex:
Generates lexical analyzer (scanner).
Recognizes tokens like keywords, identifiers,
numbers.
🔹 YACC (Yet Another Compiler Compiler):
Generates syntax parser from grammar rules.
Works with Lex to build full front-end of a
compiler.
4. Explain Role of Lexical Analyzer?
Ans- The Lexical Analyzer is the first phase of a
compiler.
Its main job is to read the source program and
convert it into a stream of tokens, which are then
passed to the syntax analyzer. It acts as a bridge
between the source code and the parser.
🔍 Main Roles of Lexical Analyzer:
I. Token Generation:
It identifies valid tokens like keywords,
identifiers, operators, constants, and
symbols.
II. Removes Whitespaces and Comments:
It ignores blank spaces, tabs, and comments.
III. Error Reporting:
Detects invalid symbols or illegal characters.
IV. Provides Input to Parser:
Sends the stream of tokens
V. Symbol Table Handling:
Collects and stores identifiers
📌 Example:
Source Code:
int x = 10;
Lexical Analyzer Output (Tokens):
Keyword → int
Identifier → x
Operator →=
Number → 10
Symbol →;
Each word or symbol is converted into a token
and passed to the parser.
A compiler is a program that translates high-level
source code into machine code.
5. Discuss different phases of compiler?
Ans- 🔻 Phases of a Compiler (in order):
1. Lexical Analysis (Scanner)
• Breaks source code into tokens
• Removes whitespace and comments
🔹 Example:
int a = 5; → Tokens: int, a, =, 5, ;
2. Syntax Analysis (Parser)
• Checks the structure of code using grammar
rules
• Builds a parse tree
🔹 Example:
Checks if int a = 5;
3. Semantic Analysis
• Checks meaning of code
• Ensures correct data types, variable usage,
etc.
🔹 Example:
Ensures a = "hello";
4. Intermediate Code Generation
• Converts the code into a middle-level
language (independent of machine)
• Makes optimization easier
🔹 Example:
a = b + c; → t1 = b + c
5. Code Optimization
• Improves the code to run faster or use less
memory
🔹 Example:
Removes unused variables or simplifies
expressions
6. Code Generation
• Converts intermediate code into machine code
or assembly
7. Symbol Table Management
• Stores information about identifiers
(variables, functions, etc.)
8. Error Handling
• Detects and reports errors at each phase
(syntax, semantic, etc.)
Fig – Different Phases of Compiler
6. What do you mean by regular expression and
Finite automata.
Ans- 🔹 Regular Expression (RE)
A Regular Expression is a sequence of characters
that defines a pattern for matching strings.
In compiler design, regular expressions are used to
define the lexical rules for tokens such as
identifiers, numbers, keywords, etc.
🔹 Finite Automata (FA)
A Finite Automaton is a mathematical model
used to recognize patterns defined by regular
expressions.
It reads an input string one symbol at a time and
moves between states to decide if the string is
accepted or rejected.
It is used in lexical analysis to identify tokens by
implementing regular expressions.
🔹 Types of Finite Automata:
Type Full Form Description
Deterministic Finite Only one transition for
DFA
Automaton each input symbol
Can have multiple
Non-deterministic
NFA transitions for same
Finite Automaton
input
7. Explain Recognition of Tokens.
Ans- Recognition of tokens refers to the process of
identifying valid and meaningful components
(called tokens) from the source program during
the lexical analysis phase of a compiler.
The lexical analyzer (or scanner) performs token
recognition using regular expressions and passes
these tokens to the syntax analyzer.
🔍 Common Types of Tokens:
Token Type Examples
Keyword int, if, while
Identifier x, sum, value1
Number 123, 45.67, 0
Operator +, -, *, =
Symbol ;, (, ), {, }
▶️ Example:
Source Code:
int sum = 100;
Recognized Tokens:
• int → Keyword
• sum → Identifier
• = → Operator
• 100 → Number
• ; → Symbol
8. Explain Transitions Diagram?
Ans- A Transition Diagram is a graphical
representation of a finite automaton, used to
show how the system moves from one state to
another on reading input characters.
UNIT – 2
9. Difference between TOP DOWN and BUTTOM
UP ?
Ans-
10. Explain Shift reduced parcer?
Ans- A Shift-Reduce Parser is a type of bottom-up
parser that builds the parse tree from leaves to
root by using two main operations: shift and
reduce.
It is commonly used in LR parsing techniques.
11. What is Predictive Parsing? Explain its
components, drawbacks.
Ans- Predictive Parsing is a type of top-down
parsing technique in which the parser uses a
lookahead symbol to predict which production
rule to use without backtracking.
It works only for LL(1) grammars, where:
• The input is read Left to right (L)
• The derivation is Leftmost (L)
• It uses 1 lookahead symbol
🔧 Components of Predictive Parser:
1. Input Buffer:
2. Stack:
3. Parsing Table:
4. Parser Driver Program
❌ Drawbacks of Predictive Parsing:
1. Only works with LL(1) grammars (very simple
grammars).
2. Cannot handle left-recursive grammars.
3. Cannot parse ambiguous grammars.
4. Parsing table creation can be difficult for
large grammars.
12. What is a Parser Generator? Explain YACC.
Ans- Parser Generator:
A Parser Generator is a software tool that
automatically generates the syntax analyzer
(parser) of a compiler from a given context-free
grammar (CFG).
Instead of writing the parsing code manually, we
write grammar rules, and the tool creates a
working parser that checks whether input follows
the grammar.
✅ YACC (Yet Another Compiler Compiler):
🔹 Definition:
YACC is a widely used parser generator tool that
produces C code for a syntax analyzer based on
grammar rules written in a YACC-specific format.
It is usually used together with Lex, which
handles lexical analysis (token generation).
❌ Drawbacks of YACC:
• Only supports LALR(1) grammars
• Requires good understanding of C language
• Cannot directly handle ambiguous grammars
13. Explain Context-Free Grammars (CFGs).
Ans- A Context-Free Grammar (CFG) is a set of
rules or productions used to describe the syntax
of a programming language.
Each rule tells how a non-terminal symbol can be
replaced by a group of terminals and/or non-
terminals.
It is called “context-free” because the rule can be
applied regardless of the surrounding symbols
(context).
📌 Basic Elements of CFG:
1. Terminals:
Actual symbols of the language (e.g., a, +, id)
2. Non-terminals:
Variables that represent patterns or groups
(e.g., E, S)
▶️ Example:
Grammar:
E→E+E
E→E*E
14. Explain Precedence Parsing?
Ans- Precedence Parsing is a type of bottom-up
parsing technique used for parsing expressions
based on operator precedence and associativity
(like +, *, /).
It uses precedence relations between operators
to decide when to shift or reduce during parsing.
🧠 In Simple Words:
Precedence parsing is a way to parse
mathematical or logical expressions by following
the rules of operator priority (e.g., * before +).
▶️ Example:
Input Expression:
id + id * id
Operator Precedence:
✅ Advantages:
• Efficient for parsing arithmetic expressions
• Simple and fast for expression
❌ Drawbacks:
• Works only for operators with precedence
• Not suitable for full programming language
syntax
UNIT – 3
15. Describe Boolean Expressions and Procedure
Calls with proper definition and example.
Ans- ✅ 1. Boolean Expression
🔹 Definition:
A Boolean Expression is an expression that
results in a Boolean value, i.e., either true or
false.
It is used to represent logical conditions in control
statements like if, while, for, etc.
Boolean expressions are formed using:
• Relational operators: ==, !=, >, <, >=, <=
• Logical operators: && (AND), || (OR), ! (NOT)
📌 Example:
(a > b) && (b != 0)
✅ Use in Programming:
Used in:
• if statements
• while and for loops
• Decision-making processes
✅ 2. Procedure Call
🔹 Definition:
A Procedure Call (also called Function Call) is the
process of calling a subprogram or function to
perform a specific task.
When a procedure is called:
📌 Example in C:
int result = add(10, 20);
• The result is stored in result
✅ Use in Programming:
• Improves modularity
• Avoids code repetition
• Helps in structured programming
16. Explain Synthesized and Inherited Attributes
with proper definition and example.
17. Explain S attributed and L attributed
definitions.
Ans.
18. Explain Quadruples, Triples, and Indirect
Triples with proper definition and example
19. Explain Translation of Assignment
Statements Strategies in Compiler Design?
Ans. translation of assignment means converting
high-level assignment statements (like a = b + c)
into intermediate code that the machine or
lower-level compiler phases can understand.
The compiler uses different strategies to do this
step-by-step.
✅ Common Strategies for Assignment
Translation:
1. Using Temporary Variables
Break the expression into small parts
2. Postfix or Stack-Based Code (Polish Notation)
3. Direct Translation (Simple assignments)
🧠 In Simple Words:
Translating assignments means breaking down
big formulas into small steps using temp
variables or stack, so the compiler can handle
them.
20. Explain control flow?
Ans. Control Flow refers to the order in which
statements or instructions are executed in a
program.
In compiler design, it helps the compiler
understand:
• Which code runs first
• What happens next
• Where the program jumps, loops, or makes
decisions
🧠 In Simple Words:
Control flow tells how the program moves from
one part to another — like following a path in a
game based on choices.
📌 Examples of Control Flow Statements:
1. Sequential Flow
2. Looping Flow (for, while)
3. Jump Flow
✅ Conclusion:
Control flow decides how and in what order
instructions are executed in a program.
It is important for the compiler to generate
correct and efficient machine code, especially for
conditions, loops, and jumps.
UNIT-4
21. What is Static and Dynamic memory
allocation?
Ans.
22. What is an Activation Tree in Compiler
Design?
Ans- An Activation Tree is a tree structure that
represents the hierarchy and sequence of
function (or procedure) calls made during the
execution of a program.
Each node in the tree represents a function
Each edge represents one function to another.
📌 Example:
✅ Activation Tree Diagram:
A
/\
B C
/
D
• A → root node
• B and C → functions called by A
• D → function called by B
✅ Conclusion:
An Activation Tree is a tool used in compiler
design to represent the function call structure of
a program.
It helps in visualizing and managing how
procedures are called and returned, especially
during recursion and nested calls.
This is important for runtime memory
organization and stack management.
23. Explain Different parameter passing
methods?
Ans ✅ Different Parameter Passing Methods:
Call by Value
Call by reference
Call by Name
Call by Result
Call by value result
24. What is an Activation Record? Explain its
different fields
Ans. An Activation Record (also called a Stack
Frame) is a block of memory created on the call
stack whenever a function or procedure is called.
It stores all the information required to:
• Execute the function
• Track the return point
• Store local data and temporary values.
✅ Fields in Activation Record (with
Explanation):
Actual Parameters
Return Address
Access Link
Local Variables
Temporary Variables
Saved Registers
25. Difference between Stack static and heap
allocation stratregies ?
Ans
Stack
Static Allocation Heap Allocation
Allocation
Memory is Memory is
Memory is
managed allocated at
allocated at
using the call runtime, as
compile time
stack needed
Stack
Static Allocation Heap Allocation
Allocation
Used for global Used for
Used for local
variables, dynamic data
variables and
constants, or structures (lists,
function calls
static variables trees, objects)
Memory grows Fixed size; Size is flexible;
and shrinks cannot be can grow or
with function changed during shrink during
calls execution runtime
26. What is the use of a Symbol Table? Explain
ways to implement it and its fields.
Ans. A Symbol Table is a data structure used by a
compiler to store information about variables,
functions, objects, etc. in a program.
✅ Uses of Symbol Table:
• Stores names of variables, functions, classes
• Keeps track of data types, scope, and
memory location
• Helps in type checking, code generation, and
error detection
✅ Different Ways to Implement Symbol Table:
Linear List
Hash Table
Binary Search Tree
Trie (Prefix Tree)
✅ Fields in Symbol Table:
Field , Name , Type, Memory Location, Size ,
Parameter.
UNIT – 5
27. What is loop optimization. Explain its
techniques.
Ans. Loop Optimization is a compiler technique
used to increase the efficiency of loops in a
program.
Since loops often run many times, optimizing
them helps improve execution speed, reduce
memory usage, and lower CPU workload.
Loop optimization is performed during the code
optimization phase of the compiler.
✅ Main Objectives:
• Reduce unnecessary computations
• Minimize loop overhead
• Use registers and memory efficiently
• Improve CPU performance
✅ Loop Optimization Techniques:
1. Loop Invariant Code Motion
2. Loop Unrolling
3. Loop Fusion (Combining)
28. Explain in brief the issues in Design of Code
Generation in Compiler Design.
Ans. Code generation is the final phase of a
compiler where intermediate code is converted
into machine code or assembly code.
While generating this code, the compiler faces
some important design issues.
✅ Main Issues in Code Generation (in simple
words):
1. Input to Code Generator
2. Target Language
4. Register Allocation
5. Optimization Level
29. Explain Global data flow analysis?
Ans Global Data Flow Analysis is a process used
in the optimization phase of a compiler to collect
information about how variables are defined,
used, and modified across an entire program or
function, not just in a single basic block.
• Eliminate unnecessary code (dead code)
• Optimize memory and CPU usage
• Improve overall program performance
• Identify where variables get their values
(definitions)
• Track where those values are used
• Determine if values are live or dead
• Support various code optimizations
✅ Example:
Consider the following code:
1: int a = 5;
2: int b = a + 3;
30. Explain Register Allocation?
Ans- Register Allocation is the process in compiler
design where the compiler decides which
variables or temporary values should be stored in
CPU registers during program execution.
Since registers are limited and much faster than
memory, the goal is to store the most frequently
used values in registers to make the program run
faster.
✅ Techniques Used in Register Allocation:
Local Allocation
Global Allocation
✅ Example:
Consider this simple code:
int a = 5;
int b = 10;
31. Explain Peephole optimization?
Ans. Peephole Optimization is a local
optimization technique used in the final phase of
a compiler.
It examines a small window (peephole) of
generated assembly or machine code instructions
and tries to replace inefficient patterns with
better, faster, or shorter alternatives without
changing the program's output.
✅ Main Purpose:
• Improve the quality of the generated code
• Remove unnecessary instructions
• Make code run faster and take less space
✅ Common Types of Peephole Optimizations:
Redundant Load/Store
Algebraic Simplification
Strength Reduction