Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
65 views28 pages

04 FSM Encoding

This document discusses encoding finite state machines (FSMs) for efficient implementation in digital circuits. It explains that FSMs must be encoded to assign unique codes to each state before implementation. Encoding aims to minimize the number of flip-flops in the state register and optimize the resulting logic. Brute-force searching all possible encodings can find the best, but is infeasible for large FSMs. Instead, encodings are chosen to maximize common cubes in the logic that can be extracted during optimization. The document provides an example FSM encoding and resulting simplified logic equations.

Uploaded by

anuja askhedkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views28 pages

04 FSM Encoding

This document discusses encoding finite state machines (FSMs) for efficient implementation in digital circuits. It explains that FSMs must be encoded to assign unique codes to each state before implementation. Encoding aims to minimize the number of flip-flops in the state register and optimize the resulting logic. Brute-force searching all possible encodings can find the best, but is infeasible for large FSMs. Instead, encodings are chosen to maximize common cubes in the logic that can be extracted during optimization. The document provides an example FSM encoding and resulting simplified logic equations.

Uploaded by

anuja askhedkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Synthesis of Digital Systems

Part 4: FSM Encoding

Instructor: Preeti Ranjan Panda


Department of Computer Science and Engineering
Indian Institute of Technology Delhi
HLS Output – FSM + Datapath
a b c d

Control
Signals mux

S1 S2
+ +
Status
Signals mux

S3 e

f,g

FSM Datapath

(C) P. R. Panda, I.I.T Delhi, 2017 2


FSM Encoding / State Encoding /
State Assignment
• Assign codes to symbolic states in an FSM
– no 2 states should have the same code
10/00 10/00

S1 S2 00 01
0-, -1/01 1-/00 0-, -1/01 11/11 1-/00
11/11

10/01 10/01 0-/01


0-/01
S3 State 10
Encoding 0-/01
0-/01

(C) P. R. Panda, I.I.T Delhi, 2017 3


FSM and State Table
• One row in State Table for every FSM Transition

Inputs Present Next Outputs


10/00
State State

S1 S2 1 0 S1 S2 0 0
0-, -1/01 1-/00 0 - S1 S1 0 1
11/11
- 1 S1 S1 0 1
10/01 1 - S2 S2 0 0
0-/01
S3 0 - S2 S3 0 1
1 1 S3 S2 1 1
0-/01 0 - S3 S3 0 1
1 0 S3 S1 0 1

FSM State Table


(C) P. R. Panda, I.I.T Delhi, 2017 4
Why Encode an FSM?
• So we can implement it
• #bits in encoding  DFFs in State Register Present State Lines
• Truth Table  Next State and Output Logic

I/P PS NS O/P I/P PS NS O/P


State
1 0 S1 S2 0 0 1 0 00 01 0 0 Register Next
0 - S1 S1 0 1 0 - 00 00 0 1 State
- 1 S1 S1 0 1 - 1 00 00 0 1 Lines
1 - S2 S2 0 0 1 - 01 01 0 0
0 - S2 S3 0 1 0 - 01 10 0 1 Next State
1 1 S3 S2 1 1 1 1 10 01 1 1 and
0 - S3 S3 0 1 0 - 10 10 0 1 Output Logic Outputs
1 0 S3 S1 0 1 1 0 10 00 0 1
State Table Truth Table Inputs
(C) P. R. Panda, I.I.T Delhi, 2017 5
Encoding an FSM
inp PS NS outp
• Convert State Table to
00 st1 st2 101
Truth Table 01 st1 st3 --1
• Truth table can then be 1- st1 st1 111
-1 st2 st3 0--
optimised using standard -0 st2 st1 000
logic minimisation -- st3 st1 11-
techniques Encoding
st1: 00 inp PS NS outp
• Problem: Find encoding
st2: 01 00 00 01 101
that minimises area of st3: 10 01 00 10 --1
final circuit 1- 00 00 111
-1 01 10 0--
-0 01 00 000
-- 10 00 11-
(C) P. R. Panda, I.I.T Delhi, 2017 6
FSM Implementation Target: PLA
AND-plane OR-plane

State
FF FF
Register

Inputs Outputs
(C) P. R. Panda, I.I.T Delhi, 2017 7
FSM Implementation Target:
Multi-level Logic

Inputs Outputs

FF

FF State
Register

(C) P. R. Panda, I.I.T Delhi, 2017 8


FSM Encoding: What to Optimise?

• PLA implementation
– #rows x #cols in PLA
• Multi-level logic implementation
– Number of gates
• number of literals (ac’ + bc + a’b  6 literals)

(C) P. R. Panda, I.I.T Delhi, 2017 9


State Assignment Options

• Use minimum number of bits (log2 n)


– s0 (00), s1 (01), s2 (10), s3 (11)
– minimum number of flip flops
• One-hot encoding
– s0 (0001), s1 (0010), s2 (0100), s3 (1000)
– more flip flops, but may simplify logic
• no need of decode, possibly faster
• Any number of bits between minimum (log2 n)
and n
(C) P. R. Panda, I.I.T Delhi, 2017 10
Brute-force Search

• Generate all possible encodings


• Synthesise each case
• Choose the best
How many encodings for n-state FSM?
Let 2m-1 <n ≤ 2m
Assume m bits in state register (use minimum #bits)
# encodings = 2m x (2m-1) x (2m-2) x...x (2m-n+1)

(C) P. R. Panda, I.I.T Delhi, 2017 11


FSM Encoding for Multi-level Logic
• Given a State Table
• Generate Truth Table
• Minimise expected gate count in implementation
– without actually implementing
– predict the effect of logic optimisation

(C) P. R. Panda, I.I.T Delhi, 2017 12


Typical Multi-level Logic
Optimisations
Factoring ab + bc + d b (a + c) + d

s=a+b
Common ace + bce + de sce + de
Sub-expression ade + bde + af sde + af

t = bef
u = ae
Common Cube ace + bcef + de
tc + uc + de
(cube = product term) ade + bdef + af
td + ud + af

Objective: Maximise the number of available


common cubes (extracted later by logic optimisation)

(C) P. R. Panda, I.I.T Delhi, 2017 13


Common Next State
I PS NS O Let state bits = “abcde”
xy z Encoding:
S0 – 10011
-0 S0 S0 0 S1 – 11100
11 S0 S0 0 S2 – 10101
01 S0 S1 - S0 01/-
0- S1 S1 1
11 S1 S0 0 S1
10 S1 S2 1 xy abcde ABCDE z
1- S2 S2 1 S2 00/1 ...
00 S2 S1 1 01 10011 11100 -
01 S2 S3 1 00 10101 11100 1
0- S3 S3 1 ...
11 S3 S2 1
Truth Table
State Table
(C) P. R. Panda, I.I.T Delhi, 2017 14
Logic Equations after State
Assignment
I PS NS O
abcde
xy z

... State Reg


01 S0 S1 - ABCDE
00 S2 S1 1
... Comb. z
Logic

xy abcde ABCDE z xy
...
01 10011 11100 - A = x’yab’c’de + x’y’ab’cd’e + ... D = ...
00 10101 11100 1 B = x’yab’c’de + x’y’ab’cd’e + ... E = ...
... C = x’yab’c’de + x’y’ab’cd’e + ... z = x’y’ab’cd’e + ...

(C) P. R. Panda, I.I.T Delhi, 2017 15


Identifying Common Cubes
A = x’yab’c’de + x’y’ab’cd’e + ... s = ab’e
B = x’yab’c’de + x’y’ab’cd’e + ... A = x’yc’ds + x’y’cd’s + ...
C = x’yab’c’de + x’y’ab’cd’e + ... B = x’yc’ds + x’y’cd’s + ...
D = ... C = x’yc’ds + x’y’cd’s + ...
E = ... D = ...
z = x’y’ab’cd’e + ... E = ...
z = x’y’ab’cd’e + ...
Encoding (N bits):
S0 01/- S0 – 10011
S1 – 11100
S1 S2 – 10101
Hamming Distance H (S0, S2) = 2
S2 00/1 Lines for A, B, C have common cube of size N-H = 5-2 = 3
Similar observation for O/Ps

(C) P. R. Panda, I.I.T Delhi, 2017 16


Common Present State
I PS NS O Let state bits = “abcde”
xy z Encoding:
S0 – 10011
-0 S0 S0 0 S1 – 11100
11 S0 S0 0 S0 S2 – 10101
01 S0 S1 -
11/0
0- S1 S1 1 S1
11 S1 S0 0
10 S1 S2 1 10/1 S2 xy abcde ABCDE z
1- S2 S2 1 ...
00 S2 S1 1 11 11100 10011 0
01 S2 S3 1 10 11100 10101 1
0- S3 S3 1 ...
11 S3 S2 1
Truth Table
State Table
(C) P. R. Panda, I.I.T Delhi, 2017 17
Common Cubes after State
Assignment
I PS NS O
A = xyabcd’e’ + xy’abcd’e’ + ... S0
xy z
B = ...
C = xy’abcd’e’ + ... 11/0
... S1
D = xyabcd’e’ + ...
11 S1 S0 0
E = xyabcd’e’ + xy’abcd’e’ + ...
10 S1 S2 1 10/1 S2
... z = xy’abcd’e’ + ...

Encoding (N bits):
S0 – 10011
s = abcd’e’ S1 – 11100
A = xys + xy’s + ... S2 – 10101
xy abcde ABCDE z
B = ... Hamming Distance (S0, S2) = H
...
C = xy’s + ... Up to N-H = 3 Next State Lines (A, E
11 11100 10011 0
D = xys + ... in this example) have common cube
10 11100 10101 1
E = xys + xy’s + ... of size N = 5
...
z = xy’s + ... (irrespective of Encoding for S1)
Similar observation for I/Ps
(C) P. R. Panda, I.I.T Delhi, 2017 18
Strategy

• For an FSM, we can identify many such


relationships
– distance of codes vs. size/number of common cubes
• Estimate reduction in literals (or “gain”) if two
states are assigned close codes
• Assign codes that maximise the number/size of
common cubes
• Assume that logic optimiser will find and exploit
the common cubes

(C) P. R. Panda, I.I.T Delhi, 2017 19


Formulation

• Build Undirected Graph


– Nodes: states
– Edge Weight: gain if states are assigned
close codes
• Assign an encoding that maximises total
gain

(C) P. R. Panda, I.I.T Delhi, 2017 20


Building the Graph:
a Fanout-oriented Algorithm
• PS’s with the same NS and O/P get
high weights
– leads to close codes eventually
– maximise size of common cubes

ab PS NS t
---------- high s0: 0000
01 s0 s1 0 s0 s2 s2: 0001
00 s2 s1 1

(C) P. R. Panda, I.I.T Delhi, 2017 21


Fanout: Constructing Output Sets

• If 2 PS’s assert I PS NS O
xy z
same o/p,
common cube -0 S0 S0 0
can be extracted 11 S0 S0 0
01 S0 S1 -
OUTPUT_SETZ
• Construct No 0- S1 S1 1
= {S1 (2), S2 (3), S3 (2)}
different o/p sets 11 S1 S0 0
10 S1 S2 1
• Assign weight = 1- S2 S2 1
#occurences of 00 S2 S1 1
01 S2 S3 1
the PS for same 0- S3 S3 1
o/p 11 S3 S2 1

(C) P. R. Panda, I.I.T Delhi, 2017 22


Fanout: Constructing NS Sets
• If 2 PS’s produce
same NS, common I PS NS O
cube can be xy z
extracted
– intersection of the -0 S0 S0 0
2 PS encodings NS_SETS0 = {S0 (2), S1 (1)}
11 S0 S0 0
• How many common 01 S0 S1 - NS_SETS1 = {S0 (1), S1 (1), S2 (1)}
cubes? 0- S1 S1 1
– # 1’s in Next State 11 S1 S0 0 NS_SETS2 = {S1 (1), S2 (1), S3 (1)}
encoding 10 S1 S2 1
– ≅ Nb / 2 1- S2 S2 1 NS_SETS3 = {S2 (1), S3 (1)}
• Construct Ns 00 S2 S1 1
different NS sets 01 S2 S3 1
0- S3 S3 1
• Assign weight =
11 S3 S2 1
#occurences of the
PS for same NS
(C) P. R. Panda, I.I.T Delhi, 2017 23
Fanout: Constructing the Graph
OUTPUT_SETZ = {S1 (2), S2 (3), S3 (2)}
• Node = state
NS_SETS0 = {S0 (2), S1 (1)}
• Edge Weight (a-b) NS_SETS1 = {S0 (1), S1 (1), S2 (1)}
– multiply node weights in NS_SETS2 = {S1 (1), S2 (1), S3 (1)}
each OUTPUT_SET
NS_SETS3 = {S2 (1), S3 (1)}
• if a, b are in same set
• each pair of transitions For Nb = 2:
has a common cube Edge Weight (S2 - S3) =
3x2 + (1x1 + 1x1) x 2/2 = 8
– multiply node weights in
each NS_SET 3
• if a, b are in same set S0 S1
0
• scale by Nb / 2 1 8 5
S2 S3
8
(C) P. R. Panda, I.I.T Delhi, 2017 24
Graph Embedding
• Start from Graph we 3
S0 S1
constructed 0
8
1 5
– fanin
– fanout S2 S3
2
• Assign codes to
minimise Cost Cost = Σedges a→b wt(a→b) × HD (a→b)
HD = Hamming Distance
Encoding 1 Encoding 2
S0: 00 Cost = S0: 00 Cost =
S1: 01 3x1 + 5x1 + 2x1 + S1: 11 3x2 + 5x1 + 2x2 +
S2: 10 1x1 + 0x2 + 8x2 = 27 S2: 01 1x1 + 0x1 + 8x1 = 24
S3: 11 S3: 10

(C) P. R. Panda, I.I.T Delhi, 2017 25


Solution to Graph Embedding
Problem
• Well studied problem
• Practical observation
– for large FSMs, clusters of nodes with high edge
weights exist
• Heuristic – in what order to choose nodes for
assignment?
– identify heavy clusters and assign codes to their
states
– remove these states from consideration
– proceed with remaining graph
(C) P. R. Panda, I.I.T Delhi, 2017 26
Embedding Algorithm

1. Select node n with maximum sum Nb = 3


of Nb incident edge weight 4
a b
2. Assign some code to n and
3 4 0
minimally distant codes to the Nb 2
6
adjacent nodes e c
1 2
allows all Nb nodes to be at distance 1
3 2
from n d
3. Remove node n (and all incident
Select b (max sum: 4+6+2)
edges) from graph
b – 000
4. Repeat (Step 1) with smaller a – 001 (distance from b = 1)
graph d – 010 (distance from b = 1)
e – 100 (distance from b = 1)

(C) P. R. Panda, I.I.T Delhi, 2017 27


Embedding Algorithm - Example

4 001 a
a b
3 3 4
2 4 0 2
6 1
e c 100 e c
1 2
3 2 3 2
d d
010
Select b (max sum: 4+6+2) Select a (max sum: 3+2+4)
b – 000 c – 011 (distance from a = 1)
a – 001
d – 010
e – 100

(C) P. R. Panda, I.I.T Delhi, 2017 28

You might also like