Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
9 views13 pages

Lecture12 13 AVL Trees

Uploaded by

meharzubair905
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views13 pages

Lecture12 13 AVL Trees

Uploaded by

meharzubair905
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

AVL Trees

Course Instructor: Maheen Zulfiqar


Introduction
• The AVL Tree is a type of Binary Search Tree named after two Soviet inventors
Georgy Adelson-Velsky and Evgenii Landis who invented the AVL Tree in 1962.
• AVL trees are self-balancing, which means that the tree height is kept to a
minimum so that a very fast runtime is guaranteed for searching, inserting
and deleting nodes, with time complexity O(logn).
• The only difference between a regular Binary Search Tree and an AVL Tree is
that AVL Trees do rotation operations in addition, to keep the tree balance.
• A Binary Search Tree is in balance when the difference in height between left
and right subtrees is less than 2.
• By keeping balance, the AVL Tree ensures a minimum tree height, which
means that search, insert, and delete operations can be done really fast.
The two trees above are both Binary Search Trees,
they have the same nodes, and the same in-order
traversal (alphabetical), but the height is very
different because the AVL Tree has balanced itself.
The Balance Factor

• A node's balance factor is the difference in subtree heights.


• The subtree heights are stored at each node for all nodes in an AVL Tree, and the balance factor is
calculated based on its subtree heights to check if the tree has become out of balance.
• The height of a subtree is the number of edges between the root node of the subtree and the leaf
node farthest down in that subtree.
• The Balance Factor (BF) for a node (X) is the difference in height between its right and left subtrees.
• BF(X)=height(rightSubtree(X))−height(leftSubtree(X))
• Balance factor values
• 0: The node is in balance.
• more than 0: The node is "right heavy".
• less than 0: The node is "left heavy".
• If the balance factor is less than -1, or more than 1, for one or more nodes in the tree, the tree is
considered not in balance, and a rotation operation is needed to restore balance.
The Four "out-of-balance" Cases
When the balance factor of just one node is less than -1, or more than 1, the tree is regarded
as out of balance, and a rotation is needed to restore balance.
Case Description Rotation to Restore Balance

Left-Left (LL) The unbalanced node and its left A single right rotation.
child node are both left-heavy.

Right-Right (RR) The unbalanced node and its right A single left rotation.
child node are both right-heavy.

Left-Right (LR) The unbalanced node is left heavy, First do a left rotation on the left
and its left child node is right child node, then do a right
heavy. rotation on the unbalanced node.

Right-Left (RL) The unbalanced node is right First do a right rotation on the
heavy, and its right child node is right child node, then do a left
left heavy. rotation on the unbalanced node.
Cont.
1. The Left-Left (LL) Case
• The node where the unbalance is discovered is left heavy, and the node's left child node
is also left heavy.
• When this LL case happens, a single right rotation on the unbalanced node is enough to
restore balance.
• Step below to see the LL case, and how the balance is restored by a single right rotation.
• As you step, two LL cases happen:
• When D is added, the balance factor of Q becomes -2, which means the tree is
unbalanced. This is an LL case because both the unbalance node Q and its left child node
P are left heavy (negative balance factors). A single right rotation at node Q restores the
tree balance.
• After nodes L, C, and B are added, P's balance factor is -2, which means the tree is out of
balance. This is also an LL case because both the unbalanced node P and its left child
node D are left heavy. A single right rotation restores the balance.
Cont.
2. The Right-Right (RR) Case
• A Right-Right case happens when a node is unbalanced and right heavy, and
the right child node is also right heavy.
• A single left rotation at the unbalanced node is enough to restore balance in
the RR case.
• The RR case happens two times :
• When node D is inserted, A becomes unbalanced, and bot A and B are right
heavy. A left rotation at node A restores the tree balance.
• After nodes E, C and F are inserted, node B becomes unbalanced. This is an
RR case because both node B and its right child node D are right heavy. A left
rotation restores the tree balance.
Cont.
3. The Left-Right (LR) Case
• The Left-Right case is when the unbalanced node is left heavy, but its left child node is right
heavy.
• In this LR case, a left rotation is first done on the left child node, and then a right rotation is
done on the original unbalanced node.
• Step below to see how the Left-Right case can happen, and how the rotation operations are
done to restore balance.
• As you are building the AVL Tree, the Left-Right case happens 2 times, and rotation
operations are required and done to restore balance:
• When K is inserted, node Q gets unbalanced with a balance factor of -2, so it is left heavy,
and its left child E is right heavy, so this is a Left-Right case.
• After nodes C, F, and G are inserted, node K becomes unbalanced and left heavy, with its left
child node E right heavy, so it is a Left-Right case.
Cont.
4. The Right-Left (RL) Case
• The Right-Left case is when the unbalanced node is right heavy, and its right child node is left
heavy.
• In this case we first do a right rotation on the unbalanced node's right child, and then we do
a left rotation on the unbalanced node itself.
• Step to see how the Right-Left case can occur, and how rotations are done to restore the
balance.
• After inserting node B, we get a Right-Left case because node A becomes unbalanced and
right heavy, and its right child is left heavy. To restore balance, a right rotation is first done
on node F, and then a left rotation is done on node A.
• The next Right-Left case occurs after nodes G, E, and D are added. This is a Right-Left case
because B is unbalanced and right heavy, and its right child F is left heavy. To restore
balance, a right rotation is first done on node F, and then a left rotation is done on node B.
Time Complexity for AVL Trees

• In worst case, algorithms like search, insert, and delete must run through
the whole height of the tree. This means that keeping the height (h) of the
tree low, like we do using AVL Trees, gives us a lower runtime.
• The BST is not self-balancing. This means that a BST can be very
unbalanced, almost like a long chain, where the height is nearly the same
as the number of nodes. This makes operations like searching, deleting and
inserting nodes slow, with time complexity O(h)=O(n).
• The AVL Tree however is self-balancing. That means that the height of the
tree is kept to a minimum so that operations like searching, deleting and
inserting nodes are much faster, with time complexity O(h)=O(logn).
Cont.
• The fact that the time complexity is O(h)=O(lgn) for search, insert, and
delete on an AVL Tree with height h and nodes n can be explained like
this:
• Imagine a perfect Binary Search Tree where all nodes have two child
nodes except on the lowest level. The time complexity
is O(h)=O(lgn) for search, insert, and delete for such tree.
Remember!
• To check tree is balanced or not, check balance factor for each node.
• Search method is same (in order traversal) like BST.
• Delete method is same like BST, but maintain balancing property for
AVL after delete.
• Insertion or construction method is same like BST, but maintain
balancing property for AVL after delete.

You might also like