Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
6 views4 pages

DSA Algorithms Expanded CheatSheet

The document provides a comprehensive cheat sheet of various algorithms categorized into sorting, searching, divide & conquer, greedy, dynamic programming, graph, string, math & number theory, tree algorithms, and advanced/miscellaneous. Each algorithm is briefly explained along with its time complexity. This serves as a quick reference for understanding and comparing different algorithms.

Uploaded by

fun tym
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views4 pages

DSA Algorithms Expanded CheatSheet

The document provides a comprehensive cheat sheet of various algorithms categorized into sorting, searching, divide & conquer, greedy, dynamic programming, graph, string, math & number theory, tree algorithms, and advanced/miscellaneous. Each algorithm is briefly explained along with its time complexity. This serves as a quick reference for understanding and comparing different algorithms.

Uploaded by

fun tym
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

■ Expanded DSA Algorithms Cheat Sheet (with Short

Explanations)

Sorting Algorithms
• Bubble Sort: Repeatedly swaps adjacent elements if they are in the wrong order. Time: O(n²).

• Selection Sort: Finds the minimum element and places it at the beginning. Time: O(n²).

• Insertion Sort: Builds the sorted list one item at a time. Good for small datasets. Time: O(n²).

• Merge Sort: Divide and conquer: split, sort recursively, merge. Stable. Time: O(n log n).

• Quick Sort: Partition around a pivot, sort subarrays recursively. Avg O(n log n), worst O(n²).

• Heap Sort: Uses a binary heap to repeatedly extract the max. Time: O(n log n).

• Counting Sort: Counts occurrences, suitable for integers in small range. Time: O(n+k).

• Radix Sort: Sorts numbers digit by digit using counting sort. Time: O(nk).

• Bucket Sort: Distributes elements into buckets and sorts each. Good for uniform distribution.

• Shell Sort: Improves insertion sort with gap sequence. Time: O(n log² n).

Searching Algorithms
• Linear Search: Scans each element sequentially. Time: O(n).

• Binary Search: Checks middle element in sorted array, divides search space. Time: O(log n).

• Jump Search: Jumps ahead by √n steps, then linear search. Time: O(√n).

• Interpolation Search: Estimates position using values. Best for uniform data. Avg O(log log n).

• Exponential Search: Exponential jumps, then binary search. Time: O(log n).

• Fibonacci Search: Uses Fibonacci numbers to divide search space. Time: O(log n).

Divide & Conquer


• Binary Search: Divide array in half each step. Time: O(log n).

• Merge Sort: Split and merge arrays. Time: O(n log n).

• Quick Sort: Partition and recurse. Time: O(n log n).

• Strassen’s Matrix Multiplication: Faster than standard matrix multiply. Time: O(n^2.81).

• Closest Pair of Points: Divide space and check recursively. Time: O(n log n).

Greedy Algorithms
• Activity Selection: Selects maximum non-overlapping activities.

• Fractional Knapsack: Takes items based on value/weight ratio. Time: O(n log n).

• Huffman Coding: Builds prefix codes for compression. Time: O(n log n).
• Job Scheduling: Schedules jobs to maximize profit with deadlines.

• Prim’s MST: Builds MST by adding cheapest edge to tree. Time: O(E log V).

• Kruskal’s MST: Sorts edges, uses union-find to build MST. Time: O(E log E).

• Dijkstra’s: Shortest path in graph with non-negative weights. Time: O((V+E) log V).

Dynamic Programming
• Fibonacci Sequence: Stores results to avoid recomputation. Time: O(n).

• 0/1 Knapsack: Max value with weight limit. Time: O(nW).

• Coin Change: Ways to make change or minimum coins. Time: O(nW).

• Longest Common Subsequence: Finds LCS between strings. Time: O(mn).

• Longest Increasing Subsequence: Finds LIS. Time: O(n log n).

• Matrix Chain Multiplication: Optimal order to multiply matrices. Time: O(n³).

• Edit Distance: Minimum edits to transform one string. Time: O(mn).

• Egg Dropping: Minimum trials to find threshold floor. Time: O(k n²).

• Bellman-Ford: Shortest paths, detects negatives. Time: O(VE).

• Floyd-Warshall: All-pairs shortest paths. Time: O(V³).

• Optimal BST: Minimizes search cost. Time: O(n³).

• Catalan Numbers: Counts parenthesizations, BSTs, paths, etc.

Graph Algorithms
• BFS: Level-order graph traversal. Time: O(V+E).

• DFS: Depth-first traversal. Time: O(V+E).

• Topological Sort: Orders DAG nodes. Time: O(V+E).

• Dijkstra’s Algorithm: Shortest paths with priority queue. Time: O((V+E) log V).

• Bellman-Ford Algorithm: Shortest paths with negative weights. Time: O(VE).

• Floyd-Warshall Algorithm: All-pairs shortest paths. Time: O(V³).

• Johnson’s Algorithm: All-pairs shortest path with reweighting. Time: O(V² log V + VE).

• Prim’s Algorithm: MST with greedy approach. Time: O(E log V).

• Kruskal’s Algorithm: MST using union-find. Time: O(E log E).

• Tarjan’s Algorithm: Finds strongly connected components. Time: O(V+E).

• Kosaraju’s Algorithm: SCC with two DFS passes. Time: O(V+E).

• Kahn’s Algorithm: Topological sort using in-degree. Time: O(V+E).

• A* Search: Heuristic-based shortest path algorithm.

String Algorithms
• Naive Pattern Searching: Check for pattern at every position. Time: O(nm).

• KMP: Preprocess pattern using LPS to skip checks. Time: O(n+m).


• Rabin-Karp: Hash-based pattern matching. Avg O(n+m).

• Z Algorithm: Pattern matching using Z-values. Time: O(n+m).

• Aho-Corasick: Multi-pattern matching using automaton.

• Boyer-Moore: Skips comparisons using bad character/good suffix. Time: O(n/m).

• Suffix Array: Array of suffixes for substring search. Time: O(n log n).

• Suffix Tree: Compressed trie for suffixes. Time: O(n).

• Manacher’s Algorithm: Finds longest palindrome in O(n).

Math & Number Theory


• Euclidean GCD: Repeated division until remainder is zero.

• LCM: LCM via a*b/gcd(a,b).

• Sieve of Eratosthenes: Marks non-primes up to n. Time: O(n log log n).

• Modular Exponentiation: Fast power using divide & conquer. Time: O(log n).

• Matrix Exponentiation: Computes powers of matrices quickly.

• Chinese Remainder Theorem: Solves simultaneous congruences.

• Karatsuba Multiplication: Fast recursive multiplication. Time: O(n^1.58).

Tree Algorithms
• Traversals: Inorder, Preorder, Postorder, Level Order.

• Binary Search Tree Ops: Insert, delete, search. Avg O(log n).

• AVL Tree: Self-balancing BST. O(log n).

• Segment Tree: Range queries and updates. O(log n).

• Fenwick Tree: Efficient prefix sums. O(log n).

• Trie: Prefix search in O(L) time.

• Lowest Common Ancestor: Finds common ancestor in O(log n).

• Red-Black Tree: Balanced BST with O(log n) operations.

• B-Tree: Balanced search tree for disk storage.

Advanced / Miscellaneous
• Union-Find (DSU): Set operations with path compression. Amortized O(α(n)).

• KMP Automaton: Finite automaton version of KMP.

• Ford-Fulkerson: Max flow using augmenting paths. O(E * maxflow).

• Edmonds-Karp: BFS version of Ford-Fulkerson. O(VE²).

• Hopcroft-Karp: Max bipartite matching. O(E√V).

• Suffix Automaton: Minimal automaton for all substrings.

• Monte Carlo Algorithms: Probabilistic with small error chance.

• Las Vegas Algorithms: Probabilistic but always correct result.


• Bloom Filter: Probabilistic membership testing.

• Skip List: Probabilistic data structure with O(log n) operations.

You might also like