Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
4 views8 pages

Algorithms

The document discusses algorithm analysis, focusing on asymptotic notation, including Big-O, Big-Ω, and Big-Θ, to evaluate algorithm performance with examples like linear search. It also covers the brute force approach to the traveling salesman problem, detailing its inefficiency and time complexity of O(n! × n). Additionally, it explains the divide and conquer strategy using merge sort, providing its recurrence relation and time complexity of Θ(n log n), and describes the decrease and conquer strategy through a recursive factorial function with a time complexity of O(n).

Uploaded by

elafhamoada
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views8 pages

Algorithms

The document discusses algorithm analysis, focusing on asymptotic notation, including Big-O, Big-Ω, and Big-Θ, to evaluate algorithm performance with examples like linear search. It also covers the brute force approach to the traveling salesman problem, detailing its inefficiency and time complexity of O(n! × n). Additionally, it explains the divide and conquer strategy using merge sort, providing its recurrence relation and time complexity of Θ(n log n), and describes the decrease and conquer strategy through a recursive factorial function with a time complexity of O(n).

Uploaded by

elafhamoada
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Design and Analysis of

Algorithms

Asignment No(1)
‫ إيالف جمال حمودة حمد النيل‬: ‫اإلسم‬

212-18 : ‫رقم الجلوس‬

- ‫ إحصاء‬: ‫القسم‬
‫حاسوب‬

1. Explain the significance of asymptotic


notation in algorithm analysis. Discuss
Big-O, Big-
Ω, and Big-Θ notations and explain their
use in evaluating algorithm performance.
Provide
one example algorithm and analyze its
time complexity using Big-o.
Significance of Asymptotic Notation
Asymptotic notation helps analyze
algorithm efficiency by describing how its
running time or space grows with input
size, ignoring machine-specific constants.
Types:
Big-O (O): Upper bound; describes the
worst-case time.
Example: O(n²) → algorithm takes at most
n² steps.
Big-Omega (Ω): Lower bound; describes
the best-case time.
Example: Ω(n) → at least n steps.
Big-Theta (Θ): Tight bound; gives exact
growth rate.
Example: Θ(n log n) → always grows like n
log n.

Example – Linear Search:


function linearSearch(arr, target):
for i from 0 to length(arr) - 1:
if arr[i] == target:
return i
return -1
Best case: target at start → Ω(1)
Worst case: target not found → O(n)
Average case: search half the array →
Θ(n)

2. Describe the brute force approach to


solving problems. Then, write the brute
force solution
for the travelling salesman problem, and
analyze its time complexity.
Brute Force and TSP
Brute Force:
Tries all possible solutions to find the best
one. It is simple but inefficient for large
inputs.
Travelling Salesman Problem (TSP):
Find the shortest route that visits all cities
once and returns to the start.
Brute Force Idea:
Generate all n! possible orders of cities,
compute the cost of each, and return the
shortest.
Pseudocode (simplified):
function TSP(cities, dist):
min_cost = ∞
return min_cost
for each permutation p of cities:
cost = sum of distances in path p +
return to start
if cost < min_cost:
min_cost = cost
Time Complexity:
n! permutations × O(n) per path
Total: O(n! × n) → exponential growth, not
scalable for large n
3. Describe the Divide and Conquer
paradigm and explain how it is used in
the Merge Sort
algorithm. Provide the recurrence relation
for Merge Sort and solve it to find the
time
complexity
Divide and Conquer and Merge Sort
Divide and Conquer Strategy:
This strategy breaks a problem into
smaller subproblems, solves them
recursively, and combines their results. It
is efficient for many problems.
Merge Sort Overview:
1. Divide the array into two halves.

2. Recursively sort both halves.

3. Merge the sorted halves.


return arr
Pseudocode:
function mergeSort(arr):
if length(arr) ≤ 1:
mid = length(arr) / 2
left = mergeSort(arr[0:mid])
right = mergeSort(arr[mid:end])
return merge(left, right)
Recurrence Relation:
Let T(n) be the time to sort n elements:
T(n) = 2T(n/2) + O(n)

Solving the recurrence using Master


Theorem:
a = 2, b = 2, f(n) = O(n)
Since f(n) = Θ(n) = Θ(n^log_b(a)), case 2
of the Master Theorem applies:
⇒ T(n) = Θ(n log n)
4. Explain the Decrease and Conquer
strategy. Illustrate your explanation by
applying it to
compute the factorial of a number using
recursion. What is the time complexity?
Decrease and Conquer and Factorial
Decrease and Conquer Strategy:
This strategy solves a problem by
reducing it to a smaller instance of the
same problem, usually by decreasing the
input size by 1.
Example: Recursive Factorial
function factorial(n):
if n == 0 or n == 1:
return 1
return n × factorial(n - 1)
Explanation:
To compute factorial(5), we compute 5 ×
factorial(4), and so on.
The problem size decreases by 1 at each
step until reaching the base case.

Time Complexity:
Each call makes one recursive call.
So the time complexity is O(n).

You might also like