Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
58 views96 pages

17 BigOh

The document discusses algorithms and their efficiency. It defines an algorithm as a logical sequence of discrete steps that solves a problem in a finite amount of time. Key aspects of algorithms include their name, description, inputs, outputs, and steps. Pseudocode is often used to describe algorithms. Examples are given of linear search algorithms, and their efficiency is analyzed by counting the number of operations. The best, worst, and average cases are considered. Big-O notation is introduced as a way to categorize algorithms based on how their runtime scales with increasing input size.

Uploaded by

Nguyen Vo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
58 views96 pages

17 BigOh

The document discusses algorithms and their efficiency. It defines an algorithm as a logical sequence of discrete steps that solves a problem in a finite amount of time. Key aspects of algorithms include their name, description, inputs, outputs, and steps. Pseudocode is often used to describe algorithms. Examples are given of linear search algorithms, and their efficiency is analyzed by counting the number of operations. The best, worst, and average cases are considered. Big-O notation is introduced as a way to categorize algorithms based on how their runtime scales with increasing input size.

Uploaded by

Nguyen Vo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 96

Algorithms

CS A262 – Discrete Structures


What is an Algorithm?

▶ An algorithm is…

2
What is an Algorithm?

▶ An algorithm is…

▶ A logical sequence of discrete steps that describe a


complete solution to a given problem computable in a
finite amount of time.

▶ In other words…

▶ It is a process that solves a problem step by step by


finding a solution in a finite amount of time.

3
Pieces of an Algorithm

▶ Each algorithm needs


▶ A name
▶ A brief description of the task performed
▶ A description of the input
▶ A description of the output
▶ A sequence of steps to follow
▶ Sometimes written in psuedocode

4
Defining Algorithms

● Pseudocode is often used to describe


algorithms
● Pseudocode:
○ Written in English, NOT code
○ May contain mathematical expressions
○ Uses indentation to indicate which
statements are part of a loop, if, or
procedure

5
Example: Linear Search

● Name:
● Summary:
● Input:
● Output:

6
Example: Linear Search

● Name: Linear Search


● Summary: Searches a list, L, of size N for a
particular element
● Input: a list and a value, val, to search for
● Output: True if the element is in the list,
False if the element is not in the list

7
Example: Linear Search -
Procedure

● Name: Linear Search


● Procedure:
found := False
i := 1
while i <= N
if L[i] = val
found := True
i := i + 1
return found

8
Example: Linear Search -
Another Procedure

● Name: Linear Search


● Procedure:
i := 1
while i <= N
if L[i] = val
return True
i := i + 1
return False
9
Choosing Algorithms

● Which version of Linear Search is better?


● How do we even define “better”?

10
EFFICIENCY OF
ALGORITHMS

11
Efficiency of Algorithms

▶ We say the algorithm that uses less operations is


more efficient
▶ Operations are measured as a function of the
input size, N
▶ For small values of N, the efficiency doesn’t
matter very much
▶ As N gets larger, the more efficient algorithm
will scale better and perform better

12
Efficiency of Algorithms
(cont.)
▶ How do we analyze a particular algorithm?
▶ We count the number of operations the
algorithm executes
▶ We do NOT focus on the actual computer time
to execute the algorithm
▶ Why not?
▶ A particular algorithm can be implemented
on a variety of computers and the speed of
the computer can affect the execution time
▶ But the number of operations performed by
an algorithm would be the same
13
Efficiency of Linear Search
Procedure 1
▶ How many times is each statement executed?

found := False
i := 1
while i <= N
if L[i] = val
found := True
i := i + 1
return found

14
Efficiency of Linear Search
Procedure 1
▶ How many times is each statement executed (assuming
all values in the list are unique and val is contained in
the list)?
found := False // 1 time
i := 1 // 1 time
while i <= N // N + 1 time
if L[i] = val // N times
found := True // 1 time
i := i + 1 // N times
return found // 1 time

Total: T(N) = 3N + 5
15
Efficiency of Linear Search
Procedure 2
▶ How many times is each statement executed?

i := 1
While i <= N
if L[i] = val
return True
i := i + 1
return False
16
Efficiency of Linear Search
Procedure 2
▶ How many times is each statement executed?

i := 1 // 1 time
While i <= N // ? time
if L[i] = val // ? time
return True // 0 or 1 time
i := i + 1 // ? time
return False // 0 or 1 time
17
Efficiency of Linear Search
Procedure 2
▶ The statements with ? depend on where val is in the
list, or if it isn’t in the list at all

i := 1 // 1 time
While i <= N // ? time
if L[i] = val // ? time
return True // 0 or 1 time
i := i + 1 // ? time
return False // 0 or 1 time.

Total: T(N) = ? + 2 18
Efficiency of Algorithms

▶ Efficiency can be measured for


▶ Best case
▶ Searching a key in a list:
▶ The key is immediately found
▶ Worst case
▶ Searching a key in a list:
▶ The key is at the end of the list or there is
no key
▶ Average case
▶ Difficult to determine
19
Efficiency of Linear Search
Procedure 2
▶ Best case: val is the first element in the list

i := 1 // 1 time
While i <= N // 1 time
if L[i] = val // 1 time
return True // 1 time
i := i + 1 // 0 time
return False // 0 time

Total (Best Case): T(N) = 4


20
Efficiency of Linear Search
Procedure 2
▶ Worst case: val is not in the list at all
▶ When comparing algorithms, we compare the Worst
Case complexity

i := 1 // 1 time
While i <= N // N + 1 times
if L[i] = val // N times
return True // 0 times
i := i + 1 // N times
return False // 1 time

Total (Worst Case): T(N) = 3N + 3


21
Which Linear Search is
better?
● Linear Search 1 has a runtime of T(N) = 3N + 5

● Linear Search 2 has a runtime of T(N) = 3N + 3

22
Which Linear Search is
better?
● Linear Search 1 has a runtime of T(N) = 3N + 5

● Linear Search 2 has a runtime of T(N) = 3N + 3

Note: Practically, Linear Search 2 would be


considered more efficient, because it never runs
worse than 3N + 3, and will usually execute less
instructions whereas Linear Search 1 ALWAYS
executes 3N + 5 instructions

23
Lecture Problems
What is the execution time, T(n), for the following code segment?
cout << "Enter two numbers: ";
cin >> num1 >> num2;

if (num1 > num2)


largest = num1;
else
largest = num2;

cout << "The largest number is" << largest << endl;

a) N
b) N+5
c) 5
d) 5N 24
Lecture Problems
What is the execution time, T(n), for the following code segment?
cout << "Enter two numbers: ";
cin >> num1 >> num2;

if (num1 > num2)


largest = num1;
else
largest = num2;

cout << "The largest number is" << largest << endl;

c) 5

25
Lecture Problems
What is the execution time, T(n), for the following code segment?
int sum = 0;
int num;
cout << “Enter positive integers ”
<< “or a negative integer to stop:” << endl;
cin >> num;
while(num > 0){
sum += num;
cin >> num;
}
cout << "The sum is " << sum << endl;

a) N
b) 18
c) 3N + 6
d) 3N 26
Lecture Problems
What is the execution time, T(n), for the following code segment?
int sum = 0;
int num;
cout << “Enter positive integers ”
<< “or a negative integer to stop:” << endl;
cin >> num;
while(num > 0){
sum += num;
cin >> num;
}
cout << "The sum is " << sum << endl;

c) 3N + 6
27
Lecture Problems
What is the execution time, T(n), for the following code segment?
for(int i = 0; i < N; i++){
// Simple Statement
}
for(int i = 0; i < N; i++){
for(int j = 0; j < N; j++){
// Simple Statement
}
}

a) 4 + 7N + 3N2
b) 3 + N + N2
c) N3
d) 3N

28
Lecture Problems
What is the execution time, T(n), for the following code segment?
for(int i = 0; i < N; i++){
// Simple Statement
}
for(int i = 0; i < N; i++){
for(int j = 0; j < N; j++){
// Simple Statement
}
}

a) 4 + 7N + 3N2

29
Which Linear Search is
better?
● Linear Search 1 has a runtime of T(N) = 3N + 5

● Linear Search 2 has a runtime of T(N) = 3N + 3

● These run times depend on this particular


implementation of these algorithms.
● Isn’t there a way to generalize how well these
algorithms scale?

30
Yes!
Big-O Notation
▶ Big O-notation:
▶ Mathematical measuring tool to quantitatively
evaluate algorithms
▶ Categorizes algorithms with respect to
execution time
▶ in terms of growth rate, not speed

31
Formal Definition of Big-O

• A function T(n) is O(g(n)) if


• There exists two constants c, k
• such that T(n) ≤ c*g(n) for all n > k

n: represents list size


T(n): count function
(for example, the number of comparisons in a
search algorithm).
c: units of computer time to execute one operation.
Constant c depends on computer speed (varies)
c g(n): computer time to execute g(n) operations.

32
Formal Definition of Big-O
(cont.)
• A function T(n) is O(g(n)) if
• There exists two constants c, k
• such that T(n) ≤ c*g(n) for all n > k

In other words, as n gets sufficiently large (larger than k),


there is some constant c for which the processing time will
always be less than or equal to c *g(n).

33
Formal Definition of Big-O
(cont.)
• A function T(n) is O(g(n)) if
• There exists two constants c, k
• such that T(n) ≤ c*g(n) for all n > k

c*g(n) provides an upper bound for the growth rate T(n)

34
Formal Definition of Big-O
(cont.)
▶ The growth rate of T (n) will be determined by the
fastest growing term, which is the one with the
largest exponent.

▶ In the example, an algorithm of T(N) = 3N + 3


▶ is O(N)
▶ In general, it is safe to ignore all constants and to
drop the lower-order terms when determining the
order of magnitude.
Formal Proof
● Want to prove that T(N) = 3N + 3 is O(N)
▶ This means for the function g(N) = N, we need
to find a constant c and a value N = k such
that
T(N) ≤ c*g(N) for N ≥ k
Formal Proof
● Want to prove that T(N) = 3N + 3 is O(N)
▶ This means for the function g(N) = N, we need
to find a constant c and a value N = k such
that
T(N) ≤ c*g(N) for N ≥ k
▶ If N > 3, then 3N + 3 < 3N + N (because 3 < N)
▶ If we let k = 3 , and c = 4, then
▶ T(n) = 3N + 3 < 4N
▶ when N > 3
for (int i = 0; i < n; i++) { Another Example
for (int j = 0; j < n; j++) {
simple statement
}
}
for (int i = 0; i < n; i++) {
simple statement 1
simple statement 2
simple statement 3
simple statement 4
simple statement 5
}
simple statement 6
simple statement 7
...
simple statement 30
for (int i = 0; i < n; i++) { Another Example
for (int j = 0; j < n; j++) {
simple statement
} This n
ested
a Sim loo
} ple St p executes
ateme
for (int i = 0; i < n; i++) { times nt n 2

simple statement 1
simple statement 2
simple statement 3
simple statement 4
simple statement 5
}
simple statement 6
simple statement 7
...
simple statement 30
for (int i = 0; i < n; i++) { Another Example
for (int j = 0; j < n; j++) {
simple statement
} This n
ested
a Sim loo
} ple St p executes
ateme
for (int i = 0; i < n; i++) { times nt n 2

simple statement 1
simple statement 2
simple statement 3
simple statement 4 This loop executes 5 Simple
simple statement 5 Statements n times (5n)
}
simple statement 6
simple statement 7
...
simple statement 30
for (int i = 0; i < n; i++) { Another Example
for (int j = 0; j < n; j++) {
simple statement
} This n
ested
a Sim loo
} ple St p executes
ateme
for (int i = 0; i < n; i++) { times nt n 2

simple statement 1
simple statement 2
simple statement 3
simple statement 4 This loop executes 5 Simple
simple statement 5 Statements n times (5n)
}
simple statement 6
simple statement 7 Finally, 25 Simple
... Statements are executed
simple statement 30
for (int i = 0; i < n; i++) { Another Example
for (int j = 0; j < n; j++) {
simple statement
}
}
for (int i = 0; i < n; i++) {
simple statement 1
simple statement 2
simple statement 3
simple statement 4
We can conclude that the
simple statement 5 relationship between processing
} time and n (the number of items
processed) is:
simple statement 6
simple statement 7 T(n) = n2 + 5n + 25
...
simple statement 30
What is the Big-Oh runtime?

● Given T(n) = n2 + 5n + 25

43
What is the Big-Oh runtime?

● Given T(n) = n2 + 5n + 25
○ Need a function g(n), a constant c, and a “large
enough” value for N
○ Intuitively, this function is O(N2)
○ Can we prove this?

44
What is the Big-Oh runtime?

● Given T(n) = n2 + 5n + 25
○ Need a function g(n), a constant c, and a “large
enough” value for N
○ Intuitively, this function is O(N2)
○ Can we prove this?
○ Short Proof:
■ Because n2 > n, n2 + 5n + 25 > n2 + 5n2 + 25n2
■ n2 + 5n2 + 25n2 = 31n2
■ Let c = 31, k = 1, g(n) = n2
■ Then T(N) < 31 * g(n) for n > 1

45
What is the Big-Oh runtime?

● Given T(n) = n2 + 5n + 25
○ Need a function g(n), a constant c, and a “large
enough” value for N
○ Intuitively, this function is O(N2)
○ Can we prove this?
○ Short Proof:
■ Because n2 > n, n2 + 5n + 25 > n2 + 5n2 + 25n2
■ n2 + 5n2 + 25n2 = 31n2
■ Let c = 31, k = 1, g(n) = n2
■ Then T(N) < 31 * g(n) for n > 1
Note: the structure of this proof. You can use this same structure for any
function.
Hence, for any polynomial, you can let c be the sum of the positive
coefficients, g(n) be the largest powered n, and k = 1
46
Growth Rates

▶ Growth rates of various functions.

size logarithmic linear log-linear quadratic cubic exponential


log2 n n n log2 n n2 n3 2n
1 0 1 0 1 1 2
2 1 2 2 4 8 4
4 2 4 8 16 64 16
8 3 8 24 64 512 256
16 4 16 64 256 4096 65,536
32 5 32 160 1024 32768 4,294,967,296

47
Growth Rates (cont.)

48
Growth Rates (cont.)

▶ Algorithms with exponential and factorial growth rates


have an effective practical limit on the size of the
problem they can be used to solve.

▶ With an O(2n) algorithm, if 100 inputs takes an hour


then,
▶ 101 inputs will take 2 hours
▶ 105 inputs will take 32 hours
▶ 114 inputs will take 16,384 hours (almost 2 years!)

49
Growth Rates (cont.)

▶ Encryption algorithms take advantage of the fact that


some growth rates have limits.

▶ Some cryptographic algorithms can be broken in O(2n)


time, where n is the number of bits in the key.
▶ A key length of 40 is considered breakable by a modern
computer, but a key length of 100 bits will take a
billion-billion (1018) times longer than a key length of 40.

50
Lecture Problems
What is the Big-O growth rate for the following code segment?
for(int i = 1; i < N; i *=2){
// Simple Statement
}
for(int i = 0; i < N; i++){
for(int j = 1; j < N; j*=2){
// Simple Statement
}
}

a) O(n)
b) O(n2)
c) O(nlogn)
d) O(logn)

51
Lecture Problems
What is the Big-O growth rate for the following code segment?
for(int i = 1; i < N; i *=2){
// Simple Statement
}
for(int i = 0; i < N; i++){
for(int j = 1; j < N; j*=2){
// Simple Statement
}
}

c) O(nlogn)

52
BIG-O, BIG-Ω,
BIG-ϴ

53
Big-O

▶ Remember, Big-O gives an upper bound for a function

▶ f(x) = x2 + 3x + 34 is O(x2)
▶ f(x) is also O(x3)
▶ f(x) is also O(x4)

▶ When using Big-O notation, you want to use the tightest


upper bound, but you still don’t know how tight the
bound is

54
Big-O

▶ f(x) = x2 + 3x + 34
▶ f(x) ≤ c*x2
▶ c = 10
▶ f(x) ≤ c * x3
▶ c=1
▶ f(x) ≤ c * x4
▶ c=1

55
Big-Ω

▶ Big-Ω give a lower bound for T(n)

▶ T(n) is Ω(g(n)) if
▶ There exists constants L, k
▶ Such that T(n) ≥ L*g(n) for n > k

56
Big-Ω

▶ Big-Ω gives a lower bound for T(n)

▶ T(n) is Ω(g(n)) if
▶ There exists constants L, k
▶ Such that T(n) ≥ L*g(n) for n > k

T(n) grows at least as fast as g(n)

57
Big-Ω Example

▶ Let T(n) = 8x3 + 5x2 + 7

▶ Need a function g(n) s.t. T(n) ≥ L*g(n)


▶ 8x3 + 5x2 + 7 ≥ 8x3
▶ Hence, T(n) is Ω(x3)

58
Big-Ω Example

▶ Let T(n) = 8x3 + 5x2 + 7

▶ Need a function g(n) s.t. T(n) ≥ L*g(n)


▶ 8x3 + 5x2 + 7 ≥ 8x3
▶ Hence, T(n) is Ω(x3)

T(n) is Ω(g(n)) iff g(n) is O(T(n))

59
Big-ϴ

▶ Big-ϴ gives an upper and lower bound for T(n)

▶ T(n) is ϴ(g(n)) if
▶ T(n) is O(g(n)) and
▶ T(n) is Ω(g(n))

T(n) grows at the same rate as g(n)

60
Big-ϴ

▶ Big-ϴ gives an upper and lower bound for T(n)

▶ T(n) is ϴ(g(n)) if
▶ There exists constants C1, C2, k
▶ Such that C1*g(n) ≤ T(n) ≤ C2*g(n)

T(n) grows at the same rate as g(n)

61
Big-ϴ Example

▶ Show that T(n) = 3n2 + 8*nlogn is ϴ(n2)

▶ Have to show T(n) is O(n2) and n2 is O(T(n))


▶ Say n > 0
▶ 3n2 + 8*nlogn < 3n2 + 8n2
▶ since logn < n < n2

▶ 3n2 + 8*nlogn < 11n2


▶ Hence, T(n) is O(n2)

62
Big-ϴ Example

▶ Show that T(n) = 3n2 + 8nlogn is ϴ(n2)

▶ Have to show n2 is O(T(n))


▶ Say n > 0
▶ n2 < 3n2 + 8nlogn
▶ Hence, n2 is O(T(n))

▶ Therefore, T(n) is ϴ(n2)

63
Big-ϴ Example

▶ Show that T(n) = 3n2 + 8nlogn is ϴ(n2)

▶ Have to show n2 is O(T(n))


▶ Say n > 0
▶ n2 < 3n2 + 8nlogn
▶ Hence, n2 is O(T(n))

▶ Therefore, T(n) is ϴ(n2)

Note: In general, if T(N) is a polynomial of degree k, T(N) is Θ(nk).


64
Big-ϴ Example

65
Summary

▶ For n > k,
▶ Big-O: g(n) is an upper bound
▶ T(n) ≤ c*g(n)

▶ Big-Ω: g(n) is a lower bound


▶ T(n) ≥ L*g(n)

▶ Big-ϴ: on the order of g(n)


▶ C1*g(n) ≤ T(n) ≤ C2*g(n)
66
Lecture Problems
Let f(x) =17x6 − 45x3 + 2x + 8.

Choose the function g(x) such that f(x) is O(g(x)).

a) g(x) = 1
b) g(x) = x
c) g(x) = x3
d) g(x) = x6

67
Lecture Problems
Let f(x) =17x6 − 45x3 + 2x + 8.

Choose the function g(x) such that f(x) is O(g(x)).

d) g(x) = x6

68
Dealing with Non-Polynomial
Functions

Let f, g, and h be functions from R+ to R+:


● If f = O(h) AND g = O(h), then f+g = O(h).
● If f = Ω(h) OR g = Ω(h), then f+g = Ω(h).
● If f = O(g) and c is a constant greater than 0, then c⋅f = O(g).
● If f = Ω(g) and c is a constant greater than 0, then c⋅f = Ω(g) .

69
Lecture Problems
Let f(x) =xlog(x) + log(x) + x.

Choose the function g(x) such that f(x) is ϴ (g(x)).

a) g(x) = x
b) g(x) = logx
c) g(x) = xlogx
d) g(x) = x4

70
Lecture Problems
Let f(x) =xlog(x) + log(x) + x.

Choose the function g(x) such that f(x) is ϴ (g(x)).

c) g(x) = xlogx

71
LINEAR VS.
BINARY SEARCH

72
Search Algorithms

▶ Search algorithms are very common


▶ They search a list
▶ Look at each item in the list and compare to the search
item

▶ We will consider two ways to search:


▶ Linear
▶ Also called sequential search
▶ Can be done in an ordered/unordered list
▶ Binary
▶ Can be done only in an ordered list

73
Binary Search

▶ Binary search Is faster than linear search


▶ BUT assumes array is sorted

▶ Breaks the list in half


▶ Determines if item in 1st or 2nd half
▶ Then searches again just that half
▶ Can be done recursively

74
Binary Search Algorithm

● Name:
● Summary:
● Input:
● Output:
● Procedure:

75
Binary Search Algorithm

● Name: Binary Search


● Summary: Searches for an element, val in a sorted
list, L of length N
● Input: An element, val, to look for; A sorted list L to
search through
● Output: The index of the first occurence of val in the
list, or -1 if the element does not exist
● Procedure:

76
Binary Search Algorithm
▶ Procedure:

lower := 1

upper := N

While lower <= upper


mid := (lower + upper) div 2
If L[mid] = val
return mid

If L[mid] < val


lower := mid + 1
Else
upper := mid - 1

return -1
77
Execution of Binary Search

▶ A sorted array of 10 elements


▶ Search for 63

[0] [1] [2] [3] [4] [5] [6] [7] [8] [9]

15 20 35 41 57 63 75 80 85 90

78
Execution of Binary Search
(cont.)
▶ A sorted array of 10 elements
Since indices are type int
▶ Search for 63 result will be truncated.
Find the middle ➔ (0 + 9) / 2 = 4

[0] [1] [2] [3] [4] [5] [6] [7] [8] [9]

15 20 35 41 57 63 75 80 85 90

mid

79
Execution of Binary Search
(cont.)
▶ A sorted array of 10 elements
▶ Search for 63
Find the middle ➔ (0 + 9) / 2 = 4

[0] [1] [2] [3] [4] [5] [6] [7] [8] [9]

15 20 35 41 57 63 75 80 85 90

mid

Is [4] equal to 63? No


Is 63 > or < than [4]? >
Check between [5] and [9]
80
Execution of Binary Search
(cont.)
▶ A sorted array of 10 elements
▶ Search for 63

Find the middle ➔ (5 + 9) / 2 = 7

[0] [1] [2] [3] [4] [5] [6] [7] [8] [9]

15 20 35 41 57 63 75 80 85 90

mid

81
Execution of Binary Search
(cont.)
▶ A sorted array of 10 elements
▶ Search for 63

Find the middle ➔ (5 + 9) / 2 = 7

[0] [1] [2] [3] [4] [5] [6] [7] [8] [9]

15 20 35 41 57 63 75 80 85 90

mid

Is [7] equal to 63? No


Is 63 > or < than [7]? <
Check between [5] and [6]
82
Execution of Binary Search
(cont.)
▶ A sorted array of 10 elements
▶ Search for 63

Find the middle ➔ (5 + 6) / 2 = 5

[0] [1] [2] [3] [4] [5] [6] [7] [8] [9]

15 20 35 41 57 63 75 80 85 90

mid

83
Execution of Binary Search
(cont.)
▶ A sorted array of 10 elements
▶ Search for 63
Find the middle ➔ (5 + 6) / 2 = 5

[0] [1] [2] [3] [4] [5] [6] [7] [8] [9]

15 20 35 41 57 63 75 80 85 90

mid

Is [5] equal to 63? Yes


Stopping case → 63 found
Location = mid
84
Execution of Binary Search
(cont.)
▶ A sorted array of 10 elements
▶ Search for 63

Number of comparisons ➔ 3

[0] [1] [2] [3] [4] [5] [6] [7] [8] [9]

15 20 35 41 57 63 75 80 85 90

mid

85
Binary Search Algorithm
● What’s the runtime?
○ What is the maximum number of times the while loop
will iterate?

▶ Procedure:

lower := 1
upper := N

While lower <= upper


mid := (lower + upper) div 2
If L[mid] = val
return mid

If L[mid] < val


lower := mid + 1
Else
upper := mid - 1 86

return -1
Binary Search Algorithm
● What’s the runtime?
○ What is the maximum number of times the while loop
will iterate?
■ log2N
▶ Procedure: T(N) = 5 + 4 *log2N

lower := 1 // 1 time
upper := N // 1 time

While lower <= upper // log2N + 1 times


mid := (lower + upper) div 2 // log2N times
If L[mid] = val // log2N times
return mid // 1 time

If L[mid] < val // log2N times


lower := mid + 1 // A times
Else
upper := mid - 1 // B times
// A + B = log2N times
87

return -1 // 1 time
Binary Search Algorithm
● What’s the runtime?
○ What is the maximum number of times the while loop
will iterate?
■ log2N
▶ Procedure: T(N) = 5 + 4 *log2N → O(log2N)

lower := 1 // 1 time
upper := N // 1 time

While lower <= upper // log2N + 1 times


mid := (lower + upper) div 2 // log2N times
If L[mid] = val // log2N times
return mid // 1 time

If L[mid] < val // log2N times


lower := mid + 1 // A times
Else
upper := mid - 1 // B times
// A + B = log2N times
88

return -1 // 1 time
Implementation

▶ Binary search can be implemented


▶ Iteratively
▶ Using a while loop

▶ Recursively
▶ IF/ELSE statement and call to itself

▶ Stopping case for both implementations:


▶ if (first > last) → no elements between them,
▶ so key cannot be there!
▶ if (key == a[mid]) → found!

89
Efficiency of Binary Search

▶ Binary search is very efficient


▶ Extremely fast, compared to sequential search

▶ Half of array is eliminated at start!


▶ Then a quarter, then 1/8, etc.
▶ Essentially eliminate half with each call

▶ Example: Array of 100 elements


▶ In this case, a binary search
▶ never needs more than 7 compares!

90
Big-O Notation for Search
Algorithms
▶ Linear search growth rate is O(n)
▶ Worst case scenario, you will be making n comparisons.

▶ Binary search growth rate is O(log2 n)


▶ Worst case scenario, you will be making log2 n
comparisons.
▶ In more general terms, the algorithm base 2 is dropped
and represented as O(log n)

91
Lecture Problems
How many comparisons are needed for a linear search in the worst
case if searching an array of 209 numbers?

a) 1
b) 8
c) 105
d) 209

92
Lecture Problems
How many comparisons are needed for a linear search in the worst
case if searching an array of 209 numbers?

d) 209

93
Lecture Problems
How many comparisons are needed for a binary search in the
worst case if searching an array of 1024 numbers?

a) 1
b) 10
c) 512
d) 1024

94
Lecture Problems
How many comparisons are needed for a binary search in the
worst case if searching an array of 1024 numbers?

b) 10

95
ALGORITHMS (END)

96

You might also like