Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
5 views9 pages

AI - Experiment 08

The document discusses two optimization techniques: Hill Climbing and Genetic Algorithm (GA). Hill Climbing is a local search method that iteratively improves a solution but may get stuck in local optima, while GA is inspired by natural selection and employs selection, crossover, and mutation to explore a broader solution space. The document includes algorithm steps, problems associated with Hill Climbing, and code examples for both techniques.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views9 pages

AI - Experiment 08

The document discusses two optimization techniques: Hill Climbing and Genetic Algorithm (GA). Hill Climbing is a local search method that iteratively improves a solution but may get stuck in local optima, while GA is inspired by natural selection and employs selection, crossover, and mutation to explore a broader solution space. The document includes algorithm steps, problems associated with Hill Climbing, and code examples for both techniques.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

Experiment 08

Title: Implement Local Search algorithm for optimization : Hill Climbing/Genetic


Algorithm.

Theory:

Local Search Algorithm:


Local Search is an optimization technique used for solving computational problems where the
goal is to find an optimal or near-optimal solution by iteratively improving a candidate
solution. It is widely used in artificial intelligence, operations research, and combinatorial
optimization problems.

Key Features of Local Search Algorithm

1.​ Incremental Improvement

○​ Starts with an initial solution and iteratively improves it by exploring nearby


solutions.
2.​ Explores Neighboring Solutions
○​ Moves to solutions in the "neighborhood" of the current solution, modifying
small parts at a time.
3.​ Greedy Approach
○​ Often selects the best available neighboring solution (greedy strategy) to
improve performance.
4.​ Memory Efficient
○​ Uses minimal memory compared to exhaustive search techniques, as it does
not store all possible solutions.
5.​ Local Optimization
○​ Focuses on optimizing within a small region of the search space rather than
exploring the entire space.
6.​ Can Get Stuck in Local Optima
○​ May settle on suboptimal solutions if no better neighboring solution is found.

Hill Climbing Algorithm

Hill Climbing is a local search algorithm used for optimization problems. It starts with an
initial solution and iteratively moves to a better neighboring solution until no further
improvements can be made.
Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

Algorithm Steps

1.​ Start with an Initial Solution


○​ Choose a random or heuristic-based solution.​

2.​ Evaluate the Current Solution


○​ Compute the objective function value (fitness) of the current solution.​

3.​ Generate Neighboring Solutions


○​ Modify the current solution slightly to create neighboring solutions.​

4.​ Select the Best Neighbor


○​ If a neighboring solution has a better fitness value, move to that solution.​

5.​ Repeat Until No Improvement


○​ Continue the process until no better neighboring solution is found (local
optimum).

1. Local Optima Problem

Problem:

●​ Hill Climbing may get stuck in a local optimum where no neighboring solution is
better, even though a better global solution exists.

Solution:

Random Restarts:

●​ Run the algorithm multiple times with different starting points to increase the chances
of finding the global optimum.

Simulated Annealing:

●​ Occasionally accept worse solutions to escape local optima by introducing


randomness.

Tabu Search:

●​ Keep track of previously visited solutions to avoid cycling back to suboptimal


solutions.

2. Plateau Problem (Flat Region)

Problem:

●​ The algorithm may reach a flat region where all neighboring solutions have the same
fitness value, causing it to get stuck.​
Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

Solution:​
Increase Step Size:

●​ Adjust the step size to explore farther away neighbors.

Use Random Jumps:

●​ Introduce occasional large jumps to explore new regions of the search space.

Momentum-Based Approach:

●​ Keep moving in the same direction until a significant change is detected.

3. Ridge Problem

Problem:

●​ The algorithm struggles with high-dimensional problems where the best path to the
optimal solution requires moving diagonally, but the algorithm only considers direct
neighbors.​

Solution:

Use Multiple Neighboring Moves:

●​ Instead of moving in one direction at a time, allow diagonal moves and multi-step
exploration.​

Hybrid with Genetic Algorithms:

●​ Combine Hill Climbing with evolutionary methods to improve search efficiency.

Code;

graph = {
​ 'S': [('A', 9), ('B', 11)],
​ 'A': [('C', 7)],
​ 'B': [('D', 9), ('E', 2)],
​ 'C': [('F', 7), ('G', 7), ('H', 7)],
​ 'F': [('I', 7), ('J', 5)],
​ 'J': [('K', 0)], # Goal Node
}
values = {
​ 'S': 10, 'A': 9, 'B': 11, 'C': 7, 'D': 9, 'E': 2,
​ 'F': 7, 'G': 7, 'H': 7, 'I': 7, 'J': 5, 'K': 0
}

def hill_climb(start):
​ current = start
Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

​ print("Path:", end=" ")

​ while True:
​ print(f"Current Node: {current} (Value: {values[current]})")
​ if current not in graph:
​ print("No more children. Stopping.")
​ break # No children to explore

​ neighbors = graph[current]
​ best_choice = max(neighbors, key=lambda x: values[x[0]], default=(None, 0))

​ if values[best_choice[0]] <= values[current]:
​ print(f"Plateau reached at {current}. No better choice available.")
​ break # Plateau condition met

​ print(f"Choosing {best_choice[0]} (Value: {values[best_choice[0]]}) because it's
the highest among {[f'{n[0]}({values[n[0]]})' for n in neighbors]}")
​ current = best_choice[0]
​ print("End")
hill_climb('S')

Output:

Genetics Algorithm :

Genetic Algorithm (GA) is an optimization technique inspired by the process of natural


selection. It is used to solve complex problems by evolving solutions over generations.
Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

Genetic Algorithm works in the following steps-

Step-01:

●​ Randomly generate a set of possible solutions to a problem.


●​ Represent each solution as a fixed length character string.

Step-02:

Using a fitness function, test each possible solution against the problem to evaluate them.

Step-03:

●​ Keep the best solutions.


●​ Use best solutions to generate new possible solutions.

Step-04:

Repeat the previous two steps until-

●​ Either an acceptable solution is found


Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

●​ Or until the algorithm has completed its iterations through a given number of cycles /
generations.

Basic Operators-

The basic operators of Genetic Algorithm are-

1. Selection (Reproduction)-

●​ It is the first operator applied on the population.


●​ It selects the chromosomes from the population of parents to cross over and produce
offspring.
●​ It is based on evolution theory of “Survival of the fittest” given by Darwin.

There are many techniques for reproduction or selection operator such as-

●​ Tournament selection
●​ Ranked position selection
●​ Steady state selection etc.

2. Cross Over-

●​ Population gets enriched with better individuals after reproduction phase.


●​ Then crossover operator is applied to the mating pool to create better strings.
●​ Crossover operator makes clones of good strings but does not create new ones.
●​ By recombining good individuals, the process is likely to create even better
individuals.

3. Mutation-

●​ Mutation is a background operator.


●​ Mutation of a bit includes flipping it by changing 0 to 1 and vice-versa.
●​ After crossover, the mutation operator subjects the strings to mutation.
●​ It facilitates a sudden change in a gene within a chromosome.
●​ Thus, it allows the algorithm to see for the solution far away from the current ones.
●​ It guarantees that the search algorithm is not trapped on a local optimum.
●​ Its purpose is to prevent premature convergence and maintain diversity within the
population.

Code:

import random

# Problem parameters

POP_SIZE = 5 # Population size


Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

GENE_LENGTH = 5 # Length of binary representation (5 bits for numbers 0-31)

MUTATION_RATE = 0.1 # Probability of mutation

GENERATIONS = 5 # Number of generations

# Function to maximize: f(x) = x^2

def fitness_function(x):

return x ** 2

# Generate random population (binary strings)

def generate_population():

return [format(random.randint(0, 31), f'0{GENE_LENGTH}b') for _ in range(POP_SIZE)]

# Convert binary string to integer

def decode(binary_str):

return int(binary_str, 2)

# Selection: Roulette Wheel Selection

def select(population):

fitness_values = [fitness_function(decode(ind)) for ind in population]

total_fitness = sum(fitness_values)

probabilities = [f / total_fitness for f in fitness_values]

expected_counts = [p * POP_SIZE for p in probabilities]

actual_counts = [0] * POP_SIZE

selected = random.choices(population, weights=probabilities, k=POP_SIZE)

for sel in selected:

actual_counts[population.index(sel)] += 1

# Display table-like output

print("\nSelection Table")

print("{:<5} {:<10} {:<5} {:<8} {:<8} {:<8} {:<8} {:<8}".format(

"ID", "Chromosome", "X", "Fitness", "Prob", "%Prob", "Exp Cnt", "Act Cnt"))

print("-" * 75)
Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

for i in range(POP_SIZE):

print("{:<5} {:<10} {:<5} {:<8} {:.4f} {:<8.2f} {:<8.4f} {:<8}".format(

i + 1, population[i], decode(population[i]), fitness_values[i],

probabilities[i], probabilities[i] * 100, expected_counts[i], actual_counts[i]))

print("-" * 75)

return selected

def genetic_algorithm():

population = generate_population()

print(f"Initial Population: {population} (Decoded: {[decode(ind) for ind in population]})")

for generation in range(GENERATIONS):

print(f"\nGeneration {generation + 1}")

fitness_values = [fitness_function(decode(ind)) for ind in population]

# Selection Process

new_population = select(population)

# Replace population with new one

population = new_population

print(f"Best Individual: {max(population, key=lambda ind: fitness_function(decode(ind)))}")

print("=" * 75)

genetic_algorithm()

Output:
Atharva Hande​ ​ ​ ​ ​ ​ ​ ​ ​ D12B/67

Concluions: Both Hill Climbing and Genetic Algorithm (GA) are optimization techniques,
but they differ in approach. Hill Climbing is a greedy local search algorithm that iteratively
improves a solution but can get stuck in local optima. In contrast, GA is a global search
algorithm inspired by evolution, using selection, crossover, and mutation to explore a broader
solution space, reducing the risk of local optima. While Hill Climbing is faster for simple
problems, GA is more robust for complex, multi-modal optimization tasks.

You might also like