DAA Report
DAA Report
Genetic Algorithms
Among the various metaheuristic techniques, Genetic algorithms (GAs) are computational search
and optimization techniques inspired by the principles of natural selection and evolutionary
biology. They are widely used to solve problems where traditional methods may struggle,
particularly in cases involving complex, multidimensional, or poorly understood solution spaces.
By emulating the processes of selection, reproduction, crossover, and mutation observed in
biological systems, GAs aim to find optimal or near-optimal solutions through iterative
improvement over successive generations.
The history of Genetic Algorithms (GAs) is rooted in the intersection of biology, computer
science, and mathematics, dating back to the mid-20th century. The foundational concepts of
GAs were inspired by Charles Darwin's theory of natural selection, which emphasizes survival of
the fittest. Over the years, these biological principles have been adapted into computational
frameworks for solving optimization problems.
Genetic Algorithms (GAs) are optimization techniques inspired by biological evolution. They
solve problems by iteratively improving a population of candidate solutions using principles like
natural selection, crossover, and mutation. Below, we explain the key components, structure, and
terminology central to GAs.
1. Fitness Function
The fitness function evaluates how well a potential solution (or chromosome) solves the given
problem.
2. Population
3. Chromosomes
Structure:
For a problem with NparN_{par}Npar parameters, a chromosome might be encoded as:
chromosome=[p1,p2,...,pNpar]\text{chromosome} = [p_1, p_2, ...,
p_{N_{par}}]chromosome=[p1,p2,...,pNpar] where pip_ipi represents a specific
parameter.
Encoding: Chromosomes can be binary strings (e.g., 010101), real numbers,
permutations, or other formats depending on the problem.
Representation Flexibility: For binary solutions, parameters are concatenated into a
single bit string.
4. Selection
Selection determines which chromosomes are chosen to reproduce, favoring those with higher
fitness.
5. Crossover (Recombination)
Mechanism:
Two parent chromosomes exchange parts of their structure to produce offspring.
o Example:
If parent chromosomes are: Parent 1:[11010111001000], Parent 2:
[01011101010010]
o After a crossover at the 4th bit: Offspring 1:[11011101010010],Offspring 2:
[01010111001000]
Importance: Combines good traits from parents to explore new regions of the solution
space.
6. Mutation
1. Initialization:
o The first generation is typically created randomly or based on heuristic rules.
o Each individual in the population (chromosome) represents a potential solution.
2. Fitness Evaluation:
o Each chromosome is evaluated using the fitness function to determine its quality
or suitability as a solution.
3. Selection:
o Chromosomes with higher fitness are more likely to be selected for reproduction,
ensuring that better solutions have a higher chance of passing their traits to the
next generation.
4. Crossover:
o Selected chromosomes exchange segments to create offspring, simulating
biological reproduction. This introduces new combinations of traits.
5. Mutation:
o Small, random changes are applied to some chromosomes to maintain genetic
diversity and prevent premature convergence.
6. Replacement:
o The new population replaces the old one, and the cycle repeats until a stopping
condition is met (e.g., number of generations, convergence to a solution).
The process aims to refine the population, increasing the overall fitness and driving the solutions
toward optimality.
Genotype and phenotype are concepts borrowed from biology to represent the internal encoding
of a solution and its external manifestation, respectively.
A Genetic Algorithm (GA) is a search heuristic inspired by the process of natural selection. It is
used to find approximate solutions to optimization and search problems. Here's a detailed
explanation of the steps involved in how a genetic algorithm works:
1. Initialization
Example: If you're solving a traveling salesman problem (TSP), a chromosome could represent a
possible route through the cities.
2. Fitness Evaluation
Purpose: To evaluate how good each solution (chromosome) is in the current population.
Explanation: The fitness function is used to assign a fitness score to each individual in
the population based on how well it solves the problem. A higher fitness score indicates a
better solution.
o Fitness Function: This is problem-specific and tells us how good a solution is.
For example, in the TSP, the fitness function could calculate the total distance of
the route represented by the chromosome.
o Scaling: The fitness values may be scaled or normalized to ensure they are
comparable across the population.
Example: For a traveling salesman problem, the fitness could be the inverse of the total distance,
meaning that shorter paths have higher fitness.
3. Selection
Purpose: To select individuals from the population to form the mating pool, based on
their fitness scores.
Explanation: In the selection step, individuals are chosen for reproduction based on their
fitness. The better the fitness, the higher the chance an individual has of being selected.
o Roulette Wheel Selection: The probability of an individual being selected is
proportional to its fitness. Individuals with higher fitness scores will have a larger
slice of the "wheel" and thus a greater chance of being selected.
o Tournament Selection: A group of individuals is randomly selected, and the best
individual from this group is chosen.
o Rank Selection: Individuals are ranked according to their fitness, and selection is
based on their rank rather than absolute fitness.
Example: A chromosome with a fitness of 90% has a higher chance of being selected for
reproduction than one with a fitness of 50%.
4. Crossover (Recombination)
Example: For binary representation, if Parent 1 = 110010 and Parent 2 = 101101, a crossover
at the 3rd bit could produce offspring as:
Offspring 1 = 110101
Offspring 2 = 101010
5. Mutation
Example: If an offspring chromosome is 110101 and a mutation is applied at the 2nd bit, the
resulting chromosome could be 100101.
6. Termination Criteria
Purpose: To stop the algorithm once a solution is found or a specific condition is met.
Explanation: Genetic algorithms typically iterate for a set number of generations or until
a certain threshold of fitness is reached. Some common termination criteria include:
o Maximum Generations: A predefined number of generations after which the
algorithm stops.
o Convergence: If the population reaches a state where the fitness no longer
improves, the algorithm might stop.
o Target Fitness: If a solution with a fitness above a certain threshold is found, the
algorithm terminates early.
The Knapsack Problem is a classical optimization problem, often used to illustrate the power of
genetic algorithms (GA) for solving complex problems.
1. Problem Statement:
You are given a set of items, each with a weight and a value, and a knapsack with a weight
capacity. The task is to determine which items to include in the knapsack so that the total value is
maximized without exceeding the knapsack's capacity.
Mathematical Formulation:
Let there be n items, where each item has a weight wiw_iwi and a value viv_ivi.
The total weight capacity of the knapsack is W.
The goal is to select a subset of items to maximize the total value V=∑vixiV = \sum v_i
x_iV=∑vixi, where xi=1x_i = 1xi=1 if item iii is selected, and xi=0x_i = 0xi=0 if not
selected, subject to the constraint ∑wixi≤W\sum w_i x_i \leq W∑wixi≤W.
To solve the Knapsack problem using a genetic algorithm, we need to represent potential
solutions (individuals) as chromosomes. In the case of the knapsack problem:
Let's go through the steps of solving the Knapsack problem using a genetic algorithm.
Step 1: Initialization (Create Initial Population)
Population Size: Choose the number of individuals in the population. Let’s assume the
population size is 6.
Chromosome Representation: Each individual in the population is a binary string
representing whether an item is selected or not.
EXAMPLE
Chromosome 1: 11001
Chromosome 2: 10010
Chromosome 3: 01101
Chromosome 4: 10110
Chromosome 5: 11100
Chromosome 6: 01011
Where:
For example, for a knapsack capacity of 50 and item values and weights like:
Step 3: Selection
Select individuals for reproduction based on their fitness using a selection method. Here,
we can use Roulette Wheel Selection or Tournament Selection.
For example, if Chromosome 1 has the highest fitness, it has a higher chance of being
selected for the next generation.
Perform crossover between two selected parent chromosomes. Typically, a random point
is selected, and the chromosomes are exchanged at that point.
Example:
o Parent 1: 11001
o Parent 2: 10010
o After crossover at point 3:
Offspring 1: 11010
Offspring 2: 10001
Step 5: Mutation
Step 6: Replacement
Replace the old population with the new population generated from crossover and
mutation.
Repeat the process for a fixed number of generations or until the termination condition is
met (such as reaching a maximum fitness level).
Step 7: Termination
The algorithm terminates after a fixed number of generations or if the fitness reaches a
satisfactory value.
4. Fitness Function for the Knapsack Problem
As mentioned earlier, the fitness function for the Knapsack problem needs to penalize solutions
where the total weight exceeds the knapsack's capacity.
Chromosome: 11001
Selected items: 1, 2, and 5
Total value: 20 + 5 + 25 = 50
Total weight: 10 + 4 + 12 = 26
Fitness = 50 (since weight ≤ 50)
Here's a simple representation of how the population evolves over a few generations.
Genetic Algorithms (GAs) are a class of optimization algorithms inspired by the principles of
natural evolution. They are used to solve complex problems where traditional methods may
struggle or fail. The primary reasons for using GAs include:
1. Exploration of Large Search Spaces: GAs can search large, complex spaces without
requiring specific knowledge of the problem's structure, unlike traditional methods.
2. Global Optimization: GAs are less likely to get stuck in local optima, making them
suitable for global optimization problems.
3. Adaptability: They can be used for various problem types, from combinatorial
optimization to machine learning model tuning.
4. Robustness: GAs can handle noisy, dynamic, and uncertain environments, making them
robust in real-world applications.
1. Flexibility:
o Traditional algorithms (e.g., gradient descent, dynamic programming) often
require a deep understanding of the problem's mathematical structure.
o GAs, on the other hand, do not require such detailed problem knowledge. They
are highly flexible and can be applied to a wide range of problems.
2. No Need for Gradient Information:
o Many traditional optimization methods require gradient or derivative information,
which may be unavailable or computationally expensive to calculate. GAs do not
require gradient information, making them suitable for problems where such
data is difficult to obtain.
3. Parallelism:
o GAs naturally lend themselves to parallelism as multiple solutions (individuals)
can be evaluated and evolved simultaneously. Traditional algorithms often
require sequential operations, which can be slower.
4. Handling Non-linear, Non-convex, and Multi-modal Problems:
o Traditional optimization techniques may struggle with complex, multi-modal
(having many local optima) or non-linear problems.
o GAs are effective at dealing with these problems, as they explore the search space
more globally and are less likely to get stuck in local minima.
1. Scalability:
o GAs scale well with problem size and complexity. They can be applied to
problems with very large search spaces and many variables, and their
performance remains relatively good as the problem size increases. This is
particularly advantageous for real-world optimization problems with thousands or
millions of variables, such as in machine learning or design optimization.
o Traditional algorithms may face challenges when scaling up, particularly with
combinatorial problems (e.g., traveling salesman, knapsack problem), where the
number of possible solutions grows exponentially.
2. Flexibility:
o GAs are highly flexible and can be adapted to different types of problems with
minimal modification. This includes problems involving continuous or discrete
optimization, multi-objective optimization, and even problems with constraints.
o Traditional algorithms are often tailored for specific problem types (e.g., linear
programming for linear problems, or gradient descent for differentiable
functions), which makes them less flexible.
The efficiency of Genetic Algorithms (GAs) can be analyzed in terms of time complexity and
space complexity. While GAs are often not as efficient as traditional algorithms in terms of raw
computational time, their global optimization capabilities and flexibility in handling complex
problems often make them more suitable for certain tasks. Let's look at their time and space
complexities and how they are analyzed in the context of real-world problems.
Time Complexity of Genetic Algorithms:
O(P × n), where P is the population size, and n is the length of the individual (e.g.,
number of genes in a chromosome).
If the GA runs for G generations, the overall time complexity would be:
O(G × P × n)
In real-world applications, P, G, and n are typically adjusted based on the complexity of the
problem and available computational resources.
Example:
For an optimization problem with a population size of 1000 individuals, running for 100
generations with individual solutions of length 50 (n = 50), the time complexity would be O(100
× 1000 × 50) = O(5,000,000) operations. This is manageable for many problems but can still be
computationally expensive depending on the nature of the fitness function.
Space Complexity of Genetic Algorithms:
Space complexity refers to the amount of memory used by the algorithm. The main contributors
to space complexity are:
1. Population Storage:
o The space required to store the population of individuals, where each individual
represents a potential solution. The space complexity for storing the population is
O(P × n), where P is the population size and n is the length of each individual.
2. Auxiliary Data Structures:
o GAs may use additional data structures such as temporary arrays for crossover
and mutation operations, as well as fitness scores and selection probabilities.
These add some additional memory requirements but are usually proportional to
the population size, i.e., O(P).
Thus, the overall space complexity is dominated by the space needed to store the population,
and can be expressed as:
O(P × n)
When applying Genetic Algorithms to real-world problems, efficiency often comes down to how
well the algorithm can balance exploration and exploitation of the solution space, and how well it
adapts to the complexity of the problem. Here are some key points to consider:
1. Convergence Speed:
o While GAs can be computationally intensive, they are often more efficient than
traditional optimization algorithms in exploring a large, complex search space.
For example, problems with multiple local optima (e.g., traveling salesman
problem, job-shop scheduling) can benefit from GAs because they are less likely
to get trapped in local minima and instead search for global solutions.
2. Scalability:
o In larger, more complex real-world problems (e.g., machine learning, robotics),
the scalability of GAs is an advantage. By adjusting population size, crossover,
and mutation rates, GAs can be tailored to optimize complex systems, even when
the number of variables is large. This adaptability is a major strength over
traditional methods.
3. Parallelism:
o Many GA operations (like fitness evaluation) can be performed in parallel. This is
an advantage in real-world applications where computational resources (e.g.,
multi-core processors) are available. Parallelism can dramatically improve the
efficiency of GAs in solving real-time or large-scale problems, such as
optimizing parameters in deep learning models or scheduling problems.
4. Handling Constraints:
o Real-world problems often come with a variety of constraints (e.g., resource
constraints, budget limits). GAs can be adapted to handle these constraints
efficiently, using penalty functions or constraint-handling techniques, which
allows them to work well in constrained optimization problems.
5. Exploration vs Exploitation:
o GAs strike a balance between exploration (searching new areas of the solution
space) and exploitation (focusing on areas that have already produced good
solutions). In many real-world problems, especially those with many local optima,
this balance helps GAs outperform other optimization methods.
Genetic Algorithms (GAs) are widely used in real-world optimization problems because they
excel in finding near-optimal solutions in large and complex search spaces. Below are several
important areas where GAs are applied, along with examples of specific problems and how GAs
help solve them.
1. Optimization Problems
Knapsack Problem
Problem: The knapsack problem involves selecting a subset of items with given weights
and values to maximize the total value while staying within a weight limit.
GA Application: GAs are used to evolve solutions by encoding possible combinations of
items (genes) and using crossover and mutation to find the best set of items that
maximize value within the weight constraint.
Example:
A warehouse wants to determine which items to load into a truck such that the total value
is maximized without exceeding the weight capacity of the truck. Using a GA, multiple
combinations of items can be evaluated iteratively, with the best solutions evolving over
generations.
Problem: The job shop scheduling problem involves scheduling jobs on machines with
constraints, such as time and machine capacity, to minimize total processing time or
costs.
GA Application: A GA can represent each solution as a chromosome encoding the order
of jobs and machines. Through crossover and mutation, the algorithm explores various
job and machine combinations to find an optimal schedule.
Example:
In a manufacturing plant, multiple machines need to process a set of tasks with varying
durations. A GA can help optimize the machine schedules to minimize the total
production time, considering resource constraints and job dependencies.
2. Machine Learning
Problem: In machine learning, feature selection aims to identify the most relevant
features from a large dataset to improve the performance of models by reducing
dimensionality and avoiding overfitting.
GA Application: GAs are used to search through subsets of features by encoding a
selection of features into chromosomes. Fitness is evaluated by training and testing a
machine learning model (e.g., classification accuracy) using the selected features.
Example:
A dataset containing hundreds of features needs to be reduced to a smaller, more
informative set for a classification task. A GA helps by evaluating different combinations
of features and selecting the subset that provides the best accuracy for the classification
algorithm.
3. Real-World Examples
Problem: Routing problems involve finding the most efficient route for vehicles, which
could involve multiple destinations and constraints (e.g., time windows, delivery
capacities).
GA Application: The GA encodes possible routes and evaluates them based on distance,
time, or cost. By evolving these routes over multiple generations, GAs can find near-
optimal solutions.
Example:
A logistics company that needs to deliver goods to multiple locations uses a GA to
optimize the delivery route. The algorithm considers factors like delivery time windows,
traffic conditions, and fuel consumption to determine the best route that minimizes costs.
Disaster Management
Problem: In disaster management, resources (e.g., medical supplies, rescue teams) need
to be allocated to affected areas to maximize the impact of relief efforts while considering
constraints such as transportation capacity, response times, and priority areas.
GA Application: GAs can model resource allocation and routing to optimize the delivery
of aid to different regions. Solutions are encoded as individuals representing different
resource distribution plans.
Example:
After a natural disaster like an earthquake, relief teams use GAs to optimize the
allocation of resources (e.g., food, water, medical aid) and personnel to the affected areas,
ensuring that help arrives quickly where it is most needed.
Resource Allocation
Example:
In project management, a company wants to allocate its resources (employees,
machinery) to different tasks in a way that minimizes the total cost or time. A GA can be
used to optimize the allocation of these resources based on constraints like project
deadlines, budget limits, and task dependencies.
Basic Principle:
GA: Genetic Algorithms are based on the principles of natural selection and genetics,
simulating the process of evolution. Solutions are represented as chromosomes, and
genetic operations such as crossover (recombination), mutation, and selection are applied
iteratively to evolve the population towards better solutions.
SA: Simulated Annealing mimics the physical process of cooling a material (annealing)
to find a global optimum. It starts with a high "temperature" (high acceptance of
solutions) and gradually decreases it (lowering the acceptance of new solutions), allowing
the algorithm to escape local minima early and converge to an optimal solution.
GA: GAs maintain a balance between exploration (searching through the space) and
exploitation (refining the solutions). This is done by using crossover and mutation, which
allows GAs to explore multiple solutions in parallel.
SA: SA tends to explore the solution space globally at the beginning, but over time, it
shifts towards exploitation. As the temperature decreases, the acceptance of worse
solutions decreases, focusing on local optimization.
Convergence:
GA: GAs can converge prematurely if there is insufficient diversity in the population or
if mutation rates are too low. However, with proper parameters, GAs can converge to a
near-optimal solution.
SA: SA is less prone to premature convergence since it allows occasional uphill moves
(worse solutions), which can help avoid local minima. However, it may take longer to
converge to an optimal solution, depending on the cooling schedule.
Parameter Tuning:
GA: GAs require careful tuning of parameters such as population size, crossover rate,
mutation rate, and selection pressure.
SA: SA requires tuning of the temperature schedule, which controls how the temperature
decreases over time.
Advantages:
GA:
o Can solve complex, multi-modal problems.
o Suitable for combinatorial and continuous optimization.
o Robust with respect to noise and uncertainty.
SA:
o More efficient than GAs in single-solution-based problems.
o Can escape local minima due to its probabilistic acceptance of worse solutions.
o Simple to implement and computationally inexpensive.
Disadvantages:
GA:
o May require more computation due to its population-based approach.
o Needs good diversity maintenance to avoid premature convergence.
SA:
o Can be slow to converge if the cooling schedule is not carefully chosen.
o Less suitable for multi-modal or large-scale problems compared to GAs.
GA: Optimizing a vehicle routing problem (multiple solutions can be processed at once).
SA: Optimizing a single-component design or a simple function with many local optima
(e.g., temperature control in material processing).
Basic Principle:
GA: Genetic Algorithms use the principles of natural evolution to search for optimal
solutions. A population of candidate solutions is evolved over generations using genetic
operators (selection, crossover, mutation).
PSO: Particle Swarm Optimization simulates the social behavior of birds flocking or fish
schooling. Each particle (candidate solution) adjusts its position based on its previous
best position and the best position found by its neighbors, converging towards an optimal
solution.
GA: GAs are strong at exploring the solution space because of crossover and mutation,
which are designed to create new, diverse solutions. However, they can struggle with
exploitation if diversity is not properly managed.
PSO: PSO is often better at exploiting the best solutions it finds since particles move
towards the global best position. However, this can lead to premature convergence if the
particles are too similar.
Convergence:
GA: GAs may converge to an optimal or near-optimal solution, but they are susceptible
to stagnation if the population diversity is lost.
PSO: PSO converges quickly compared to GAs due to its more direct approach of
updating positions based on historical and global best positions. However, like GAs, it
may get stuck in local optima without careful parameter tuning.
Parameter Tuning:
GA: Requires careful tuning of population size, crossover rate, mutation rate, and other
parameters.
PSO: Requires tuning of parameters such as the cognitive (personal best) and social
(global best) factors, along with the inertia weight, which determines the trade-off
between exploration and exploitation.
Advantages:
GA:
o Suitable for both combinatorial and continuous optimization problems.
o Good for complex and multi-modal optimization problems.
o Can be applied in parallel, processing multiple solutions at once.
PSO:
o Computationally efficient, with less parameter tuning required compared to GAs.
o Can converge faster to a global optimum in continuous problems.
o Simple to implement and easy to understand.
Disadvantages:
GA:
o Population-based methods can require more computational resources.
o Tuning of parameters can be complex and time-consuming.
PSO:
o Tends to struggle with combinatorial optimization problems.
o May converge prematurely, especially in high-dimensional spaces.
GA: Solving optimization problems in robotics, circuit design, and feature selection.
PSO: Applied to continuous function optimization, parameter tuning in machine learning
models, and optimization of network design.
While Genetic Algorithms (GAs) have been successfully applied to a wide range of optimization
problems, they come with their own set of challenges and limitations. Here are the key
challenges and limitations of GAs:
1. Premature Convergence
Issue: One of the primary challenges with GAs is premature convergence. This happens
when the population becomes too similar (low diversity) in early generations, leading the
algorithm to converge to a suboptimal solution before exploring the entire search space.
Cause: This typically occurs when the selection pressure is too high or when crossover
and mutation rates are not balanced properly.
Solution: Maintaining diversity in the population is crucial. Techniques like crowding,
fitness sharing, and adaptive mutation rates can help mitigate premature convergence.
Issue: GAs are computationally expensive, especially for large-scale problems. Since
GAs work with populations of solutions, the number of fitness evaluations increases with
both the population size and the number of generations.
Cause: The algorithm performs multiple fitness evaluations in each generation, making it
less efficient for problems that require large amounts of computation or have a large
search space.
Solution: Optimizing the GA’s parameters (such as population size, selection rate, and
generation count) and using parallelization techniques can help reduce computational
demands.
3. Selection of Parameters
Issue: GAs are heuristic search methods, and there is no guarantee that they will find the
global optimum solution, especially for complex, multi-modal, or poorly defined
problems.
Cause: The GA's exploration of the search space is probabilistic, which means it can get
trapped in local optima, particularly when the search space is rugged or has many peaks
and valleys.
Solution: Using hybrid approaches (combining GAs with other algorithms like Simulated
Annealing or Local Search) can improve the chances of finding better solutions and avoid
local minima.
5. Sensitivity to Encoding
Issue: The choice of how the problem is encoded into chromosomes (the representation
of solutions) has a significant impact on the performance of a GA.
Cause: Poor encoding can lead to loss of solution quality, slow convergence, or difficulty
in applying genetic operators like crossover and mutation.
Solution: Careful consideration of the problem representation and choosing the
appropriate encoding scheme (binary, real-valued, or others) is essential for GA success.
Issue: While GAs have been empirically successful in many areas, they lack a strong
theoretical foundation compared to other optimization techniques.
Cause: The evolutionary processes involved in GAs are probabilistic, and understanding
how they will perform in different situations is often difficult.
Solution: Ongoing research in evolutionary computation and hybrid models aims to build
better theoretical frameworks for GAs and improve their understanding and applicability.
Issue: In cases where GAs are used for machine learning or data-driven applications,
there is a risk of overfitting the model to the training data.
Cause: The GA may converge to solutions that fit the training data too well but fail to
generalize effectively to unseen data.
Solution: Regularization techniques, cross-validation, and careful control of the search
process can help reduce the risk of overfitting.
Issue: GAs may struggle when the optimization problem includes complex constraints,
especially if the constraints are not handled efficiently.
Cause: GAs typically work on a population of candidate solutions, and if the solution
space is constrained, they may end up with infeasible solutions.
Solution: Special techniques like penalty functions, repair methods, or using constraint-
handling techniques can help address constraints effectively in GA-based approaches.
Conclusion
Genetic Algorithms (GAs) are a versatile and powerful optimization technique inspired by
natural evolution. They are particularly effective for solving complex, non-linear problems
where traditional methods may fall short. By iterating through generations with processes like
selection, crossover, and mutation, GAs can evolve solutions toward optimal or near-optimal
outcomes.
Despite challenges such as premature convergence and computational complexity, GAs are
widely used in real-world applications like machine learning, optimization, scheduling, and
resource allocation. Their scalability and flexibility make them a valuable tool for solving
problems in diverse fields.
While not always the best option, GAs continue to be an essential tool for tackling complex
optimization problems, with ongoing improvements in their efficiency and performance.
REFERENCES
Kumar, M., & Rathi, S. (2015). Genetic Algorithm for Optimization Problems: A Review.
International Journal of Computer Science and Information Technologies, 6(1), 265-268.
Link: https://www.ijcsit.com/docs/Volume%206/Issue%201/IJCSIT15-06-01-032.pdf