Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
5 views9 pages

Conference Paper

This document provides a comprehensive review of strategies for optimizing energy efficiency in cloud architectures for edge computing, highlighting the collaboration between cloud and edge computing to enhance scalability and reduce energy consumption. It discusses various techniques such as workload optimization, AI-driven resource management, and virtualization, while emphasizing the importance of sustainability in light of increasing energy demands. The paper also addresses challenges and future research directions in developing energy-efficient cloud-edge systems, particularly in the context of IoT applications.

Uploaded by

jsuganyact
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views9 pages

Conference Paper

This document provides a comprehensive review of strategies for optimizing energy efficiency in cloud architectures for edge computing, highlighting the collaboration between cloud and edge computing to enhance scalability and reduce energy consumption. It discusses various techniques such as workload optimization, AI-driven resource management, and virtualization, while emphasizing the importance of sustainability in light of increasing energy demands. The paper also addresses challenges and future research directions in developing energy-efficient cloud-edge systems, particularly in the context of IoT applications.

Uploaded by

jsuganyact
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

(IJACSA) International Journal of Advanced Computer Science and Applications,

Vol. 15, No. 11, 2024

Optimizing Energy Efficient Cloud Architectures for


Edge Computing: A Comprehensive Review
TA Gamage*, Indika Perera
Department of Computer Science and Engineering, University of Moratuwa, Sri Lanka

Abstract—Now-a-days, edge computing and cloud computing Global accessibility is yet another fundamental benefit of cloud
are considered for collaborating together to produce computing computing. Moreover, cloud services reduce geographical
solutions that are more effective, scalable and adaptable. The barriers, enabling real-time collaboration across several regions
proliferation of cloud infrastructures has drastically increased and supporting the delivery of services to a global clientele
energy consumption leading to the need for more research in with less idle time [2].
optimizing energy efficiency for sustainable and efficient systems
with reduced operational costs. In addition, the edge computing Edge computing is a distributed computing paradigm that
paradigm has gained wide attention during the last few decades moves data storage and computation closer to the point of
due to the rise of the Internet of Things (IoT) devices, the demand, which is generally at the network's edge, close to the
emergence of applications that require low latency, and the data generating source. Edge computing reduces latency,
widespread demand for environmentally friendly computing. bandwidth utilization, and reaction times by processing data
Moreover, lowering cloud-edge systems' energy footprints is locally on IoT sensors, gateways, or edge servers rather than
essential for fostering sustainability in light of growing concerns depending on a centralized cloud [4], [5]. This method is more
about environmental effects. This research presents a efficient for time-sensitive applications where rapid decisions
comprehensive review of strategies aimed at optimizing energy are essential, like real-time analytics, industrial automation,
efficiency in cloud architectures designed for edge computing and driverless cars. Edge computing improves performance,
environments. Various strategies, including workload
security, and dependability in a variety of applications with the
optimization, resource allocation, virtualization technologies, and
use of decentralized computing [6], [7].
adaptive scaling methods, have been identified as techniques that
are widely utilized by contemporary research in reducing energy Cloud computing has revolutionized the way businesses
consumption while maintaining high performance. Furthermore, and organizations operate, offering scalable, on-demand
the paper investigates how advancements in machine learning computing resources. Nevertheless, the growth of cloud
and AI can be leveraged to dynamically manage resource infrastructures has increased their energy consumption that has
distribution and energy-efficient enhancements in cloud-edge been a major concern. Therefore, the demand for energy-
systems. In addition, challenges to the approaches for energy efficient cloud architectures has become a critical area of
optimization have been discussed in detail to further provide
research due to environmental concerns, operational costs, and
insights for future research. The conducted comprehensive
the need for sustainability. In addition, the proliferation of edge
review provides valuable insights for future research in the edge
computing paradigm, particularly emphasizing the critical computing where data is processed closer to the data source to
importance of enhancing energy efficiency in these systems. reduce latency and bandwidth usage adds another dimension to
cloud energy optimization [8], [9].
Keywords—Cloud computing; edge computing; energy The rapid growth of energy consumption by edge cloud
efficiency; sustainability computing infrastructures has been a significant focus of
I. INTRODUCTION contemporary research considering the operational costs and
environmental concerns aiming at the sustainability of cloud
Cloud computing is a concept for providing computer computing architectures. Among the approaches for a
resources such as servers, storage, databases, networking, sustainable edge computing paradigm, dynamic resource
software, and analytics over the internet. Users can obtain allocation, AI driven energy optimization, energy-aware
physical infrastructure and data centers on-demand from cloud scheduling and load balancing, green data centres,
service providers, negating the need to own and manage them virtualizations and containerizations are acquiring significant
and providing more flexibility, scalability, and cost-efficiency. focus by the researchers [10], [11], [12], [13].
Without the upfront expenses and hassles of maintaining
traditional IT systems, cloud computing allows businesses and Nevertheless, the exponential growth of cloud computing
individuals to scale their IT needs as needed, covering infrastructures has resulted in inflated demand for further
everything from data storage to sophisticated application research on optimizing energy efficient cloud architectures in
development [1] [2] [3]. order to move towards a sustainable edge computing paradigm.
In addition, maintaining performance and scalability while
The adaptability of cloud technology is one of its most reducing energy consumption has also been a controversial
intriguing features. Businesses can simply reallocate their topic that is further researched [14] [15]. Therefore, further
computer resources to meet changing demands, allowing for research that enhance the green cloud architectures for edge
ongoing scaling in response to changing business requirements.

The open-access publication fee of this paper is supported by the SRC


publication fee scheme of the University of Moratuwa

637 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 15, No. 11, 2024

computing are required for the improved sustainability of the settings, and dynamically manage resource utilization based on
cloud computing paradigm. needs by evaluating real-time data. With this real-time
optimization, energy conservation is guaranteed without
The rest of the paper is laid out as follows. Section II compromising the necessary performance and service levels.
discusses approaches that have been widely concerned to
achieve sustainability in the edge computing paradigm AI can also provide intelligent load balancing and
focusing on energy efficiency while Section III broadly scheduling, which will optimize the energy consumption of
explores recent related work that makes use of different edge computing systems. AI models, for instance, can
strategies for energy efficiency in the edge computing anticipate high load periods and move workloads to nodes that
paradigm. Section IV provides a discussion of the conducted use less energy or have unused resources. AI algorithms can
review work along with challenges and further insights into the also allow edge devices to transition into low-power modes
energy-efficient approaches in the edge computing domain. while they are inactive or when processing demands drop,
Finally, Section V concludes the research findings along with which will cut down unnecessary energy consumption. AI-
future directions for the conducted research work. driven energy optimization may dramatically increase the
lifespan of edge devices, slash operating costs, and lessen the
II. STRATEGIES FOR ENERGY OPTIMIZATION environmental effect of edge computing infrastructures by
The following strategies were identified as the main continually learning from and reacting to the system [23] [24].
approaches that are widely focused in research that aim at C. Virtualization and Containerization
energy optimizations in edge computing. Nevertheless, the
techniques that are utilized for edge computing and even in the Virtualization and Containerization in edge computing play
cloud computing paradigm in contemporary research are a a significant role in improving resource usage, flexibility, and
combination of the below-discussed approaches. scalability of edge networks. Several Virtual Machines (VMs)
can operate on a single physical server owing to virtualization,
A. Energy Aware Scheduling and Load balancing which makes it possible to abstract hardware resources. This
Energy-aware scheduling and Load Balancing in edge makes it possible to consolidate edge resources and execute
computing aims to optimize resource allocation while reducing various applications or services in separate environments,
energy consumption. In the context of an edge environment, making the best use of the hardware that is available.
resources are distributed across a number of small, Virtualization assists in the management of various workloads
geographically separated devices, requiring efficient and offers the flexibility to dynamically scale resources in
scheduling algorithms to manage tasks without overloading response to variations in demand in edge computing [25], [26].
nodes or consuming excessive amounts of energy [16], [17]. In addition, it makes it possible to handle software upgrades
Energy-aware scheduling approaches prioritize tasks based on and system maintenance effectively without interfering with
their energy demands and resource requirements, ensuring that ongoing services that in turn increase system reliability.
high-priority tasks are executed on energy-efficient nodes Nevertheless, containerization, which bundles apps and
while low-priority tasks are deferred across less critical their dependencies into containers, is a less complex substitute
resources. By optimizing task scheduling, systems can for virtualization. Since containers share the host operating
minimize energy wastage, extend device lifespans, and ensure system kernel, containers are more efficient than VMs in terms
seamless service delivery [18] [19]. of startup times and overhead. Moreover, containerization
Load balancing complements energy-aware scheduling by facilitates the quick deployment and scalability of applications
distributing computational workloads evenly across available among dispersed nodes in edge computing. Due to its great
nodes to prevent resource overuse and reduce energy degree of mobility, moving apps between various edge devices
consumption. In edge computing, load balancing techniques or contexts is much simpler. This is especially critical in edge
must consider not only the computational capacity of each environments, which are dynamic and heterogeneous and have
node but also its current energy state. Dynamic load balancing a wide range of device capabilities [27], [28], [29]. Therefore,
ensures that tasks are allocated to nodes with sufficient energy it can be stated that containerization is a perfect solution for the
reserves, minimizing the risk of node failure due to energy changing needs of edge computing since it allows for quicker
depletion. These approaches improve the efficiency and application development cycles, better system responsiveness,
sustainability of edge computing setups, enabling them to and more effective resource management.
support more applications and services without overwhelming D. Network Optimization
the energy resources of distributed nodes [20] [21].
Network optimizations in edge computing primarily aims
B. AI-Driven Energy Optimization to reduce latency, enhance bandwidth efficiency, and improve
AI-driven energy optimization approaches that are utilized overall performance by bringing data processing and storage
in Edge computing incorporate artificial intelligence strategies closer to end devices, such as IoT sensors or mobile users. By
to improve the energy efficiency of edge networks and devices, decentralizing computing power, edge computing reduces the
which are frequently decentralized and resource-constrained. amount of data that needs to be sent to central cloud data
AI algorithms, in particular deep learning and machine centers, minimizing delays in data transmission [30], [31].
learning, are able to track and forecast edge device energy Techniques like adaptive routing allow the network to choose
consumption and task trends [22]. AI systems are able to make the optimal path for data, considering real-time conditions such
decisions about how best to assign jobs to nodes, modify power as congestion, which improves response times [32].

638 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 15, No. 11, 2024

Moreover, Dynamic Voltage and Frequency Scaling highlights architectural differences and provides insights into
(DVFS) is another approach in network optimization for energy efficiency, it may benefit from more consideration of
energy efficiency that reduces energy consumption when full real-world variables, such as the uneven distribution of end
performance is not needed by adjusting the power consumption users and the impact of varying application workloads on
of networking devices depending on real-time demands [33], energy consumption. In addition, the reliance on existing
[34]. Network Function Virtualization (NFV) is also widely simulators, which have their limitations, could affect the
researched as an efficient approach for network optimization accuracy of the results.
for energy efficiency in edge computing as it enables the
operation of several network services on a single physical Another group of researchers in study [13] have presented a
server, which eliminates the need for specialized hardware novel framework that employs Deep Reinforcement Learning
[35]. Adaptive transmission power control [36], [37] and the (DRL) to optimize workflow scheduling in edge-cloud
application of energy-efficient communication protocols [38], computing environments, specifically targeting the challenges
such as the enhanced energy efficiency features of 5G [39] are introduced by the proliferation of IoT devices. Traditional
other approaches that wireless networks can reduce overall cloud architectures often struggle with the demands of IoT
power consumption. Therefore, these approaches can applications due to issues like high latency and limited
significantly lower the energy footprint of networks without bandwidth. This study aims to address these challenges by
sacrificing the performance by following the network balancing the conflicting objectives of minimizing energy
optimization techniques discussed above. consumption and execution time while ensuring that workflow
deadlines are met. The proposed DRL technique demonstrates
E. Green Data Centers significant improvements over baseline algorithms, achieving
Edge data centers are typically smaller and distributed, 56% better energy efficiency and 46% faster execution times.
which makes traditional data center energy efficiency Key innovations include a hierarchical action space that
techniques less applicable. By utilizing sustainable methods distinguishes between edge and cloud nodes, as well as a multi-
like the usage of renewable energy sources, cutting-edge actor framework that enhances the learning process by
cooling systems, and energy-efficient hardware, green data allowing separate networks to manage task allocation. The
center features aim to reduce their environmental impact for results indicate that this approach is particularly effective for
sustainable and green data centers [40], [41]. The transition latency-sensitive applications, such as video surveillance,
towards greener operations decreases carbon footprints and where efficient resource management is critical. Overall, the
operational expenses, making these data centers a crucial research highlights the potential of DRL in optimizing resource
element of sustainable IT infrastructure. allocation and scheduling in edge-cloud environments,
providing valuable insights for future advancements in this
In contemporary research, other than the shift towards rapidly evolving field.
renewable energy sources in data centers that assists in
reducing dependency on the grid, automation techniques, low The researchers of study [11] address the crucial issue of
power consuming electric equipment, and devices with resource allocation in cloud computing, particularly in the
extended thermal limits have also been incorporated in order to context of increasing energy consumption and performance
improve sustainability and to move towards green data centers demands on data centers. The authors propose a hybrid model
[42]. In addition, thermal energy harvesting, kinetic energy and that combines Genetic Algorithms (GA) and Random Forest
hybrid systems have also been noted in data centers to improve (RF) techniques to optimize the allocation of VMs to physical
the resilience, and contribute to more sustainable and energy machines (PMs). The GA is employed to generate an
efficient infrastructures. optimized training dataset that maps VMs to PMs, which is
then utilized by the RF for classification and allocation tasks.
III. RECENT WORK This approach aims to minimize power consumption while
maximizing resource utilization and maintaining load balance
The authors in study [43] have conducted a comprehensive
across the data center. The effectiveness of the proposed model
analysis of energy consumption across various cloud-related
is evaluated using real-time workload traces from PlanetLab,
architectures, including cloud, fog, and edge computing. It
showing significant improvements in energy efficiency and
introduces a taxonomy that categorizes these architectures
execution time compared to traditional methods. The study
based on their characteristics, such as the number and role of
contributes to the existing body of knowledge by
data centers and their connectivity. The authors propose a
demonstrating the potential of hybrid optimization techniques
generic energy model that accurately estimates and compares
in enhancing cloud infrastructure management. However, the
the energy consumption of these infrastructures, taking into
authors acknowledge the need for further research to assess the
account factors like cooling systems and network devices. The
model's adaptability to diverse workloads and its scalability in
findings of this research work indicate that fully distributed
heterogeneous cloud environments.
architectures can consume significantly less energy between
14% and 25% compared to centralized and partly distributed The research in [10] introduces an Energy-Efficient Task
architectures, highlighting the importance of energy efficiency Offloading Strategy (ETOS) aimed at enhancing energy
in the design and deployment of modern computing solutions. efficiency in Mobile Edge Computing (MEC) environments for
In summary, the study aims to provide a foundational resource-intensive mobile applications. The study formulates
framework for future research in energy consumption analysis the task offloading problem as a non-linear optimization
within the evolving landscape of cloud computing challenge, proposing a hybrid approach that combines Particle
technologies. However, while the proposed model effectively Swarm Optimization (PSO) and Grey Wolf Optimizer (GWO)

639 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 15, No. 11, 2024

to effectively allocate resources while considering capacity and compared to existing models like TW BP PM and FSDL. The
latency constraints. The proposed ETOS leverages the study categorizes energy consumption modeling into two main
collaborative capabilities of MEC servers to minimize energy approaches: system resource utilization and Performance
consumption during task execution. Extensive simulations Monitor Counter (PMC)-based modeling. The ECMS model
demonstrate that ETOS outperforms existing baseline methods leverages a simpler network topology and lower input
in terms of energy utilization, response delay, and offloading dimensions, resulting in reduced training time and CPU
utility, particularly under limited resource conditions. Despite workload during execution. The findings indicate a strong
its promising results, the research highlights the need for real- correlation between energy consumption and CPU utilization,
world validation and addresses the potential complexity of emphasizing the need for precise energy models to inform
implementing the hybrid optimization approach in practical optimization algorithms. The research concludes with future
scenarios, suggesting directions for future work to enhance directions, suggesting the extension of the ECMS model to
applicability and effectiveness in real-world MEC systems. mixed workloads and integration with advanced AI/ML
techniques, such as reinforcement learning, to further enhance
The research in study [12] focuses on developing a novel energy efficiency in edge computing environments, ultimately
multi-classifier algorithm aimed at optimizing energy-efficient contributing to sustainable practices in the industry.
task offloading in Fog Computing environments for IoT
applications. As the number of connected devices increases, A technical analysis on service placement approaches in the
efficient resource management becomes crucial to minimize context of edge computing has been focused by the authors
energy consumption and enhance service quality. The proposed [45] with the aim of addressing energy efficiency in edge
algorithm evaluates various attributes related to tasks, network computing paradigm in IoT systems. The main objective of this
conditions, and processing capabilities of Fog nodes to research work has been to identify the effective and efficient
determine the most suitable node for task execution. By strategies for service placement in IoT environments along
leveraging machine learning techniques, the algorithm aims to with a taxonomy to categorize the studies in this field and in
improve decision-making processes regarding task offloading, terms of cloud edge service placement approaches and
thereby reducing execution time and energy usage. The study algorithms that have been utilized. In addition to the technical
emphasizes the importance of balancing energy efficiency with analysis, a statistical analysis has also been provided by the
performance metrics, demonstrating that the multi-classifier authors along with evaluation factors to which the research
approach can significantly enhance Quality of Service (QoS) findings provide further insights on future research in this
parameters. In summary, this research contributes to the paradigm. The results suggest that the server placement
ongoing efforts to optimize Fog Computing frameworks, approaches fall under three types of categories, namely,
making them more effective in handling the computational decentralized, centralized and hierarchical while Genetic
demands of IoT applications while addressing energy Algorithm has been widely utilized by researchers compared to
constraints. other machine learning algorithms such as Greedy, Markov,
BSAP, Topsis and Polynomial algorithms. Furthermore, time,
The authors of study [44] propose a novel framework to cost, and latency evaluation metrics have been identified as the
optimize energy consumption and computational efficiency in most concerned evaluation metrics in the context of service
IoT environments. It introduces a three-layer architecture placement. Finally, the authors provide insights on to open
comprising sensor, edge, and cloud layers, facilitating effective issues and challenges in the context of service placement that is
task offloading and resource management. The study employs vital for future research focusing on service placement.
Long Short Term Memory (LSTM) networks for accurate
workload prediction, enabling the system to adapt to dynamic The research in study [19] presents a heterogeneous cluster-
conditions. Additionally, it utilizes Lyapunov optimization based wireless sensor network (WSN) model aimed at
methods to address the non-convex nature of resource optimizing task allocation to minimize energy consumption
allocation problems. Simulation results demonstrate significant and balance load. The network consists of clusters, each with a
improvements in energy efficiency and processing rates. cluster center and several sensor nodes, where the cluster
However, the research acknowledges limitations, including center collects data from the nodes and communicates with a
concerns about scalability, the assumptions made regarding central processor. The study establishes a task model where
user behavior, and a lack of focus on security aspects. complex tasks are divided into independent subtasks, each
Therefore, the paper contributes valuable insights into mobile requiring specific resources, data sizes, and computation times.
edge computing, highlighting the potential for enhanced To address the task allocation problem, the authors propose a
performance in IoT networks while suggesting areas for further Fusion Algorithm (FA) that integrates Genetic Algorithm (GA)
exploration and refinement. and Ant Colony Optimization (ACO) techniques. This
algorithm features a novel mutation operator and a new
The research of [22] presents the Edge Intelligent Energy population initialization method, enhancing its effectiveness in
Consumption Model (ECMS) aimed at optimizing energy
reducing energy consumption and balancing the load across the
usage in MEC environments. As energy consumption in edge network. Experimental results demonstrate that the FA
data centers becomes increasingly critical, the ECMS model outperforms traditional GA, achieving 8.1% lower energy
provides a framework for predicting and managing energy consumption and significantly reducing the load on both sensor
needs based on varying workloads, including CPU-intensive, nodes and cluster centers. The proposed approach ensures all
Web transactional, and I/O-intensive tasks. The authors sensors remain operational throughout task execution, thereby
validate the model through experimental results, demonstrating increasing the reliability and longevity of the WSN. Therefore,
its superior performance in accuracy and training time

640 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 15, No. 11, 2024

the research contributes valuable insights into efficient task The research in [48] presents a novel approach to enhance
allocation strategies in edge computing environments. task offloading in dynamic vehicular environments. It
addresses the limitations of existing centralized and
The researchers of study [46] investigate an RIS-assisted decentralized Deep Reinforcement Learning (DRL) algorithms,
Non-Orthogonal Multiple Access (NOMA) Mobile Edge which often struggle with computational constraints and
Computing (MEC) network, focusing on minimizing energy coordination issues. The proposed framework introduces a
consumption for users. The authors propose a joint multi-layer Vehicular Edge Computing (VEC) architecture that
optimization approach that includes RIS phase shifts, data optimizes task management across vehicles, edge servers, and
transmission rates, power control, and transmission time. Due cloud resources. Key contributions include the development of
to the non-convex nature of the problem, the authors an energy-efficient VEC framework that considers the diverse
decompose it into two sub-problems: firstly, utilizing a dual computing capabilities of network entities and introduces a
method for a closed-form solution with a fixed RIS phase utility function to enhance energy efficiency. Additionally, a
vector, and the other employing a penalty method for decentralized Multi-Agent Deep Reinforcement Learning
suboptimal power control solutions. The optimization process (MADRL) algorithm is proposed, which effectively adapts to
alternates between these sub-problems until convergence is changing conditions while minimizing latency and maximizing
achieved. To demonstrate the effectiveness of their proposed task completion rates. The research also provides a
NOMA-MEC scheme, the authors compare it against three comprehensive performance evaluation using simulations in a
benchmark schemes: TDMA-MEC partial offloading, full local realistic environment, demonstrating the effectiveness of the
computing, and full offloading. These researchers have also proposed solution in managing varying traffic densities. The
introduced an alternating 1-D search method for optimizing conducted research highlights the potential of decentralized
RIS phase shifts in the TDMA-MEC scheme. Numerical approaches in improving the efficiency and responsiveness of
results indicate that the proposed scheme significantly reduces vehicular networks, paving the way for advancements in
overall energy consumption and highlights the impact of user autonomous vehicle applications and real-time data processing.
distance on performance. The paper concludes by
acknowledging the potential for future work on robust The research in [24] presents a novel cloud-edge
transmission design to address channel state information cooperative content-delivery strategy aimed at minimizing
estimation errors. network latency in asymmetrical Internet of Vehicles (IoV)
environments. By leveraging Deep Reinforcement Learning
Another group of researchers in [47] have worked on a (DRL), the authors propose a Deep Q Network (DQN) policy
deep reinforcement learning (DRL) approach for delay-aware that optimizes content caching and request routing based on
and energy-efficient computation offloading in dynamic MEC perceptive request history and current network states. The
networks with multiple users and servers. The primary study formulates the joint allocation of heterogeneous
objective is to maximize the number of tasks completed before resources as a queuing theory-based delay minimization
their deadlines while minimizing energy consumption. The objective, addressing the challenges of computation complexity
proposed DRL model operates in an end-to-end manner, and dynamic network conditions. Extensive simulations
eliminating the need for post-action optimization functions, demonstrate that the proposed strategy significantly reduces
and can handle a large action space without relying on network latency compared to existing solutions, showcasing its
traditional optimization methods. The study formulates the adaptability to varying user requirements and network states.
offloading problem as a Markov Decision Process (MDP), The findings indicate that the DQN model achieves fast
capturing the complexity of the MEC system by incorporating convergence and improved performance across different
time-varying channel conditions and various task profiles. scenarios. The paper concludes with a discussion on future
Extensive simulations demonstrate that the proposed DRL work, including the exploration of end-user mobility and
model significantly outperforms existing DRL models and deeper collaboration among mobile users, edge, and cloud
greedy algorithms in terms of task completion and energy networks to enhance overall Quality of Experience (QoE)
efficiency. The results indicate that the DRL model learns while balancing network delay and energy consumption. This
optimal policies over time, effectively managing the trade-off research contributes valuable insights into optimizing resource
between exploration and exploitation during training. The allocation in IoT-edge-cloud systems, particularly in the
conducted research highlights the potential of DRL in context of intelligent transportation systems.
enhancing the performance of MEC systems, making it a
valuable contribution to the field of IoT and edge computing. Table I illustrates a summary of the recent work that were
discussed in detail under objective, approach, key findings and
remarks of the particular research work.

641 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 15, No. 11, 2024

TABLE I. SUMMARY OF THE LITERATURE REVIEW

Reference Objective Approach Findings Remarks


To evaluate and compare energy A taxonomy of different Model may not account for real-
Fully distributed architectures consume 14%-
[43] consumption of Cloud, Fog, and cloud architectures and a world variables and simulator
25% less energy than centralized ones
Edge computing infrastructures generic energy model constraints
Energy-efficient resource  Energy consumption (56% of The proposed reinforcement
Deep Reinforcement
[13] scheduling in edge cloud improvement) learning framework is designed to
Learning
environment  Execution time (46% of improvement) operate in a centralized manner
Optimize virtual machine
Limited generalizability due to
allocation in cloud infrastructure,
Genetic Algorithm (GA)  Execution Time (37% of reduction) specific workload traces,
aiming to minimize power
[11]
consumption while maintaining
and Random Forest (RF)  Resource Utilization (11% complexity in real-world
techniques Improvement) implementation, and potential
load balance and maximizing
oversight of other critical factors
resource utilization
Hybrid approach
Propose a task offloading scheme The proposed strategy considerably
established based on Lacks real-world validation and
that minimizes the overall energy outperforms other baseline approaches, such
Particle Swarm may face implementation
[10] consumption along with as OEOS, ROA-DPH, ATO, and Local
Optimization (PSO) and complexity in resource-constrained
satisfying capacity and delay execution in terms of energy consumption,
Grey Wolf Optimizer environments
requirements execution time, and offloading utility
(GWO)
 Energy Consumption (11.36% of
reduction) for Cloud-only
Propose a novel energy-efficient  Energy Consumption (9.30% of Lacks extensive empirical
task offloading method for IoT, Multi classifier-based reduction) for edge-ward validation and does not address
[12]
Fog, and Cloud computing approach  Network usage (67% of reduction) decentralized approaches for
paradigms for Cloud-only dynamic environments
 Network usage (96% of reduction)
for edge-ward
Propose a hierarchical
communication and computation Scalability concerns, assumptions
The Long Short Term The proposed method can greatly improve
framework for jointly about user behavior, limited
[44] Memory (LSTM) system performance by saving energy costs
optimizing energy consumption security focus, and complexity in
network and achieving a high processing rate
and computation rate is practical implementation
proposed
The research primarily focuses on
Intelligent energy The proposed ECMS outperforms the
specific workloads, limiting its
Predicting energy consumption modeling approach that baseline power models (FSDL, CMP,
[22] applicability to broader, mixed
and monitor edge servers combines Elman Neural TW_BP_PM, AEC, CUBIC, Power
workload scenarios in MEC
Network (ENN) regression)
environments
 Methods and algorithms of the
Identify studies related to service
Perform a technical existing service placement approaches
placement strategies to Lacks any further implementation
analysis on the cloud  Evaluation metrics for service
[45] categorize the relevant studies as towards the service placement
edge service placement placement approaches
a knowledge source for further paradigm
research
approaches  Tools and environments developed
for service placement approaches
Develop an efficient task
Fusion Algorithm  Load on sensors was reduced by
allocation strategy for The research may not address
combining Genetic 58% with the FA, while the load on cluster
heterogeneous wireless sensor complex task dependencies and
Algorithm and Ant centers decreased by 30.8%.
[19] networks that minimizes energy real-world scenarios requiring
consumption and balances load
Colony Optimization for  FA achieved an 8.1% reduction in dynamic resource allocation and
effective task allocation energy consumption compared to the
to extend network lifetime and execution order
in WSNs traditional GA.
enhance reliability
Minimize energy consumption in Jointly optimize RIS Non-convex optimization and CSI
Significant energy savings compared to
[46] RIS-assisted NOMA-MEC phase shifts, transmission estimation errors affect
benchmark schemes.
networks rates, and power control performance
Maximize task completion
Proposed model outperforms existing Model’s performance may affect
before deadlines while Deep Reinforcement
[47] methods in task completion and energy the complexity of real-world
minimizing energy consumption Learning model
efficiency through extensive simulations environments
in MEC systems
A multi-layer edge
Optimize task offloading and computing architecture Significant reduction in energy consumption
resource management in and a decentralized and improved task completion rates when Scalability issues in highly
[48]
dynamic vehicular environments multi-agent deep compared to existing algorithms in dynamic vehicular networks
using a decentralized framework reinforcement learning simulations
algorithm
Minimize network latency in Future work needed on end-user
 Network Delay (44% reduction)
asymmetrical IoV environments Deep Reinforcement mobility and deeper collaboration
[24]
through optimized resource Learning  Reward per episode (39% among network components and
improvement)
allocation the tradeoff between network delay

642 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 15, No. 11, 2024

and energy consumption have not


been focused to compromise
among multiple network indicators
IV. DISCUSSION of experience for the end users and high cost of energy and
bandwidth utilization that are unfavourable. Moreover, limited
The conducted literature review provides insights into power sources, processing and storage capabilities also impose
future research that aims at energy efficiency in the paradigms challenges to energy efficiency optimization strategies for a
of both cloud and edge computing along with the approaches sustainable edge computing paradigm [49], [50]. Resource
that are utilized concerning the reduction of energy management is key to optimizing energy efficiency and has
consumption. Nevertheless, it also noted that the researchers also been a challenging task owing to the highly dynamic
utilize a combination of techniques in their work towards nature of IoT traffic. Furthermore, many tasks have
optimizing energy efficiency for sustainable and efficient dependencies that dictate the order of execution. Managing
systems with reduced operational costs. Moreover, it is noted these dependencies while optimizing resource allocation adds
that the techniques widely spread towards the AI-driven another level of complexity to the task allocation process.
approaches with the advancements in machine learning.
Ensuring that the network meets specific QoS
Furthermore, it is also vital to identify the challenges that requirements, such as latency and reliability, while performing
are encountered when focusing on the energy-efficient aspects task allocation is a critical challenge that must be addressed. In
in edge computing due to the proliferation of IoT devices and addition, striking a balance between energy efficiency and QoS
advancements in modern computing. Among the challenges to awareness has also been a challenging aspect where the QoS is
optimizing energy efficiency in the edge computing paradigm, highly impacted by the majority of mechanisms that are
distributed dynamic workloads among edge nodes with vast utilized in distributed computing environments [51].
range of heterogeneity, distributed, and resource constrained Furthermore, energy efficiency with application performance
nature is much prominent since accurate modeling would be and user experience is crucial, as overly aggressive energy-
complicated due to the fluctuating workloads based on user saving measures may negatively affect user satisfaction.
demands. In addition, different and unpredictable workloads Addressing these challenges is essential for advancing research
have imposed a lot of issues towards the edge computing and developing effective energy management strategies in
paradigm. Nevertheless, the task scheduling techniques MEC environments.
employed by contemporary research are more towards the
independent task-oriented workloads while few studies have Additionally, the need for extensive data collection for
focused on complex workloads [13]. relevant energy-related parameters can introduce overhead and
impact performance, while selecting the most pertinent features
Moreover, delayed critical applications that run on devices for modeling remains a challenge. Real-time processing
in the mobile edge computing paradigm also impose challenges requirements further complicate the development of accurate
to this arena [10] that directly leads to the insufficient quality models, and integrating advanced AI/ML techniques introduces
complexities in training and deployment.

Fig. 1. Mind map with key findings from the conducted review.

643 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 15, No. 11, 2024

With the scalability concerns in both edge and cloud REFERENCES


computing, as the number of nodes and tasks increase, [1] A. Sunyaev and A. Sunyaev, ‘Cloud computing’, Internet computing:
maintaining efficient communication and coordination among Principles of distributed systems and emerging internet-based
nodes becomes more difficult. Algorithms must be scalable to technologies, pp. 195–236, 2020.
handle larger networks without significant performance [2] H. K. Mistry, C. Mavani, A. Goswami, and R. Patel, ‘The Impact Of
degradation. Moreover, interoperability also could be a major Cloud Computing And Ai On Industry Dynamics And Competition’,
challenge where a diverse range of IoT devices and platforms Educational Administration: Theory and Practice, vol. 30, no. 7, pp.
797–804, 2024.
can lead to compatibility issues, making it difficult to
[3] A. K. Y. Yanamala, ‘Emerging Challenges in Cloud Computing
implement a unified mechanism. Fig. 1 provides a mind map of Security: A Comprehensive Review’, International Journal of Advanced
the key findings from the conducted literature review Engineering Technologies and Innovations, vol. 1, no. 4, pp. 448–479,
comprising of energy optimization techniques, cloud 2024.
architecture adaptations, major considerations and challenges [4] K. Cao, Y. Liu, G. Meng, and Q. Sun, ‘An overview on edge computing
to the energy efficiency aspect of the edge computing paradigm research’, IEEE access, vol. 8, pp. 85714–85728, 2020.
that provides further insights into research in this domain. [5] L. Kong et al., ‘Edge-computing-driven internet of things: A survey’,
ACM Computing Surveys, vol. 55, no. 8, pp. 1–41, 2022.
Therefore, when the above aspects are concerned, it is [6] Q. Luo, S. Hu, C. Li, G. Li, and W. Shi, ‘Resource scheduling in edge
apparent that there is a lack of unified metrics and benchmarks computing: A survey’, IEEE Communications Surveys & Tutorials, vol.
for assessing energy efficiency across different edge computing 23, no. 4, pp. 2131–2165, 2021.
environments, which makes it difficult to compare solutions. In [7] T. Qiu, J. Chi, X. Zhou, Z. Ning, M. Atiquzzaman, and D. O. Wu, ‘Edge
addition, the use of AI/ML models for decision-making in edge computing in industrial internet of things: Architecture, advances and
challenges’, IEEE Communications Surveys & Tutorials, vol. 22, no. 4,
computing often introduces significant energy overhead that pp. 2462–2488, 2020.
urges need of optimized lightweight AI/ML algorithms for
[8] Y. Chen, S. Ye, J. Wu, B. Wang, H. Wang, and W. Li, “Fast multi-type
edge environments without compromising accuracy or resource allocation in local-edge-cloud computing for energy-efficient
efficiency. Heterogeneity of Edge devices also necessitates the service provision,” Information sciences, pp. 120502–120502, Mar.
scalable, device-agnostic optimization models that can adapt to 2024, doi: https://doi.org/10.1016/j.ins.2024.120502.
heterogeneous edge environments. Moreover, real-time [9] K. Sadatdiynov, L. Cui, L. Zhang, J. Z. Huang, S. Salloum, and M. S.
applications such as autonomous vehicles, healthcare Mahmud, ‘A review of optimization methods for computation offloading
monitoring systems, etc., have strict latency and reliability in edge computing networks’, Digital Communications and Networks,
vol. 9, no. 2, pp. 450–461, 2023.
requirements, which complicate energy optimization efforts.
[10] M. P. J. Mahenge, C. Li, and C. A. Sanga, “Energy-efficient task
Therefore, developing solutions that balance energy efficiency offloading strategy in mobile edge computing for resource-intensive
with real-time performance guarantees has also been a critical mobile applications,” Digital Communications and Networks, Apr.
consideration in this research paradigm. 2022, doi: https://doi.org/10.1016/j.dcan.2022.04.001.
[11] M. H. S, S. Kumar T, S. M. F. D. S. Mustapha, P. Gupta, and R. P.
V. CONCLUSION Tripathi, “Hybrid Approach for Resource Allocation in Cloud
Infrastructure Using Random Forest and Genetic Algorithm,” Scientific
The research findings are indicative of the current Programming, vol. 2021, pp. 1–10, Oct. 2021, doi:
approaches to energy efficient edge computing systems with https://doi.org/10.1155/2021/4924708.
reduced energy consumption and costs to provide further [12] M. K. Alasmari, S. S. Alwakeel, and Y. A. Alohali, “A Multi-Classifiers
insights into future research in this arena. In addition, the Based Algorithm for Energy Efficient Tasks Offloading in Fog
different strategies that were identified as the key techniques to Computing,” Sensors, vol. 23, no. 16, pp. 7209–7209, Aug. 2023, doi:
energy efficient systems in cloud computing further assists https://doi.org/10.3390/s23167209.
future research while a combination of different strategies has [13] A. Jayanetti, S. Halgamuge, and R. Buyya, “Deep reinforcement
learning for energy and time optimized scheduling of precedence-
also been noted in contemporary research. Furthermore, AI- constrained tasks in edge-cloud computing environments,” Future
driven energy optimization techniques have been widely Generation Computer Systems, Jun. 2022, doi:
focused and researched and this approach has been able to https://doi.org/10.1016/j.future.2022.06.012.
provide better mechanisms for optimizing energy efficiency in [14] D. Alsadie, “Efficient Task Offloading Strategy for Energy-Constrained
both edge and cloud computing paradigms. Edge Computing Environments: A Hybrid Optimization Approach,”
IEEE Access, vol. 12, pp. 85089–85102, 2024, doi:
In conclusion, this comprehensive review on optimizing https://doi.org/10.1109/access.2024.3415756.
energy efficiency for edge computing highlights the growing [15] J. A. Ansere et al., ‘Optimal computation resource allocation in energy-
importance of balancing performance and sustainability in efficient edge IoT systems with deep reinforcement learning’, IEEE
modern cloud systems since as edge computing gains Transactions on Green Communications and Networking, vol. 7, no. 4,
pp. 2130–2142, 2023.
prominence in reducing latency and improving data processing
[16] S. Sangeetha, J. Logeshwaran, M. Faheem, R. Kannadasan, S.
efficiency, the need for energy optimization becomes critical. Sundararaju, and L. Vijayaraja, ‘Smart performance optimization of
Moreover, various approaches, including workload energy-aware scheduling model for resource sharing in 5G green
distribution, resource allocation, and hardware improvements, communication systems’, The Journal of Engineering, vol. 2024, no. 2,
have demonstrated the potential to significantly reduce energy p. e12358, 2024.
consumption while balancing the quality of service. As future [17] A. Asghari, H. Azgomi, A. A. Zoraghchian, and A. Barzegarinezhad,
research, the authors are to refine these strategies and explore ‘Energy-aware server placement in mobile edge computing using trees
social relations optimization algorithm’, The Journal of Supercomputing,
innovative methods to further enhance energy efficiency, vol. 80, no. 5, pp. 6382–6410, 2024.
ensuring sustainable cloud-edge ecosystems that meet the
[18] F. Ramezani Shahidani, A. Ghasemi, A. Toroghi Haghighat, and A.
rising demands of modern applications. Keshavarzi, ‘Task scheduling in edge-fog-cloud architecture: a multi-

644 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 15, No. 11, 2024

objective load balancing approach using reinforcement learning infrastructures’, Journal of Network and Computer Applications, vol.
algorithm’, Computing, vol. 105, no. 6, pp. 1337–1359, 2023. 221, p. 103764, 2024.
[19] J. Wen, J. Yang, T. Wang, Y. Li, and Z. Lv, ‘Energy-efficient task [36] X. Cao, G. Zhu, J. Xu, Z. Wang, and S. Cui, ‘Optimized power control
allocation for reliable parallel computation of cluster-based wireless design for over-the-air federated edge learning’, IEEE Journal on
sensor network in edge computing’, Digital Communications and Selected Areas in Communications, vol. 40, no. 1, pp. 342–358, 2021.
Networks, vol. 9, no. 2, pp. 473–482, 2023. [37] X. Cao, G. Zhu, J. Xu, and S. Cui, ‘Transmission power control for
[20] M. Raeisi-Varzaneh, O. Dakkak, A. Habbal, and B.-S. Kim, ‘Resource over-the-air federated averaging at network edge’, IEEE Journal on
scheduling in edge computing: Architecture, taxonomy, open issues and Selected Areas in Communications, vol. 40, no. 5, pp. 1571–1586, 2022.
future research directions’, IEEE Access, vol. 11, pp. 25329–25350, [38] X. Mo and J. Xu, ‘Energy-efficient federated edge learning with joint
2023. communication and computation design’, Journal of Communications
[21] H. Huang, W. Zhan, G. Min, Z. Duan, and K. Peng, ‘Mobility-aware and Information Networks, vol. 6, no. 2, pp. 110–124, 2021.
computation offloading with load balancing in smart city networks using [39] H. Koumaras et al., ‘5G-enabled UAVs with command and control
MEC federation’, IEEE Transactions on Mobile Computing, 2024. software component at the edge for supporting energy efficient
[22] Z. Zhou, M. Shojafar, J. Abawajy, H. Yin, and H. Lu, ‘ECMS: An Edge opportunistic networks’, Energies, vol. 14, no. 5, p. 1480, 2021.
Intelligent Energy Efficient Model in Mobile Edge Computing’, IEEE [40] X. Shao, Z. Zhang, P. Song, Y. Feng, and X. Wang, ‘A review of energy
Transactions on Green Communications and Networking, vol. 6, no. 1, efficiency evaluation metrics for data centers’, Energy and Buildings,
pp. 238–247, 2022. vol. 271, p. 112308, 2022.
[23] K. Sathupadi, ‘Ai-driven energy optimization in sdn-based cloud [41] Q. Zhang et al., ‘A survey on data center cooling systems: Technology,
computing for balancing cost, energy efficiency, and network power consumption modeling and control strategy optimization’, Journal
performance’, International Journal of Applied Machine Learning and of Systems Architecture, vol. 119, p. 102253, 2021.
Computational Intelligence, vol. 13, no. 7, pp. 11–37, 2023.
[42] M. Manganelli, A. Soldati, L. Martirano, and S. Ramakrishna,
[24] T. Cui, R. Yang, C. Fang, and S. Yu, ‘Deep reinforcement learning- ‘Strategies for improving the sustainability of data centers via energy
based resource allocation for content distribution in IoT-edge-cloud mix, energy conservation, and circular energy’, Sustainability, vol. 13,
computing environments’, Symmetry, vol. 15, no. 1, p. 217, 2023. no. 11, p. 6114, 2021.
[25] Y. Mansouri and M. A. Babar, ‘A review of edge computing: Features [43] E. Ahvar, A.-C. Orgerie, and A. Lebre, ‘Estimating Energy
and resource virtualization’, Journal of Parallel and Distributed Consumption of Cloud, Fog, and Edge Computing Infrastructures’,
Computing, vol. 150, pp. 155–183, 2021. IEEE Transactions on Sustainable Computing, vol. 7, no. 2, pp. 277–
[26] C. Jian, L. Bao, and M. Zhang, ‘A high-efficiency learning model for 288, 2022.
virtual machine placement in mobile edge computing’, Cluster [44] Q. Wang, L. T. Tan, R. Q. Hu, and Y. Qian, ‘Hierarchical Energy-
Computing, vol. 25, no. 5, pp. 3051–3066, 2022. Efficient Mobile-Edge Computing in IoT Networks’, IEEE Internet of
[27] L. Urblik, E. Kajati, P. Papcun, and I. Zolotová, ‘Containerization in Things Journal, vol. 7, no. 12, pp. 11626–11639, 2020.
Edge Intelligence: A Review’, Electronics, vol. 13, no. 7, p. 1335, 2024. [45] L. Heng, G. Yin, and X. Zhao, ‘Energy aware cloud-edge service
[28] J. Zhang, X. Zhou, T. Ge, X. Wang, and T. Hwang, ‘Joint task placement approaches in the Internet of Things communications’,
scheduling and containerizing for efficient edge computing’, IEEE International Journal of Communication Systems, vol. 35, no. 1, p.
Transactions on Parallel and Distributed Systems, vol. 32, no. 8, pp. e4899, 2022.
2086–2100, 2021. [46] Z. Li et al., ‘Energy Efficient Reconfigurable Intelligent Surface
[29] S. Hu, W. Shi, and G. Li, ‘CEC: A containerized edge computing Enabled Mobile Edge Computing Networks With NOMA’, IEEE
framework for dynamic resource provisioning’, IEEE Transactions on Transactions on Cognitive Communications and Networking, vol. 7, no.
Mobile Computing, vol. 22, no. 7, pp. 3840–3854, 2022. 2, pp. 427–440, 2021.
[30] F. Zhou and R. Q. Hu, ‘Computation Efficiency Maximization in [47] L. Ale, N. Zhang, X. Fang, X. Chen, S. Wu, and L. Li, ‘Delay-Aware
Wireless-Powered Mobile Edge Computing Networks’, IEEE and Energy-Efficient Computation Offloading in Mobile-Edge
Transactions on Wireless Communications, vol. 19, no. 5, pp. 3170– Computing Using Deep Reinforcement Learning’, IEEE Transactions on
3184, 2020. Cognitive Communications and Networking, vol. 7, no. 3, pp. 881–892,
[31] X. Zhou, X. Yang, J. Ma, I. Kevin, and K. Wang, ‘Energy-efficient 2021.
smart routing based on link correlation mining for wireless edge [48] M. Fardad, G.-M. Muntean, and I. Tal, ‘Decentralized vehicular edge
computing in IoT’, IEEE Internet of Things Journal, vol. 9, no. 16, pp. computing framework for energy-efficient task coordination’, in 2024
14988–14997, 2021. IEEE 99th Vehicular Technology Conference (VTC2024-Spring), 2024,
[32] T. Vaiyapuri, V. S. Parvathy, V. Manikandan, N. Krishnaraj, D. Gupta, pp. 1–7.
and K. Shankar, ‘A novel hybrid optimization for cluster-based routing [49] K. Kaur, S. Garg, G. S. Aujla, N. Kumar, J. J. P. C. Rodrigues, and M.
protocol in information-centric wireless sensor networks for IoT based Guizani, ‘Edge Computing in the Industrial Internet of Things
mobile edge computing’, Wireless Personal Communications, vol. 127, Environment: Software-Defined-Networks-Based Edge-Cloud
no. 1, pp. 39–62, 2022. Interplay’, IEEE Communications Magazine, vol. 56, no. 2, pp. 44–51,
[33] A. Javadpour et al., ‘An energy-optimized embedded load balancing 2018.
using DVFS computing in cloud data centers’, Computer [50] J. Lin, W. Yu, N. Zhang, X. Yang, H. Zhang, and W. Zhao, ‘A Survey
Communications, vol. 197, pp. 255–266, 2023. on Internet of Things: Architecture, Enabling Technologies, Security
[34] S. K. Panda, M. Lin, and T. Zhou, ‘Energy-efficient computation and Privacy, and Applications’, IEEE Internet of Things Journal, vol. 4,
offloading with DVFS using deep reinforcement learning for time- no. 5, pp. 1125–1142, 2017.
critical IoT applications in edge computing’, IEEE Internet of Things [51] U. M. Malik, M. A. Javed, S. Zeadally, and S. ul Islam, ‘Energy-
Journal, vol. 10, no. 8, pp. 6611–6621, 2022. Efficient Fog Computing for 6G-Enabled Massive IoT: Recent Trends
[35] A. Cañete, M. Amor, and L. Fuentes, ‘HADES: An NFV solution for and Future Opportunities’, IEEE Internet of Things Journal, vol. 9, no.
energy-efficient placement and resource allocation in heterogeneous 16, pp. 14572–14594, 2022.

645 | P a g e
www.ijacsa.thesai.org

You might also like