Navigating the Complexities of Fog Computing Task Scheduling
Fog computing has emerged as a transformative paradigm, extending the capabilities of cloud computing by bringing processing and data analysis closer to the network’s edge. This innovation enables real-time processing and application deployment for Internet of Things (IoT) environments, catering to the diverse array of interconnected devices generating data at the periphery. Within this fog computing landscape, effective task scheduling emerges as a paramount challenge, involving the reasonable allocation of computing resources to tasks to optimize performance metrics such as makespan, energy consumption, and resource utilization.
Achieving optimal task scheduling in fog computing environments, however, proves inherently intricate due to the dynamic nature of the network, the diverse array of heterogeneous computing resources available, and the stringent constraints imposed by edge devices. In light of these complexities, this comprehensive review delves into the advancements in heuristic task scheduling methodologies tailored for fog computing systems.
Through a systematic analysis of relevant literature, a spectrum of heuristic approaches is scrutinized, encompassing priority-based strategies, greedy heuristics, metaheuristic algorithms, learning-based approaches, hybrid heuristics, and nature-inspired methodologies. This review critically assesses the strengths, limitations, and practical applications of each approach within the context of fog computing environments.
By synthesizing insights from existing literature and delineating key challenges and prospective research trajectories, this article aims to propel the field of fog computing task scheduling forward, catalyzing the development of more resilient, adaptable, and efficient solutions tailored to meet the demands of real-world applications.
Methodology
This systematic literature review on heuristic-based task scheduling in fog computing follows the guidelines established in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) (Page et al., 2021). The subsequent sections detail the specific steps and methodology employed in this review.
The inclusive search criteria encompassed various factors, including terms such as heuristics, optimization, and nature-inspired methods for task scheduling in fog computing. The selected studies were required to meet criteria related to task scheduling, energy optimization, and resource management. Moreover, stringent eligibility criteria for report characteristics, such as English language publication, classification as a scientific article or review, and a publication year between 2019–2024, were meticulously applied.
Electronic databases like IEEE Xplore, ScienceDirect, and SpringerLink were utilized for the search, along with consideration of highly cited articles from ACM, MDPI, De Gruyter, Hindawi, and Wiley, focusing on task scheduling in fog computing environments, specifically heuristic-based methods.
The reviewers meticulously examined each article retrieved from the search, assessed its eligibility, and collaboratively reached a majority consensus to finalize the selection of studies included in the review. Data collection was facilitated through a Google spreadsheet, where information from the selected studies was systematically compiled, allowing for a comprehensive analysis of the cutting-edge developments within the field.
Taxonomy of Heuristic Approaches for Task Scheduling in Fog Computing
In the realm of fog computing environments, effective task scheduling stands as a pivotal factor in optimizing resource usage and elevating the performance of IoT applications. This section introduces a taxonomy outlining heuristic methods for task scheduling in fog computing, organizing them according to their fundamental principles and attributes.
Priority-based Heuristics
Priority-based heuristics prioritize task execution based on predefined criteria (Fahad et al., 2022; Tang et al., 2023). Static Priority Scheduling assigns fixed priorities to tasks, typically determined by factors such as deadlines, importance, or resource requirements. In contrast, Dynamic Priority Scheduling adjusts task priorities dynamically during runtime in response to real-time system conditions, workload characteristics, or user-defined policies (Shi et al., 2020).
Greedy Heuristics
Greedy heuristics make locally optimal decisions at each step with the aim of achieving a globally optimal solution (Azizi et al., 2022). Earliest deadline first (EDF) schedules tasks based on their earliest deadlines, prioritizing those with imminent deadlines to minimize lateness. Shortest processing time (SPT) selects tasks with the shortest estimated processing time, aiming to minimize overall completion time and improve system throughput. Minimum remaining processing time (MRPT) prioritizes tasks based on their remaining processing time, with shorter tasks given precedence to expedite completion.
Metaheuristic Approaches
Metaheuristic approaches are high-level strategies that guide the search for optimal solutions in a solution space (Wu et al., 2022; Keshavarznejad, Rezvani & Adabi, 2021). Genetic algorithms (GA) employ genetic operators to evolve a population of candidate solutions towards an optimal task schedule. Particle swarm optimization (PSO) mimics the collective behavior of a swarm of particles to iteratively explore the solution space and converge towards an optimal task schedule. Ant colony optimization (ACO) draws inspiration from the foraging behavior of ants, utilizing pheromone trails and heuristic information to navigate towards an optimal task schedule on a global scale. Simulated annealing (SA) simulates the gradual cooling of a material to find the global optimum by accepting probabilistic changes in the solution (Dev et al., 2022).
Learning-based Heuristics
Learning-based heuristics leverage machine learning approaches to discover optimal task scheduling policies (Wang et al., 2024). Reinforcement learning (RL) utilizes trial-and-error learning to discover optimal task scheduling policies through interactions with the environment and feedback on task completion. Q-Learning learns an optimal action-selection strategy by iteratively updating a Q-table based on rewards obtained from task scheduling decisions (Gao et al., 2020; Yeganeh, Sangar & Azizi, 2023). Deep Q-Networks (DQN) extend Q-learning by employing deep neural networks to approximate the Q-function, enabling more complex and scalable task scheduling policies.
Hybrid Heuristics
Hybrid heuristics integrate multiple heuristic approaches to exploit their complementary strengths and improve solution quality (Agarwal et al., 2023; Yadav, Tripathi & Sharma, 2022a). This includes combinations of greedy and metaheuristic approaches, as well as the fusion of learning-based and metaheuristic approaches (Leena, Divya & Lilian, 2020).
Nature-inspired Heuristics
Nature-inspired heuristics draw inspiration from natural phenomena to develop efficient task-scheduling strategies (Mishra et al., 2021; Usman et al., 2019). This encompasses biologically inspired algorithms, such as genetic evolution and swarm intelligence, as well as physics-based heuristics derived from principles in physics to optimize task allocation and resource utilization in fog environments.
Heuristic Approaches for Task Scheduling
The taxonomy of heuristic methods provides a comprehensive framework for understanding the diverse approaches to task scheduling in fog computing, each offering unique advantages and applications in optimizing the performance of IoT applications.
Priority-based Heuristics
Priority-based heuristics provide a straightforward method for task scheduling in fog computing environments, where tasks are prioritized based on predefined criteria such as deadlines, importance, or resource requirements (Sharma & Thangaraj, 2024; Choudhari, Moh & Moh, 2018). Static priority scheduling assigns fixed priorities to tasks, offering simplicity but lacking adaptability to dynamic environments. In contrast, dynamic priority scheduling adjusts priorities based on real-time conditions, offering flexibility at the cost of complexity.
Fahad et al. (2022) introduced a preemptive task scheduling strategy tailored for fog computing environments, known as multi-queue priority (MQP) scheduling. This approach aims to address the challenge of task starvation among less critical applications while ensuring balanced task allocation for both latency-sensitive and less latency-sensitive tasks. Simulation outcomes showcased significant reductions in latency compared to alternative scheduling algorithms.
Madhura, Elizabeth & Uthariaraj (2021) introduced an innovative task scheduling algorithm for fog computing environments, focusing on minimizing makespan and computation costs. The algorithm’s three-phase approach, including level sorting, task prioritization, and task selection, demonstrated superior performance compared to existing methods.
Movahedi, Defude & Hosseininia (2021) addressed the task scheduling challenge within fog computing environments by introducing a novel method termed OppoCWOA, which harnesses the Whale Optimization Algorithm (WOA). This approach integrates opposition-based learning and chaos theory to augment the efficacy of WOA in optimizing task scheduling.
Hoseiny et al. (2021) introduced a scheduling algorithm named PGA, designed to optimize overall computation time, energy consumption, and the percentage of tasks meeting deadlines in fog-cloud computing environments.
Choudhari, Moh & Moh (2018) proposed a task scheduling algorithm within the fog layer, employing priority levels to accommodate the growing number of IoT devices while enhancing performance and reducing costs.
Greedy Heuristics
Greedy heuristics provide a rapid and straightforward approach to task scheduling, prioritizing tasks based on specific criteria without guaranteeing the best solution (Azizi et al., 2022). Standard methods like EDF, SPT, and MRPT exemplify this approach, aiming to find satisfactory solutions quickly.
Azizi et al. (2022) developed a mathematical formulation for the task scheduling problem aimed at minimizing the overall energy consumption of fog nodes (FNs) while ensuring the fulfillment of Quality of Service (QoS) criteria for IoT tasks and minimizing deadline violations. They introduced two semi-greedy-based algorithms, namely priority-aware semi-greedy (PSG) and PSG with a multistart procedure (PSG-M), designed to efficiently allocate IoT tasks to FNs.
Zavieh et al. (2023) introduced a novel methodology termed the Fuzzy Inverse Markov Data Envelopment Analysis Process (FIMDEAP) to tackle the task scheduling and energy consumption challenges prevalent in cloud computing environments. By integrating the advantages of Fuzzy Inverse Data Envelopment Analysis (FIDEA) and Fuzzy Markov Decision Process (FMDP) techniques, this approach adeptly selected physical and virtual machines while operating under fuzzy conditions.
Tang et al. (2023) conducted a study on AI-driven IoT applications within a collaborative cloud-edge environment, leveraging container technology. They introduced a novel container-based task scheduling algorithm dubbed PGT, which integrates a priority-aware greedy strategy with the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) multi-criteria approach.
Metaheuristic Approaches
Metaheuristic approaches provide adaptable and versatile solutions by heuristically exploring the search space (Aron & Abraham, 2022; Gupta & Singh, 2023). Examples include genetic algorithms, particle swarm optimization, simulated annealing, and ant colony optimization.
Hosseinioun et al. (2020) proposed an energy-aware method by employing the Dynamic Voltage and Frequency Scaling (DVFS) technique to reduce energy consumption. Additionally, to construct valid task sequences, a hybrid approach combining the Invasive Weed Optimization and Culture (IWO-CA) evolutionary algorithm was utilized.
Hosseini, Nickray & Ghanbari (2022) presented a scheduling algorithm called PQFAHP, leveraging a combination of Priority Queue, Fuzzy logic, and Analytical Hierarchy Process (AHP). The PQFAHP algorithm integrated diverse priorities and ranked tasks according to multiple criteria, including dynamic scheduling parameters such as completion time, energy consumption, RAM usage, and deadlines.
Abdel-Basset et al. (2020) introduced a novel approach to task scheduling in fog computing, aiming to enhance the QoS for IoT applications by offloading tasks from the cloud. It proposed an energy-aware model called the marine predators algorithm (MPA) for fog computing task scheduling (TSFC).
Learning-based Heuristics
Learning-based heuristics represent a paradigm shift in task scheduling within fog computing, harnessing the power of machine learning to create dynamic and adaptable algorithms (Memari et al., 2022; Fahimullah et al., 2023; Ibrahim & Askar, 2023).
Fellir et al. (2020) proposed a multi-agent-based model for task scheduling in cloud-fog computing environments to address the challenges of managing large volumes of data generated by IoT devices. The model prioritized tasks based on factors such as task priority, wait time, status, and resource requirements.
Wang et al. (2024) introduced a novel Deep Reinforcement Learning-based IoT application Scheduling algorithm, DRLIS, designed to optimize the response time of heterogeneous IoT applications and balance the load on edge and fog servers efficiently.
Gao et al. (2020) introduced a collaborative computing framework integrating local computing (mobile device), edge cloud (MEC), and central cloud (MCC) components to enhance resource allocation and computation offloading for tasks with high computational requirements. Within this framework, they devised a novel Q-learning-based computation offloading (QLCOF) policy.
Hybrid Heuristics
Hybrid heuristics present a promising avenue for addressing the intricate challenges of task scheduling in fog computing environments (Dubey & Sharma, 2023; Kaushik & Al-Raweshidy, 2022). By amalgamating the strengths of individual heuristic approaches, these hybrid approaches offer heightened adaptability, solution quality, and efficiency.
Agarwal et al. (2023) introduced a novel methodology termed Hybrid Genetic Algorithm and Energy Conscious Scheduling (Hgecs) to tackle the challenges of multiprocessor task scheduling in fog-cloud computing systems. It integrated a genetic algorithm and energy-conscious scheduling to optimize task scheduling.
Leena, Divya & Lilian (2020) introduced a task scheduling algorithm tailored for fog nodes with a focus on enhancing energy efficiency. They introduced a hybrid heuristics approach to optimize the utilization of fog nodes with limited computational resources and energy.
Mtshali et al. (2019) presented an application scheduling technique based on virtualization technology to optimize energy consumption and average delay of real-time applications in Fog computing networks.
Nature-inspired Heuristics
Nature-inspired heuristics represent a fascinating avenue in the quest for efficient and adaptable task-scheduling algorithms tailored for fog computing environments. Drawing inspiration from natural phenomena and biological processes, these approaches offer novel problem-solving strategies.
Usman et al. (2019) presented a comprehensive review of nature-inspired algorithms aimed at addressing energy issues in Cloud datacenters. It followed a taxonomy focusing on virtualization, consolidation, and energy-awareness dimensions.
Chhabra et al. (2022) introduced a novel approach termed h-DEWOA, which integrates chaotic maps, opposition-based learning (OBL), and differential evolution (DE) with the standard WOA aim to enhance exploration, convergence speed, and the balance between exploration and exploitation.
Dabiri, Azizi & Abdollahpouri (2022) addressed the task scheduling challenge in fog-cloud computing environments by proposing a system model aimed at optimizing both total deadline violation time and energy consumption. They introduced two nature-inspired optimization methods, namely grey wolf optimization and grasshopper optimization algorithm, to efficiently tackle the scheduling problem.
Open Challenges and Future Directions
Despite the notable progress in heuristic approaches for task scheduling in fog computing, several unresolved challenges persist, signaling promising avenues for future research.
Dynamic and Uncertain Environment
Fog environments exhibit dynamic and uncertain characteristics, marked by fluctuating resource availability, evolving user demands, and unpredictable task arrival patterns. Existing heuristic approaches may struggle to adapt and optimize scheduling decisions in real-time to address these dynamic challenges effectively.
Heterogeneity and Scalability
The heterogeneous nature of fog environments, comprising diverse resources with varying capabilities and constraints, presents scalability hurdles. Future research should concentrate on devising scalable scheduling solutions capable of efficiently managing the extensive and diverse array of devices and tasks encountered in real-world IoT deployments.
Limited Resource Constraints
Edge devices and fog nodes typically operate under stringent resource constraints, including limited processing power, memory, and energy resources. Future heuristic approaches should prioritize efficiency and resource optimization while ensuring timely task completion within specified deadlines.
Security and Privacy Concerns
Data security and user privacy represent critical concerns in fog computing environments. Heuristic approaches must integrate features to ensure secure and confidential task execution while maintaining efficient scheduling operations addressing emerging security and privacy threats.
Explainability and Transparency
The decision-making process behind scheduling decisions, especially within complex learning-based methods, may lack transparency and explainability. Future research should explore methods to enhance the interpretability and transparency of heuristic algorithms, fostering trust and understanding in automated scheduling systems.
Future Research Directions
Prominent future research directions to address these challenges include:
Real-time scheduling with machine learning: Integrating machine learning models with online learning capabilities can empower heuristic approaches to adapt in real-time to changing environmental conditions, enhancing the responsiveness and agility of fog computing systems.
Federated learning for collaborative scheduling: Leveraging federated learning approaches enables collaborative learning from distributed data sources across multiple fog nodes, improving scheduling