Chrome Extension
WeChat Mini Program
Use on ChatGLM

A New Approach to the Resource Allocation Problem in Fog Computing Based on Learning Automata

CYBERNETICS AND SYSTEMS(2022)

Cited 0|Views3
No score
Abstract
With the rapid development of Internet of Things (IoT) devices, IoT applications require real-time and low-latency responses. Fog computing is a suitable platform for processing Internet of Things applications. However, fog computing devices are distributed, dynamic, and have limited resources, so the allocation of fog computing resources to execute heterogeneous and delay-sensitive tasks in the Internet of Things is a significant challenge. In this paper, we mathematically formulate the resource allocation problem to minimize the makespan while meeting the quality of service (QoS) requirements of IoT tasks. Next, we propose two learning automata, namely an automaton for select tasks (A(T)F) and an automaton for select virtual machine (A(v)f) to efficiently map IoT tasks to FNs. In this approach, a task is selected from the set of A(T)F actions and then, a Fog node is selected from the set of A(v)f actions. If the requirements for executing the tasks on the fog nodes are met, then the resource is allocated to the task. The efficiency of the proposed algorithm is evaluated through conducting several simulation experiments under different fog configurations. The proposed algorithms are evaluated in a simulation environment by extending the iFogsim to simulate a realistic fog environment. The experimental results indicate that the performance of the proposed algorithm is better compared with existing algorithms in terms of MK, response time, delay, Processing time, and cost with the increasing number of task submissions.
More
Translated text
Key words
Fog computing,learning automata,makespan,resource allocation,task scheduling
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined