Fog computing extends cloud computing to end devices, in order to better support time dependent, location dependent, massive scale, and latency sensitive applications. In this paper, we propose a fog computing ecosystem, implement a real testbed, and evaluate it with diverse usage scenarios. More specifically, we study three usage scenarios and optimize our fog computing platform for them. These usage scenarios are: (i) content dissemination in challenged networks, (ii) crowd-sourced fog computing, and (iii) programmable Internet-of-Things analytics. We leverage and enhance open source projects to realize our fog computing platform.We then solve optimization problems in each usage scenario with novel algorithms. Sample results show that our proposed algorithms outperform baseline algorithms by at least 30.3%, 20.0%, and 89.4% in terms of the main performance metrics of the three usage scenarios, respectively. Several ongoing tasks aim to improve our fog computing platform for: (i) network resource provision, (ii) system dynamics adaption, and (iii) device availability prediction.
Did you like this research project?
To get this research project Guidelines, Training and Code...
Click Here