Fog Computing proposes a highly distributed platform that relies on nodes at the edge of the network to provide low-latency services. Its true potential lies in its use as a generic scalable platform, running multiple IoT applications simultaneously.

Summary

Fog Computing proposes a highly distributed platform with nodes at the edge of the network. These nodes will offer resources such as computing, storage, and communication to the applications operating under this infrastructure. It is not a substitute of Cloud rather than an extension since Fog nodes are connected to the Cloud. The application requirements will determine whether if it runs on the Fog or if it has to go to the Cloud. In theory, this process is totally transparent to the end-user to facilitate the experience. Its main advantages are:

  • Process the data close to where it is generated, achieving low-latencies and reducing the network bandwidth
  • Common infrastructure for multi-tenant applications, eliminating the silos problem and thus optimizing the deployments
  • Support for large scale distributed systems
  • Support for critical applications which require security and real-time capabilities

Objectives

  1. Fog Computing architectures for IoT workloads
  2. Fog node architectures, focused on the first aggregation levels
  3. Analyze and study the hierarchy that Fog brings naturally to improve the network bandwidth
  4. Exploit the IoT mobility through the distributed Fog nodes
    • MARIO NEMIROVSKY's picture
    • Contact
    • MARIO NEMIROVSKY
    • Network Processors Group Manager
    • Tel: +34 934011844