IoT, the internet of things, has been trans­form­ing IT land­scapes all across the world and is already seen as a key tech­nol­o­gy for many future-facing projects. Tra­di­tion­al IoT ar­chi­tec­tures, where data is centrally collected and processed, are unable to in­fi­nite­ly scale due to lim­i­ta­tions such as bandwidth. In the field of fog computing, possible solutions are being developed to address such issues as­so­ci­at­ed with im­ple­ment­ing IoT.

Compute Engine
The ideal IaaS for your workload
  • Cost-effective vCPUs and powerful dedicated cores
  • Flex­i­bil­i­ty with no minimum contract
  • 24/7 expert support included

What is fog computing? A de­f­i­n­i­tion

Fog computing is a cloud tech­nol­o­gy in which data generated by end devices doesn’t load directly into the cloud but is instead pre­processed in de­cen­tral­ized mini data centers. The concept involves a network structure that extends from the network’s outer perimeter (where data is generated by IoT devices) to a central data endpoint in a public cloud or to a private data center (private cloud).

The aim of “fogging” is to shorten com­mu­ni­ca­tion distances and reduce data trans­mis­sion through external networks. Fog nodes form an in­ter­me­di­ate layer in the network where it is decided which data is processed locally and which is forwarded to the cloud or to a central data center for further analysis or pro­cess­ing.

The following schematic il­lus­tra­tion shows the three layers of fog computing ar­chi­tec­ture:

Image: Schematic diagram of an IoT architecture’s edge, fog, and cloud layers
In fog computing, data storage and pre­pro­cess­ing resources are available in a de­cen­tral­ized manner across the network. Instead of having to rely solely on a public cloud or a central data center, these resources can be accessed through fog nodes on an in­ter­me­di­ate layer within the network.
  • Edge layer: The edge layer includes all of an IoT ar­chi­tec­ture’s “smart” devices (edge devices). Data generated from the edge layer is either processed on the device directly or trans­mit­ted to a server (fog node) in the fog layer.
  • Fog layer: The fog layer includes a number of powerful servers that receive data from the edge layer, pre­pro­cess­ing and uploading it to the cloud as needed.
  • Cloud layer: The cloud layer is the central data endpoint of a fog computing ar­chi­tec­ture.

A reference ar­chi­tec­ture for fog systems was developed by the OpenFog Con­sor­tium (now Industry IoT Con­sor­tium (IIC)). You can find more white papers on fog computing on the IIC website.

How is fog computing different from cloud computing?

What sets fog and cloud computing apart is the provision of resources and how data is processed. Cloud computing usually takes place in cen­tral­ized data centers. Resources such as pro­cess­ing power and storage are bundled by backend servers and made available to clients through the network. Com­mu­ni­ca­tion between two or more end devices always takes place via a server in the back­ground.

Systems like the ones used in smart man­u­fac­tur­ing require data to be con­tin­u­ous­ly exchanged between countless end devices, pushing such an ar­chi­tec­ture beyond its limits. Fog computing makes use of in­ter­me­di­ate pro­cess­ing close to the data source in order to reduce data through­put to the data center.

How is fog computing different from edge computing?

It’s not only the data through­put of large-scale IoT ar­chi­tec­tures that pushes cloud computing to its limits though. Another problem is latency. Cen­tral­ized data pro­cess­ing is always as­so­ci­at­ed with a time delay due to long trans­mis­sion paths. End devices and sensors have to com­mu­ni­cate with each other through the server in the data center, resulting in a delay in the external pro­cess­ing of the request as well as the response. Such latency times become prob­lem­at­ic in IoT-supported pro­duc­tion processes where real-time in­for­ma­tion pro­cess­ing is a must for machines to react im­me­di­ate­ly when an incident occurs.

One solution to the latency problem is edge computing, a concept within the framework of fog computing in which data pro­cess­ing is not only de­cen­tral­ized but takes place directly in the end device at the edge of the network. Each smart device is equipped with its own micro-con­troller, enabling basic data pro­cess­ing and com­mu­ni­ca­tion with other IoT devices and sensors. This not only reduces latency but also the data through­put at the central data center.

While fog computing and edge computing are closely related, they are not the same thing. The crucial dif­fer­ence lies in where and when the data is processed. With edge computing, data is processed where it is generated, and in most cases, the data is sent im­me­di­ate­ly after it’s processed. In contrast, fog computing collects and processes raw data from multiple sources in a data center that is located between the data source and a cen­tral­ized data center. Pro­cess­ing the data in this way makes it possible to avoid for­ward­ing ir­rel­e­vant data or results to the central data center. Whether edge computing, fog computing or a com­bi­na­tion of both is the best depends heavily on the in­di­vid­ual use case.

What are the ad­van­tages of fog computing?

Fog computing offers solutions to a variety of problems as­so­ci­at­ed with cloud-based IT in­fra­struc­tures. It pri­or­i­tizes short com­mu­ni­ca­tion paths and keeps uploading to the cloud to a minimum. Here are the most important ad­van­tages:

  1. Less network traffic: Fog computing reduces traffic between IoT devices and the cloud.
  2. Cost savings through use of third-party networks: Network providers bear high costs for high-speed upload to the cloud. Fog computing reduces these.
  3. Offline avail­abil­i­ty: In a fog computing ar­chi­tec­ture, IoT devices are also available offline.
  4. Less latency: Fog computing shortens com­mu­ni­ca­tion paths, ac­cel­er­at­ing automated analysis and decision-making processes.
  5. Data security: In fogging, device data is often pre­processed by the local network. This enables an im­ple­men­ta­tion where sensitive data can remain within the company or be encrypted or anonymized before being uploaded to the cloud.

What are the dis­ad­van­tages of fog computing?

De­cen­tral­ized pro­cess­ing in mini data centers also comes with its own set of dis­ad­van­tages. The main dis­ad­van­tages are the cost and com­plex­i­ty of main­tain­ing and managing a dis­trib­uted system. The dis­ad­van­tages of fog computing systems are:

  1. Higher hardware costs: Fog computing requires that IoT devices and sensors be equipped with ad­di­tion­al pro­cess­ing units to enable local data pro­cess­ing and device-to-device com­mu­ni­ca­tion.
  2. Increased main­te­nance re­quire­ments: De­cen­tral­ized data pro­cess­ing requires more main­te­nance, since pro­cess­ing and storage locations are dis­trib­uted across the entire network and, unlike cloud solutions, can’t be main­tained or ad­min­is­tered centrally.
  3. Ad­di­tion­al network security re­quire­ments: Fog computing is vul­ner­a­ble to man-in-the-middle attacks.
Go to Main Menu