From a practitioner's perspective, I often see the need for computing to be more available and distributed. When I started to integrate the Internet of Things with OT and IT systems, the first problem I faced was that the amount of data sent by the device to our server was too large. I work in a factory automation scene, we integrate 400 sensors, these sensors send 3 sets of data every 1 second.
Most of the sensor data generated is completely useless after 5 seconds.
We have 400 sensors, multiple gateways, multiple processes, and multiple systems, and we need to process these data almost simultaneously.
Most proponents of data processing support the cloud model, in which you should always send something to the cloud. This is also the first IoT computing foundation.
1. Internet of Things Cloud Computing
Using IoT and cloud computing models, you can basically push and process sensory data in the cloud. You have a receiving module that receives data and stores it in a data pool (a very large storage space), then applies parallel processing to it (maybe Spark, Azure HD Insight, Hive, etc.), and then uses this information To make a decision.
Since I started to build IoT solutions, we now have many new products and services that allow you to do this very easily:
1) If you are a loyal supporter of AWS, you can use AWS Kinesis and big data lambda services.
2) You can use Azure's ecosystem to make it very easy to build big data functions.
3) Alternatively, you can use Google Cloud Products with Cloud IoT Core and other tools.
Some of the cloud computing challenges I face in the Internet of Things are:
1) Enterprises are unwilling to store their data on the platforms of Google, Microsoft and Amazon.
2) Delay and network interruption issues.
3) Increasing storage costs, data security and durability.
4) Usually the big data framework is not enough to create a large receiving module that can meet the data requirements.
2.Fog Computing for the Internet of Things
With fog computing, we become stronger. We now use local processing units or computers instead of sending data all the way to the cloud, waiting for the server to process and respond.
Four to five years ago when this function was implemented, we did not have wireless solutions such as Sigfox and LoraWAN, and BLE did not have mesh networking or remote capabilities. Therefore, we must use more costly network solutions to ensure that we can establish a secure and durable connection with the data processing unit. This central unit is the core of our solution, and there are few dedicated solution providers.
My first implementation of fog computing was in oil and gas pipeline projects. The pipeline generated several terabytes of data, and we created a fog network with appropriate fog nodes to calculate the data.
Since then, what I have learned from implementing the fog network:
1) It is not very simple, you need to know and understand many things. Building software or our work in the Internet of Things is more direct and open. In addition, when you use the Internet as an obstacle, it will slow you down.
2) Such an implementation requires a very large team and multiple suppliers.
Open Fog and its impact on fog computing
Open Fog (https://www.openfogconsortium.org/) computing framework is used for fog computing architecture. It provides:
And reference architecture
3.Edge Computing of the Internet of Things
The Internet of Things captures micro-interactions and responds as quickly as possible. Edge computing brings us the closest to the data source and allows us to apply machine learning in the sensor area. The difference between edge and fog computing is that edge computing is entirely the intelligence of sensor nodes, while fog computing is still a local area network that can provide computing power for data-heavy operations.
Industry giants such as Microsoft and Amazon have released Azure IoT Edge and AWS Green Gas to promote machine intelligence on IoT gateways and sensor nodes with outstanding computing capabilities. These are excellent solutions that make your work very easy, but it has greatly changed the meaning of edge computing that our practitioners understand and use.
4. MIST calculation of the Internet of Things
We see that we can do the following to promote the data processing and intelligence of the Internet of Things:
Cloud-based computing model
Fog-based computing model
Edge computing model
We can simply introduce the network functions of IoT devices and distribute workloads, using dynamic intelligent models that neither fog nor edge computing can provide. This type of calculation can complement fog and edge calculations and make them better.
Establishing this new model can realize high-speed data processing and intelligent extraction from devices with a memory size of 256kb and a data transfer rate of about 100kb per second.
I dare not say that this technical model is mature enough to help us deal with IoT computing models. But for the Mesh network, we will definitely see the facilitator of such a computing model.
Personally, I have spent some time implementing MIST-based PoC in the laboratory, and the challenge we have to solve is the distributed computing model and its governance. However, I am 100% sure that soon someone will come up with a better MIST-based model that all of us can easily use and use.