As we have learned from the whitepaper 'Edge Computing in Industrial Environment' about the overview and different forms of Edge computing, we will discuss here more on how Edge is deployed in an Industrial environment considering its scalability, reliability, latency and data/bandwidth, and discuss the supporting IEEE standards.
As we have discussed in the section "what is Edge computing?" of the whitepaper, the answer to squaring this circle is to embrace edge computing as part of an industry 4.0 strategy. Edge computing is a cloud-based intermediate layer that connects the central cloud and the edges, providing specialised services using hardware and software.
Deploying Edge
The key components of Edge computing are Cloud (It can be public or private cloud, which has a repository for the container-based workloads and also to host & run the applications), Edge devices (equipment along with sensors having limited compute resources), Edge node (referring to any edge device, edge server, or edge gateway on which edge computing can be performed), Edge server (typically used to run enterprise application workloads and shared services), and Edge Gateway (able to host enterprise application and provide network services). Edge devices are physical hardware situated at the network's edge that have enough memory, processing power, and computing resources to gather, analyse, and execute data in near real time with just a little support from other parts of the network. Different edge devices offer different levels of processing, these can also filter data, ensuring only important changes are sent to the cloud for further analysis. Besides, Edge is more fault tolerant, and corrections can be made immediately. Even in scenarios with less Signal strength, Edge would be able to independently support the devices.
Containerization is one of the most common ways to make programs compatible with cloud usage. The program is packed together with all the operating system libraries it requires using technologies from suppliers such as Docker, and the complete container is transferred from server to server as needs change. This move is often carried out with the help of Kubernetes and similar tools, which monitor hardware availability and other factors to determine when and where containers should run. The centralised cloud computing structure is increasingly inefficient for processing and analysing enormous volumes of data gathered from IoT devices due to data transfer with restricted network capacity. The preprocessing methods greatly minimise the amount of data transported since edge computing offloads computing duties from the centralised cloud to the edge near IoT devices.
As shown in figure, IoT Connect® Platform enables the system to quickly connect, collect and generate valuable insights from data across enterprise. The IoT Connect platform consists of various components such as tools, technologies, SDKs, APIs, and protocols. It provides a matrix of devices, sensors, gateways, actuators, and other modules. These devices collect various types of data at different intervals that can then be monitored, filtered, and processed in real-time to provide you with actionable insights. One can also create new revenue streams and service models by quickly deploying solutions that scale across production environments.