SDS Logo | An Applied Visions Company

Low-latency, high-performance control systems

How much data is necessary? Where should it be stored and processed? Local processing can be key.

One of the biggest problems with IoT applications is often having too much data, which can make it hard to decide where to store and analyze it. There is a delay when you send information to the cloud for processing. Hence the need for low-latency, high-performance control systems at the local level. This is the niche in which SDS started so many years ago, so we know it well.

When we design an IoT system for a client, we start by mapping out our client’s data needs. We help them decide what data to record, how much of it to store, and what to do with it based on the project’s requirements and goals.

In a traditional IoT system, sensors collect data, such as temperatures and vibrations. An on-property gateway collects this information and sends it to the cloud for processing. The information is stored at the cloud level, where it is processed and analyzed with big data analytics or machine learning.

Become more powerful and energy-efficient

IoT systems become more powerful and energy-efficient when edge computing and Fog-based machine learning (ML) are properly leveraged. Edge computing refers to locating the processing of data closer to devices operating “at the edge” of the network, rather than up in the cloud or in a central application server. This minimizes the amount of data being sent back and forth to the cloud so the data can be processed, the devices can be controlled, and reports can be generated. Instead, more of these activities can be processed at the edge.
Processing data at the edge of the network can be much faster, able to provide real-time insights on your application. The processing units that take up the space between the edge and the cloud comprise the “Fog.” These processors operate close to the data source, but have enough power and bandwidth to perform heavier data processing and analytics.

Fog processing may be a better approach, especially for certain industrial settings where connectivity to the cloud may be limited. Data that is less time-sensitive (such as information for preventive maintenance) can then be sent to the cloud for further analysis.

It’s critically important to make the right decisions as to where to process the data in an IoT system based on your latency needs:

Local processing can be done in microseconds.

Fog processing requires milliseconds.

Cloud processing can be accomplished in seconds.

How much data do you need to send?

The big questions are how much data you need to send up to the cloud and where to process the data within these three tiers. Many IoT systems (particularly remote Industrial IoT systems) lack the capability and connectivity to send all of the data collected to the cloud fast enough to be useful, making the Fog the ideal place for the heavy lifting of machine learning.

How fast do you need to send it?

There are many instances where speed of processing is critically important. The financial services industry is one area where large sums of money can be gained or lost without real-time data processing.

How fast does the system need to respond?

Autonomous cars are another example. These vehicles produce an enormous amount of data that needs to be processed and analyzed instantly. There is no time to send all of the data to the cloud for processing. Edge computing allows the data to be processed locally, so the vehicle can respond quickly, keeping passengers and pedestrians safe.

Where can the data be processed?

The solution is to segment data analysis and processing. Data analysis that is needed in real time can be done locally on the device or at the Fog level, depending on the exact time requirements and the processing capabilities of the sensors being used. Only some of this data needs to go up to the cloud, combined there with data from other systems and devices for more powerful and historical analysis.

Making the right decisions

We help clients make the right decisions for data analysis and edge computing based on the answers to such questions as:

  • Do you need real-time data processing?
  • Do you have connectivity or bandwidth issues that edge computing can help resolve?

If edge computing is the right choice for you, we can assist with all stages of edge implementation:

Proof of concept (POC) and prototype

We will build a POC followed by a prototype that will deliver the functionality required. Future iterations will use a combination of off-the-shelf and custom hardware to deliver functionality within any cost limitations on the end product.

Design of network architecture

This includes your cloud servers, routers, access and edge nodes, gateways, and end devices.

Sensor selection

Edge sensors are becoming more powerful. This is what makes local and Fog processing possible—but these sensors also cost more. The design of the system must balance functionality with cost. This is where an expert design team is important.

Development of data governance policies

These should outline what data will be processed at the edge and what data will go to the cloud. Policies should also document who has access to the data. Back-up procedures for data recovery are critical and should be documented as well.

Data analytics

When it comes to actual data analysis, you may be able to use an off-the-shelf analytics package to analyze your data or you may need a custom solution to fit your needs. We can make the right recommendation for your project constraints and requirements.

Security

The security of your edge devices should be a primary concern. There are some companies providing security solutions to help secure microcontrollers at both the board and network level. Physical security is also important, as many edge devices are in public locations that may be difficult to secure. We can help with all aspects of security.

Find out what we can do for you

Reach out to us today to see how we can help you with your low-latency, high-performance control system.