Edge computing does not need to support the same number of virtual machines and containers that might be needed in cloud server environments. In many cases 8-core or 12-core processors will be perfectly suited to OT and IT deployments in edge computing scenarios.
Read Time: 4 minutes
In the rapidly evolving landscape of technology, edge computing stands out as a game-changer by bringing computation and data storage closer to where data is generated. This shift from traditional cloud-based systems to edge environments highlights a key difference: the scale and processing power required. While cloud data centers are designed to handle thousands of virtual machines (VMs) and containers for hundreds of end users and processes, edge computing operates on a different scale.
Cloud data centers are usually overbuilt to provide the scalability needed over years to accommodate fluctuating and growing demands from diverse applications and services. Data centers are designed to house some of the most powerful – and power hungry – servers that may have more than 200 processing cores, enabling hundreds of separate VMs and containers. Computing resources can be added and allocated as needed, providing flexibility and availability for many use cases that can scale over time.
Edge computing focuses on localized, real-time data processing that is carried out closer to data sources like IoT sensors, cameras, and industrial machinery and often for a single enterprise end user, within the boundary of that enterprise’s facilities and network. The edge computing systems can be designed to handle specific, often more predictable tasks. This means these systems do not need to have as many virtualization capabilities, making 8-core or 12-core processors perfectly adequate for the job. These processors offer ample power to handle the specialized tasks of data collection, processing, and analysis right at the edge of the network.
Where a single cloud datacenter houses hundreds to thousands of servers to service the processing needs hundreds to thousands of end users, a typical edge deployment might house a handful of servers in each facility (or branch or store) — but this may be replicated across hundreds or thousands of such locations.
Reasons why right-sizing the servers work so well at the edge:
1. Tailored Processing Power: Edge workloads are usually optimized for specific applications, such as real-time analytics or control functions, with a narrower focus and less variability than broad, multi-tenant applications running in cloud data centers.
2. Reduced Latency: One of the primary benefits of edge computing is that processing data close to its source reduces latency compared to a cloud data center. The reduced need for virtualization also simplifies the architecture and may allow faster response times and more efficient data handling within the edge server.
3. Cost Efficiency: Deploying edge devices with fewer cores can reduce costs. Matching the processing power to the actual processing need leads to lower hardware and energy expenses, making edge computing a cost-effective solution with a smaller environmental footprint.
4. Scalability Needs: While cloud environments must be designed to scale dynamically to handle diverse and unpredictable workloads, edge computing deployments are more predictable and can be scaled more finely tuned to specific needs.
5. Easier Maintenance: right-sized hardware, especially when ruggedized, can reduce IT truck rolls and IT staff time to keep things running. Modularity and lightweight designs will simplify and streamline maintenance, and lead to a lower total cost of ownership.
Edge computing is particularly beneficial in scenarios where real-time processing is crucial. For instance, in industrial settings, edge devices can monitor machinery performance, detect anomalies, and make immediate adjustments without relying on distant cloud servers. In smart cities, edge computing can manage traffic signals and public safety systems with minimal latency. These applications leverage the localized power of edge computing, making high-core-count processors unnecessary.
The number of processing cores needed for edge computing should not be chosen based on the needs of cloud data centers built to support extensive virtualization and diverse workloads. Edge computing thrives on a different approach where 8-core or 12-core processors may offer the right balance of power and efficiency for localized, real-time data processing. This tailored approach not only simplifies the architecture but also enhances performance and reduces costs, making edge computing a practical and effective solution for many modern technological needs.
By Rudi Carolsfeld