By Lior Mishan
As so much attention shifts to the network edge, there is a growing realization of the potential of edge computing to improve performance, scalability, reliability, and regulatory compliance options for critical applications.
One key to achieving the edge’s potential is in accelerators, such as Smart NICs. That’s one of the points made in a new, comprehensive report on the current edge computing landscape.
The State of the Edge 2018 report (available for download here), a collaborative effort of Arm, Packet, Vapor IO, and others, points to the “central role” of these accelerators, “due to an emerging distributed architecture, as well as power constraints and workload requirements.”
The report notes that device edge (user side) resources are often constrained by power and connectivity, while at the infrastructure edge (service provider side), the environment is ripe for dynamically scalable resources that can deliver a centralized cloud experience, just on a smaller scale.
So how can a provider best deliver these dynamically scalable resources? Through an edge data centers approach, with many small but powerful centers located close to users to assure minimal latency. This model can help to provide the cloud-like experience.
The report anticipates that a typical infrastructure edge deployment will provide enough computing, data storage, and network capacity to support elastic resource allocation within a given regional area. Done correctly, it accomplishes the right balance and tradeoff between latency and resource density.
“Importantly, edge data centers will be capable of supporting the wide array of computing hardware that is quickly defining distributed architectures, including specialized accelerators such as FPGAs… and SmartNICs,” the report says. Our ACE NIC40 and ACE NIC100 are prime examples.
Edge nodes, located in sites such as virtual central offices, will be capable of supporting the same types of hardware and services as a centralized cloud data center. Importantly, the report sees them containing hundreds or even thousands of specialized accelerator devices in order to provide the high-performance and low-latency foundation for applications such as artificial intelligence and machine learning.
Clearly, acceleration will play an increasingly important role in the future of the edge. But we believe that just as important as acceleration is the programmability that an FPGA architecture enables. In an increasingly virtualized environment, service providers require the maximum flexibility in configuring their equipment to meet dynamic, ever-changing customer needs.
The report backs up our contention, noting that “Edge computing will require a very different supply and service chain, one that is facile at deploying and managing a wide variety of unique resources in thousands of locations.” It adds that the edge workload will certainly benefit from custom low power systems-on-chips (SoCs) and programmable SmartNICs.
That is why we at Ethernity Networks emphasize all-programmable platforms, whether the requirements involve mobile telecom, enterprise security, or data centers. It’s why we incorporate hardware, FPGA firmware, and software applications into a complete offering that enables full programmability at the pace of software development.