Bringing Clarity to the Edge
By Tony Gaunt, Sr. Director, Cloud & Colocation, Vertiv Asia
Much has been said about ‘edge computing’ that the conversation seems to have been drowned out by the many definitions and principles behind this trend. Just ask anyone what the definition of ‘edge’ is and you’ll get varied responses. Defining the ‘edge’ may be different depending on your industry, your role and even where you sit in the world. Nevertheless, what seems to matter most is that the edge is growing and its growing fast. The critical thing is for businesses to adapt quickly to leverage on this growth.
Just how much is edge expected to grow? Cisco projects that in just three years there will be an estimated 23 billion connected devices across the globe. That’s a staggering number. A large portion of this will be mobile sensor data that must be transmitted on wireless or mobile networks because of the Internet of Things and smart devices. With the explosion of connected devices, one thing’s for certain: the changes in the computer and storage infrastructure required to support the smart and connected future, particularly at the local level, will be profound.
That’s why it is critical to understand the edge ecosystem to determine the right infrastructure that will support it. To have a better understanding of the ecosystem, we’ve broken it down into four main archetypes:
This represents situations where the amount of data makes it impractical to transfer over the network directly to the cloud or from the cloud to the point-of-use, because of data volume, cost or bandwidth issues.
An example of this is high-definition content delivery. In 2016, video accounted for 73 percent of all IP traffic and that is expected to grow to 82 percent by 2021. Major content providers, such as Facebook, Amazon, and Netflix, are actively partnering with colocation providers to expand their delivery networks, bringing data-intensive video streaming closer to users.
Another example is the use of IoT to create smart homes, buildings, factories, and cities.
With the explosion of connected devices, one thing’s for certain: the changes in the computer and storage infrastructure required to support the smart and connected future, particularly at the local level, will be profound
Despite IoT still being in its early stages, organizations are already struggling to manage the volume of data being generated. In this case, the challenge is the opposite of the one presented by high-definition content delivery. Rather than moving data closer to users, these applications must move the huge amounts of data generated by devices and systems at the source to a central location for processing. This will require the evolution of an edge-to-core network architecture.
This is where services are optimized for human consumption such as e-commerce, where speed has a direct impact on the user experience. E-commerce sites, for example, require a fast infrastructure that would translate directly into increased page views and sales. Amazon found that a 10-millisecond delay in payment processing caused a 1 percent decrease in retained revenue. To remove any delays, local data processing hubs would be essential.
Machine-to-Machine Latency Sensitive
Because machines can process data much faster than humans, speed is the defining characteristic of this archetype. The consequences of failing to deliver data at the required speeds can be even higher in this case. For example, the systems used in automated financial transactions, such as commodities and stock trading, are latency sensitive. In these cases, prices can change within milliseconds and systems that don’t have the latest data when needed cannot optimize transactions, turning potential gains into losses. Smart grid technology, smart security, and real-time analytics also fall under this archetype.
This encompasses use cases that directly affect human health and safety such as autonomous vehicles, electronic health records, health monitoring technologies and smart transportation. In this situation, speed and reliability are paramount.
What’s Next for Data Centers?
By bringing to light the edge ecosystem through these four archetypes, we are able to understand better the requirements for our critical infrastructure to support various edge applications. Each edge deployment will require a local data hub, which provides storage and processing in close proximity to the source. In some cases, the local hub may be a free-standing data centre. More commonly, it will be a rack- or row-based system providing 30-300 kW of capacity in an integrated enclosure that can be installed in any environment. Because security is paramount, data encryption and other security features will be essential as well.
Now that we’ve taken a closer look at the various archetypes supporting the edge ecosystem, data centre operators will be able to make smarter decisions for their infrastructure, thus having a more agile, future-looking and resilient data centre to support edge computing.