IT Infrastructure innovations, edge computing began with engineers as a natural extension of technology to address a growing need
Software-defined deployment, decreasing cloud networking costs and more, including edge computing��s rough spots and the additional operations complexity it adds.
It completely changed the architecture of the data center, the frameworks for security, and end-users’ expectations around data access and manipulation.
Like many IT innovations, edge computing began with engineers as a natural extension of technology to address a growing need. The concept isn’t new; distributed computing has been around for decades. But, at the same time standards began to converge and edge hardware started making the rounds at trade shows, the hype machine saw an opportunity. It amplified edge’s considerable promise in reducing latency, offering software-defined deployment, decreasing cloud networking costs and more. But as is too often the case, the bold feature bullets ignored the production concerns businesses must address, including edge computing’s rough spots and the additional operations complexity it adds.
Of course, edge computing will survive a little overexcited promotion, just like many of the once improbable technologies before it. People used to say, “What? Abstract all my data center applications away from the hardware as virtual servers? Impossible!” A decade later, we can’t imagine how we’d deliver traditional enterprise services, cloud computing, online retail, media streaming and everything else in between without exactly this. Virtualization survived its awkward hype adolescence, and edge computing will, too. The needs edge computing addresses are only growing.
Learn more: HOW DISTRIBUTED CLOUD WILL AFFECT DATA CENTER INFRASTRUCTURES IN 2020 AND BEYOND
“ It completely changed the architecture of the data center, the frameworks for security, and end-users’ expectations around data access and manipulation ".
Thanks to engineers and operations teams, the edge distributed model is moving toward practical use. It’s proving itself capable of meeting requirements for new levels of network performance through reduced latency, scalability and, more importantly, manageability. For some businesses, it’s even reducing costs over the long haul. With the proliferation of connected devices and a growing focus on 5G-enabled technology, tech pros should set aside their natural reluctance to wade through the edge hype and consider it a genuine possibility.
“ Edge computing is much the same, High-latency, poor application performance, low bandwidth – these are simply unacceptable to end-users today,With expectations set, IT will need to deliver this to users across the business.
Its adoption is following the rise of emerging technologies and the applications taking best advantage of it: 5G, augmented reality, autonomous vehicles, IoT and smart manufacturing. These environments require not only low upstream latency, but high-performance compute and timely result data. Light only travels so fast, pushing infrastructure closer and closer to consumers for faster, more seamless processing in the form of brand-differentiating user experiences. The rise of cloud computing and the efficiency of large, remotely located data centers requires a new compute model. To lower latency and raise capacity, edge computing will augment the data center to bring compute and storage much closer to the user.
Edge computing is the epitome of agility. While traditional data centers are strategic, large multi-story facilities that support thousands of applications, edge data centers – which could be a ½ rack in a cabinet – can go anywhere and meet more specific, if smaller demands. It is not an either-or, though. We used the word “augment” purposefully. Edge computing provides an add-on capability that will modernize the traditional data center as the digital transformation sweeping the globe makes new demands that deliver performance and experience to end-users. Broadly speaking, edge computing moves some computational needs away from the centralized data center to nodes at the edge of the network, improving application performance and decreasing bandwidth requirements. In fact, a recent report showed potential improved latency and data transfer reduction to the cloud of up to.
LOOKING TO BUILD THE INFRASTRUCTURE TO CONNECT THE WORLD’S GAMING PLATFORMS