Getting Your IT Infrastructure Ready for Edge Computing

Microsoft | May 18, 2020

  • IT Infrastructure innovations, edge computing began with engineers as a natural extension of technology to address a growing need .

  • Software-defined deployment, decreasing cloud networking costs and more, including edge computing��s rough spots and the additional operations complexity it adds.

  • It completely changed the architecture of the data center, the frameworks for security, and end-users’ expectations around data access and manipulation.


Like many IT innovations, edge computing began with engineers as a natural extension of technology to address a growing need. The concept isn’t new; distributed computing has been around for decades. But, at the same time standards began to converge and edge hardware started making the rounds at trade shows, the hype machine saw an opportunity. It amplified edge’s considerable promise in reducing latency, offering software-defined deployment, decreasing cloud networking costs and more. But as is too often the case, the bold feature bullets ignored the production concerns businesses must address, including edge computing’s rough spots and the additional operations complexity it adds.


Of course, edge computing will survive a little overexcited promotion, just like many of the once improbable technologies before it. People used to say, “What? Abstract all my data center applications away from the hardware as virtual servers? Impossible!” A decade later, we can’t imagine how we’d deliver traditional enterprise services, cloud computing, online retail, media streaming and everything else in between without exactly this. Virtualization survived its awkward hype adolescence, and edge computing will, too. The needs edge computing addresses are only growing.



Learn more: HOW DISTRIBUTED CLOUD WILL AFFECT DATA CENTER INFRASTRUCTURES IN 2020 AND BEYOND .
 

“ It completely changed the architecture of the data center, the frameworks for security, and end-users’ expectations around data access and manipulation ".

~ Microsoft.


Thanks to engineers and operations teams, the edge distributed model is moving toward practical use. It’s proving itself capable of meeting requirements for new levels of network performance through reduced latency, scalability and, more importantly, manageability. For some businesses, it’s even reducing costs over the long haul. With the proliferation of connected devices and a growing focus on 5G-enabled technology, tech pros should set aside their natural reluctance to wade through the edge hype and consider it a genuine possibility.

“ Edge computing is much the same, High-latency, poor application performance, low bandwidth – these are simply unacceptable to end-users today,With expectations set, IT will need to deliver this to users across the business. “


Its adoption is following the rise of emerging technologies and the applications taking best advantage of it: 5G, augmented reality, autonomous vehicles, IoT and smart manufacturing. These environments require not only low upstream latency, but high-performance compute and timely result data. Light only travels so fast, pushing infrastructure closer and closer to consumers for faster, more seamless processing in the form of brand-differentiating user experiences. The rise of cloud computing and the efficiency of large, remotely located data centers requires a new compute model. To lower latency and raise capacity, edge computing will augment the data center to bring compute and storage much closer to the user.


Edge computing is the epitome of agility. While traditional data centers are strategic, large multi-story facilities that support thousands of applications, edge data centers – which could be a ½ rack in a cabinet – can go anywhere and meet more specific, if smaller demands. It is not an either-or, though. We used the word “augment” purposefully. Edge computing provides an add-on capability that will modernize the traditional data center as the digital transformation sweeping the globe makes new demands that deliver performance and experience to end-users. Broadly speaking, edge computing moves some computational needs away from the centralized data center to nodes at the edge of the network, improving application performance and decreasing bandwidth requirements. In fact, a recent report showed potential improved latency and data transfer reduction to the cloud of up to.


Learn more: LOOKING TO BUILD THE INFRASTRUCTURE TO CONNECT THE WORLD’S GAMING PLATFORMS .
 

Spotlight

Leading organizations are shifting their digital transformation strategies to maximize efficiency, enhance productivity, and cut costs given economic uncertainty. This report highlights the top seven digital transformation trends to deliver success in 2023 and beyond, including: Investment in automation to help companies to do

Spotlight

Leading organizations are shifting their digital transformation strategies to maximize efficiency, enhance productivity, and cut costs given economic uncertainty. This report highlights the top seven digital transformation trends to deliver success in 2023 and beyond, including: Investment in automation to help companies to do

Related News

HYPER-CONVERGED INFRASTRUCTURE,APPLICATION INFRASTRUCTURE

Involta Announces Flexible and Robust DRaaS+ Solutions

Involta | January 31, 2023

On January 30, 2023, an Industry leading hybrid IT, data center, and cloud computing services company, Involta, announced its flexible and robust DRaaS+ (Disaster Recovery as a Service) offering. The new three-tiered model DRaaS+ is designed to offer the right service level for securing essential business data and systems. The service offered is built on Involta's proven track record, and it has provided leading DRaaS solutions since 2015 and is an immediate result of client feedback. Business continuity is essential for companies today, but research shows that 93% of enterprises that lost their data center for more than ten days because of a disaster went out of business within a year. Many companies without data management in the same period filed for immediate bankruptcy. Involta's DRaaS+ provides a next-gen service model that focuses on business outcomes rather than only on infrastructure delivery. Involta's top-notch infrastructure and industry-leading software can provide this solution quickly and precisely when and where it is most needed, such as in the aftermath of a disastrous natural event or cyberattack. The additional benefits of the software include 100% infrastructure and platform SLA for all service tires, no egress fees, expert engineer guidance for DR declaration, and allowance of predictable cost structure for various compute consumption models. Jim Buie, President, and CEO of Involta, said, "Ensuring businesses can bounce back from a disruptive event is no small feat. Staggering numbers paint a gloomy picture where most businesses will simply not survive a catastrophic data loss." He also commented, "We are delighted to offer a full range of disaster recovery services designed to offer the level of support our enterprise customers need, all housed in Involta's owned, purpose-built, enterprise-grade data centers." (Source: Cision) About Involta Involta Founded in 2007, Involta is an award-winning national IT service provider and consulting company. It helps organizations manage, plan, and execute hybrid IT strategies using a broad range of services. The services of Involta include cloud computing, colocation, managed IT, cybersecurity fiber and network connectivity. In addition, it enables compliance and IT transformation initiatives and has industry-specific services for healthcare, manufacturing, finance, and technology. The company maintains partnerships with leading tier tech vendors and major public cloud providers such as Cisco, Pure Storage and Veeam. It is based in Cedar Rapids, Iowa and uses its unique resources and partnerships to provide organizations with advanced hybrid IT solutions that meet their changing needs.

Read More

WINDOWS SYSTEMS AND NETWORK,DATA STORAGE,WINDOWS SERVER OS

QTS Data Center provides residual heat to 10,000 buildings and educational institutions

QTS | January 02, 2023

In an industry first, QTS Data Centers, a leading data Center solutions provider, recently announced its commitment to supply residual heat to a large-scale district heating project in partnership. Headquartered in Netherlands, QTS in collaboration with WarmteStad, a sustainable utility company of the municipality of Groningen, will supply the heat out of its Groningen data Center as part of its sustainability initiative. WarmteStad recently unveiled an innovative heating plant that will utilize 100% residual heat from the QTS Groningen data Center. Heat pumps powered by renewable energy distribute hot water via an existing underground heating network installed in Zernike, Paddepoel, and Selwerd. By 2026, WarmteStad expects to produce sustainable and affordable heat for over 10,000 households, buildings, and knowledge institutions in Groningen's northern districts. Buildings can operate without a natural gas connection when heated with water, significantly reducing CO2 emissions and supporting Groningen's environmental goal of being completely CO2 neutral by 2035. QTS has committed to minimizing its data center carbon footprint through innovative solutions including the utilization of renewable energy, reclaimed water, and recycled materials to support both QTS and our customers' sustainability initiatives." Travis Wright, VP of Energy and Sustainability, QTS. About QTS Data Centers QTS Realty Trust, LLC, a Blackstone backed company, is a leading provider of data Center solutions in North America and Europe. With over 9 million square feet of owned mega-scale data Center space, QTS delivers leading hyperscale technology companies, enterprises, and government entities with secure, compliant infrastructure solutions, robust connectivity, and premium customer service.

Read More

HYPER-CONVERGED INFRASTRUCTURE, APPLICATION INFRASTRUCTURE

Zenoss Launches Real-Time Kubernetes Monitoring

Zenoss | March 20, 2023

Zenoss Inc., the leader in AI-driven full-stack monitoring, today announced it has released streaming data monitoring for Kubernetes, the most widely deployed open-source orchestration platform used to manage containerized applications. This real-time monitoring of Kubernetes streaming data is part of a broader set of initiatives focused on cloud-based monitoring, which enables visibility for ephemeral systems that cannot be effectively monitored by traditional monitoring tools. Kubernetes was designed by Google in 2014 and donated to the Cloud Native Computing Foundation in 2015. It has become the de facto standard for running containers in production at scale, including in cloud environments such as AWS, Azure and Google Cloud. Also known as K8s, Kubernetes is supported by a community of professional programmers and coders from around the world. There have been over 2.8 million contributions to Kubernetes made by companies. Along with containers, Kubernetes has emerged as a primary technology for modern cloud-native workloads. According to the Cloud Native Computing Foundation, organizations using or evaluating Kubernetes increased from 78% in 2019 to 96% in 2022. Zenoss initially released monitoring and analytics for Kubernetes in 2018 and has delivered continuous innovation to become a leading monitoring platform for container-based environments. Zenoss provides full-stack monitoring and AIOps for public and private clouds, as well as for all on-prem IT infrastructure. The platform provides a view of containerized applications in the context of the broader infrastructure. This provides a common view for IT Operations, DevOps, DevSecOps, and business-level users. In addition to previously existing capabilities, Zenoss monitoring for Kubernetes now provides: Monitoring insights for Kubernetes clusters in a single pane of glass along with the broader infrastructure for K8s deployments in AWS, Azure and Google Cloud, as well as in private or hybrid clouds and locally hosted environments Secure, cloud-based monitoring with zero install Data collection and analytics in under five minutes Visibility into the health and performance of nodes, services, pods, containers, namespaces and more Intelligent dashboards with out-of-box templates Smart View, actions and notifications "There is a significant shortage of visibility into health and performance in these highly complex container orchestration environments," said Trent Fitz, chief product officer for Zenoss. “Just as application developers have adapted to be more efficient, scalable and automated, we are doing the same for monitoring these environments.” Zenoss Cloud is the leading AI-driven full-stack monitoring platform that streams all machine data, uniquely enabling the emergence of context for preventing service disruptions in complex, modern IT environments, including those built on Kubernetes. Zenoss Cloud leverages the most powerful machine learning and real-time analytics to give companies the ability to scale and adapt to the changing needs of their businesses. About Zenoss Zenoss works with the world's largest organizations to ensure their IT services and applications are always on. Delivering full-stack monitoring combined with AIOps, Zenoss uniquely collects all types of machine data, including metrics, dependency data, events, streaming data and logs, to build real-time IT service models that train machine learning algorithms to deliver robust AIOps analytics capabilities. This enables IT Ops and DevOps teams to optimize application performance, predict and eliminate outages, and reduce IT spend in modern hybrid IT environments. Zenoss is recognized in The Forrester Wave™: Intelligent AIOps Platforms, Q4 2022, the 2022 Gartner Market Guide for AIOps Platforms, and in the 2022 Gartner Market Guide for IT Infrastructure Monitoring Tools.

Read More