Defying data gravity: How can organisations escape cloud vendor lock-in?

The process of deriving the maximum possible business value from the data you hold is not a new challenge, but it is one that all too many organisations are still learning to address in the most sustainable way. The concept of ‘data gravity��, coined by software engineer Dave McCrory in 2010, refers to the ability of bodies of data to attract applications, services and other data. The larger the amount of data, the more applications, services and other data will be ‘attracted’ to it and the faster they will be drawn. As the amount of data increases exponentially it gains mass, and becomes far more rooted in place. In a business context, it becomes harder and harder for that data to be moved to different environments. What is especially true is that data has far more mass than the compute instances utilising it – for example, moving 1,000 virtual machines to the cloud is far easier than moving 1,000GB of data to the cloud – the same is true for migrating out of the cloud. Therefore with data gravity it has become more important than ever where that data resides, and how ‘portable’ it can really be for it to be utilised to its full potential. Increasingly, the ‘where’ for many businesses is in the cloud.

Spotlight

Other News

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More

Spotlight

Resources