Outsourcing of IT resources, limitless computing power, increased use of delegated services (SaaS, IaaS, PaaS, etc.)... Don't get us wrong, the cloud is here to stay but when you scratch the surface a little, you'll find that in the 2020s, many organizations are relocating all or part of their data and applications to their own datacenters.
According to an IDC report, in 2018, 80% of companies have recovered workloads and expect to migrate back 50% of their applications from the cloud to on-premise or private hosting.
What are the reasons behind this exodus? Cost? Efficiency? Security?
Yes. And a little more besides...
Cloud costs are often underestimated by users. Providers charge for specific services including storage and security. Yet there are often additional costs associated with data and applications movement, personalization of services, maintenance or even advice. Initially unused services such as data recovery, soon push the invoice higher. Much higher! In a study conducted by Vanson Bourne in 2015, the amount of these costs was estimated to average 720 000 dollars per year! By reducing or cutting operational and recurring cloud expenses, business can save on their already-stretched budgets.
The major cloud solution providers have high security standards. Yet many companies seem unaware that they remain responsible for their data and applications. If disaster strikes –and strike one day, it will- the company and not the cloud solution provider cannot be held responsible for data loss. It is crucial to have your legal teams read the fine print and check with your DPO (Data Protection Officer) that all will not be lost if the cloud goes down, or the provider goes bankrupt.
The rise in cyberattacks and the ongoing health crisis have pushed organizations to assess the consequences of data loss on their activities. Add in natural disasters, human errors, or hardware failures : the cloud suddenly does not seem the best single option. Therefore, more IT departments are rebooting on-prem infrastructures which -by their very isolation- are often less vulnerable.
Business data is typically spread across several datacenters and cloud providers. Data transfer can cause latencies which, when they are too high, can slow down business activities and result in a loss of connection to some applications. For this reason, applications that need to move large volumes of data between different sites are good candidates for data repatriation. The same should be applicable to mature applications, critical business applications and workloads requiring an elevated level of performance.
In an on-prem environment where costs are fixed and predictable, it is easier for companies to control data and the most used applications. For some businesses, IT teams are better equiped to handle challenges when data is in their own datacenters. Therefore, if the cloud solution provider is not able to ensure an important level of control, it is in a company interest to bring their data back on-premise.
IS HYBRID CLOUD THE BEST SOLUTION ?
It is not because companies change their IT environment that they reject the cloud and its promises. While cloud computing is attractive and has many advantages, not all the organizations have found in this technology a relevant answer for all their needs. More CIOs want to implement hybrid cloud architecture to boost flexibility and data availability. the hybrid cloud, which is a mix of public and private clouds, provides service availability, performance and allows organizations to control critical applications and data.
In conclusion, the cloud offers many benefits bu does not meet all needs. Organizations must keep in mind that while the expected ROI is attractive during migration, repatriating data and applications is often expensive and risky. A clear data repatriation strategy, excellent team training and recognized data management partners are the key to success. Thinking about repatriating your data from the cloud to on-premise or alternative hosting ? Please do contact-us.
Learn more in this blog post: