For the second year in a row, InterCloud was at VivaTech, the Paris rendez-vous for start-ups & leaders. We were honored to have a booth in our major cu[...]
With the benefit of hindsight, one can see that the advent of cloud computing was an inevitable consequence of the twin driving forces of Moore’s Law and Metcalfe’s Law. It was a bounden certainty that computing power would eventually become cheap and ubiquitous enough that farming out your computing problems to someone else’s datacentre would become a proposition that looks a lot more attractive than having to purchase, implement and manage your own.
This cloud computing proposition is very attractive and yet it has its flaws, which loom up as the number of cloud services proliferate. More and more users adopt applications which all of a sudden become business-critical, because so many users are dependent on them. When this happens within an organisation’s Cloud-based portfolio Metcalfe’s law goes into reverse; instead of things becoming exponentially better as more and more people use something, they become exponentially more complex, difficult and costly to manage. This happens because the various vendors never had each other in mind when developing their solutions and so it becomes more and more complicated to connect the various components of an increasingly cloud-based mixed environment – data to applications to users. As more and more people use more and more complex network paths to access more and more data, the solution design and, indeed, the laws of physics, begin to have a bearing on the speed of response and the consequent business performance.
This is leading to significant frustration among users, who expect to be able to access data easily and securely without delays. For the IT department, user experience in the Cloud is fast becoming as big a challenge as the transition to the Cloud.
Large enterprises today are dispersed around the globe, which, in a cloud-based environment means the application could be in one location, data in another, the Cloud Service Provider (CSP) in a third and users spread across continents. This leaves the IT function with a major headache – how to ensure data travels across the network efficiently, securely and at speed to meet the needs of the users. Access to data is now recognised as crucial to any organisation’s ability to operate with agility and seize market opportunities rapidly, and so Cloud connectivity is becoming the number one priority for IT teams.
To complicate the issue further, most large corporates are moving to a multi-Cloud model to give them more choice and leverage with their CSPs. Using Infrastructure, Platform and Software as a Service from different vendors massively complicates access to information, particularly since these vendors will each update their offerings independently of the others. Any slow connections between these different platforms lead to user frustration and hinder business performance.
The need is clear, as Gartner says, that “IT leaders and enterprise architects must prepare an overarching cloud strategy for their organisations."
The goal of most CSPs is to build their business by increasing their subscriber base rather than to work with other providers. The marketplace has subsequently forced many to offer one-to-one connectors so that, for example, Salesforce can talk to Workday or SAP to Taleo. But as the data needs of individual cloud applications overlap more and more, what’s required is a more generalised way to understand which applications are talking to which users and to what data they need access – swiftly, accurately and securely.
Historically, organisations have gone to their telecoms provider to address Cloud performance. Telecoms providers, in their turn, typically seek to address this by selling more bandwidth. As a solution, this is crude, is not necessarily cost-effective and does not reflect how companies want to operate in the Cloud. To take advantage of Cloud flexibility, companies want to be able to scale usage up to meet peak demand, which may only last for a short period before scaling back down again; but telecoms providers want their customers to sign up for long-term contracts at specified bandwidths. Would it not show greater customer service to offer short-term, flexible contracts that reflect the urgent and varying requirements of businesses?
At InterCloud we believe the key is application segregation: prioritising that data which is critical to your business and ensuring that above all other traffic it receives VIP treatment. This simplifies how traffic flows around your organisation and offers additional security and governance protection for your most valuable data. However, getting to this point means that understanding your most important data from end to end is as critical as understanding your most important customers from a strategic viewpoint. This has to be a central pillar of your planning process as you move to the Cloud. For any move to the Cloud, we recommend you keep these key points in mind:
Matthew Parker, UK Sales Director & Country Manager, InterCloud
UK Sales Director & Country Manager of InterCloud