The increasing availability of public cloud has changed the mindset of businesses. They can now use the architecture and capabilities of these services themselves to either migrate or implement business services, or to guide their own on-premise implementations to support the agility and performance of application deployment.

This is in stark contrast to the previous generation where it was often the IT department guiding the deployment of data centre infrastructure, based around application requirements and growth, and this was generally siloed across business units and applications. Digital transformation now expects delivery of new services without the latency and overheads of infrastructure deployment.

1. Customers are moving to hyperconvergence

Essentially, if you look at the traditional data centre, it is split into storage, compute, networking – which are all independent. You buy the storage from one vendor, the compute power from another, and the networking from another. So you have three different experts monitoring and maintaining your data centre. Of course this is complex for customers, and if you run into problems, it’s difficult to troubleshoot if the silos don’t talk to each other. Hyperconverged infrastructure brings everything together, so you can manage and control storage, compute and networking together.

Vendors are also moving more towards software-defined cloud based controllers. Customers are looking for actionable insight and knowledge, and since vendors are hosting the management service in the cloud, customers benefit from new features immediately – as soon as new features are available in the management plane, the upgrade is available instantly in the cloud providing fast access to new features and functionality.

2. Customers want flexibility to move workloads

Customers like cloud because of its pay-as-you-go model. But when they start running applications in the cloud, this model can become very expensive as they grow, and more and more services are delivered. Some customers also have internal policies or compliance which doesn’t allow them to bring crucial applications or data to the cloud. That’s when some customers start thinking about the hybrid option.

Customers want their on-premises data centre to have the same flexibility and ease of use as their cloud, and have a consistent experience across both. So data centre vendors are providing a cloud-like experience with their on-premise, with the flexibility for customers to model the application, and then deploy it on-premises. Previously, if you wanted to migrate an application from on-premises to cloud, it took weeks. Today, vendors have created tools to model and deploy in a matter of minutes. It doesn’t matter where it is, you can deploy it on-premises, or to any cloud — whether it’s AWS, Azure or Google.

3. Digitisation and IoT is driving edge computing

Organisations involved in industries such as manufacturing, mining, transportation, oil and gas, and healthcare collect and process a lot of data. In the automotive industry, for example, car sensors collect huge amounts of information and manufacturers require high-processing power and fast response to process it. They can’t afford to move it all to a data centre, process it, and wait for a response. They need computing power where the data is generated. Even SMBs can now afford to deploy a small data centre at their branch offices. So from an investment perspective, edge computing reduces cost, but it’s also relevant and suitable for businesses with remote offices or co-located, un-maned sites with a lot of data.

 4. Growth of app-to-app and device-to-device traffic

With analytics and data, there are no manual interactions. In the IoT world, you get the data from devices, send it to a data centre, process the data, use an application to analyse the data, model with an existing database… that’s a lot of processing. And manage them all with a many applications or its components. Today, if you look at the application landscape, there not a single monolithic application serving the purpose – instead there are lots of small micro-services modules sitting around the world, all talking between each other using APIs to deliver the intended outcome. There is a phenomenal amount of traffic being transferred between applications and between devices. So how do you control the traffic? How do you know what is happening with your devices? Organisations need visibility tools to understand what is happening around their applications and devices so that they can define the policies to secure the data and applications.

5. Securing data centre is no longer optional

Once you have the visibility, it’s important to secure the data centre. In a traditional network, most organisations have periphery security – you have a firewall, and you’re done. Today, multiple checkpoints around your apps and your data centre are a necessity to protect and validate all the traffic that is transferred.