No matter where you look, organisations of every kind are finding their business structures upended by digital transformation. Empowered by DevOps practices, IT teams are helping to drive down costs, enhance agility and create a new era of innovation-fuelled growth.
But what drives DevOps?
Increasingly, the answer is containers: viewed by many as a major evolution in cloud computing, providing scalability and flexibility where developers need it most. Yet, while containers may be seen as a “dream”, it can quickly turn into a nightmare for the enterprise architects tasked with maintaining IT infrastructure.
Legacy technology such as centralised databases often come with interoperability issues, and alongside container sprawl, threaten to undermine the DevOps project – and the digital transformation efforts now so vital to business growth.
Containers are here to stay
It’s no exaggeration to say containers are seen as one of the modern building blocks of cloud computing. Like virtual machines (VMs), they provide a neat, self-contained package in which developers can run their applications, libraries and other dependencies. In so doing, containers offer a consistent, predictable environment isolated from other applications. However, since they’re more lightweight and have lower associated overheads than virtual machines, deploying containers quickly and easily at large scale across private, public and hybrid cloud environments comes far more easily.
It’s no wonder, therefore, that containers have continued to attract increasing praise in recent times. The ability to set-up test environments quickly and easily and scale them up to full production if necessary is a tantalising prospect for developers. It’s claimed that over 80% of IT teams used containers in 2018, up from just over half (58%) the year before. Each week, Google deploys over two billion of them alone.
Friend or foe?
Despite this deployment, the rapid adoption of containers highlights a growing rift in IT architecture, between stateless application workloads running on container environments, and stateful application workloads running on more traditional infrastructure. As container orchestration tools such as Kubernetes have allowed organisations to take greater control over their container environments, so businesses have begun to see the benefits of stateless applications. These range from using APIs to integrate multiple applications into services, to enabling an online-first approach to services, and to easier scalability and redeployment.
Yet while organisations reap the full rewards of containers, they are now facing the opposite challenge from their legacy IT. Quite simply, architecture built for stateful applications cannot match the flexibility, agility and rapid evolution that is now possible. For instance, stateful applications will often exist in silos, with their own independent network, policies and infrastructure – meaning it is much harder to scale without directly adding to that infrastructure, or to connect with other applications using APIs. What this means is that architects face an all-too-common nightmare of running without actually moving. Although a huge amount of investment and energy is put into building and improving legacy applications and their databases, stateless applications continue to offer incredible potential.
In one thing’s for sure, it’s that architects must bridge this gap – as the longer they leave it, the wider and harder to cross it will become. The task will have to be delicate. The new lightweight approach containers allow is at odds with the traditional, monolithic approach of legacy databases and infrastructure. At the same time, simply replacing a legacy database with a more modern alternative is not an easy answer. Whether organisations like it or not, legacy databases support applications that are absolutely critical to the business, when adopting a more modern NoSQL database, there’s nothing to say it can support containers automatically.
The path to architecting success
For architects, the good news is that there is light at the end of the tunnel. Modern databases are being designed to operate seamlessly with new container orchestration tools like Kubernetes — allowing architects to more easily manage how containers connect with centralised databases in the cloud. Armed with such tools, architects can finally ensure each component works together by taking a holistic approach to IT infrastructure.
Of course, architects do have the task of understanding which of their applications need to be moved from stateful to stateless quickly, to ensure they can keep pace with the evolution of containers; and which can be kept in their legacy environment, as they are at no risk of becoming obsolete. For instance, finance and payment functions, whose prime concern is performing the exact same action consistently, quickly and transparently, could remain on their legacy database, while anything that impacts the customer or end-user experience should be modernised so that it can keep evolving at the same rate as customer demands. Eventually, almost all of the applications in a business will be built on containers. If they can manage this evolution, architects can ensure that containers remain a friend not a foe.
The architect’s role is becoming increasingly challenging, as they are not only tasked with keeping the lights on, but also providing the right environment to drive innovation-fuelled success. Containers are just the latest advances in technology that tests their ability to keep pace with DevOps teams. More advances naturally lie ahead, and to continue to add value to the business, integrating existing and emerging technologies will be near the top of the architect’s agenda.
Written by Anil Kumar, Director of Product Management at Couchbase