There are points in time when all things combine to ensure only one likely outcome. Those who can see it coming can prepare, and those who dismiss the warning signs become its victims. As John F. Kennedy once said; “Change is the law of life. And those who look only to the past or present are certain to miss the future.”
A perfect storm has been circling traditional enterprise computing for many years, constantly building in speed and power with each innovation, and with each new business thrown into the mix. The storm is now hovering intensely, right above the largest businesses in the world.
Enterprise IT has been built up over the years like geological layers. Over time, a lot of what used to be, is no longer visible, however it finds its place firmly in the sediment. The deeper you dig into a company’s core processes, the more and more ancient fossils you will find, as these are usually the parts that the company is incapable of replacing.
The bedrock of all this is the legacy programming languages – including the likes of COBOL and PL/1 – which came to prominence in the 60s, and found their home in mainframe technology, and as such, have played an important role in big business ever since. There are over 220 billion lines of COBOL code in operation worldwide, and an estimated 70% of the world’s commercial transactions are processed by a mainframe application at some point in time. With 71% of Fortune 500 companies relying on legacy systems, this equates to €4 trillion of legacy investment in mainframe systems.
A diminishing pool of qualified, interested professionals
Contemporary code has long evolved and adapted from its early forms. For example, in two separate ongoing studies from TIOBE and PYPL, COBOL ranks outside of the top 20 most popular coding languages – despite the fact it has embedded itself far and wide into global enterprise. New languages and patterns have rendered the continued development capability of legacy programming languages relatively obsolete over the last 30 years.
As mainframe programming began in earnest in the seventies and eighties, the programmers that generated the billions of lines of legacy code still in existence are largely retired. And for those at the younger end, the boomers, they can see retirement in the very near future. A dearth of younger generation programmers trained in these languages means in the very near future, the last mainframe skilled professional may well be leaving the building.
To compound the problem, the original source code, designs and documentation are often unavailable, which are indispensable should any changes to the applications be required.
Consequently, companies are losing the ability to understand their systems of record. Maintaining computing programs without the institutional knowledge of how they work is incredibly difficult, and enhancing them to support new business requirements is nigh on impossible. For very large organisations that are utterly dependent upon decades-old applications, this lack of maintenance and enhancement could be fatal.
Locked out of Moores Law
Mainframe technology, has failed to keep pace with the inexorable march of Moores Law. This exponential growth in performance of Commercial Off The Shelf (COTS) architectures means even modest x86 computers are materially more powerful than the largest and most expensive mainframes. The same is happening with I/O performance. Given the staggering cost of today’s mainframe, as Moores Law continues over the next few years, the gap between the price/performance curves of mainframes and COTS devices will make mainframes totally untenable as platforms for enterprise computing.
Their born-on-the-web competition has agility in its DNA
Mainframe application architectures, may have seemed effective 30 years ago, but they have stubbornly resisted recent technical advances, characterised as the ‘DevOps Toolchain’, designed to really speed development. These architectures were not built with current trend monoliths like mobile, IoT, cloud, DevOps or the digital transformation in mind. There are many new companies that are not stricken with the burden of mainframe application inertia, and as a result are much more agile. These born-on-the-web companies, can respond to changing market dynamics quickly. Banks reliant on mainframes might take 1.5 years to develop and roll-out a new system – a P2P vendor could do the same in about six weeks.
Regulation also plays a part in tipping the scales in favour of the born-on-the-web companies. Financial organisations face pressure from the Competition and Markets Authority (CMA) to offer IT services on mobile devices – a long and arduous process if involving mainframe applications. Meanwhile, the EU’s MiFID II will instil a range of technological requirements on financial systems, including the need to retrieve particular data types within 24 hours of the request. Who knows how long such changes will take for companies dependant on mainframes, but one can be certain it will be materially longer than for companies whose application architectures are modern.
There’s no doubt that mainframe environments have formed an impressive pillar of industry over the past half-century, and many CIOs would be forgiven for attributing a nostalgic glow to their memories of the big-iron over the years. They haven’t advanced with the times, though, in terms of price vs performance and interoperability, and the people that know how to maintain and adapt their code are fast diminishing. These companies have their backs to the wall when faced with increasing regulations that give the upper hand to born-on-the-web firms, and thus these companies are crying out for IT modernisation. Any one of the three issues is a compelling reason to migrate off mainframes. But taken together they represent a perfect storm of existential proportions.
Mark Cresswell is CEO of LzLabs