As the digital economy has gained momentum, a number of questions have been raised about the mainframe’s suitability for today’s business requirements.
Among these criticisms is that the mainframe is old and therefore obsolete; that it is an impediment to progress and that the integration of the mainframe with newer digital systems exposes organisations to service outages. As a result, some are debating ripping out their mainframes in a bid to increase agility and improve their competitive edge.
However, this would be a huge mistake, as firms stand to abandon decades’ worth of irreplaceable intellectual property in the mainframe code that has underpinned their business operations for decades. Worse still, the arguments against the mainframe are critically flawed, and those who endorse them often fail to provide alternatives that are superior to the mainframe in helping enterprises gain a competitive advantage in the digital economy. So what are these mainframe myths, and where does the truth behind them lie?
Myth 1: The mainframe is old, therefore obsolete
Far from being obsolete, the mainframe remains the lynchpin of the modern digital economy – processing over 30 billion transactions and rising, every single day. So why is a 50-year-old technology platform still so critical in today’s digital world? The simple fact is that the mainframe remains unmatched in terms of scalability, performance, reliability and security, which is why it also hosts most of a large enterprise’s databases. However, many fail to acknowledge these virtues in their rush to write-off the mainframe.
With the majority of the enterprise’s data residing on the mainframe, any mobile, web, or cloud application that needs to draw on core data, such as customer records or retail inventories, remains strongly reliant upon mainframe code. For instance, every time a customer checks their bank balance through a mobile application, the service needs to interact with the mainframe to collect the account data before serving the information back to the app. This isn’t likely to change any time soon, with research revealing that 88% of CIOs expect the mainframe to remain a key business asset over the next decade. To address the cynics’ concerns, businesses need to futureproof the mainframe and ensure it is able to keep pace with the rapid pace of technological change. As such, it’s vital that the mainframe is included in Agile delivery workflows, so that it is seen as an enabler, rather than a barrier to progress.
Myth 2: The mainframe can’t be Agile
Many organisations haven’t so far tried to integrate the mainframe into Agile development workflows; they mistakenly believe that they can’t. However, the main thing preventing the mainframe from “going Agile” is the unwillingness of an IT organisation to embrace it. At the end of the day, code is code—whether it’s COBOL or Java, so there’s no practical reason not to integrate the mainframe into Agile development practices. However, the challenge is that mainframe developers have traditionally worked independently from those working on other platforms. The continuous delivery model favoured in Agile development is often alien to the mainframe workforce, which traditionally follows waterfall development models and works with obsolete tools. As such, a cultural shift is needed to modernise the mainframe environment and bring it into the fold of mainstream IT, but that’s far from impossible.
Companies need to break down the barriers that separate their development teams and integrate the mainframe into mainstream DevOps workflows. Developers must therefore be enabled to use modern tools and processes across the entire IT stack, whether they’re creating a brand new mobile application, or tweaking the mainframe code that underpins it, and wherever possible, the same tools. Having a common set of tools that they can use across all platforms will enable programmers to switch seamlessly between tasks, regardless of the platform; creating a truly Agile IT team.
Myth 3: Staying on the mainframe increases risk
Moving away from the mainframe is actually more likely to increase the frequency of IT outages, due to the complexity of rewriting the interdependencies between applications, difficulties sustaining the business logic, and the risks of transitioning mainframe applications to less secure and less stable platforms.
Rather than moving away from the mainframe, companies need to enable millennial developers to understand the complex relationships between mainframe code, which is often undocumented, without the need for extensive training. One way to achieve this is through visualisations that provide graphical visibility into the interactions within mainframe applications and data. The ability for developers to instantly recognise these relationships, as well as the impact that any code changes will have on the wider ecosystem, empowers them to find and fix issues quickly, or avoid problems altogether. As a result, even developers with limited mainframe experience will be able to update legacy applications quickly and accurately.
Ultimately, there are many myths surrounding the mainframe, but they all stem from a basic misunderstanding of how the platform works and the value it provides to the business. As with most things in life, fear of the unknown naturally makes us look elsewhere for answers, but to do away with the mainframe would be a major injustice with serious consequences, both to the platform and to the businesses it supports so reliably. Instead, we should be looking at how we can integrate the mainframe into mainstream IT, so it becomes just another IT platform, rather than a mythical enigma.