

The primary Maserati was launched in 1926. The primary Ferrari was launched in 1947. And the primary Porsche was launched in 1948. And my private favourite, the primary Land Rover, additionally was launched in 1948.
What do every of those legendary automobiles have in widespread?
Every predates the mainframe and COBOL, but nobody calls them outdated. Why? As a result of they’ve frequently advanced—embracing fashionable engineering, cutting-edge know-how, and innovation whereas sustaining the effectivity, efficiency, reliability, and excellence they have been constructed on. The identical is true for the mainframe.
But, regardless of a long time of steady transformation, some critics nonetheless cling to the parable that mainframes are outdated, inefficient, and unable to combine with fashionable IT methods. This couldn’t be farther from the reality. IBM’s z16, launched in 2023, was constructed for AI, and the z17, as a result of launch this 12 months, is poised to deal with new workloads with unparalleled safety, scalability, and effectivity. COBOL, the spine of crucial purposes, is as straightforward to make use of as any fashionable programming language when paired with the correct instruments.
The issue isn’t the mainframe—it’s how we’ve managed and remodeled the purposes operating on it. As an alternative of strolling away from essentially the most dependable, safe, and high-performing computing platform in historical past, we should always concentrate on how it’s evolving to assist new workloads, AI-driven insights, and hybrid cloud integration.
A Quickly Modernizing House
The mainframe isn’t standing nonetheless. It’s taking over extra mission-critical workloads than ever, supporting all the things from AI-powered fraud detection to high-speed monetary transactions. In actual fact, a whopping 72 % of the world’s compute runs on mainframes whereas the platform makes up simply 8 % of IT prices.
Mainframe transformation includes two issues. First, growth groups have to harness mainframes’ computing energy, scale, and information storage capabilities. Second, they want these mainframe methods to align with the automation capabilities that their cousins within the cloud have adopted, making the mainframe software program growth life cycle extra environment friendly, eliminating guide processes, and growing the standard and velocity of legacy purposes.
DevOps workflows alone received’t get us there, however instruments are bridging the hole.
Relating to instruments, retailers want mainframe code to be managed identical to cloud or distributed purposes, enabling steady integration/steady growth pipelines, automated testing, and model management whereas sustaining compatibility with legacy environments.
Tradition and the developer expertise additionally play an necessary position in mainframe transformation. If the developer expertise for engineers is subpar, a lift to effectivity is unlikely to emerge. Eradicating guide bottlenecks, lowering or eliminating context switching, streamlining archaic growth processes, and adopting an agile tradition are all straightforward methods to enhance the developer expertise.
High-quality-Tuning the Mainframe for Authorities Effectivity
Prospects I discuss to typically describe three very totally different—however equally legitimate—paths for fine-tuning their mainframe technique. Some authorities businesses select a slow-and-steady method, increasing their mainframe footprint over time as wants evolve. “Our workloads are rising as our inhabitants grows,” one CIO advised me. “We’re not transferring off the mainframe—we’re rising with it.” For these businesses, there’s a pure rhythm of development that doesn’t require radical change, simply considerate funding as utilization expands.
Others are leaning into modernization by refactoring the code itself. With the assistance of Generative AI-powered code assistants, prospects are telling me they’re lastly in a position to sort out decades-old purposes with confidence. These instruments clarify unfamiliar code in plain language, doc it mechanically, and recommend finest practices for making adjustments. For presidency groups with restricted entry to senior mainframe builders, this new degree of code intelligence helps bridge the talents hole and enabling sooner, safer transformation of core purposes.
After which there are the businesses doubling down—reinvesting within the mainframe by upgrading to the newest zSystems and embracing DevOps practices throughout the board. “If we will do it on the distributed aspect, we should always be capable of do it on the mainframe,” one company chief advised me. By staying present, these organizations scale back technical debt, assist fashionable growth instruments, and guarantee seamless integration into their enterprise-wide DevOps workflows.
Future-Proofing the Mainframe
The builders working with mainframes are additionally enthusiastic about their future. A 2024 Forrester Report discovered that “amongst international infrastructure {hardware} decision-makers, 61% stated that their agency makes use of a mainframe. Of people who use mainframes, 54% indicated that their group would improve its use of a mainframe over the subsequent two years.”
There’s additionally a large ecosystem of distributors constructing instruments to modernize the mainframes.
That’s the reason you see increasingly more speak about synthetic intelligence, graphical scanning, and mapping instruments to parse, map, and refactor legacy code bases and monolithic code into extra manageable property. AI additionally provides organizations the power to rapidly onboard new assets and get them conversant in their code base sooner to change into extra productive. Builders can pinpoint obligatory adjustments sooner, lowering planning time and accelerating updates.
These tendencies are promising, and I don’t doubt that they might enable authorities companies to harness the mainframe’s information storage and processing energy whereas additionally adopting the agility that has been the hallmark of Silicon Valley.