COBOL is often lumped into the legacy systems category and is at the core of major systems in state and federal government, insurance and banking industry, and much more. Very popular in the ’60s, ’70s and ’80s when major core systems that run backroom operations were being implemented, it was considered the language of choice because it allowed programmers to logically breakdown and document business processes.
Prioritizing the transition from legacy systems to something more modern is always top of mind of every CIO regardless of industry or sector. Modernization offers opportunities to reimagine business processes, address graying workforce concerns, access larger pools of skilled workforce, lower operating costs, and transition away from big iron and its costly replacement cycle (cap costs).
I believed that and worked tirelessly throughout my career to justify transitions away from legacy systems to more modern software platforms, unfortunately with mixed results.
IT leaders were under constant pressure to manage their legacy portfolio. Legislators often refused to allocate new funds to make meaningful changes and only allocated enough to keep the railroad running. Sometimes the easiest and most logical conclusion, wholesale replacement, was the most difficult to take on and toughest to justify. CIOs waited for major events or crisis to justify transitioning — Y2K, 9/11, Obamacare, cloud computing, etc. But even when projects were fully funded, results were mixed and highly visible failures were too risky to politicians and CIOs alike.
The late Carey Edwards who once ran for governor in New Jersey once said, “the DMV won’t help you win an election but it will definitely make you lose one.” He was pointing to his efforts to modernize the Department of Motor Vehicles’ systems from COBOL to more modern technology in the mid ’80s, and the project failed miserably. It was actually the State IT department that bailed DMV out of the crisis — but it took three years, and Edwards lost his bid of becoming governor.
That visible failure sent a chilling effect throughout state government and code modernization took a backseat to the less risky proposition of hardware re-platforming. So agency focused more on downsizing from mainframe to mini/micro computing to claim reduction in operating costs and data center footprint. The legacy code in many instances went untouched.
With wholesale replacement being too costly, time consuming, and high risk for visible failure, CIOs began exploring other strategies like solutions to encapsulate or shield the COBOL code so programmers can innovate applications around COBOL without disturbing it.
There was some success accelerating internet-enabled processes and services, but the practice was later criticized as “perfuming the pig.” You still needed staff around to maintain the COBOL code but now you also need a different team of coders to maintain the front end and the two often clashed when working together. Most of the new funding went to the new sexy and shiny object while COBOL coders got enough to keep things barely running. To boot, CIOs now have an overly complicated operation with one foot in the old world and one foot in the new world.
Fast forward 20+ years and here we are: quarantined at home to lower risk of catching and spreading a virus, economy at a near standstill, and legacy systems being blamed for not keeping up with demand for servicing the public, state unemployment insurance and IRS systems being most visible.
This is déjà vu all over again (or in the words of Jean-Luc Picard, we’re trapped in a temporal causality loop). Politicians, software vendors, and integration consultants begin lobbying for hundreds of millions of dollars in new stimulus package for wholesale replacements of unemployment and other similar public-facing legacy systems. Many of these unemployment insurance systems were supposed to be renovated from the last infusion of cash they received during the last stimulus package from over a decade ago. Vendors were able to convince agencies to tackle public-facing components first (customer-facing eligibility determination website, upgrades to telephony and call centers) and leave the modernization of COBOL code to a later phase.
Unfortunately, over time, funds dry up, and incoming new administrations have different priorities and are often critical of long-term IT projects. At the end of the day, COBOL modernization is put on a back burner and CIOs are forced to keep the railroad running. From a political perspective, there’s nothing sexy about modernizing COBOL code — that doesn’t get you votes because the average citizen doesn’t understand or care. So COBOL continues to live another day but with less available talent.
Allocating another batch of stimulus funds and expecting different results will be a big mistake. So why not try something different: Stop fighting and putting down COBOL and let’s figure out how to embrace the fact that while COBOL is old, it’s stable. The cost of wholesale replacement far outstrip the risk of keeping and managing COBOL. Of course, not every COBOL system is worth maintaining, but the large core systems at the heart of government processes deserve a second look.
Constructive solutions can emerge if we stop demonizing COBOL and using the “legacy system” label as a convenient scapegoat to everything wrong with IT modernization. Legacy COBOL code is not the evil that lurks in our data centers, it’s legacy business processes that predated COBOL.
If stimulus dollars are going to be carved out to modernize core government systems, I would hope that more attention is spent to update legacy business practices and policies before tackling code modernization. Otherwise, we’re repeating inefficient processes mistakes but now with newer shinny code. Let’s not repeat the past. We’re not smarter than history.
Knowledge is power!
Subscribe for free today and stay up to date with news and tips you need to grow your career and connect with our vibrant tech community.