Editor's note: This article was originally published in American City & County, which has merged with Smart Cities Dive to bring you expanded coverage of city innovation and local government. For the latest in smart city news, explore Smart Cities Dive or sign up for our newsletter.
Most branches of government, from local city and county agencies all the way up to the largest departments at the federal level, still run their data architecture on mainframes. But the very term “mainframe” conjures up images of outdated technology from the earliest days of computer technology. This misconception, combined with government shifts to eliminate wasteful systems, has intensified efforts to modernize IT infrastructure and architectures.
The notion that the mainframe is dead is a misconception that started in the late 1990s by an uneducated financial analyst. As a result, members of the mainframe community have spent the last 25-plus years dispelling an image that was never true to begin with. A more accurate characterization of the mainframe’s evolution is that IBM has ensured it remains at the core of our modern world by releasing amazing advancements in performance, capacity and security for decades.
Further muddying the waters is the concept that mainframe modernization only involves the mainframe. The reality is that the mainframe is just one server within a broader IT ecosystem. And modernizing could mean different things to different organizations, particularly at state and local government levels. All these facets combine to create a general misunderstanding about the mainframe and, in some cases, the dismissal of modernization initiatives.
Defining system modernization
Modernization isn't simply about replacing one platform with another platform; that just means that data has shifted from one system to another. Rather, true modernization is the culmination of several things — enhancing application performance, reducing data latency, implementing additional security, improving end-user experiences, and integrating these improvements within a well-thought out and “future proof” technology stack.
IBM has done an admirable job of keeping mainframe technology relevant. Yet public perception often focuses disproportionately on visible changes, such as user interface (UI) updates, rather than backend improvements. For many, a modern UI is likely to be far more noticeable and appreciated, even if the back-end architecture isn’t technically sophisticated. The same can be said about federal agencies that conversely require immense processing capacity to serve the public and prioritize optimization over modern front ends.
For state, county and local governments, technological advancements are influenced by financial budgets. This means that modernizing the use of the mainframe may take a backseat to UI updates that would be more noticeable by the public. It could also mean that federal regulations impose unforeseen changes to long-term modernization plans that disrupt direction, impact project team size or completely change desired outcomes.
Mainframes: One size doesn’t fit all
Mainframe modernization efforts require careful consideration of what should be replaced and how the IT team should approach it. It requires a knowledgeable project management team that can help the organization stick to their plan. However, as leaders and priorities change and as budgets ebb and flow, it’s easy to get distracted by new concepts and technology. And before long, a modernization project is stalled in favor of another project.
One concept being debated in government is whether emerging trends, such as artificial intelligence (AI), are accelerating or stalling modernization efforts. Many leaders, including many government decision makers, tend to latch on to the latest buzz around emerging tech and, without fully understanding the concept or its applicability within their organization, either insist upon its immediate implementation or delay other important updates while waiting for an enhanced version of what’s to come.
Some mainframe modernization efforts have been referred to as "rip and replace," implying rapid, wholescale change. But those within the mainframe community know that such a description is highly inaccurate, as it implies a level of speed that’s simply impossible to achieve. Even the most rapid mainframe migrations require careful planning and diligent execution. Those businesses who rely upon mainframes for one reason or another most often prefer incremental modernization efforts that can be phased in using methodical approaches with fallback procedures as safeguards.
When it comes to mainframe architecture, there is no universal playbook that every company and government agency can leverage. Solutions will be uniquely tailored to business needs and their amount of risk tolerance. Things become even more complicated in government roles where leaders or decision-makers may change every two to six years. It’s common to see an agency make a commitment to get off the mainframe and begin taking steps in that direction, only for priorities or business needs to change mid-migration. This is why it’s important for leaders to plan for delivering “modernization” in phases and stay focused on each phase’s outcomes.
The mainframe brain drain
It’s no secret that there aren’t enough software engineers in general, given the boom of “non-tech” companies building engineering teams to address bespoke business needs. Many non-tech companies now have engineering teams that rival companies that are considered “tech” companies (think retailers who sell things but have entire engineering teams dedicated to building bespoke software to give the retailer “an edge.”). However, this shortage is amplified in the mainframe space because of the uniqueness of the platform and the lack of understanding from young tech professionals who think the mainframe isn’t cutting-edge technology. Luring younger engineers to pursue careers in mainframe is an effort. While the mainframe community in general is investing in a multi-pronged PR campaign of sorts to attract more interest from younger tech minds, cross-skilling new talent into the mainframe space through the implementation of hybrid models seems to be an effective approach.
A hybrid model blends mainframe reliability and security with the flexibility of Linux, Unix and Windows (LUW) platforms or cloud platforms such as AWS or Azure. Despite cloud providers' popularity, companies and government agencies often recognize that critical transactional operations and secure processing are still best handled by mainframes.
Hybrid models and data ownership
For state and county agencies that need to have real-time transaction processing power coupled with reliability and uptime, mainframes will likely continue to be a core part of the IT architecture. For those agencies that rely more upon virtualizing data, stored on different disparate servers for the end users, it’s more likely that a hybrid model will best serve the business’ needs.
Data ownership has been a hot topic for a long time. The European Union passed its landmark GDPR privacy laws all the way back in 2016 forcing businesses to best shore its business-critical data by 2018. But there are still significant questions around data privacy in the United States, particularly as many of the biggest cloud-based data solutions become more deeply intertwined within the government organizations. It’s yet to be seen if the closer relationship between cloud services and policy will result in state and local government services being forced off mainframes, but the possibility doesn’t seem far-fetched.
This raises questions about the public’s data privacy. If we see sensitive data currently stored on mainframe servers migrating onto AWS cloud-based solutions or a hybrid cloud/mainframe model, added layers of tools, processes and oversight will need to be implemented to ensure that specific types of data remain secure. Government agencies are still notably cautious about migrating sensitive data to cloud platforms due to complex security measures. But budgetary constraints, especially at state and county levels, will further complicate these decisions around modernization and cloud adoption.
The future of mainframe modernization
The mainframe era is far from dead; it continues to transform to meet evolving demands. Longer-term strategic hybrid approaches, supported by AI-driven analysis, automation and advanced security measures, will continue to influence the future of modernization efforts. However, addressing the skills shortage in mainframe administration is critical to ensuring sustained progress.
Ultimately, the goal is still clear: to maintain robust, secure and responsive infrastructure that can adeptly support the demands of citizens and businesses. Through careful planning, modernization efforts can achieve both operational efficiency and public confidence in government infrastructure.
Commentary is a space for state and local government leaders to share best practices that provide value to their peers. Email Smart Cities Dive to submit a piece for consideration, and view past commentaries here.
About the Author
Jennifer Nelson, CEO, Izzi Software, has spent the bulk of her career in mainframe space, including 15 years at Rocket Software and five years at BMC. In 2019 she left Rocket to work in senior engineering roles at global technology companies outside of the Z world. In early 2024 Nelson began working on what would become Izzi Software, which is currently acquiring companies in the Z Systems and IBM Power sectors.