Anthropic’s COBOL Bet Shakes Mainframe Economics

Most people who deposited a paycheck this week, withdrew cash from an ATM, or filed a government form probably had no idea their transaction ran through COBOL.
Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.
Δdocument.getElementById( "ak_js_1" ).setAttribute( "value", ( new Date() ).getTime() );
If you haven’t heard of it before now, it’s understandable. It’s a programming language developed when Dwight Eisenhower was in the White House. That invisibility has suited IBM just fine for decades. It has not suited IBM’s shareholders this week.
Anthropic published a blog post on Monday (Feb. 23) describing how its Claude Code tool can automate the analysis, dependency mapping and documentation work that has historically made COBOL modernization so expensive. In response, IBM shares closed down nearly 13.2%, at $223.35, their worst single-day decline since October 2000, according to CNBC. IBM shares have now fallen 27% in February, on track for the company’s biggest one-month slide since at least 1968, according to Bloomberg.
The selloff reflects a deeper concern: if generative AI can meaningfully reduce the cost and time required to understand and rewrite legacy code, it could weaken one of the strongest sources of lock-in in enterprise IT.
An estimated 95% of ATM transactions in the U.S. run on COBOL, and hundreds of billions of lines of the code remain in production across finance, airlines and government. Migrating off those systems requires reverse-engineering decades of undocumented business logic, recreating data structures tightly coupled to mainframe environments, and executing transitions in industries where regulators set the pace. IBM has built a high-margin consulting and mainframe services business around that complexity, charging clients for incremental modernization work that can span years and cost tens of millions of dollars.
The talent dimension compounds the problem. The developers who built these systems have largely retired, taking their institutional knowledge with them. Production code has been modified repeatedly over decades, but documentation has not kept up, and COBOL is now barely taught in universities. That scarcity has made the skills rarer and more expensive and has reinforced IBM’s position as the de facto steward of systems no one else could confidently touch.
The irony is that IBM entered 2026 looking formidable. Q4 2025 revenue reached $19.7 billion, infrastructure sales rose 21%, and IBM Z Systems, its mainframe line, posted 67% year-over-year growth. The company’s generative AI book of business stood at more than $12.5 billion. The sell-off was not triggered by a missed quarter. It was triggered by a question about whether those numbers hold.
We’d love to be your preferred source for news.
Please add us to your preferred sources list so our news, data and interviews show up in your feed. Thanks!
IBM’s response was pointed. “New AI tools emerge every week, including our own,” the company said in a post on X. “What they do not change is the fundamental engineering challenge of running mission-critical workloads at scale.”
That pushback is not without foundation. Code translation is one phase in a migration that also requires replacing decades of middleware, data formats, compliance controls and disaster recovery architecture, all of it under the scrutiny of regulators who do not reward speed.
IBM has its own counter in the market: its watsonx Code Assistant for Z uses a 20-billion parameter model to translate COBOL to Java, and the company claims its internal Project Bob initiative has already improved developer productivity by 45%.
The decisive question is not whether AI can translate code in a lab setting. It is whether banks and regulators will trust a probabilistic model to refactor systems that move trillions of dollars.
Core banking modernization is governed by strict change management protocols. Every modification must be traceable, testable and explainable. A model’s output would need to pass layers of validation before deployment. Banks may use AI to accelerate analysis and documentation while keeping humans firmly in the loop for final decisions.
Even so, the direction of travel matters. If AI can shoulder more of the analytical burden, it could expedite the long-delayed shift toward cloud-native architectures. The cost and complexity of untangling legacy code have been a primary deterrent to migration. Lower those barriers, and the cloud roadmap accelerates.
That shift would ripple across vendors whose business models rely on the persistence of mainframe environments. It would also strengthen hyperscalers and cloud-native core banking providers positioned to capture new workloads.
Anthropic’s COBOL Bet Shakes Mainframe Economics
Fed’s Cook Says AI Productivity Boom May Call for Broader Policy Toolkit
Binance Teams With Ondo to Offer Tokenized Securities
Keurig Dr Pepper Sales Surge 10% as Corporate Split Nears
Originally published by pymnts.com on February 24, 2026.View original