By Anthony Mangini

Data Integration API Development and Legacy System Modernization.

We tried to automate a simple HR workflow last year. Pull employee data from our HR system, match it against records in another platform, update the relevant fields, done. The kind of thing that should take an afternoon to set up and then run itself.

It didn't work. Our HR platform, iSolved, offers a limited, read-only API. To access even basic data programmatically, we had to go through a third-party vendor and pay for the privilege. What we got back still wasn't flexible enough for our specific workflows. No ability to write data back. No way to tailor the integration to our needs. So we built a workaround: manual export, automated matching script, output to a spreadsheet that HR uses to batch-enter updates instead of going case by case. It's better than fully manual. But it's duct tape on a problem that shouldn't exist.

This isn't a story about one vendor. It's a story about what happens when a system wasn't built to connect to other systems. And it's playing out in organizations of every size, across every sector.

The Integration Tax

The federal government spends more than $100 billion annually on IT. Roughly 80% of that goes to operating and maintaining existing systems (GAO, 2025). Not building new capabilities. Not funding innovation. Keeping the lights on.

GAO's latest review identified 11 critical federal legacy systems that collectively cost $754 million per year just to maintain (GAO, 2025). Some run on COBOL and Assembler code from the Kennedy administration. The average COBOL programmer is approaching 60, and 10% of them retire every year. That institutional knowledge isn't being replaced.

But federal mainframes aren't the only problem, and they're not the most common one. The more typical version of "legacy" is a system like our iSolved example: it does its primary job fine, but connecting it to anything else requires workarounds, vendor negotiations, or both. Data goes in. Getting it back out in a usable format is a project in itself.

Every organization has at least one system like this. The CRM that marketing bought five years ago. The case management system that operations has been customizing since 2014. A financial platform nobody is allowed to touch. And somewhere in between, spreadsheets and manual exports fill the gaps that APIs should.

Why This Matters More Now Than It Did Two Years Ago

Integration debt has always been expensive. In 2026, it's become something worse: a blocker for automation and AI.

Consider the progression. You want to automate a workflow that spans two systems. To automate it, you need data from both systems in a structured, reliable format. If one system won't expose its data through a capable API, the automation stalls before it starts. You're back to manual exports and spreadsheets.

Now scale that to AI. AI tools need connected, clean, accessible data to do anything useful. If your employee data lives in a system with a read-only API that requires a paid third-party intermediary, you're not building AI-powered HR workflows. You're not even building basic automations. You're paying someone to copy data between systems, which is the exact opposite of where organizations need to be heading.

Contrast that with a platform built API-first. We use Atlassian's ecosystem across Tactis, and the difference is night and day. Jira and Confluence expose comprehensive REST APIs with full read and write access across well-documented endpoints. We use those APIs to update documentation across the platform as policies and processes change, pull project and client data for executive reporting without manual assembly, and maintain a single, reliable source of truth across teams. The automation possibilities compound because the platform was designed to let other systems in. Nearly every operation you can perform in the browser, you can also perform through the API.

That's the real divide in 2026. It's not between organizations with new systems and organizations with old ones. It's between organizations whose systems can talk to each other and organizations whose systems can't. The first group is automating workflows and experimenting with AI. The second is still exporting CSVs.

What API-First Actually Means

An API is a standardized way for two systems to exchange data. When an organization takes an API-first approach, every system (new or modernized) is designed to share its data through well-documented, secure interfaces from the start. You build the translation layer once, and any authorized system can plug into it.

The practical difference is speed. Adding or replacing a system means connecting it to the API layer instead of rebuilding integrations with every other system individually. That's the difference between a six-month integration project and a six-week one.

APIs also enable real-time data flow. When a case status updates in one system, every connected system reflects that change immediately. No more nightly batch exports. No more "the data will be current by tomorrow morning."

For a federal agency, this might mean building secure API connectors between a NICE contact center platform, a ServiceNow case management system, and a Drupal public-facing website. For an association, it could mean connecting the membership database, the event management platform, and the email marketing system so member data stays synchronized without someone running manual exports every week.

What to Do When You're Stuck with a Bad API

Not every system can be replaced on your timeline. You might be mid-contract, or the switching costs are prohibitive, or the system does its core job well enough that a full replacement isn't justified. The API is the problem, not the product.

This is where middleware and integration platforms earn their keep. Tools like MuleSoft, Workato, or even lighter-weight options like Make can sit between a limited system and the rest of your stack, normalizing data formats and bridging gaps that the native API can't cover. You're essentially building a wrapper: a modern API layer around a system that doesn't have one.

Sound familiar? It should. It's the same principle as strangler fig modernization, applied at the integration layer instead of the application layer. You don't wait for the vendor to fix their API. You build around the limitation and reduce your dependency on it.

The other lever is market pressure. Vendors whose APIs can't support modern integration patterns are increasingly losing deals to competitors who can — but only if buyers make API quality a real selection criterion. As we've seen across projects, asking whether a system has a fully documented, complete API isn't just a technical detail; it is a long-term operational decision. Too often, that question hasn't carried enough weight during procurement, and the result is what many teams are dealing with now: brittle workarounds, dependence on outdated approaches like EDI (structured file-based document exchange, a concept dating to the 1960s), and limited ability to automate or extend functionality. A system with great features and a locked-down or incomplete API will cost you more in workarounds than you saved on the license.

Modernizing Without Ripping and Replacing

The "big bang" replacement approach (retire the old system, build a new one, flip the switch) fails more often than it succeeds. Martin Fowler called the alternative "strangler fig" modernization: instead of replacing the old system all at once, you build a modern layer around it. New capabilities connect through APIs, old functionality migrates in phases, and eventually the legacy system retires because there's nothing left for it to do.

The risk profile alone makes this worth considering. Each phase delivers value independently. If something goes wrong, the legacy system is still running. Organizations get integration benefits within the first phase instead of waiting 18 to 24 months for a full replacement. Kyndryl's 2025 survey of 500 IT leaders found mainframe modernization projects delivering ROI in the range of 288% to 362%, depending on the approach (Kyndryl, 2025).

There's also the institutional knowledge question. Legacy systems encode decades of business rules and process logic. A phased approach gives teams time to document, validate, and intentionally carry forward the rules that matter. That work gets lost in a big-bang replacement. The business logic disappears, and six months later someone asks why the system used to handle a particular exception and nobody can answer.

Where to Start

If your systems don't talk to each other and the workarounds are piling up, the starting point is simpler than most modernization pitches make it sound.

Map your data flows first. Understand how data actually moves through your organization today. Where are the manual handoffs? Where does data get duplicated or fall out of sync? Where is a system's API limitation blocking an automation you've already identified? That map becomes the blueprint for your API strategy and your automation roadmap simultaneously.

From there, pick the highest-value integration point. Start where the disconnection causes the most pain, or where the automation opportunity is clearest. Build a secure, documented API for that connection. Get it working, prove the value, and use that success to fund the next phase.

That's how modernization programs build momentum. Not with a five-year roadmap and a prayer, but with working integrations that make someone's job measurably better this quarter, and set the foundation for the automation and AI work that's already overdue.

At Tactis, we build integrations between complex system environments for federal agencies and mission-driven organizations, and we're increasingly focused on making sure those integrations support automation and AI readiness from the start. If your systems need to talk to each other better, or if you're staring at a modernization effort and want to avoid the big-bang risk, we'd welcome the conversation.

Contact Us

CITATIONS

All external statistics and claims referenced in this post. Verify before publishing.

[1] Federal government spends $100B+ on IT; ~80% on operations/maintenance of existing systems. Source: GAO-25-107795, "Information Technology: Agencies Need to Plan for Modernizing Critical Decades-Old Legacy Systems," July 2025. https://www.gao.gov/products/gao-25-107795

[2] 11 critical federal legacy systems costing $754M/year to maintain. Source: GAO-25-107795, July 2025. https://www.gao.gov/products/gao-25-107795

[3] IRS relies on 60+ year old COBOL/Assembler applications. Source: GAO High Risk Series, IRS modernization. https://www.gao.gov/highrisk/modernizing-irs-business-systems

[4] Average COBOL programmer approaching 60; 10% retiring annually. Source: Phil Teplitzky / IBM data (2019); Fujitsu (2020). https://corporate-blog.global.fujitsu.com/fgb/2020-05-26/whats-the-future-for-cobol/

[5] Mainframe modernization ROI of 288-362%. Source: Kyndryl 2025 State of Mainframe Modernization Survey (500 IT leaders). https://www.kyndryl.com/us/en/campaign/state-of-mainframe-modernization

[6] iSolved integration capabilities "still developing"; third-party vendors required for API access. Source: OutSail HRIS Review (2025); detamoov integration documentation. https://www.outsail.co/post/isolved-reviews-pricing-pros-cons-user-reviews

[7] Atlassian REST APIs: comprehensive read/write access, well-documented, nearly all UI operations available via API. Source: Atlassian developer documentation (2025). https://developer.atlassian.com/cloud/jira/platform/rest/v3/

[8] "Strangler fig" modernization pattern. Source: Martin Fowler. https://martinfowler.com/bliki/StranglerFigApplication.html