It’s Time to Bring Legacy Data into the Modern Era

From law enforcement and tax administration to healthcare and social services, data is the lifeblood of government operations. In addition to being at the core of traditional services, data is also the foundation for a new generation of digital initiatives.

Across all levels of government, agencies are looking to improve governance by analyzing and sharing data. Meanwhile, initiatives such as the White House’s Fifth Open Government National Action Plan are pushing agencies to make their data more transparent and available to the public.

The challenge is that many agencies can’t make use of all their data, because a lot of it is siloed in legacy systems and formats. Modern data solutions are designed to work with state-of-the-art applications and devices in cloud environments, not with mainframes and other legacy systems.

Accessing the data isn’t the problem; the challenge is integrating it so that it can be easily put into the right hands in usable formats. This is why agencies need a holistic view of their data, regardless of its original source, in a single location, in a usable format and accessible via the cloud.

We recently developed a Market Trends report that provides insights into how agencies can overcome the challenges of making use of legacy data by moving it to the cloud, explains best practices for integrating the data and offers examples of how it can help agencies to meet their mission goals.

Legacy IT Systems Are Holding Agencies Back

One of the key findings of this report is that legacy systems are holding agencies back. A 2021 KPMG report found that 60 percent of executives from federal civilian agencies, as well as the Department of Defense and state governments, believe that their current IT systems are extremely, very or moderately hurting their ability to integrate new tools and technologies.

In addition, the same report found that 79 percent of government officials said the age of their IT systems negatively impacts their missions. And 50 percent of U.S. state chief information officers say a majority of their applications are in need of modernization.

recent GAO report also found that 80 percent of the more than $100 billion the federal government spends each year on IT and cyber-related investments is spent on operating and maintaining existing IT, including legacy systems.

Turning Fragmented Data into a Holistic View

Many legacy systems have been around for 20 or 30 years for a reason — they are mission-critical. They have been at the center of what an agency does, performing critical, transaction-based functions while collecting vast amounts of information.

The challenge is putting that data to use, especially as agencies evolve toward distributed cloud-based systems. Agencies often face common difficulties in making use of that data.

These difficulties include newer systems being managed by different people than those managing legacy systems, many of whom may be approaching retirement. Data in legacy systems is also often stored in different formats, with different nomenclature.

The inability to make use of that data can impact government services. For example, the federal Centers for Medicare and Medicaid Services (CMS) recently said that many states were mistakenly cutting children off from Medicaid services while performing a large-scale CMS-requested eligibility review.

In this instance, the data attached to a child, such as a parent’s income, could be in a legacy format that’s “not being married-up” to other pertinent data. This potential data mismatch could prevent states from getting a full picture of a child and why they should be eligible for Medicaid.

The Solution: An Integrated Enterprise

Agencies at all levels of government are in various stages of modernizing their systems. But while this can be a long, challenging process, modernizing their legacy data can be done fairly simply with data integration. And modernizing that data is essential to enabling real-time, data-driven decision-making.

There are three key steps to achieve this: liberating the data, connecting it through data repositories and then building the data pipeline.

The first step of liberating the data is all about creating a comprehensive data dictionary, which involves understanding the structure and organization of the mainframe data, identifying the different data stores, and mapping the data elements to their corresponding applications.

By connecting it through data repositories, it is possible to converge data from both legacy systems and new sources, which allows agencies to transform and enhance it into useful information for real-time decision-making. Analysts and data scientists can easily search the data for useful information.

And the third step around building the data pipeline is using pre-built connectors and adapters that can automate the seamless movement of data to target systems, such as a centralized cloud data warehouse like Snowflake. This process removes the need for time-consuming and error-prone custom coding, while making sure the data flows are scalable and reliable.

Through these three key steps, integrating legacy data allows agencies to deliver on a wide variety of public services.

In healthcare, for example, integrating electronic health records, hospital systems and disease surveillance systems enables real-time monitoring of health trends or early detection of outbreaks. Meanwhile, cities can optimize urban planning, traffic management and waste management, among many other public services. And emergency response agencies can factor in everything from situational awareness to weather forecasts in planning responses.

Conclusion

Digital transformation projects have brought government agencies forward into a new age of distributed, cloud-based computing that generates massive amounts of data. But much of the data that agencies need to perform their missions is stored in older, siloed – though mission-critical – mainframes or other legacy systems.

Making legacy data accessible in usable formats and combining it with data from new sources is critical to efficient agency operations, data sharing initiatives and meeting compliance requirements. It requires a data integration solution that provides a holistic view of data across the enterprise and the tools to enable advanced analytics and any other steps necessary to put it to use.

A data integration platform, enhanced through industry partnerships, allows agencies to move legacy data to an accessible cloud by integrating it with information from multiple new sources and performing other advanced analytical operations.

It’s the best way that an agency can make use of all of its available data to support accurate and real-time decision-making.

Chris Oskuie is the Vice President, State & Local Government and Education at Software AG Government Solutions.

Read article here: https://governmenttechnologyinsider.com/its-time-to-bring-legacy-data-into-the-modern-era/

Related-Content

New Adabas Security Options for z/OS: Auditing

Presented by Eric Wood, Principal Systems Engineer, Software AG Government Solutions About this talk With the almost daily news of data breaches, can you afford not to implement mainframe Adabas…

Data Privacy and the Evolving Regulatory Landscape

As state and local governments collect ever-growing volumes of constituents’ data, protecting consumer privacy has become a significant concern. It has also increasingly become the focus of government regulation. For…

The future doesn't wait. Why should you?

Let’s talk about your technologies and infrastructure, so we can keep your mission moving forward.

Speak with a government IT integrations expert