Scanning, Documenting, Validating & Visualising mainframe lineage
Solidatus’s legacy software scanning capabilities allow organisations to scan the most exhaustive list (50+) of mainframe languages and automatically generate a visual metadata model complete with field level lineage, processing logic and business glossary in a fraction of the time and cost it would normally take. This process can be a one off or automated to occur as required. Example of supported languages: Adabas, ADS/OL, C++, CICS, CSD, CICS Tables, COBOL, Delta, DL/I, Easytrieve, Focus, Fortran, Ideal, IDMS, IMS, JCL, Link Decks, Load Modules, Mantis, Model 204, Natural, PL/1, SQL, Syncsort, Telon, etc.
In addition to mainframe scanning, Solidatus has a rich suite of connectors and a powerful API that enables organisations to fully automate metadata and data lineage documentation from various sources, including data governance tools, data dictionaries, databases, big data platforms, cloud platforms, ETL tools, reporting software, BI tools, spreadsheets, programming languages and bespoke systems. All changes are versioned, audited and can be augmented with manually maintained metadata and lineage.
Demonstrable Data Lineage
Solidatus mainframe scanning automatically identifies documents and allows validation of legacy code by creating a data catalogue and demonstrating visualised data lineage, tracking data use throughout the complex processes locked away within a mainframe.
Solidatus, through its unique metadata functionality, provides an organisation with the ability to enrich their mainframe data, allowing for increased understanding to help drive business intelligence, turning an organisational cost into an organisational investment.
Sharing & Validating
Solidatus visually enables an organisation to quickly and easily share and validate their mainframe data lineage in an interactive and dynamic format, allowing greater understanding and transparency to senior management.
Data quality RESOLUTION
Solidatus allows organisations to visualise their entire data landscape within one model or drill down to a single data flow. This transparency allows an organisation to quickly identify and resolve data quality issues and demonstrate that they fully understand their process and illustrate regulatory compliance.