In an ideal world, following the digital boom that took place during the pandemic, organizations would no longer be using legacy systems. However, the reality of it is, replacing legacy tech with new automated systems is no easy feat as various challenges arise.
In waiting for automation and digitization, companies have had to integrate a range of disparate tools and workflows to cope with convoluted and sensitive environments that are set against a backdrop of countless market requirements. Although each tool and workflow provides a competitive advantage in specific markets, collectively they become difficult to manage.
Gaps form between different software systems and processes, data becomes siloed, manual intervention and associated human error spikes, and tasks take longer to complete.
To confront this challenge, companies are turning to automation. However, intense specialization across asset classes, regions, and functions, coupled with a reliance on technical debt-ridden legacy systems is needlessly delaying automation transitions. So what can they do to improve?
Aidan Murray explains what he believes can be done:
Solving the challenge of bringing automation into financial services
The financial services sector is complex and laden with niche specialisms. Companies have had to integrate a range of disparate tools and workflows to cope with such a convoluted and sensitive environment. One that’s set against a backdrop of countless market requirements.
Even the most carefully planned automation initiatives often result in sub-optimal processes that rely heavily on manual entries or temporary quick-fix solutions. Owed to a lack of resources, internal development constraints, and an absence of a clear return on investment (ROI) for implementing new software systems, true, holistic automation has remained out of reach for many.
The problem of legacy system prevalence
The prevalence of legacy systems has forced companies to build bespoke interfaces and rely on non-industry-specific integrations to knit them together. This is especially the case in those middle and back-office functions that are not directly responsible for generating revenue. It is an approach proven to create as many problems as it solves.
Most legacy systems use custom data models and schemas that cannot be changed. Any integrations are typically file-based, meaning they are neither programmatic nor API-driven. Furthermore, due to the deep propriety knowledge required to update or amend existing reporting and workflows, reliance on IT teams is intensified, as is the dependence on vendors to create new reports and data flows.
The result of these various inflexibilities colliding is financial services companies that remain stuck with an array of outdated systems and dependent on the expertise of technicians.
Data automation platforms hold the key
Data automation platforms have emerged as the most compelling solution to date for the integration challenges posed by legacy systems.
Other options, such as robotic process automation (RPA) or outsourced solutions, tackle legacy system challenges by increasing capacity. Either by programming bots to complete tasks in place of a human or by tasking more actual humans to perform integration activities.
However, simply increasing capacity fails to solve the underlying issues. Instead, processes are simply migrated to another solution for them to be performed in the same way as before. Only, it is hoped, more quickly.
Alternatively, an approach rooted in data re-engineers processes to become optimal and operates through a common framework that brings systems together without requiring any additional capacity – human or bot. Moreover, employees can then reallocate their workloads to higher-value endeavors.
Though there are multiple beneficial outcomes financial services companies can leverage by achieving automation, three stand out.
Three core outcomes from achieving data automation include:
System interoperability is a core goal for any financial services company. Where it is accomplished, consistent and critical data can be moved seamlessly between front, middle, and back-office processes in a way that legacy systems can never match
Leading platforms support even deeper interoperability by accommodating legacy systems and allowing financial services companies to create, build, or plug in their own data schemas. It means companies can create schemas for specific processes. Or, alternatively, data types that map in all system extracts or reports to standard schemas.
With data normalized, it can then be transmitted to downstream systems with ease.
EUCs (End-User-Computers) pose a significant control risk to companies with their associated errors costing the industry millions of dollars annually.
Yet, EUCs do bring flexibility and quick time to market when tackling manual processes, hence their continued prevalence. Here, leading data automation platforms enable the best of both worlds, with companies remediating existing EUCs by migrating them to the platform where all business logic is documented, audited, and version controlled.
Transformation engines within certain platforms go on to provide such flexibility that all Excel and macro-based functions can be migrated into environments that users quickly adapt to, given the similarity with the spreadsheet-based functions they’re familiar with.
Conversion of manual processes to STP (Straight-Through-Processing)
The conversion of manual processes to STP is crucial for increasing automation across financial services. Nevertheless, migration away from manual processes is impeded by a range of factors, such as client- and counterparty-specific requests or formats, legacy system inability to receive or create standard formats, and semi-structured or unstructured receipt of data.
In such instances, flexible and intelligent automation tools are required to produce STP. For example, if an accounting system or Portfolio Management System requires data from an external counterparty in a specific format such as a .txt file, but the counterparty data is sent as a PDF, data automation platforms complete the conversion automatically.
Where once an operations user had to key in the data into a new .txt file manually, the process is migrated to an STP flow, saving significant time and eliminating human error.
Choosing the right data automation platform
As with any technology, not all data automation platforms are equal. For a company to have come so far in a transformation journey only to realize it has invested in a solution that does not perform in critical areas is an incredibly frustrating experience.
The data solution you implement should be asset class, system, and data source agnostic for maximum flexibility and to eliminate the need for manual intervention or investment in extra solutions.
It should be able to receive data in any format, from SWIFT messages and end-of-day Excel and CSV batch reporting to alternative documents like capital calls, loan notices in PDF, or even non-digital formats. Moreover, it should be able to create data in any format.
Channels are also vital. Your platform should be able to receive data via all channels and systems of record, such as an IBOR, ABOR, or document repositories. It must be able to push data to any channel, from regulatory trade and document repositories to internal data lakes or warehouses.
Lastly, your platform must accommodate the specificities of legacy systems and processes and handle complex data and system workflows comfortably.
Troubles will remain until modernization
Until the legacy systems used within financial services have been modernized, companies will struggle to create real-time and non-file-based integrations.
Where this struggle persists, companies have little option but to rely on stop-gap solutions such as EUCs and manual processes to connect systems and processes. However, legacy systems are often so entrenched in core processes that removing them altogether is not feasible, making it critical that companies opt for data and process solutions that accommodate them.
Not only will the correct data solution improve legacy system performance, but it will also improve intra-system data quality. It will capture data accurately from legacy systems and processes and provide confidence that business rules are maintained and validated while delivering data in a structure or cadence that downstream systems of records can accept.