paint-brush
How to Effectively Integrate a Third-Party Core in a Complex IT Infrastructure: A Banking Migrationby@romansaltanov
403 reads
403 reads

How to Effectively Integrate a Third-Party Core in a Complex IT Infrastructure: A Banking Migration

by Roman SaltanovNovember 29th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Revolutionizing bank operations: Migrating core systems & innovating seamless authentication integration for optimal performance.
featured image - How to Effectively Integrate a Third-Party Core in a Complex IT Infrastructure: A Banking Migration
Roman Saltanov HackerNoon profile picture

In the dynamic realm of banking technology, my team and I embarked on an ambitious project to migrate a bank's core operations to a modern platform developed by an external provider. This wasn't just an upgrade but a transformative integration that involved transitioning to a new data storage system and updating a multitude of API interfaces, upon which many internal software complexes relied.

The Main Challenge

The bank faced a unique problem in integrating its own session-based authentication system, tied to LDAP, with the specialised authentication method of the new core banking system. The serious question arose: how to smoothly transition to the new system without changing the existing software complexes that were not only vital to our operations but also owned by another company.

The Breakthrough

To address this challenge, I developed an innovative 'authentication converter'. This intermediary layer was designed to authorise requests within the internal system and generate corresponding tokens for the new core system. It could handle a diverse set of API protocols, including XML and JSON, while maintaining the existing methods of interaction.




Requests from internal software complexes are redirected to a load balancer and then to the newly developed system. The requested URLs are matched with URLs corresponding to the new system from the URL link storage system, which also stores various additional settings for each link.

Technical Complexity

This integration supports HTTP methods such as GET, POST, PUT, DELETE, and works with HTTP headers that also contain parameters for configuring the application's interaction with requests.

Redis-based caching is used for requests that indicate the corresponding parameter in the HTTP header or according to the configuration for the given URL, which allows for increased operational speed and database unloading for APIs that permit caching.

A monitoring system has also been implemented to identify performance issues, enabling prompt responses to emerging difficulties.

Performance and Cost Savings

The new system now processes about 30,000 requests per second, demonstrating its efficiency and scalability. The bank has achieved significant cost savings by developing this solution in-house, which amounted to hundreds of thousands of British pounds, instead of using the main system developer's integration services.

The Future is Here

The entire infrastructure of the new program is deployed on AWS in a Kubernetes (K8S) cluster, which allows for the successful processing of requests from hundreds of internal systems with the ability to scale as needed.


The implementation of such a system resolved the task in a short time, while the system itself was designed with maximum simplicity and high-speed performance in mind. Currently, almost all services pass through this system, which effectively and efficiently handles the load, testifying to the successful resolution of the integration challenge.