Why Baton CORE is architected using DLT
How we used DLT to build a platform able to interoperate with the core ledgers, payment systems, and post-trade systems of the banks & execute distributed workflows, & exchange assets with legal settlement finality
This blog forms the second part of our distributed ledger technology (DLT) mini-series. It looks at the architecture of Baton CORE™ and specifically at why it was architected on distributed ledger technology.
Before I founded Baton Systems, I spent about 18 months meeting with around 30 different banks to learn about the problems of the post-trade world – specifically in the clearing and settlement of large value assets. The issues were always the same: the movement of assets was slow and expensive, and involved a lot of risk – with operational delays and inefficient operational processes the norm across banks.
I carefully examined both public and private/permissioned blockchains and found some interesting constructs in each, but neither of them solved the core problems. The major roadblock we faced was that a bank’s middle and back-office systems are complex, with sets of business processes and, more importantly, of technology components, all playing an important role. Not all of this could be disintermediated and in many cases it would be undesirable to do so.
Interoperability is key
“If you want to make improvements in the post-trade world, you need to interoperate with existing business processes and systems, as well as with existing legal frameworks within banks.”
So we soon realised that if we were to really address these problems, with a workable solution, we needed to build out a distributed ledger technology from scratch.
For this to work, we knew that our technology would need to interoperate with a number of internal systems within each bank. For example, the system used to manage entitlements is complex when you’re looking at multiple legal entities within a bank, and even more complex when you’re looking across banks, because one bank’s set of entitlements will not work for another. So for interoperability you need something that’s able to fit and interact with a bank’s complex systems, business processes and security models.
The other area where interoperability is important is within a bank’s core payment systems. Every payment request for example that originates or terminates in a bank is required to go through compliance checks such as AML and OFAC before it hits the streets. If you bypass these, then essentially the chain is doing something that doesn’t fit in with the legal frameworks that banks need to comply with.
Seamless integration and venue connectivity
To address this, Baton built what I like to think of as core gateway patterns. In Software, we refer to these as enterprise integration design patterns. Through these well architected, secure and tested modules, we were able to integrate seamlessly with a bank’s existing systems, be they security systems, payment gateways or post-trade systems.
Our next task was to build infrastructures and connectivity into the core settlement venues, such as exchanges, custody banks and nostro accounts. So, brick by brick, we built native connectivity into the largest exchanges globally.
We built the same native connectivity to custody banks and to nostros and SWIFT gateways. What we essentially achieved was out-of-the-box connectivity to these core venues. We also provided gateways to integrate with the existing infrastructures of the banks. So we now have connectivity into the banks and connectivity into the settlement venues.
Middleware: where distributed ledger technology comes into its own
“We wanted to make sure we brought in the three core constructs of distributed ledger technology: transparency, workflow and auditability.”
The most important thing was to bring in constructs of real-time and collaboration between parties interacting on the ledger.
People have become used to trading in real-time, in nanoseconds, while the post-trade processes take two days, or sometimes even longer. With our use of DLT we were able to provide the ability for users to view the lifecycle of a trade, to watch the exposures and obligations build in real-time and to start collaborating with their counterparty on the Baton platform. “This is an industry game changer.”
There were blockchains also working to solve these issues, but to our knowledge, these projects hadn’t previously been able to move beyond a proof of concept. This may have been because they did not provide for the key constructs of the gateways that needed to integrate with the core systems of a bank. By connecting into these core systems, and into the venues – that is, the exchanges, the custody banks and the nostro accounts – we made collaboration possible. “By using distributed ledger technology we have made smart contracts really smart, and provided the ability to execute them across institutions, allowing for real time visibility and collaboration.”
Customisable distributed workflows
Of course, every bank is different, so how do you customise the workflow of an individual bank whilst still allowing interoperability with others? We found the answer by using the constructs of public and private steps, which I will cover in a future blog post – our technology allows a bank to customise a smart contract and fit in with existing business processes whilst still being able to collaborate actively with a counterparty. By enabling this we have been able to bring significant benefits to the market.
As a result, through our use of distributed ledger technology we were able to build a platform with the ability to interoperate with the core ledgers, payment systems, and post-trade systems of the banks and execute distributed workflows, and exchange assets with legal settlement finality.
Baton’s technology improves market efficiency
Earlier I mentioned the problems of high costs, delays and lack of auditability. With our use of distributed ledger technology we have been able to significantly cut costs and delays for our clients.
This achieves real efficiencies for market participants.
For auditability, we have benefited from the core constructs of the blockchain – that is, tamper resistance and non-repudiation. We are very enthused about the possibilities that this opens up.
“We are excited about what we have built and I think the markets are too. We are very humbled by the efficiencies we are able to bring to our early adopters.”
I hope that I have explained the unique benefits of Baton CORE’s distributed technology architecture. Please reach out if you have any questions or would like further information by emailing [email protected].
Watch to learn more about Why Baton CORE is architected using DLT
MORE BLOG POSTS
DLT, blockchain and financial market infrastructure
INTEROPERABILITY: Why it’s important and how we deliver interoperability with Baton CORE
How the Baton CORE platform allows for extensibility
DLT, blockchain and financial market infrastructure
Welcome to the most recent edition of CORE Thoughts, discussing the benefits distributed ledger technology (DLT) presents to capital market infrastructure, the differences between a blockchain and DLT and how we have used DLT to architect the Baton CORE™ platform.
The benefits distributed ledger technology offers to capital markets
Significant investment in front office trading technology means trades can now be executed in a nanosecond. This level of efficiency has generated an exponential growth in trading volumes. Yet the processing of trades meets a bottleneck at the banks’ middle and back office systems which are markedly slower, often involving manual operations. Leading to the question: How can post-trade systems be re-engineered to attain efficiencies closer to those of front office systems?
Let’s first consider the functions these middle and back office systems actually perform in practice. They consist of complex business processes, often under a contract like ISDA involving operations teams across financial institutions, each using a series of hardwired, siloed and inflexible technologies. Most of these processes or technologies have not changed much in the last 20-30 years. So the large volume of front office trades being fed into the post-trade plumbing are required to move through processes that haven’t kept pace with changes to the business.
This has resulted in significant inefficiencies across the entire post-trade process. These inefficiencies lead to increased costs, significant delays and substantial risk to all participants. It is this inherent inefficiency in the base technologies powering middle and back office systems where applying DLT could prove transformative.
The winning features of a DLT platform we like to summarise as TWA: Transparency, Workflow and Auditability. Let’s explore why these are important.
Transparency
When constructing systems to execute and process huge volumes of trades across multiple institutions – transparency is critical. To have 360 degree, real-time visibility of these trades across institutional boundaries creates opportunities to integrate and streamline workflows and makes every step of the process completely auditable. Transparency enables banks to create digital workflows with powerful new features that were previously unfeasible.
Workflow
Domain models: Post-trade processes are executed within the context of a legal framework and/or a rulebook. Domain models such as the common domain model enable the digitisation of post-trade events, actions and workflows with the context of a “digital contract” that mirrors (or mimics) the contracts that exist and are agreed to between institutions today. This along with the relevant rulebook provides the framework for enforceability of the workflow processes between members.
Smart contracts: A smart contract is a piece of software that is processed by a distributed ledger or a blockchain. It’s essentially a software module that formalises and executes agreements between participants and comes with inbuilt compliance and controls. Our view is that the software needs to be combined with a strong rulebook that defines the roles, responsibilities, liabilities and legal agreements and compliance to existing business processes rules as well as rules relating to the exchange of assets and liabilities. Further, a rulebook should also define the roles of the various stakeholders – participants, technologies, operators and intermediaries (which may or may not be financial institutions), as well as clearly articulate the steps in automation. We believe that this is what makes the software “Smart” and a “Contract”. At Baton, we refer to our use of Smart Contracts as distributed workflows.
Auditability
When dealing with high value transactions it’s essential to have end-to-end auditability of the lifecycle of the trade or business process. To achieve this, being able to view the complete lineage from the moment the trade entered the process to its termination, often spanning operations over 12 months or longer is preferable. Well designed DLT systems with a domain model provide that lineage in a tamper-resistant, fully automated and auditable manner.
It’s this unique combination of transparency, workflow and auditability possible through the use of DLT that really enables financial institutions to address the issues of cost, delay and risk they face with their existing post-trade infrastructure at their very core.
Differences between distributed ledger technology and blockchain in the context of financial markets
There’s been a great deal of discussion on this topic recently and I think it’s important to bring it back to first principles to understand the key differences between DLT and blockchain.
First, let’s consider the differences between public and private/permissioned blockchains.
Trust is essential with the public blockchain. The public blockchain operates a peer-to-peer model in which no individual entity is seen as trusted. Rather, it is the community in its entirety that creates trust via the use of a shared, unalterable digital record.
This is quite different from the permissioned blockchain which actually operates in an intermediary format. Here we have a trusted intermediary – a third party – which is helping govern the interactions between peers.
I am not saying that one is better than the other, but it’s clear that the world that we’re working towards, which is likely to include a very different set of requirements in 2030 than it does today, will likely include both public blockchains and private/permissioned blockchains.
If we turn to the transition between the public and the private/permissioned blockchains, at Baton we view the latter as much closer to distributed ledgers and in my experience there are various instances where they’re more appropriate for the structure of today’s financial markets.
Why we architected the Baton CORE platform using distributed ledger technology
Our roadmap aimed at solving a very specific problem: securely speeding up clearing and settlement across institutions and radically reducing the significant level of risk exposure existing practices introduce today. We felt that the most effective way to do that was by bringing a spirit of transparency, speed and auditability to these processes.
We also had to make sure our solutions would interoperate with the existing business processes, systems and operating rules of financial institutions. We realised that there were certain gaps in the functionality offered by both private/permissioned and public blockchains. Public blockchains were not an option for private high value contracts, so we went with the private/permissioned route. But even that had pretty significant challenges.
The challenges primarily stemmed from facilitating the constant interactions with core systems. Let’s take as an example an intermediary that is extending credit to a counterparty. It has to ask: Does this entity have sufficient credit in order to execute this transaction? The process must accommodate constant inputs and outputs to the chain. This is where both private and public/permissioned technologies have brittle characteristics – which inevitably break in operation.
As a result, at Baton Systems we used DLT to build in seamless gateways by which we are able to interoperate with the core systems and the core business processes of a bank such as core ledgers, payment gateways and messaging systems using secure access protocols, adapters and APIs. In doing so, we also brought in the transparency, auditability and workflow constructs of the blockchain. This is the foundation upon which we constructed our game-changing fully customisable, smart workflows that sit at the heart of our Baton CORE platform that powers our Core-FX™, Core-Liquidity™, Core-Payments™ and Core-Collateral™ solutions.
I hope I’ve helped clarify the key differences between blockchains and DLT in capital markets and how Baton Systems is using DLT to deliver benefits to our clients in the post-trade environment. Please don’t hesitate to reach out with any questions on this or other aspects to me or our team by emailing [email protected].
Watch to learn more about Distributed ledger technology (DLT), Blockchain technologies and market infrastructure
MORE BLOG POSTS
SECURITY: At Baton Systems, it’s in our DNA
INTEROPERABILITY: Why it’s important and how we deliver interoperability with Baton CORE
How the Baton CORE platform allows for extensibility
How to build a resilient digital market infrastructure
Why true resiliency requires a fully interconnected strategy incorporating the Technology Stack, Operational Process & Governance Structure
When you’re building critical market infrastructure used by the world’s largest financial institutions to process and settle trillions of dollars of assets every day, as we do at Baton Systems, resiliency needs to be a fundamental cornerstone of the entire architecture.
As an Engineer, I used to think that resiliency was all about technology, however I’ve come to understand, and really appreciate, how ensuring true resiliency requires a fully interconnected three pronged strategy:
- Firstly, you need to build a resilient technology stack
- Secondly, the technology stack needs to be supported by an operational process, and that in itself needs to have resiliency built into it.
- Finally, there needs to be in place a strong governance process that stands behind both the technology and the operational process
Let me explain further.
The Technology Stack: Building in Resiliency & Redundancy
The type of critical market infrastructure Baton deploys, needs to be able to support hundreds of millions, if not a billion events plus, a day. These events don’t increase on a linear scale, we’re dealing with very bursty traffic, and at times very high spikes – often at the beginning and end of the trading day. These spikes have to be handled in a very efficient manner, so the software needs to be able to run on commodity hardware and it needs to be able to scale (1). These are all factors that need to be considered in the way technology providers design multiple aspects of the solution including the data pipes, storage, compute infrastructures, monitoring and alert processes.
“Baton’s technology includes real time stream processing features such as MQ Series and Kafka based data pipes that offer guaranteed delivery and large queue depths.”
At Baton, we’ve focused on this and built resiliency and redundancy into the technology itself, to support our clients by providing technology that’s able to quickly recover. We believe being able to deliver this is incredibly important because there are always going to be elements outside of a technology provider’s control and accommodating for this needs to be built into the infrastructure. So we designed Baton’s technology to include real time stream processing features such as MQ Series and Kafka based data pipes that offer guaranteed delivery and large queue depths. The queues are monitored for latency, throughput and queue depth for example. We also architected stream processing using serverless architectures and asynchronous processing to reduce latency and increase parallelism. Additionally, the monitoring systems feed to notification systems and case management tools for alerting and calling different tiers of support as needed.
“We also architected stream processing using serverless architectures and asynchronous processing to reduce latency and increase parallelism”
Our software also autoscales using stateless microservices that are deployed on a Kubernetes cluster. Our services use an asynchronous event driven design pattern – they’re designed to be idempotent so the replaying of data in any order will result in the same terminal state.
The Operation Process: SLAs, Business Continuity & Disaster Recovery Processes & Client Support
Let’s talk about SLAs. As you’re most probably aware, for this type of software the SLAs will include detailed throughput and latency numbers. As a technology provider, we need to be measuring our system’s performance against these SLAs in real-time at all levels of our platform and application stack. This includes the various layers including compute, network, transport, data access and storage. We need to know if the SLAs are likely to be breached and take corrective actions. We tie this into a case management tool where our operations teams are informed and there is a chain of command that kicks in for the business continuity and disaster recovery process. These plans need to not only be documented, but actually tested on a frequent basis.
“We provide support 24hrs a day, 6 days a week, but in anticipation of expanding client needs we’re already preparing to extend our support to 24/7”
Additionally, the level of operational support offered needs to align with client needs. To effectively support our clients, we provide support 24hrs a day, 6 days a week. This allows us to use the 7th day to deploy any patches or updates to the software, but in anticipation of expanding client needs we’re already preparing to extend our support to 24/7 – so once required we’ll be ready to deliver.
The Governance Process: Strong & Effective Controls, Accountability & Reporting
The governance structure ensures that the software and operational processes in place are backed by strong and effective controls. This means, for instance, ensuring there is no single point of failure, which from a personnel perspective could include ensuring if someone were to leave the business neither the company, the client nor their data, would be at risk. This is a cultural shift technology providers have to make as an organisation.
Accountability plays a huge role in any governance process and as part of Baton’s we offer clients frequent governance reports and meetings where we review both our performance against SLAs and the system KPIs. We can also provide clients with automated reporting and offer a a service portal so clients can log tickets and be updated on the issue’s progress.
“Accountability plays a huge role in any governance process – as part of Baton’s we offer clients frequent governance reports & meetings to review both our performance against SLAs & the system KPIs.”
We believe true resiliency needs to be factored in at so many levels when you’re deploying and supporting a bank’s critical market infrastructures and we’re grateful to have been able to work so collaboratively with our clients as we’ve further developed and enhanced our approach to resiliency.
I hope this blog has provided you with a better understanding of how we work with our clients to manage resiliency. If you have any further questions please do not hesitate to reach out and email [email protected].
1. The ability to scale linearly on commodity hardware is important to keep costs low. For example hardware that is three times as fast as commodity hardware is more than six times as expensive as commodity hardware making the cost per transaction twice as high.
Watch to learn more about Baton's approach to resiliency
MORE BLOG POSTS
Avoid nostro overdrafts with effective liquidity management
SECURITY: At Baton Systems, it’s in our DNA
NETTING: Solving operational challenges with Baton’s CORE technology