Foreword
We are living through a period of profound technological transformation. Applying the technologies currently employed in crypto-asset markets to real-world retail and wholesale payments offers the potential for payments to be embedded more efficiently and deeply into our increasingly digital economy, with benefits for households, businesses and financial markets.
In financial markets specifically, DLT could facilitate faster, cheaper processes – with fewer intermediaries, shorter settlement windows and smart contracts automating routine processes. It could increase the liquidity of a wider range of financial assets, enabling them to be held, traded or used as collateral by a broader set of market participants.
There are also challenges to overcome. For example, we need to ensure that, when used in regulated infrastructure, DLT delivers the same levels of operational resilience, clearly accountable governance and settlement finality that we expect for regulated infrastructure using traditional technology, while still maintaining scalability for high transaction volumes.
Furthermore, in financial markets, it has long been recognised by market participants and international standards that there are benefits for market liquidity and financial stability from those transactions settling in central bank money.
At the Bank of England, we have therefore been doing a lot of work to support the responsible adoption of DLT both by the private sector in its tokenisation of money and assets, and by ourselves to enable tokenised wholesale transactions to settle in central bank money.
The DLT Innovation Challenge has provided valuable practical insights as we take forward this work. DLT is evolving fast as a technology, and working across authorities, market participants and technology providers helps us all understand how design choices in DLT platforms can best support markets, and authorities’ policy objectives, including the trade-offs involved. There is clearly more work for us all to do as we progress this critical work to digitalise financial markets.
The Bank is grateful to those across industry and the central banking community who took part in the Challenge and contributed to its success. We look forward to continuing this collaboration in future, in pursuit of monetary and financial stability and a financial system that supports sustainable growth.
Sarah Breeden
Deputy Governor for Financial Stability
Executive summary
The DLT Innovation Challenge was a joint initiative between the Bank of England (the Bank) and the BIS Innovation Hub London Centre. Its objective was to explore how distributed ledger technology could be applied to wholesale payments and settlement. The challenge was focused on building understanding, nothing in this report should be interpreted as indicating a policy position or commitment by the Bank or the BIS.
The Challenge provided practical insights on the design of DLT platforms and how different design choices affect their usefulness for financial market participants as well as important public policy priorities. This contributes to policy work around use of DLT in the private sector to tokenise money and assets, as well as to the Bank’s work on how to support settlement of tokenised wholesale transactions in central bank money. To examine this, the Challenge brought together a diverse set of participants, including financial institutions, technology firms, and academic experts.footnote [1]
Participants explored DLT solutions across four themes: (i) settlement finality and security; (ii) scalability; (iii) network and asset control; and (iv) interoperability with other DLT platforms and with non‑DLT systems, including real‑time gross settlement (RTGS). Across these themes, participants proposed a wide range of technical design choices, including permissioned and permissionless architectures, different consensus mechanisms, layered execution models, and alternative interoperability arrangements.
Settlement finality. Settlement finality underpins confidence that completed transactions are irrevocable and is a core expectation for systemically important payment systems and other types of financial market infrastructure. The Challenge demonstrated that a range of technical approaches can improve the speed of settlement finality on DLTs, including constrained validator sets, alternative consensus designs, and layered execution models. However, each introduces trade‑offs between determinism, resilience and decentralisation. No single model delivers fast, deterministic finality without shifting risk or trust assumptions, highlighting the importance of carefully assessing how different designs align with the standards expected of wholesale payment and settlement systems.
Scalability. Scalability of payment systems can have implications for operational resilience and financial stability. Participants demonstrated multiple on‑chain and off‑chain approaches to scaling DLT‑based systems, including Layer 1 optimisation, horizontal scaling, and Layer 2 execution. These approaches can increase throughput and reduce latency but often introduce additional complexity and have implications for settlement finality, governance and operational resilience. Therefore, scalability choices cannot be considered in isolation from control and resilience requirements.
Network and asset control. Effective network and asset control enables security, resilience, and compliance objectives, depending on system design and governance. The Challenge highlighted that effective network and asset control can be implemented at different layers of the DLT stack, through combinations of on‑chain permissions, off‑chain governance arrangements, and middleware solutions. Stronger control and compliance capabilities were typically associated with more permissioned or layered architectures. More open network designs relied on additional governance and trust arrangements to achieve asset controls, either through technical means (such as smart contracts) or governance through foundations or decentralised organisations.
Interoperability between DLT and non-DLT networks. Interoperability is important to enable innovation, avoid fragmentation of liquidity and ensure robust functioning of the financial system. Participants presented a range of interoperability models enabling interaction between DLT platforms, and between DLT and non‑DLT systems, including RTGS. These included native protocol features, bridges, or orchestration layers. Interoperability solutions involve trade‑offs between atomicity, flexibility and security, with interoperability often shifting, rather than removing, trust and operational dependencies across systems.
Overall, the Challenge highlighted that, while technical progress has been made, the use of DLT for wholesale settlement involves significant design trade‑offs. Choices that improve speed or scalability may introduce new dependencies or resilience considerations; approaches that enhance decentralisation may complicate governance or control; and interoperability solutions can shift, rather than eliminate, trust assumptions. This leads to further questions around how we implement governance of networks and needing off-chain components to enable this for permissionless ledgers; as well as a need to further understand optimal interoperability either through native integration or use of third parties.
The Challenge provided a structured way to examine how emerging DLT‑based solutions align with the expectations and requirements of wholesale payment and settlement. The insights in this report will inform future analysis, experimentation and policy work by helping to clarify where DLT may offer benefits, where constraints remain, and what questions require further consideration as innovation in money and financial market infrastructure continues.
Deep dives and key findings
The DLT Innovation Challenge provided a consistent and comparable basis on which participants could test DLT solutions and deliver practical insights, in order to inform the Bank’s policy work around use of DLT in the private sector to tokenise money and assets, as well as the Bank’s work on how to support settlement of tokenised wholesale transactions in central bank money.
The Challenge included a series of deep dives focused on four themes most relevant for assessing the use of distributed ledger technology in wholesale settlement context and expectations of systemically important payment and settlement infrastructures.
These themes – settlement finality and security; scalability; network and asset control; and interoperability – reflect core design questions and capture the main areas where trade‑offs arise between performance, resilience, governance and control. The discussion below summarises the key approaches observed during the Challenge and the insights they offer, without making policy judgements or assessing the suitability of any specific solution for future implementation.
Settlement finality
Why finality matters
Settlement finality refers to the point at which a transaction becomes irrevocable and legally binding. Settlement finality underpins confidence that completed transactions are irrevocable and is a core expectation for systemically important payment systems and FMIs as set out in Principle 8 of CPMI IOSCO PFMIs.footnote [2] Achieving settlement finality in distributed environments requires balancing trade-offs between speed, determinism of finality and the degree of distribution of the network. The scope of the Challenge focused exclusively on technical settlement finality, rather than legal finality, and does not assess whether any arrangement discussed would meet statutory or regulatory requirements for legal finality.
Approaches to achieving technical finality
Participants presented a range of approaches intended to improve the speed and certainty with which technical settlement finality is achieved on DLTs. These approaches can be grouped into three broad categories: 1) Reducing the sets of validators confirming transactions; 2) Improving the speed of transaction propagation; and 3) Consensus mechanism design.
1. Reducing the sets of validators confirming transactions
Reducing the sets of validators required to confirm a transaction can increase the speed of settlement finality as fewer actors are required to agree (reach consensus) on transaction validity.footnote [3] Participants demonstrated three ways to reduce validator sets:
- Access requirements for validating on the network. This is often implemented through the use of selected validator consortia or proof-of-authority consensus model, in which only approved validators can process a transaction.
- Sampling consensus mechanisms. To confirm the transaction validity, consensus is achieved by sampling a subset of validators, rather than requiring agreement across the full validator set.
- Independent transaction verification and global state confirmation. Some solutions locally decouple transaction validation from ledger-wide finality. An independent local verification is generated bilaterally between parties eg a balance check prior to transaction submission. Ledger-wide confirmation is provided by a trusted third party (eg a notary function), which verifies the transaction validity and consistency with the wider (global state) on the DLT.
Figure 1 – Reducing Validator sets
2. Improving transaction propagation
Improving how transactions propagate within the network can increase the speed of settlement finality, as transactions can be validated more quickly across the network. Participants demonstrated two main approaches to improving transaction propagation:
- Inferred transaction messaging. Some participants used approaches in which transactions were shared and validated by inferring the current ledger state from previously confirmed valid transactions. This method reduced the need to linearly verify earlier transactions improving the speed of finality. They also accelerated transaction information sharing by enabling validators to distribute both their own transactions and those they know, using directed acyclic graph (DAG) architectures or other novel messaging protocols.
- Ordering. Some participants improved efficiency by introducing a central component responsible for ordering transactions. By determining transaction order upfront, this approach allows the network to reach consensus and finality without requiring participants to debate transaction ordering, while introducing an element of centralisation in how finality is achieved.
Figure 2 – Improving transaction propagation
3. Consensus mechanism design
This refers to the mathematical approaches by which DLTs reach agreement over the state of the ledger. These range from probabilistic to deterministic approaches. These concepts overlap with validator set constraints and transaction propagation, as the choice of consensus mechanism is not decoupled from overall ledger design.
Throughout the Challenge, participants implemented a range of consensus mechanisms. Some firms chose to reduce the potential for transaction reversibility and ledger forking by implementing deterministic finality through fixed voting rounds to confirm transactions. In contrast, other participants relied on probabilistic mechanisms which use economic incentive to encourage validators to confirm transactions and discourage reversal or forking. A further approach involved sampling the validity of transactions to reach finality. These approaches involve trade-offs in terms of resilience of the consensus mechanism as faster determinism is often achieved by reducing the level of agreement required within the consensus mechanism.
All the approaches also need to consider ways to ensure fairness around how transactions are proposed, validated and committed to the network to ensure a validator cannot submit certain transactions before other and enable them to be front run.
Alternative views and combination of approaches
Some participants acknowledged limitations at the Layer 1 level, the core ledger for settlement, in terms of settlement finality performance. These solutions opted to use Layer 2, a separate ledger to improve transaction throughput that batch transactions to the Layer 1, to batch and provisionally settle transactions off the Layer 1 ledger in order to increase throughput. In addition, some interoperability-focused solutions prioritised enabling settlement across ledgers, rather than improving finality on a single ledger. These approaches did not improve settlement finality on the Layer 1 but improved scalability and interoperability between networks.
The approaches to improving settlement finality are not mutually exclusive; and it was common for solutions to combine multiple methods. For example, a given solution may implement access requirements for validators alongside a sampling based consensus mechanism and/or deterministic consensus mechanisms. In addition, Layer 2 solutions may leverage these settlement finality mechanisms to increase the speed of on-chain processes before ultimately relying on settlement back to Layer 1. The interoperability approaches observed did not directly address settlement finality, but instead focused on co-ordinating settlement across different ledgers or systems.
Key learnings and insights
- Improvements in probabilistic settlement. These solutions leveraged Layer 2 arrangements and sampling-based consensus methods to improve settlement speed. However, this raised further questions about whether probabilistic finality could be supported in settlement models in financial market infrastructures and payment systems, given that these approaches cannot, in their current form, provide fast and deterministic finality.
- There is no single solution to enable fast and deterministic technical finality. Participants improved deterministic settlement finality by constraining validator sets and implementing consensus models with deterministic finality. Hypothetical performance figures of 10,000 transactions per second (TPS) and sub-second finalityfootnote [4] were presented where these approaches were combined. These outcomes involve trades offs between speed, network distribution and fault tolerance within the consensus mechanism, and were not independently validated.
- Different settlement finality solutions raise resilience considerations. We identified resilience and operational risk issues relating to consensus mechanisms and validator nodes responsible for ensuring settlement finality. Design choices affecting the distribution of validators and the number required to agree on a transaction can impact network resilience, including susceptibility to malicious actors, concentration of control, or the impact of validator downtime or infrastructure failures.
- Layer 2 solutions improve transaction throughput but rely on Layer 1 finality. This reflects an approach where you can solve scalability but you are still bound the conditions of the layer 1 in which core settlement continues to rely on the finality of the underlying, non-deterministic ledger.
Interoperability with DLT and non-DLT networks
Why interoperability matters
Wholesale market settlement takes place across multiple networks nationally and across borders. Interoperability between DLT platforms, RTGS systems, and other financial infrastructures is important to enable innovation, avoid fragmentation of liquidity and ensure robust functioning of the financial system.
Interoperability also plays a critical role in preserving trust in money. Ability to freely exchange different forms of money at par value is essential for maintaining monetary and financial stability. In a system with multiple ledgers and settlement assets, fragmented and isolated systems – ‘walled gardens’ – would increase operational complexity and financial stability risks. It could lead to trapped liquidity, single points of failure and undermine confidence in the financial system.
Approaches to interoperability
Across the DLT Innovation Challenge, participants demonstrated a spectrum of interoperability models. These can be grouped into two broad categories, with different approaches to solve key issues in each, these are:
- DLT interoperability. This might be achieved through:
- Native DLT interoperability, where interoperability is built directly into a Layer 1 ledger; and
- Cross-chain protocols and bridges, where interoperability is provided through additional systems that connect separate ledgers.
- Non-DLT interoperability. This refers to interoperability between a DLT system and non-DLT systems. Approaches observed during the Challenge included:
- Oracles, which allow external data to be used within DLT smart contracts and applications;
- Middleware, which uses application programming interfaces (APIs) to connect DLT platforms with external systems; and
- RTGS connectivity, where oracles, middleware or synchronisation operators are used to co-ordinate settlement between an RTGS and DLT system.
1. DLT-interoperability
Native DLT interoperability
Native DLT interoperability refers to the capability of distributed ledger platforms to enable secure, real-time communication and asset transfers directly between blockchains that operate under a shared protocol or governance framework.
In such architectures, interoperability is achieved at the protocol level, allowing for authenticated message passing, atomic transactions, and seamless coordination of activities across multiple subnets or shards within the same ecosystem. This can enable faster settlement across networks without the use of bridges or intermediaries, resulting in high atomic settlement and security. However, this approach is typically less flexible in terms of the range of external systems or networks with which it can interoperate.
Cross-chain protocols and bridges
Cross-chain protocols (CCP) are general-purpose frameworks designed to enable the transfer of assets and data across multiple distributed ledger networks. These protocols often use oracle-based or messaging architectures to facilitate interoperability, allowing for a wide range of asset types and transaction flows between otherwise independent blockchains.
Their extensible and asset-agnostic design can make them relatively straightforward to deploy and scale, supporting integration with new networks and asset classes with limited bespoke development. However, this approach introduces additional trust assumptions in external systems such as oracles, which is discussed further below in the context of non-DLT interoperability.
Bridges, by contrast, are typically chain-specific and implemented through mechanisms such as smart contracts and hashed time-locked contracts (HTLCs) to enable atomic assets swaps. This can provide an effective solution where asset specific connectivity is required, but tends to reduce flexibility across chains, assets or message types, as each connection requires a dedicated bridge. Bridges can also be susceptible to security risks if the underlying smart contracts are not designed, implemented or governed appropriately.
Figure 3 – DLT to DLT interoperability methods
2. Non-DLT Interoperability
Participants demonstrated approaches for connecting DLT platforms to external systems, including RTGS. Common approaches relied on oracles to bring off-chain data into DLT environments and middleware to connect DLT platforms with external systems. In addition, direct integration with RTGS and current financial infrastructure represents a third type of non-DLT interoperability.
Oracles
Oracles are mechanisms that enable distributed ledger platforms to use data from external, off-chain sources by providing a trusted bridge between DLT and non-DLT infrastructure. Oracles attest to data cryptographically allowing it to be used in DLT smart contracts and applications. This reliance on oracles introduces additional trust assumptions, as the integrity of the DLT application becomes dependent on the accuracy and security of the oracle service, which can be subject to compromise, manipulation or governance risk.
Middleware
Middleware solutions act as integration layers that facilitate communication between DLT platforms and external systems, they do this through APIs and shared standards. Middleware can provide flexibility and ability to abstract the complexity of underlying systems and support integration across different infrastructures.
The key distinction between the two is that oracles enable off-chain data to be cryptographically verified in a DLT system, and middleware solutions provide more extensibility around communication between DLT and Non-DLT systems. Typically, both are used in conjunction to build interoperability solutions.
RTGS and legacy system integration
Direct integration with RTGS and conventional financial systems represents another approach to non-DLT interoperability. This involves connecting DLT platforms to established payment and settlement infrastructures through the techniques described above or using a synchronisation operator, which can leverage oracles and middleware as well as specific approaches set by the RTGS system to connect.footnote [5] This approach enables DLT-based transactions to be settled in central bank money and reconciled with existing account-based systems, supporting atomic settlement models.
Figure 4 – DLT to non-DLT interoperability methods
Trade-offs and Challenges
A critical insight from the Challenge is that no single interoperability model is universally optimal. We evaluated the approaches to interoperability through the lens of atomicity, flexibility and security.
Atomicity – the ability for a transaction across multiple systems to either succeed or fail, can be viewed along a spectrum. Native interoperability solutions embed atomicity at the ledger layer, as they are designed to interoperate directly within the protocol. By contrast, solutions that rely on smart contracts, oracle services or middleware shift the responsibility for achieving atomicity to the system providing interoperability.
Flexibility – the ability for an interoperable platform to connect across multiple types of assets and ledgers also varies across approaches. Some solutions optimise for asset interoperability and enable compatibility across ledgers, while others focus on messaging or orchestrating across ledgers, making them more generalisable across asset types and data.
Security – in this context – refers to the additional vulnerabilities introduced by interoperability solutions relative to the native Layer 1 security model. Native interoperability can inherit the security properties of the underlying ledger. In contrast, solutions that rely on smart contacts, such as bridges, or third-party components, such as oracles used in cross-chain protocols, introduce new attack vectors. Middleware-based approaches may also introduce security risks by concentrating coordination and control within a central entity.
Key insights and learnings
- There are limits to cross-chain composability across different design choices. Protocol-level interoperability within a single ledger environment can offer strong guarantees, composability and atomicity when transacting across chains. By contrast, cross-chain interoperability implemented through APIs or messaging may not always achieve the same level of composability across infrastructures, due to the absence of cryptographic assurance within the settlement ledgers.
- Interoperability approaches involve trade-offs between atomicity and the risks of reversibility or partial settlement. Mechanisms that guarantee atomic settlement are likely to be an important consideration for central banks, as partial settlement or reconciliation failures are unacceptable in critical payment systems. Solutions ensuring native interoperability with DLTs are generally more likely to support atomicity compared to synchronisation solutions, where trust is placed on a third-party component. However, synchronisation represents a useful approach for connectivity with DLTs that aligns with existing RTGS solutions, and with good design around how it safeguards transaction completion between ledgers, it can solve atomicity risks.
- Further analysis is needed on the trade-offs between atomicity and fragmentation when enabling interoperability between DLT and non-DLT systems, including RTGS. The ability to integrate with existing FMIs, RTGS systems and messaging standards is likely to be important if such solutions are to function as financial market infrastructure. Middleware and orchestration layers might provide the flexibility and resilience, while native integration may offer an alternative route to achieving greater atomicity and leveraging DLT functionality. Further consideration is therefore needed to balance interoperability approaches that prioritise atomicity and composability against those that seek to reduce fragmentation and liquidity segmentation across systems.
Network and asset control
Why network and asset control matters
Effective network and asset control underpins the security, resilience, and governance of applications built using DLT. Confidence is required that networks cannot be manipulated by malicious actors and that appropriate governance and resilience arrangements are in place. Asset control mechanisms can enable clear ownership, appropriate access and robust regulatory compliance.
The governance and operational design of DLT networks and assets are key in enabling systems to meet regulatory standards, apply policies and allocate roles and responsibilities. For central banks, these design choices are therefore an important consideration in balancing innovation with oversight and maintaining trust in money.
Background on network and asset control
Figure 5 – Network and asset control layers
Network and Asset Control solutions can be categorised according to the four architectural layers of a DLT stack:
- Application Layer – consists of smart contracts and transaction signing mechanisms, used to encode rules such as regulatory checks and compliance requirements. This layer embeds asset behaviour directly into contracts, leveraging token standards and permissioning.
- Consensus Layer – provides agreement on the ordering of transactions recorded on the ledger. Validators operate in accordance with the consensus protocol to arrive at a shared view of the ledger state. Some DLT networks restrict who can act as a validator, and these authorised validator nodes can enforce permissions during transaction execution.
- Network Layer – ensures secure connectivity and operational resilience across the distributed infrastructure. This includes controls over how nodes communicate with one another, how data is propagated across the network, and how the underlying infrastructure is governed.
- Data Layer – governs how data is shared on the ledger. DLTs are often designed as open networks, where ledger data is visible to participants. A range of data segregation techniques can be applied to support privacy, including sharding, private data lakes and data encryption mechanisms.
Additional execution layers (both on and off chain) such as Layer 2 arrangements, may be used to off-load transaction processing from Layer 1 ledger. In this context, the Layer 2 can enforce permissioning and apply controls over the transactions submitted for settlement.
Alongside the above, off-chain controls may also be used, such as governance councils responsible for decisions relating to network operations and maintenance. Off-chain governance refers to processes that are not recorded on the ledger or automated through smart contracts. Decisions taken through these arrangements can influence multiple aspects of DLT operation and may span across all four layers of DLT stack, as shown in the diagram. Given off-chain arrangements are not bound by the infrastructure, they can be subject to change or manipulation over time.
Firms’ approaches to network and asset control
Across the DLT Innovation Challenge, three broad approaches to network and asset control emerged in the solutions presented:
- Public permissionless networks with centralised components.
- Permissioned networks.
- Middleware and chain-agnostic solutions.
Public permissionless networks with centralised components
This approach leveraged open, permissionless blockchains in order to benefit from strong network effects and broad participation, while enabling controls needed to meet compliance and operational requirements. Because the underlying ledger is decentralised, direct control over the consensus, network, and data layers is constrained.
Most firms implementing this model focused on the application layer, embedding Know-Your-Customer (KYC) rules, allow/block lists, token standards and compliance logic into smart contracts to support scalability and compliance objectives, subject to governance arrangements from entities such as foundations or decentralised governance for permissionless networks. At the consensus layer, some solutions also used transaction ordering mechanisms or restricted validator lists to exert a degree of control over the network operation. Some solutions introduced additional execution environments, such as Layer 2 arrangements, which process transactions off the Layer 1, while anchoring security to the Layer 1’s consensus mechanism. This approach enables broader participation while retaining safeguards over asset issuance, redemption and compliance.
Under this approach, participants also proposed the use of off-chain overlays or governance frameworks to manage infrastructure participation, reflecting the limitations of on-chain permissioning in public networks.
Overall, this approach offers flexibility and enables the use of public networks. However, it also presents challenges, as embedding regulatory compliance depended on additional governance and trust arrangements.
Permissioned networks
Most participants favoured permissioned architectures, where access and validation rights are tightly embedded across all layers of the DLT stack. These networks typically feature permissioned validator sets, managed through governance councils or multi-signature arrangements (these are technical or governance entities, which enable certified parties to ratify governance decision), ensuring only authorised nodes participate at the consensus layer. Strict participation criteria relating to node infrastructure and uptime are enforced. Validator nodes are incentivised through the pre-determined share of transaction fees, while being required to meet defined infrastructure and availability standards. Validators enforce transaction-level rules and execution-level permissioning in accordance with the chosen consensus protocol.
At the application layer, participants implemented role-based access controls and embedded compliance checks directly into smart contracts, leveraging token standards to support interoperability and control over asset behaviour. At the network layer, strict permissioning was applied through address allow-lists and certificates, complemented by governance standards relating to uptime, security, and geographic diversity.
For the data layer, where the networks were public, privacy controls included encryption, selective disclosure, and audit capabilities. Some solutions segregated sensitive data using partitioned channels or off-chain storage. Off-chain governance was a critical component of these models, with councils providing decision-making authority over network operations and maintenance. Some participants proposed hybrid models, allowing private networks to interoperate with public chains using a common technology stack, while others developed bespoke DLT platforms optimised for speed, compliance, and fee predictability.
Overall, this approach enabled regulatory compliance and operational resilience but may also introduce governance concentration risks.
Middleware and chain-agnostic solutions
Middleware providers and oracle networks approached network and asset control by abstracting governance and compliance mechanisms away from the underlying ledger. These solutions offer modular, blockchain-agnostic services that can operate across multiple platforms and can be integrated at different layers of the stack. This can help reduce vendor lock-in and support interoperability across systems.
At the application layer, controls were enforced through orchestration layers that manage identity, compliance and asset lifecycle policies without embedding these directly into smart contracts. For the consensus layer, validator governance and execution permissioning were implemented indirectly through middleware policies, allowing participants to manage validator sets across different chains.
At the network layer, interoperability frameworks were used to govern connectivity between networks, supporting secure message and asset exchange and promoting infrastructure diversity across cloud environments and nodes operations. At the data layer, privacy and auditability were supported through external storage solutions, zero-knowledge proofs, and integration with legacy systems. Off-chain components were central to this approach, with middleware acting as the control plane for compliance, key management, and cross-chain asset control.
While this model enhances flexibility and scalability, it also introduces additional trust assumptions and integration complexity, requiring robust design and governance arrangements to mitigate associated risks.
Key insights and lessons
- There is no single model for controlling DLT networks and assets. The optimal configuration of network and asset control depends on the underlying DLT implementation (ie public, permissioned), which influences where in the DLT stack control mechanisms can be applied. These design choices affect where and how control can be exercised.
- Governance structures are critical to effective control of DLT networks and assets. Whether implemented through governance councils or multi-signature arrangements, robust governance is essential to prevent excessive concentration of power and to ensure accountability. In contrast, decentralised governance structures, especially in open permissionless systems, may raise challenges due to the absence of clear accountability, and if they are off-chain or opaque they can change over time.
- Layered control arrangements can enhance resilience. Combining technical, operational and organisational controls, such as validator selection, key management, and compliance checks, can help achieve desired level of resilience, security and privacy while meeting relevant design requirements.
Scalability
Why scalability matters
Scalability, or ability to process large transaction volumes with low latency, is essential for real-world adoption of electronic payment systems. It can also have implications for operational resilience and financial stability, particularly where payment systems are systemically important.
Scalability emerged as a central theme throughout the Challenge, reflecting a common constraint for DLT based solutionsfootnote [6]. This challenge is often described through the ‘blockchain trilemma’ which highlights the inherent trade-offs between scalability, decentralisation, and security. Participants demonstrated a range of design choices aimed at addressing scalability with these constraints.
The blockchain trilemma
The ‘blockchain trilemma’ posits that it is difficult to optimise scalability, security, and decentralisation simultaneously. For example, The original Ethereum design, which was based on Proof-of-Work (PoW) is focussed on decentralisation and security, but this comes at the expense of scalability. This approach allows broad participation and robust security yet limits the network’s capacity to process economic activity at scale.
The transition to Proof-of-Stake (PoS) increased scalability, by reducing validator decentralisation by adding the need to stake Ethereum native token (Ether), in the network to be a validator.
Approaches to scalability
During the Challenge, solutions fell into two broad categories, each with distinct approaches.
A. On-chain scaling
On-chain scaling enhances the performance of Layer 1 through innovative consensus mechanisms, validator set design, and transaction propagation. These approaches address the blockchain trilemma by making design trade-offs between the scalability, decentralisation and security.
- Native scaling: Optimising Layer 1 performance through purpose-built consensus mechanism and validator architecture.
- Horizontal scaling: Distributing transaction processing across multiple subnets or network-of-networks, often combined with advanced consensus and validator architecture to improve performance.
B. Off-chain scaling
Off-chain scaling shifts complexity away from the main ledger through the use of Layer 2 solutions, orchestration layer, oracle networks and synchronisation mechanisms.
- Vertical scaling (Layer 2): Layer 2 arrangements execute and batch transactions off-chain, while anchoring settlement finality back to the Layer 1 ledger.
- Pure off-chain orchestration: Transaction coordination and compliance enforcement are managed by external systems, such as synchronisation layers or oracle networks.
Many solutions employ hybrid approaches, combining on-chain scaling techniques for consensus and network design with off-chain elements such as vertical scaling and sharding.
Figure 6 – Scaling approaches for DLTs
Scaling models and firm implementations
The Challenge surfaced four main scaling models, each illustrated by solutions presented by participants:
1. Native scaling
This approach involves improving the Layer 1 ledger, typically through innovation in consensus mechanism and validator optimisation, to deliver higher throughput and deterministic finality. As discussed in the settlement finality section, participants demonstrated that this can be achieved by reducing the number of validator sets; increasing the efficiency of transaction propagation across the network, and/or implementing deterministic consensus mechanisms. These techniques aim to increase transaction capacity or reduce the time required to reach deterministic consensus.
2. Horizontal scaling
Under this approach, transaction processing is distributed across multiple subnets or interconnected networks, reducing congestion and improving throughput. These solutions use multiple networks to scale processing, with specific subnetworks configured and optimised for particular applications or throughput requirements, while remaining interoperable with other subnetworks. This allows for greater customisation in deployment and use-case design, but also introduces additional complexity in how networks interact and co-ordinate.
3. Vertical scaling
In vertical scaling models, transaction execution and batching occur off chain, with settlement finality anchored to the Layer 1 ledger. This preserves Layer 1 as the ultimate source of settlement while improving throughput. Participants demonstrated solutions using zero-knowledge roll-ups with centralised components to support compliance and security requirements. These roll-ups enable faster transaction processing, as the proof submitted to the Layer 1 for each batch is deterministic. While this approach helps scale Layer 1 capacity, the speed at which finality is achieved remains constrained by the performance of the underlying ledger.
4. Pure off-chain scaling (middleware and orchestration)
In this model, transaction orchestration and compliance enforcement are managed off-chain, prioritising flexibility and interoperability. Participants demonstrated approaches that act as messaging layers between different ledgers, using oracle networks to validate information across chains. Other solutions employed ‘synchronisation’ components between ledgers, operated by a central entity. These approaches can support scalability by allowing a third party to manage transaction validity across ledgers, providing flexibility in deployment. However, they introduce additional trust assumptions and governance layers in the entity co-ordinating settlement, and remain dependent on the performance of the ledger on which assets ultimately settle.
Key insights and lessons
- Trade-offs in scaling are unavoidable and have a direct impact on settlement finality. Native Layer 1 scaling, using techniques to increase speed and transaction capacity; and horizontal scaling, which distributes transaction processing across networks, can improve performance and security but can also add operational complexity to Layer 1 architecture. Vertical scaling preserves Layer 1 as the settlement ledger whilst improving throughput, but introduces operational overhead and remains constrained by Layer 1 performance, which can delay finality. Pure off-chain solutions can support scalability by moving transactions away from the settlement ledger; however, they introduce trust assumptions and coordination challenges across ledgers, while remaining dependent on the settlement performance of the underlying Layer 1.
- On-chain and off-chain scaling are not binary choices. Hybrid approaches are often required to address the trade-offs inherent in scaling. Solutions may combine Layer 1 optimisation with Layer 2 roll-ups and orchestration models to support specific use-cases, depending on the design constraints and objectives being prioritised.
Cross-cutting Issues
Unlike the technical findings associated with specific solutions, these issues cut across all approaches and will influence how distributed ledger technology might be adopted for wholesale settlement.
- Governance and oversight: Decisions about who operates and oversees DLT networks, and how responsibilities are allocated, remain central considerations. Effective governance is required to balance innovation with accountability, as is ensuring that such governance remains robust over time.
- Operational resilience: Ensuring that systems are robust to technical and operational failures is a common concern across all approaches, particularly as DLT networks become more interconnected.
- Privacy and compliance: Achieving an appropriate balance between protecting sensitive information and meeting regulatory requirements presents a challenge for all solution.
- Integration with existing systems: Connecting new DLT platforms with legacy payment and settlement infrastructure presents both technical and operational challenges.
- Efficiency and innovation: The novel features of DLT – such as shared consensus, programmability and composability – may enable improved efficiency, coordination and innovation in wholesale settlement. The Bank of England’s work spans a structured programme of initiatives across experimentation and regulation – including the Digital Securities Sandbox and the Synchronisation Lab – which will continue to improve our understanding of the benefits and limitations of these technologies.
These cross-cutting issues are not unique to any single technology or participant. They represent shared considerations that will shape future work, policy development, and collaboration in the area of DLT and blockchain innovation.
Conclusions
The DLT Innovation Challenge has provided a practical and intentionally broad look at how distributed ledger technology could be applied to wholesale money issuance and settlement. The range of solutions and approaches presented by participants indicates that technical progress is being made, with different approaches demonstrating secure, scalable, and interoperable settlement on DLT-based systems.
At the same time, the Challenge has highlighted those broader considerations – including governance and operational resilience – remain central to future adoption. While there is no single path forward, Box A describes how the insights set out in this report contribute to the Bank’s work both on how to support settlement of tokenised wholesale transactions in central bank money (through synchronisation, and potentially through tokenisation of central bank money itself) and on our policy work around use of public permissionless ledgers by trading venues and settlement systems for tokenised assets (which are soon to launch in the Bank and FCA’s Digital Securities Sandbox) and systemic stablecoins (for which the Bank is designing a regulatory regime), as well as how banks should manage their exposures to tokenised assets using public permissionless ledgers.footnote [7]
As the sector continues to evolve, ongoing collaboration and open-minded exploration will be important in shaping how these technologies might be applied in practice.
Box A: The impact of the DLT Innovation Challenge on our approach to innovation in money and payments
The DLT Innovation Challenge was not intended to reach policy conclusions. However, the technical design choices and trade‑offs observed across settlement finality, interoperability, scalability and network controls offer useful signals for how different approaches could shape the Bank’s wider programme of work on synchronisation, wholesale central bank money and public permissionless ledgers. This box highlights where the Challenge provides insights that inform future analysis and experimentation.
Synchronisationfootnote [8]
The interoperability findings highlight that native integration with DLT ledgers supports atomicity, security and settlement assurance by leveraging the features of a DLT. Where interoperability is embedded at the ledger or protocol layer, cryptographic or co-ordination guarantees can be used to enable effective settlement directly across systems.
Synchronisation approaches could be particularly useful for enabling reliable interactions between DLT platforms and non-DLT ledgers, including RTGS services. Synchronisation approaches can flexibly serve different types of assets and ledgers, by leveraging a central authority with technical controls to enable atomic settlement across a range of solutions. Our challenge findings support this and suggest there may also be different native interoperability approaches which can leverage the advantages of a DLTs architecture.
Wholesale central bank money
The challenge has demonstrated best-practice approaches to using DLT to tokenise money and assets to meet the requirements of financial market participants and public policy objectives.
It has demonstrated how native issuance of assets on a DLT can effectively enable atomic settlement, leverage DLT security and resilience benefits, achieve asset control and avoid reliance on centralised intermediaries. At the same time, this approach could reduce flexibility across systems and lead to liquidity fragmentation across ledgers.
These trade-offs highlight the importance of further work on issuance models, and downstream impacts on how these assets interact and interoperate with a broader ecosystem.
Public Permissionless Ledgers
One clear insight from the Challenge is that faster and more deterministic settlement finality is often achieved by reducing decentralisation and concentrating control within networks. This raises important questions for the use of more decentralised public permissionless ledgers in wholesale contexts, where probabilistic settlement finality remains common.
The Challenge also found that asset‑level controls can be effectively implemented on public permissionless ledgers, particularly at the application and execution layers. However, governance and resilience arrangements in these environments often rely on off‑chain processes, limiting the ability to enforce accountability and operational assurance through the ledger itself. These findings suggest that future work on public permissionless ledgers will need to explore how probabilistic finality models, governance structures and control mechanisms could operate in practice if such networks were to support systemic settlement activity.
Acknowledgements
Thanks goes to the participants in the DLT Innovation Challenge for their time to talk and present their solutions; the authors, Akanksha Dixit, Amit Sagar, Cindy Ramdoyal, David Parrott, Marima Cope, Prem Munday and to all the Bank of England and Bank for International Settlements staff who participated in the challenge and reviewed the report.
Annex
Challenge structure
Deep dive sessions
The core of the Challenge comprised a series of four virtual ‘deep dive’ sessions for each participant. Each session focused on a key theme central to the problem statement:
- Settlement finality and security:
This session explored how distributed ledger platforms can achieve secure, irreversible settlement of central bank money, focusing on the mechanisms that underpin transaction finality, prevent unauthorised asset creation, and ensure the integrity and auditability of completed transactions. - Scalability:
The scalability session examined the capacity of DLT solutions to process high transaction volumes with low latency, highlighting the technical and architectural strategies used to support efficient consensus, parallel processing, and economic viability for wholesale payments. - Network and asset control:
This session addressed the governance and operational controls available within DLT environments, considering how digital assets are managed, access is programmed, and compliance is enforced, while balancing decentralisation with regulatory and reconciliation requirements. - Interoperability and RT standards:
The interoperability and RT standards session focused on how DLT platforms can communicate and transact seamlessly with other financial systems, including legacy RTGS infrastructure, by adopting protocol compatibility, cross-network functionality, and alignment with industry messaging standards.
During each session, participants presented their solutions and engaged in detailed discussions with Bank and BISIH staff. Presentations were followed by Q&A, allowing for critical examination of technical approaches, operational models, and governance considerations. This format encouraged open exchange and practical demonstration of capabilities.
Showcase event
Following the deep dives, selected participants were invited to present at an in-person showcase event in London. This event provided an opportunity to share insights and innovations with a broader audience, including central bankers, payments professionals, and the wider financial sector.
Timeline
- Applications closed in late July 2025.
- Participants were notified by late August.
- Deep dive sessions were held throughout September and October.
- The showcase event took place in October 2025.
Focus on learning and experimentation
The Challenge was explicitly experimental and did not represent a policy trial or commitment to future implementation. Its structure was intended to isolate technological possibilities from policy constraints, enabling the Bank to build an evidence base and inform future work on wholesale settlement and digital money
Glossary
Allow List
A documented list of specific elements that are allowed, per policy decision. This is commonly used in identity and security contexts to specify which entities (eg, relying parties, applications, IPs) are permitted to interact with a system. This concept has historically been known as a whitelist.
Application Programming Interface (API)
A system access point or library function that has a well-defined syntax and is accessible from application programs or user code to provide well-defined functionality.
Attestation Service
An attestation service is a process where a system (the attester) produces believable evidence about itself to allow a remote party to decide whether to trust it.
Atomic Settlement / Atomicity
The all‑or‑nothing completion of linked transfers across systems or ledgers. Either all legs settle or none do, eliminating partial‑settlement or reconciliation risk.
Authentication
Verifying the identity of a user, process, or device, often as a prerequisite to allowing access to resources in an information system.
Deny List
A documented list of specific elements that are blocked, per policy decision.
Composability
The ability of applications or assets to interoperate safely. Protocol‑level composability (within a single ledger system) provides strong guarantees; cross‑chain solutions lack these cryptographic assurances.
Consensus
A process to achieve agreement within a distributed system on the valid state.
Consensus Mechanism
The protocol by which validators reach agreement on the ledger state.
Deterministic Finality
A state in which a committed transaction cannot be reversed under the protocol’s assumptions, providing mathematical certainty. Distinguished from probabilistic finality (eg in some public blockchains) where a transaction only has a probability of being final at a given time.
Directed Acyclic Graph (DAG)
A directed graph that contains no directed cycles. It is composed of vertices connected by directed edges or arcs. Following the direction of these edges can not lead back to the starting vertex, therefore, no closed directed loops exist. The enables transactions to be more effectively shared in the context of a DLT network.
Distributed Ledger Technology (DLT)
A consensus‑based data structure in which multiple participants maintain a synchronised, shared ledger. Used in the Challenge to explore settlement of central bank money on external programmable ledgers.
Fault tolerance
Fault tolerance in a DLT context refers to the ability for a consensus mechanism to operate and maintain consensus even when some validators or nodes are acting maliciously. The higher the fault tolerance, the greater number of malicious nodes a DLT can handle.
Global State
The global state of a distributed system is the collective state of all the individual processes (eg nodes, channels, processes) and all the communication channels at a specific point in time.
Governing Council
A decision-making body responsible for overseeing, managing and setting strategic direction for an entity. It typically formulates policy and holds governing authority for effective administration.
Hashed Time‑Locked Contract (HTLC)
A conditional payment mechanism combining hash‑locks and time‑locks to ensure atomic cross‑chain swaps: the recipient must reveal a preimage to claim funds, or funds return to the sender after a timeout.
Layer 1
Primary or base layer network that independently validates, records, and secures transactions. Layer 1s define consensus, provide the native environment for smart contracts and decentralised applications.
Layer 2
A Layer 2 blockchain is a secondary framework or protocol built on top of a base Layer 1 to increase transaction speed, improve scalability, and reduce fees. Layer 2s process transactions off-chain, bundling them into batches before submitting them to the Layer 1, which retains ultimate security and finality.
Node
A blockchain node is any device (computer, server) connected to a blockchain network that runs the protocol software, validates transactions, stores a copy of the ledger, and communicates with other nodes to maintain the network's integrity, security, and decentralisation.
Off-Chain
Functions that are processed off a DLT, typically using a centralised system.
On-Chain
Functions that are directly processed through the cryptographic standards and recorded on on a DLT.
Oracle
A mechanism enabling DLT systems to ingest off‑chain data securely and enable it to interact with DLT smart contracts.
Real‑Time Gross Settlement (RTGS)
The Bank of England’s settlement infrastructure that holds settlement accounts for participating institutions and enables final, risk‑free, real‑time settlement of sterling obligations.
Settlement Finality (Legal)
Defined in the UK’s Financial Markets and Insolvency (Settlement Finality) Regulations 1999. Ensures transfer orders in designated payment systems are final, irrevocable, and insulated from insolvency‑related reversal.
Settlement Finality (Technical)
The irreversible commitment of a transaction at the system/consensus layer of a distributed ledger, without reference to legal constructs. The Challenge focuses on this technical domain.
Smart Contract
A smart contract is a self-executing, automated digital agreement with terms directly written into code, stored on a blockchain.
Synchronisation
synchronisation is defined as the orchestration of movements of central bank money between accounts held in the renewed RTGS service with the simultaneous transfer of assets or funds on one or more external, non-RTGS ledgers.
Token Standard
Token standards are a set of, commonly agreed-upon, technical rules, functions, and parameters within smart contracts that define how crypto tokens are created, issued, and managed on a blockchain, ensuring interoperability, security, and consistent behaviour
Validator node
A validator node is a device (computer, server) in a blockchain network, primarily in that actively verifies transactions, creates new blocks, and maintains consensus by voting and maintaining a copy of the ledger, ensuring network security.
Wholesale central bank money
A potential digital liability of a central bank used exclusively by eligible financial institutions for high‑value payments and securities settlement.
Zero-Knowledge Proof
A Zero-Knowledge Proof (ZKP) is a cryptographic method where one party (the prover) can convince another party (the verifier) that a statement is true, without revealing any information beyond the truth of the statement itself.
Zero-Knowledge Roll-up
A Zero-Knowledge Rollup (ZK-rollup) is a Layer 2 scaling solution that increases transaction throughput on Layer 1’s by executing transactions off-chain and posting only cryptographic, validity proofs back to the Layer 1. By bundling hundreds of transactions into a single batch and using mathematical proofs to verify their validity.
Ava Labs, Chainlink and Aave Labs, Circle, Digital Asset and KPMG, Hedera, HSBC, Kaleido, Rayls, The Scottish Centre of Excellence in Digital Trust and DLT, with partners Siccar, Nethermind, and TrackGenesis.
PFMI Principle 8 states FMIs should provide clear and certain final settlement, at a minimum by the end of the value day and, where necessary, intraday or in real time.
Reducing validators also can reduce fault tolerance of a network, impacting security.
In comparison, Visa can process 83,000 transaction messages per second. Additionally, the Bank has conducted in-house experimentation of centralised ledgers for payments that can sustain 100,000 transactions per second.
For example, the original Ethereum design with Proof-of-Work (PoW) had a transactions per second (TPS) of 15–30. This has increased to ~45 with the Proof-of-Stake (PoS) upgrade.
The Bank of England’s approach is that Synchronisation is the orchestration of movements of central bank money between the accounts held in RT2, with the transfer of assets or funds on one or more external asset ledgers. Synchronisation involves a two-stage ‘earmark’ and ‘release’ process. In the first stage, placing an earmark reserves assets on an asset ledger, or funds in the RT2 account. Once earmarked, funds and assets are ‘locked’ and cannot be used in other transactions. In the second stage, once conditions are met, releasing an earmark initiates the transfer of locked funds and assets to settle the transaction atomically.