The Unified Ledger
Since 2019, the Bank for International Settlements — the world’s central bank of central banks — has been running dozens of technology projects through its Innovation Hub. Each one tackles a different problem: cross-border payments, climate risk, fraud detection, digital currencies, regulatory compliance.
On the surface, they look like separate experiments being tested by different central banks in different countries. But they are all parts of the same system.
And they all connect to a single foundation — the unified ledger.
In 2023, the BIS published a blueprint for a single programmable platform on which all financial assets could be tokenised, all transactions recorded, and all compliance conditions written directly into the code. Agustín Carstens, the BIS General Manager, was candid about the purpose: the individual Innovation Hub projects needed ‘some way to bring them all together’1.
The unified ledger was that way.
A unified ledger is not a single database. It is a specification for a type of ledger that can hold anything — currencies, bonds, property titles, carbon credits, identity records — and connect to any other ledger built to the same specification. It’s a standard, upon which every central bank develops compatible infrastructure.
Today, every country runs its own payment systems, securities registries, and banking infrastructure. Moving money or assets between them requires intermediaries at every step — correspondent banks, clearinghouses, messaging systems, regulators. The unified ledger replaces this patchwork with a single layer where everything is visible, programmable, and subject to the same rules.
The BIS insists this is not ‘one ledger to rule them all’. They describe a collection of separate ledgers — each one handling specific assets or use cases — linked together through shared technical standards, ‘much like the internet’. In practice, the difference is technical, not functional. A bunch of databases forced to behave like one database for everything that matters.
Saying ‘multiple interoperable ledgers’ instead of ‘one ledger’ is the same move as saying ‘multipolar’ instead of ‘unipolar’. More nodes yet the same outcome, because they all align with the same set of rules. The architecture does not need to be in one place. It needs every place to clear against the same standards. One ledger or fifty — that is an engineering question. The question that mattered was answered the moment someone decided what the compatible standards would be. Everything else is implementation, and each one follows those very same standards.
Every Innovation Hub project is a branch growing from that root. The branches follow a logic: first the system defines its standards, then it checks every transaction against them, then it settles or refuses. Ethic in, financial outcome out.
The standard
Classification
The first step is turning a policy objective into something a machine can measure. Project Genesis2 built a prototype for green bonds on blockchain — debt instruments whose compliance with environmental criteria is verified automatically throughout their lifecycle. Project Viridis3 monitors climate-related financial risks using satellite data and AI. The NGFS climate scenarios4, developed by the Network for Greening the Financial System (housed at the BIS), translate climate targets into capital requirements that determine which assets banks can hold and at what cost.
Climate risk is the first standard loaded into the classification layer. The EU Taxonomy Regulation of 20205 stated explicitly in Article 26 that guidance on ‘other sustainability objectives, including social objectives, might be developed at a later stage’. The classification architecture was designed from the outset to accept any standard — environmental, social, governance, health, biodiversity — and apply it to the cost and availability of capital.
The EU’s Carbon Border Adjustment Mechanism6 (CBAM), which began levying imports in January 2026, conditions cross-border trade on carbon compliance — the classification branch enforced at the national border before the digital ledger is even complete. And the moral architecture justifying total financial visibility has been developed over two decades by Thomas Piketty, whose body of work — a global asset registry, wealth and inheritance taxes coordinated across borders, central bank accounts for every citizen, individual carbon cards — maps precisely onto the layers described above.
Identity
The second step binds every participant to the system so they can be measured against the standard. The BIS’s Finternet paper7, published in 2024, proposed extending the unified ledger to retail transactions by anchoring it to digital identity. The model is India’s ‘India Stack’ — Aadhaar for digital ID8, the Unified Payments Interface for real-time payments9, and a data layer for verified document exchange. India brought 80 per cent of its population into the banking system in a decade using this model. It is now being exported to Nigeria, Kenya, Bangladesh, Ghana, Nepal, and Trinidad and Tobago10.
Without the identity layer, the system only reaches institutions — banks, corporations, governments. Include it, and the system reaches every individual. The Finternet paper proposed privacy tools to address the surveillance concern, but these only protect the data while it moves through the system. They do not address who decides what counts as compliant, and who writes the rules the system enforces.
Evaluating the transaction
Surveillance
Once the standards are set and every participant is bound to the system, the next step is seeing everything. Project Aurora11 uses machine learning to detect money laundering patterns across institutions and borders by mapping networks of behaviour rather than examining individual payments. Project Ellipse12 aggregates data from multiple sources to give supervisors a real-time view of financial risk. Project Rio13 integrates regulatory reporting so that central banks can monitor the system as a whole rather than collecting reports from individual banks.
These projects are presented as tools for fighting crime and managing risk. In practice, they build a continuous surveillance layer capable of seeing every transaction, tracing every flow, and flagging every anomaly across the entire ledger.
Compliance
Surveillance sees the transactions. Compliance makes them readable. Project Keystone14 builds analytics for ISO 2002215 — the dominant messaging standard. ISO 20022 replaces payment messages with structured data: legal entity identifiers, purpose codes, tax identification numbers, and links to specific invoices or contracts. Every transaction becomes machine-readable, classifiable, and auditable.
Without standardised data, the system cannot automatically check transactions against the taxonomy. With ISO 20022 running across the ledger, every payment carries enough information for the compliance layer to classify it, check it against whatever standard has been loaded, and approve or reject it in real time without human intervention. Project Mandala16 takes this further by encoding each legal jurisdiction’s regulatory requirements into a common framework.
Keystone makes the transaction readable. Mandala checks it against the rules.
Settling or refusing
Settlement
Project mBridge17, developed with the central banks of China, Thailand, the UAE, and Hong Kong, settles cross-border payments in central bank digital currencies directly between participants. Project Nexus18 links domestic fast-payment systems across countries into a single network. Project Mariana19 tests automated foreign exchange settlement using decentralised finance protocols. Project Dunbar20 builds a shared platform where multiple central banks can issue and exchange digital currencies.
Together, these projects build the pipes that connect every national currency to the unified ledger. The underlying currency is irrelevant. Dollar, yuan, rouble, digital euro — all settle through the same infrastructure. Countries settling in yuan clear through the People’s Bank of China. Countries settling in roubles clear through the Russian central bank. Both are BIS members.
But currencies are only the beginning. Project Helvetia21 tested using digital central bank money to settle securities — stocks and bonds turned into digital tokens. Project Jura22 moved tokenised assets across the border between France and Switzerland. Project Agorá23 puts commercial bank deposits and central bank money on the same platform. Currencies, bonds, securities, bank deposits — all turned into tokens, all on one system, all subject to the same rules.
In October 2024, the BIS deployed mBridge to its partner central banks24, the same week Russia proposed a ‘BRICS Bridge’ payment system at the Kazan summit. Russia’s proposal released it. The BIS then turned its attention to Project Agorá — a new settlement platform built with seven Western central banks including the Federal Reserve, the Bank of England, and the Bank of Japan.
The BIS built the Eastern system first, then started on the Western one. Agorá is years behind mBridge, which was already working by June 2024. One is operational, the other is still being tested — but both are built on the same design and check against the same rules. The architecture does not need either side to win — but the side that was supposed to be sanctioned got there first.
Actuation
Project Rosalind25 shows how all the layers converge at the moment of payment. Its three-party lock mechanism requires the central bank, the commercial bank, and a compliance condition to agree before a transaction settles. All three keys must turn. If the compliance condition is not met, the payment does not clear.
The ethic has been translated into a standard, the standard has been evaluated, and the payment refuses to clear if the evaluation fails.
Connecting the countries
National onboarding
The final branch is not a technology project. It is how countries get connected to the system. IMF loans increasingly come with requirements for digital infrastructure. World Bank development funding pays for digital identity systems. Trade agreements require alignment with international standards. CBDC pilots, sold as modernisation, connect national currencies to the settlement layer. Crisis lending — the same ‘common fund for countries in transition’ that Van Zeeland proposed in 1938 — attaches conditions that persist long after the crisis ends.
Each country enters the system through a different door — crisis, development, trade, or voluntary adoption. But every door leads to the same system.
The unified ledger is the platform all of these projects are building towards. Settlement connects the pipes. Surveillance provides visibility. Classification translates policy objectives into machine-readable standards. Identity binds the individual. Compliance provides the grammar. Actuation gates each transaction. National onboarding connects the countries.
The rules the system checks against are set by whoever decides what the next standard will be — outside democratic legislation, outside national parliaments, and soon, outside human oversight entirely.
The mechanism operates on five steps.
The ethic is the input — climate, social justice, financial stability, public health.
The standard is the translation — the specific metric, threshold, or classification that turns the ethic into something a machine can check.
The clearing function evaluates the transaction against that standard.
Settlement is where behaviour is forced — the transaction either executes or it doesn’t.
The outcome is compliance, not because the participant agrees with the ethic, but because the payment will not go through without it.
The clearinghouse merely applies the given standard. The power is vested with he who translates the abstract ethic into a standard.
In April 2016 — three years before the BIS Innovation Hub came to be — Jeffrey Epstein sent Joi Ito, head of MIT’s Media Lab, a draft paper titled ‘Reinventing Bookkeeping and Accounting’26. Ito proposed replacing the 700-year-old system of double-entry bookkeeping with algorithmically computable accounts:
There is no reason that every entry in our books needs to be a number. The cells could be an algorithmic representation of the obligations and dependencies that it represents.
He described making every contract, every obligation, and every financial dependency visible and computable in real time — not for one bank, but for ‘the whole system of banks, investors, and everything that is interacting financially’. He named the Financial Stability Board as the body that would run stress tests on the system, and connected the concept to blockchain and cryptography. Epstein's key addition: the locality of money. Currency that can only be spent in designated locations.
That is the unified ledger — described in a private email, by a man whose network inherited the intelligence function Robert Maxwell had operated through Pergamon Press, sent to the head of the institution whose Digital Currency Initiative went on to partner with the Federal Reserve Bank of Boston on Project Hamilton27 — the US central bank digital currency prototype.
The pattern has been running for over a century. Gold was replaced by a claim on gold held at a central node. Currencies were replaced by claims clearing through the dollar. Physical stock certificates were replaced by book entries at central depositories. In each case, the tangible asset migrated to the intermediary and the holder was left with a receipt whose validity depends entirely on the node honouring it.
The unified ledger completes the migration: every asset — currency, securities, property, carbon credits — tokenised on a single platform, with ownership reduced to a conditional book entry that clears against whatever standard is loaded.
In practice, the unified ledger operates as a continuous chain of instructions, each one conditional on compliance. If a transaction clears, the next instruction treats that outcome as settled fact and builds on it.
If, however, a transaction is settled incorrectly — approved when it should have been refused, or refused when it should have been approved — every instruction that followed, and used that outcome as its starting point, is already wrong. But those downstream transactions have already been used as inputs for others. Within seconds, thousands of subsequent transactions can build on an incorrect settlement.
Unwinding one thus means unwinding everything that came after it, across every participant who relied on it. The system cannot do this, because those other participants cleared legitimately, and it would stall the economy.
The result is that upon execution, everything becomes permanent in practice — not because it cannot theoretically be reversed, but because the cost of reversing it falls on everyone except the person who imposed it. This, perhaps, is not a major problem when it’s a matter of something fungible, like money. But when the tokenised asset is your home, it’s a somewhat different matter. The settlement is final, and your home now belongs to someone else.
‘Own nothing, control everything’ is an architectural description. And once ownership is a conditional entry on a programmable ledger, it can be revoked in microseconds, with no realistic possibility of appeal.




















The complexity of this, at scale, remains mind boggling. The original internet development and implementation pales by comparison. And anyone who remembers will recall years of debugging and performance issues with the internet, especially during the 1990s. A key difference here is the control aspect of a system that seeks to “clear” everything based on a complex and evolving set of criteria. The internet never attempted to do this. It was/is an information highway, mainly concerned with connectivity and speed and throughput and not with algorithmic restrictions and unlimited reach. The question is not about what the technology can do. The question is the development horizon for something so massive. The Achilles heel of every large IT project ever attempted has been twofold: 1) Incomplete documentation(and understanding!) of system requirements at the outset on the part of the coders and 2) the inability to maintain project schedules due to the disruptive nature of changing and evolving requirements and project scope(obviously related to problem # 1). The result of these issues has caused very many IT projects to either fail or be significantly descoped. I was very close to many such projects in a 32 year career in aerospace. These problems were endemic to all such endeavors, regardless of industry or technology. The time factor cannot be understated, and is the reason for the old maxim of complex program and project management: “Everything is overcome by events”. This refers to the fact that any project that necessarily resembles “laying the track in front of the train” ends up, in the end, having little or no resemblance to the original vision. The destination becomes unknown and unknowable.
I believe the visionary billionaires and power apparatchiks that have envisioned this evil scheme have little to no understanding of the above. They are accustomed to a world where money seems to move mountains. This is not like building a data center. Replacing the entire global financial system with something that actually works by 2030 is simply unachievable on the time scale. What may be achievable is destroying the existing system.
Some imagine this trajectory continuing indefinitely — a world of total platform control over human economic behaviour, every transaction monitored, every contribution priced, every exchange mediated by a handful of centralised systems. It is a vision that should be taken seriously as an intention. It should not be taken seriously as a possibility.
It fails on three independent grounds:
The Social Logic Distributed systems generate workarounds. People route around control when control becomes intolerable. The history of the internet — and of every attempt to enclose a commons — demonstrates this consistently. You cannot enclose what is already everywhere.
The Technical Reality Large-scale centralised IT projects fail. Not occasionally, not exceptionally — structurally, repeatedly, at cost. The complexity required for total coordination exceeds what centralised systems can reliably manage. The brittleness is intrinsic, not incidental.
The Physics The energy required to run centralised data processing at the scale envisaged for total economic surveillance and control does not exist — and will not exist. The projections for AI infrastructure alone are already straining power grids. Distributed, localised processing is not merely ideologically preferable. It is thermodynamically inevitable. The physics points in the same direction as the social logic.
https://www.outersite.org/the-exit-is-already-here/