At Sibos 2025 in Frankfurt, tokenisation wasn’t just a talking point — it was the spine of the agenda. From central bank announcements to infrastructure upgrades, the week marked a decisive shift in how digital assets are being treated: not as speculative instruments, but as the future plumbing of global finance. Yet as the sector rallies around blockchain-based settlement, programmable money, and tokenised deposits, a more sobering question looms.
Can tokenisation and digital assets truly scale? And not just technically, but operationally, institutionally and globally?
Swift, the organiser of the event, set the tone early, announcing the integration of a native blockchain ledger into its core infrastructure and placing digital asset interoperability at the heart of its Sibos programming.
In the opening plenary, Bundesbank President Joachim Nagel followed with a bold declaration that tokenisation and the digital euro would “redraw the financial map,” signalling Germany’s intent to lead on wholesale digital transformation. The European Central Bank then confirmed a 2026 rollout for blockchain-based settlement, marking a decisive shift in how central bank money will interact with tokenised assets.
Yet beneath these headline announcements, a deeper conversation was taking shape — one focused on the operational realities of tokenisation. Thomas Dugauquier, tokenised assets product lead at Swift who was moderating a panel on digital asset standards, distilled the challenge into a single line: “An asset is only useful if it’s liquid.” That liquidity, he argued, depends on interoperability. And interoperability, in turn, depends on shared definitions and standardised behaviour.
Dugauquier pointed out that while on-chain protocols like ERC-20 (the technical standard for fungible tokens) offer basic interfaces, they often lack a clear taxonomy of the asset itself. Without that taxonomy — without a common language for what an asset is, how it behaves, and how it should be treated across systems — digital instruments risk becoming isolated experiments.
Swift’s response is to define assets in a technology-agnostic way, combining ISO 20022 syntax with modular methodologies that codify functions like coupon calculations, fractional ownership, and cost management. “It’s not about slowing down innovation,” Dugauquier said. “It’s about enabling it.”
This emphasis on structure and clarity was echoed across the payments and operations landscape. Speaking to Capital Pioneer, Philip Bruno, chief strategy and growth officer at ACI Worldwide, described tokenisation as a velocity play. “If you increase the velocity of money, you boost economic growth,” he said, referencing research that links real-time payments to measurable economic uplift.
But Bruno was quick to point out that velocity depends on orchestration. Tokenised deposits and programmable money may be the future, but they come with complex requirements — from compliance and sanctions screening to conditional logic and cross-border reconciliation. “Stablecoin is just another payment type,” he said, “but it has special needs. That takes work.”
Bruno’s remarks underscored a critical point: tokenisation doesn’t replace infrastructure — it demands more of it. The ability to route, settle, and reconcile digital assets across jurisdictions and asset classes requires a robust orchestration layer. And that layer must be flexible enough to support fiat, stablecoins, and tokenised deposits interchangeably, while remaining compliant with local regulations and global standards.
Yet even the best infrastructure can’t eliminate complexity. Steve Morgan, global banking lead at Pega, brought the conversation back to operational intelligence. “Tokenised assets still need to move through structured workflows,” he explained. “And when something breaks — a data link, a calculation, a compliance check — you need a human in the loop.”
Morgan’s experience spans credit operations, lending and legal workflows, and he’s seen firsthand how automation can transform back-office processes.
He cited Santander Brazil’s legal operations as a case in point: by applying generative AI and automation, the bank reduced its workload by 77%, improved accuracy to 99% and redeployed staff to higher-value roles. But the human layer wasn’t eliminated — it was elevated. “There’s always an exception,” Morgan said. “And that’s where judgment, experience, and context still matter.”
This tension between automation and oversight runs through the entire tokenisation debate. Programmability offers enormous potential — from conditional settlement to real-time collateral optimisation — but it also introduces new failure modes. A corrupted data feed, a misaligned calculation, or a jurisdictional mismatch can derail even the most elegant smart contract. As Morgan put it, “You can’t rely on floating agents with no predictability. You need structure, governance, and escalation paths.”
That need for structure is precisely why standardisation matters. Dugauquier noted that it took decades to mature standards in traditional finance, and that the digital asset space must now accelerate that journey without sacrificing rigour. Swift’s work on asset taxonomy, behaviour codification, and modular interfaces is designed to support that acceleration — not just for bonds and equities, but for more complex instruments like tokenised royalties, trade finance contracts and cross-border liquidity flows.
Bruno, too, sees promise in programmable instruments, but warns that use cases must be grounded in reality. “Programmability lets you do anything,” he said. “The challenge is deciding what actually gets done — and making sure it works when something breaks.” He pointed to mobile workers in unstable currency environments and trade finance settlement as areas where tokenisation could deliver real value — but only if the underlying infrastructure is ready.
Morgan agreed, adding that the best implementations combine automation with intelligent off-ramps. Citibank’s City Services division, for example, uses AI and automation for smaller clients, but ensures that mid-sized and large clients can easily escalate to a human when conditions change. “Even the best AI can’t predict a sudden change in a tariff,” he said. “You need someone who understands the patterns and can interpret what’s happening.”
Taken together, these perspectives form a blueprint for tokenisation at scale. It begins with standards — clear definitions, codified behaviour, and aligned taxonomies. It requires infrastructure — orchestration layers that support programmability, compliance, and velocity. And it demands operational intelligence — workflows that manage exceptions, support decision-making, and preserve trust.
The message for financial institutions is clear. Tokenisation isn’t just about putting assets on-chain. It’s about making them usable — across fragmented systems, regulatory regimes, and client expectations. And the time to engage is now. “We’re creating the system,” Bruno warned. “If you’re not part of it, you may not get a say.”
Sibos 2025 may be remembered not for a single announcement, but for a collective shift in tone. Tokenisation is no longer a question of if — but how. And the answer lies not just in technology, but in standards, workflows, and the people who keep the system running when things don’t go to plan.