After years of pilots, white papers, and conference panels, tokenization has abruptly moved from the margins of U.S. finance into the core of market-structure discussions. A cluster of concrete moves by major banks, exchanges, and policymakers in March 2026 signals that Wall Street is no longer treating tokenization as a crypto-adjacent experiment, but as the next layer of infrastructure for cash, collateral, and securities — provided it remains firmly under traditional control.
For institutional investors and market participants, this shift reframes tokenization from an innovation topic to a live question about how liquidity, settlement, and risk will be organized over the next decade.
The March 2026 Turning Point
Four developments in quick succession make clear how far tokenization has advanced into mainstream finance.
First, Bank of Montreal (BMO) announced plans to launch tokenized cash capabilities with CME Group and Google Cloud, targeting real-time payments and 24/7 margin activity for institutional clients. This is not a lab experiment: it is aimed directly at margined products and derivatives trading at one of the largest global derivatives venues.
Second, Nasdaq secured SEC approval to support trading and settlement of certain stocks and ETFs in tokenized form. That green light explicitly recognizes tokenized versions of traditional instruments within existing regulatory perimeters and moves tokenization from a post-trade concept into the matching and settlement stack of a major exchange operator.
Third, earlier in the month, U.S. bank regulators clarified that tokenized securities will not face extra capital charges merely because a blockchain is involved. For bank balance sheets, that distinction is critical: capital treatment often determines whether new infrastructure is viable at scale.
Finally, on March 25, the House Financial Services Committee held a dedicated hearing on tokenization and flagged that draft legislation is in the works to adapt securities rules for this structure. Committee materials indicated lawmakers are examining whether current securities law adequately governs tokenized activity and where duplicative requirements might be inhibiting useful deployments.
Taken together, these moves show that tokenization has crossed a threshold. It is being framed not as a separate “crypto” category, but as a new representation layer for familiar instruments — subject to securities law, overseen by existing regulators, and implemented by incumbent intermediaries.
From Crypto Slogan to Market Infrastructure

At its core, tokenization means taking an existing asset — such as cash, securities, or other claims — and representing it on a blockchain-based ledger. The goal: allow those assets to move with more automation, fewer time constraints, and closer alignment to how global markets actually operate.
In public communications, large institutions have converged on a similar framing. In his 2026 chairman’s letter, BlackRock’s Larry Fink described tokenization as a way to make investments easier to issue, trade, and access, emphasizing efficiency and access rather than radical reinvention. JPMorgan’s Kinexys platform pitches a comparable future in institutional terms: transactions that operate 24/7, in near real time, and across borders.
This messaging underscores a key distinction. The current wave of tokenization efforts is not about creating new speculative asset classes. It is about re-platforming existing instruments onto digital rails that can support continuous operations, automated workflows, and more granular control over collateral.
That shift matters for risk, liquidity, and market structure. A tokenized architecture potentially changes not only when trades settle, but who controls the software, data, and governance layers beneath those trades.
“Internet Hours” and the End of Market Business Days

The simplest way to understand why Wall Street is leaning into tokenization now is to stop treating it primarily as a blockchain story. For most legacy firms, the core demand is not for crypto-native infrastructure; it is for trading continuity — the ability to operate on “internet hours,” rather than business hours.
Global markets already react around the clock. Oil prices move when New York is asleep. Futures reprice on geopolitical headlines from Asia or the Middle East. Margin calls in London can be triggered regardless of the time in Chicago. Yet the machinery behind cash movements, collateral transfers, and final settlement still largely runs on constrained windows and slow, batch-based back-office processes.
Tokenization offers a way to close that gap between market reality and operational capability. BMO’s planned tokenized cash platform with CME is explicitly designed to let institutional clients manage trading, settlement, and margin calls at any time. JPMorgan’s Kinexys is pursuing always-on payments and faster cross-border transfers. Citi’s work on tokenized payments is framed around real-time liquidity, automation, and more efficient use of collateral.
These initiatives are now focused on practical treasury management problems: how to get cash and collateral where they need to be, when they need to be there, without waiting for the next settlement batch or banking day. The language has shifted from innovation narratives to operational detail — a sign that tokenization is entering live production planning, not just concept decks.
Collateral Mobility: The Real Prize
Publicly, the leading talking point for tokenization is faster settlement. Privately, the more strategic prize is mobile collateral.
When markets come under stress, the main constraint is rarely price alone. Volatility can strand capital in the wrong venue or jurisdiction. Transfers between entities and infrastructures take too long. The lag between a trade, a margin call, and deployable cash can exacerbate liquidity squeezes.
Tokenized cash and tokenized securities promise a framework in which valuable assets can be moved, pledged, and reused quickly and with less friction. Citi’s tokenized-payments work is explicitly oriented toward a trading environment with real-time liquidity and fully automated processes. BMO’s initiative with CME is premised on similar goals: ensuring that collateral and funding can keep up with the tempo of modern derivatives markets.
For large financial institutions, this mobility is likely more economically significant than simple speed gains in trade settlement. A world where collateral can be reallocated swiftly across books, entities, and venues — while preserving regulatory constraints and risk controls — alters how leverage, funding, and balance-sheet optimization are managed.
This is also where tokenization collides with questions of power: whoever owns and operates the rails for tokenized collateral stands to control critical chokepoints in the next version of market infrastructure.
Who Builds and Governs the New Rails?

The current wave of initiatives shows that incumbent exchanges, banks, and clearinghouses are all vying to be the core providers of tokenization infrastructure.
Nasdaq’s SEC approval demonstrates that exchanges have been early movers in turning tokenization from theory into regulated practice. The New York Stock Exchange is not standing still either; its partnership with Securitize to develop a tokenized securities platform signals competitive pressure to ensure that tokenized trading happens on traditional venues, not outside them.
On the post-trade side, DTCC’s tokenization work clearly indicates that clearing and settlement incumbents intend to adapt rather than yield ground. As the central plumbing for U.S. securities, DTCC is positioning tokenization squarely within a regulated environment that preserves ownership rights and investor protections.
Banks are pushing to ensure that tokenized cash and collateral remain integrated with their existing balance-sheet, custody, and payments franchises. For them, tokenization is a way to modernize infrastructure without ceding core functions to open networks or new entrants.
Viewed together, these moves resemble a coordinated shift in market structure rather than scattered private-sector experimentation. Banks want markets that function on internet hours. Exchanges want tokenized trading flows to remain on their platforms. Clearinghouses want digital assets tied tightly to existing technical and regulatory frameworks. The common thread: tokenization should upgrade the system, not displace it.
Congress, Regulators, and the Limits of the Narrative
Washington is increasingly treating tokenization as a capital-markets question, not a fringe crypto issue. The March 25 House Financial Services Committee hearing and accompanying memorandum highlight two main policy concerns: how current law applies to tokenized instruments, and where rule adjustments might be needed to avoid either gaps or duplication.
One discussion draft presented to the committee would require the SEC and CFTC to conduct a joint study on whether additional rules are necessary for tokenized securities and derivatives. Another would direct the SEC to craft rules allowing key intermediaries to rely on blockchain records under specified conditions, effectively recognizing distributed ledgers as authoritative books and records in certain contexts.
Witness testimony showed a broad consensus on direction, if not on every detail. Nasdaq’s John Zecca argued that tokenization should be integrated into the existing market system and noted that capital markets are moving toward a more continuous, automated, and interconnected structure. SIFMA’s Kenneth Bentsen voiced support for innovation while stressing that investor safeguards and market coherence must move in parallel.
DTCC took its characteristic incumbent stance, endorsing tokenization within a tightly regulated framework that preserves investor protections. Even the North American Securities Administrators Association (NASAA), in a letter to the committee written from a more skeptical standpoint, accepted the premise that tokenized securities are real securities and should remain fully subject to securities law.
For institutional participants, this broad agreement has important implications. Tokenization is being normalized as a technical evolution within the existing legal regime, not an exemption from it. That reduces regulatory tail risk but also confirms that open, permissionless models will not define mainstream tokenized finance in the U.S. near term.
Still, policymakers and market actors acknowledge that tokenization may not deliver all that is currently promised. Fragmentation across chains and platforms is a real risk. Interoperability between different tokenization systems remains incomplete. Legal enforceability — particularly in cross-border and default scenarios — still requires clearer answers. There is a meaningful possibility that institutions could spend years digitizing assets only to achieve incremental improvements rather than the transformative gains often cited.
Yet the direction of travel is hard to ignore. When firms such as BlackRock, BMO, Nasdaq, DTCC, JPMorgan, and NYSE, along with Congress and key trade groups, begin speaking in variations of the same language, tokenization has clearly ceased to be a peripheral crypto slogan. Crypto markets helped demonstrate that money and assets can operate on continuous digital rails. Wall Street now wants a version of that future it can regulate, monetize, and contain within the existing financial order.
The debate on Capitol Hill underscored the new reality: tokenization is no longer waiting for permission to enter the mainstream. The open question for institutional investors is not whether tokenization will arrive, but who will define its standards — and how that will reshape the economics of liquidity, collateral, and control.

Hi, I’m Cary Huang — a tech enthusiast based in Canada. I’ve spent years working with complex production systems and open-source software. Through TechBuddies.io, my team and I share practical engineering insights, curate relevant tech news, and recommend useful tools and products to help developers learn and work more effectively.





