Tokenization has caught the attention this year but will it finally have its moment in 2024? Myles Milston, Co-Founder and CEO of Globacap takes a look at whether tokenization is ready for mass adoption in private markets.
The successive crises that marred the digital assets market in the last year were a blow to trust in the industry. However, progress has been made in converging the worlds of digital and traditional finance.
The new Technology Working Group of the UK government’s Asset Management Taskforce recently approved UK funds to develop tokenisation, publishing a roadmap for the adoption of the technology.
Meanwhile, Goldman Sachs’ Global Head of Digital Assets Matthew McDermott has said he expects a “significant uptick” in trading volumes of blockchain-based assets within a few years.
The real revolution will be in private markets where tokenizing real-world assets such as real estate, private equity and private debt could improve access to these markets and make trading much faster and simpler.
But tokenization only comes at the end of a private markets transaction. To make it a reality, we firstly need to prioritise digitizing and automating private markets’ infrastructure.
Registered vs bearer instruments
Before we can talk about the potential of tokenization in private markets, it’s important to differentiate between bearer instruments and registered instruments.
With bearer instruments, whoever holds the asset is the owner. For example, if you give someone a coin, they become the owner, without a record trace.
Registered instruments have a centralized list of asset owners. If an individual loses the asset, the register can prove they are still the owner.
Bearer instruments have all but become extinct over the last 300 years due to the obvious impracticalities and security risks of ownership being solely established by possession.
Despite this, tokenization to date has focused largely on bearer instruments – notably cryptocurrencies – which aren’t compliant with most sets of securities regulation and investor compliance requirements, preventing institutional investors from engaging with this market.
Three use cases for tokenization in private markets
In recent years there has been much speculation that tokenization can transform capital markets, delivering greater interoperability and control to investors. However, for this to happen, we need to change our approach.
Tokens should be a representation of the security, rather than the security itself. This means the asset moves away from being a bearer instrument, to instead become a registered instrument, which opens a world of opportunities for the technology.
Public markets are already very efficient. Of course, improvements are always possible, but the real opportunity lies in private markets, which are highly inefficient. Blockchain technology, or tokenization, is one tool that can assist with part of that process.
In my view, there are three use cases for tokenization in private markets:
- One-to-one exchange of assets – When transacting in private markets, there is usually an intermediary such as a broker or law firm in the middle to ensure both sides meet their end of the deal, before settling the transaction. With blockchain technology, a digital smart contract can perform that intermediary function instead, automatically receiving and swapping out two smart contracts each representing the asset and the cash. This cuts transaction time from weeks to a few seconds, while eradicating intermediary fees.
- Private asset trading – If a firm wants to allow unlimited transactions in their private shares, they could tokenize their shares and share them with a digital exchange. This allows investors to be in full control of where they seek liquidity, rather than relying on an intermediary acting in an opaque manner. However, this is only relevant for the largest assets – most don’t have sufficient liquidity demand to warrant continuous trading.
- Transactions between institutions – For institutions, many of these transactions go through SWIFT rails and are settled through custodians. With a tokenized asset, similar to the one-to-one exchange of assets above, a smart contract can be a more efficient settlement and reconciliation mechanism.
Digitalization comes first
Tokenization is a useful toolkit at the point of transaction itself. However, in order for blockchain to function, the underlying asset itself has to first be digitized. This is already true for public markets with the register held at a Central Securities Depositary (a process generally referred to as “dematerialized”), which is why public markets are already quite efficient.
Private markets, on the other hand, despite being a huge driver of economic activity, representing an estimated $22.6 trillion in AUM, typically involve paper certificates, a securities register held in an Excel file or other medium not automatically updateable, and therefore require manual human intervention to update the owner in a secondary transaction.
Technology now exists which digitizes private securities end-to-end, and fully automates the processes involved in settling private markets transactions, even including high friction points such as stamp duty and stock transfer forms.
Once a private markets asset has been digitized and connected to automated workflows, then tokenization can be overlayed on top to enable highly efficient transfers.
The groundwork has been laid
Amidst the crises that tested the trust in the digital assets market, notable progress has been made in bridging digital and traditional finance.
Regulations in most jurisdictions are coming, albeit slowly. It’s a complex case, as tokenization itself is the technology medium. In the UK for example, the Companies Act specifies the contents that a share register must contain, but it does not specify – and it shouldn’t – the medium on which that register should be stored.
The prerequisite for successful implementation lies in the digitization of private markets and the technologies capable of delivering this are starting to make inroads, setting up 2024 as the most important year for tokenization yet.