Before you go, check out these stories!

Hackernoon logoLiquidity is about market depth, not magic by@sbmckeon

Liquidity is about market depth, not magic

Author profile picture

@sbmckeonStephen McKeon

As I was scrolling through my twitter feed yesterday I saw that Emin Gün Sirer, a thoughtful scholar whom I respect, had retweeted a piece on token liquidity by Preston Byrne. I read it immediately. As it turns out, the post references some of my statements from Traditional Asset Tokenization in a context that might be considered critical. Preston’s blog contains some valuable insights, so this post is a brief response to reconcile the two articles.

As an academic, I welcome critical feedback. Working through divergent views is how we learn. It’s a sign of a healthy ecosystem and exactly what we should be doing. As I detail below, Preston and I are in more agreement than may be obvious. For example, take the last sentence of his post (bold italics are quotes from his article):

…if you have the right business case and find the right lawyers to advise you, tokens and securities can mix quite nicely.


I was assuming legal securitization as given. I’m a finance professor, it should come as no surprise that I believe people need to comply with security laws. Preston’s post addresses what happens under conditions of willful non-compliance.

There is nothing magic about liquidity. It is a function of market depth, as measured by bid-ask spreads and price impact from trade. As I mentioned in my post, the act of tokenization does not impact liquidity unless it impacts market depth. The interesting thought experiment is to consider how and why tokenization might increase market depth.

Representing a fractional ownership claim on an asset is securitization, a concept that has been around for hundreds of years. The difference with blockchain tokens is divisibility, low cost global transfers of ownership, and an immutable record of the ownership claims. My argument is that these differences will lead to increased depth in the secondary market for the securities because they reduce frictions to trade, no magic required.

A careful reading of my article reveals that I intentionally stayed away from startup liquidity problems. Tokens representing early-stage corporate equity have their own set of complications that I will address in a future post.

“Liquidity becomes available by taking the illiquid asset (bricks and mortar) and pooling it with many others like it so you can get a more easily tradable asset (e.g. a AAA-rated note) which allows a bank in, say, Japan to get exposure to some mortgages in southern Florida. Which makes mortgage loans, as a class, more liquid than they were before, because you can get them off your balance sheet before they mature.”

When one pools a group of assets, say mortgages, and then splits up the claims on the pool into a hierarchy, new securities with different levels of risk and return can be created. This is a totally different animal compared to the idea of securitizing/tokenizing a single asset. In the pooling case, additional liquidity is achieved by tweaking the risk and return profile of the securities. It is the alteration of the risk profile that generates demand and increases the depth of the secondary market.

Pooling will continue occur when more assets are securitized with tokens, but focusing on pooling is missing the point. The big picture in traditional asset tokenization is about relaxing frictions to trade, not pooling and tranching. The liquidity gains I describe in my article are not conditional on changing the risk profile of the assets.

“Where I’m left is that everyone is repeating the “tokens make X easier” mantra ad nauseam but nobody can actually explain why.”

I stated that IRS 1031 exchanges would be easier if the underlying assets were tokenized. Here’s why: 1031 exchanges allow the seller of an investment real estate asset to defer paying capital gains taxes if the proceeds from the sale are invested into a like-kind asset (another real estate asset). If real estate assets were tokenized, the degree of divisibility allows a buyer to tailor the amount of investment to precisely match the proceeds from the sale. A larger and deeper market of real estate asset tokens would reduce search cost for the buyer to find a suitable replacement asset. Further, if real estate tokens can be directly traded for each other, this is even closer to the spirit of a simultaneous exchange of like-kind assets. The notion of direct exchange of assets without translation through a currency is really interesting to me and I’ll surely be writing more on this in the future. However, before we get too carried away, let me state that I’m not a tax attorney and cannot predict how the IRS will treat tokens representing real estate assets. Perhaps they’ll rule that they are not eligible for 1031 exchanges. This is a regulatory issue, which is one of the challenges I mentioned in my original post.

“When we hear people say they want to “tokenize an asset,” most people are talking about avoiding the expense and bother of securitization while achieving the same effect.”

I cannot speak for others, but this is absolutely not what I’m talking about. I wholeheartedly agree that tokenization needs to be done within the context of regulation. That means complying with security laws like Blockchain Capital, not ignoring them like The DAO.

To be clear, traditional asset tokenization is not about cutting corners. Registration is a slow expensive process and cost cutting on registration is not where the gains in liquidity reside. The gains flow from increasing the depth of the secondary market, where liquidity is enhanced by compliance, not hindered by it. There are lots of investors (i.e. institutions) with fiduciary responsibilities that are rightfully wary of unregulated securities. These investors represent a lot of capital and they will add depth to markets. Adhering to regulation is a necessary condition for expanded institutional participation.

In sum, the main takeaway is that Preston and I are talking about different aspects of tokenizing traditional assets. His post is focused on the cost of securitizing at the time of issuance and the importance of conforming to legal regulations. My post is about secondary market liquidity effects when an asset can be traded with reduced frictions. These are not mutually exclusive and both are important.

I commend Preston for drawing a bright red line around the topic of securities regulation and tokens. It is a topic that we’ll be discussing a lot more as time goes on.


The Noonification banner

Subscribe to get your daily round-up of top tech stories!