“Ever tried. Ever failed. No matter. Try Again. Fail again. Fail better.” ~ Samuel Beckett
Please make sure to read the first post explaining the “progress” vs “process” problem with Appended assets, and a future option for conversion in the form of a security. This post will cover both the spectrum of utility asset classification, and what projects can do to both slow down velocity and tokenize properly — even if time has already passed after an ICO.
To step back for a second, Appended tokens can enter death spiral either through “process,” as defined in the prior post, or usage, unless actions are taken to either make them integral to a system, or introduce incentives and a tinge of game theory to slow down their velocity (as explained later).
Payment tokens are great examples of bad design, as the parent chain asset could handle the entire service minus the friction and headache that converting requires. However, in my opinion, all assets exist along a dynamic spectrum where design covers one axis and functionality covers the other.
As mentioned in the prior blog, assets exist on a spectrum between both Appended and Intelligent. Purely Appended assets are those that function either purely as an MoE (medium of exchange) on a platform, or as a coupon in which you pay less tokens than you would at any normal valuation. Therefore, the business could either exist without these assets, and completely replace them at any time — rendering them useless.
The token doesn’t need to exist for the success of the platform, and no, your special token that you minted for the sake of paying for an application doesn’t make your platform any better, it simply just introduces more friction.
On the other side sits the Intelligent assets, or those that don’t suffer from the problem of process, as they’re key to both functionality and provide mechanisms of capturing real value. Examples of value capture from Intelligent assets come from staking mechanisms, node locking, and governance in various forms (whether it be token curated registries, pure protocol governance, etc). These tokens are here for a reason, and can’t simply be replaced with the parent-chain asset, nor can they simply be removed from an entire service without a complete loss of quality.
However, there is another spectrum of token value that sits perpendicular to that of the Appended/Intelligent axis, and that’s whether or not the token in the system is Functional or Non-Functional. As ICOs continue to raise unrealistic amounts of capital, their assets aren’t always functional from their issuance. Many projects that have great design haven’t even deployed the use of their asset yet, rendering them both Intelligent but Non-Functional at the same time. It becomes even harder to develop a valuation profile on an asset due to unseen network effects from the asset not being used outside of pure speculation.
For example, Augur and 0x — although they have smart design, suffer from the fact that they’re not being used due to not being Functional at the moment, while Bancor and Binance Coin are both being used actively. On the Functional but Appended side sits Golem, as its a payment token purely for use of the Golem network, and Salt which is an asset that grants different tiers of membership to the Salt lending platform. On the completely Appended and Non-Functional end of the chart, we have assets such as Sirin Labs token which is a Non-Functional coupon, and Civic is slowly making its way toward Intelligent and Functional (see later section).
There also could be dual token models in which one asset exists to purely capture value while the other continuously spins, and it’s typically seen in the form of a governance or staking asset paired with either a stablecoin or MoE that’s governed by the original asset. For example, in the case of MakerDAO, Maker is used to govern the collateralization rate of Dai and collect fees, while Dai is a stablecoin. Spankchain’s Spank token is staked in order to produce Booty, its own form of payments token that’s proportionally minted. As time goes on, it wouldn’t be surprising to see more experiments in token design to figure out a way of sustainably capturing value and stretching the limits of functionality.
Recently, a case of ugly journalism reared its head on popular news outlet Coindesk, with an article explaining how Quantstamp’s community is unhappy with the fact that the company has been taking Ethereum and USD for their auditing services rather than QSP (their native token). What a shocking situation — why wouldn’t Quantstamp be receiving more QSP for their services than expected?
“Some customers can’t buy QSP to pay us, which means that we miss out on the opportunity to be helpful and stay true to our mission of driving smart contract adoption for the ecosystem”
Case in point — not only would the token price suffer in the future from spinning hard (high economic velocity — explained later), but there’s friction to use the service by paying in QSP to begin with. This isn’t to say that Quantstamp isn’t a great project — they have a lot of smart people on the team and what they’re doing is important for the ecosystem. However, they’re a great case of bad token design: it served its purpose to help bootstrap the project, but isn’t required for the future of Quantstamp.
To be fair to the projects, sometimes Appended isn’t unintelligent, but just an early way of sticking a token onto something and raising capital. The teams and projects could be delivering a service that is monumental, but weren’t thinking about the ways in which we think of token design today. Another great example of this is Golem: great undertaking, bad token design. It would be much easier to pay for the service in Ether in their case. Appended doesn’t mean the team is unintelligent, just the way the token function is unintelligent.
— Notes on Modeling —
To quickly cover what is meant by economic velocity and “the spinning,” we must first review the equation of exchange, which is defined as MV=PT. The equation is typically used to determine both the health of an economy and the effective inflation rate — the more stable the velocity rate, the healthier the economy. In the equation, M = the money supply, V = velocity — or spending speed, P = the average price of goods, T = total transactions.
Chris Burniske of Placeholder Capital reworked the equation of exchangeback in September to better represent the crypto economy. In his model, MV = PQ, M = the asset base, V = its velocity, P = the price of the asset’s network resources, and Q = the quantity of the the resource being provisioned. The GDP of the assets network effectively becomes “PQ.” When velocity is too low and transaction volume collapses, so does the price of the asset. When velocity is too high and it’s changing hands too frequently while being converted back into fiat or a parent asset (MoE), the asset isn’t able to capture any value.
The greatest ally of economic velocity is Intelligent token design: how can you stop the spinning just enough to allow the token to effectively capture value? Set the RPMs of a record player too high or too low and what do you get? A mess of sound.
Another way Burneske proposed valuing assets was through his idea of a crypto J-curve. This is a model that focuses on a “CUV” or current utility value, alongside a “DEUV” or discounted expected utility value (pure speculation). How the curve operates is through the following series of events:
The J-curve’s greatest ally on the spectrum leans toward functionality — or rather, the top of the curve in which utility value is completely realized alongside speculative value. However, the curve might be skewed in the future, as CUV continues to be pushed toward the front of the line when dealing with ICOs so products can demonstrate that the asset they sold actually has utility.
A utility token having utility — who would’ve thought?
To speak to the original spectrum — no asset is stuck. We are in an ever-evolving environment in which assets could append new features in order to achieve a slower velocity and a greater chance at capturing increased amounts of value.
One great example of an asset recently converting from an Appended to an Intelligent model was Civic. Civic’s original token design was that of a payment token: verifiers and individuals could earn tokens by onboarding network participants, and the token would be used to pay for services. However, with enough time and adoption in that system, value would eventually be driven down due to Civic’s token spinning faster and faster. However, after understanding the downside to their model, their team applied a tinge of game theory to create a staking model in which value is captured a bit better than the original design.
With a bit of game theory in mind, Civic redesigned their token to create a “mandatory goal alignment” between the participants of their network. In this case, validators and requesters have a staking mechanism that penalizes them for incorrect behavior on the platform when transactions occur in Civic’s marketplace. This in turn allows Civic to expand beyond a payment token and “stop the spinning” so to speak as gradual amounts of the supply begin to be locked up for staking for the sake of validation and rewards.
However, it might not necessarily be a complete redesign that a token needs as Civic deployed with their project and new whitepaper. Some assets could provide smaller elements of extra functionality over time that continue to capture value, which also incentivizes users to speculate on the next feature release. Projects are never locked in a box with their token — they simply have to be creative in the ways in which they allow their token to capture value. In my personal opinion, Binance is a great example of a company always thinking on their feet and continually adding unique dimensions to their token over time — an evolving asset.
A discrete accumulation and holding mechanism Binance has put into place recently with their token was with the announcement of their latest community vote. Binance added a governance layer with their token being the voting mechanism by which new assets are added to the exchange, but this recent vote adds a twist: a snapshot and a multiplier based on the amount of BNB held by the voters. The multiplier is dictated by token balance and runs up to 500 in the case of anything above that number. Now, as opposed to simply falsifying votes for their asset to be added to Binance for the mere price of 0.1 BNB, companies are incentivized to take an extra leap and arm accounts and users with 500+ BNB ($7,000+ value locked up) in order to have the greatest number of votes.
This, along with their perpetual burning of the token, and their announcement of their own blockchain to be processed with BNB makes Binance Coin a great example of an evolving asset.
There is no concrete way to evaluate this strange new asset class, as it defies the models in which we’ve placed traditional assets. We can take existing frameworks and come close to newer models, but curve balls will continue to be thrown. I’m no economist by trade, nor am I an expert -but I feel as though there’s a contribution to be made to the ways in which we attribute value to these little speculative instruments.
Nobody has the perfect model — nor do most of these assets actually function as planned. But do any of us really?
Nothing in this article should be taken as legal or investment advice.
Create your free account to unlock your custom reading experience.