https://upload.wikimedia.org/wikipedia/commons/2/2e/Hendrick_van_Cleve_III_-_The_tower_of_Babel.jpg
What must exist on the blockchain to represent fungibility, non-fungibility, composability and ownership trustlessly in decentralized networks? If provability of an account owning a balance or a few unique integers is all we need, why do we care if these “tokens” conform to any standard? Additionally, if tokens have proxies mapping to standard interfaces, it matters less what’s happening behind the scenes on the blockchain.
What are the requirements for a token to be “standard”; what measuring sticks do we use; and where must standard compliance fall between blockchain and application? A robust set of libraries and documentation for standard or custom tokens is needed by applications regardless. This application layer and end user adoption should be the primary focus of every blockchain project.
In this post, I argue against on-chain token standards. Standards defining interfaces at the smart contract “on-chain” layer. I hope by exploring this contrarian point of view to force an exploration of how we handle tokens moving forward in decentralized applications.
https://www.smithsonianmag.com/history/making-cents-currencys-ancient-rise-180963776/
Ancient money and physical scarcity took many forms. Were the standard production processes and symbols printed necessary, when weight and material were king in a majority of cases? Today, when prices of metals jump, nations around the world remove them from their currencies, leaving only the symbols and iconography of the state. Quite a shift. What can I tell you… If it all functions the same way at the application layer and I can still buy a cup of coffee, who cares?
I’ve been participating in the Ethereum standards community for over 3 months after the introduction of Crypto Composables, and proposal of ERC-998: Composable Non-Fungible Tokens (CNFTs). The loosely formed community around token standards lacks formalization and a clear process. Some projects are tending toward working in stealth silos and pushing whatever token they need to the blockchain. Teams are then resurfacing, promoting their standard solution, which they are often married to and likely has become too domain specific. There is also evidence of using the ERC processes for marketing stunts, getting publications and syndication. A direct result of lacking education on the standards process for wider audiences in the Ethereum community: press, enthusiast, investor.
To be tackled in a future post.
We’ll need these for tokens… trust me 😉
In a field as new, passionate and dare I say religious as crypto and blockchain, we are seeing the best and most opinionated sides of individuals and tribes exposed online. The tweet storms are nothing short of biblical. When it comes to creating standard token building blocks that projects can use and re-use, what I personally feared looks to be happening. The Ethereum blockchain is about to get rocked by hundreds of implementations of the same underlying axioms like fungibility, non-fungibility, etc. This will create a great deal of friction at the application layer and for users. Well you know what they say…
I’m here to tell projects that it doesn’t matter anymore. There’s no need to deploy standard tokens. Go out there and break a leg or your smart contract. Here’s why:
Adoption is King 👑
If you come up with a brilliant smart contract that’s customized for your use case and mega efficient, you still have to get people to use it. Wallets, exchanges and other decentralized apps (dapps) need to know how your token can be used, why their users should care and how to integrate it technically in their front-end interfaces. Saying your token is standard goes a long way in some cases. Hopefully, in the world of decentralized exchanges and wallets, that I hope is coming soon, frictionless and permissionless will be the name of the game for tokens. But what if…
You still have a few options to get that sweet frictionless adoption. Here they are:
Clearly the 4th option isn’t going to work… so let’s explore the first 3.
Bob (custom token) and the proxy (token) both are on-chain
A proxy contract would implement a standard interface and delegate actual bookkeeping of digital scarcity (fungible or non-fungible) to the custom token contract. The benefit of this approach is that nothing new needs to be created at the application layer. For a user to add this new token to their wallet, they simply add the address of the proxy contract into the wallet as they’ve done before. The drawbacks of this approach are several.
It essentially doubles your on-chain code leading to bloat on-chain if lots of projects were to take this approach. For reading information from the chain, those calls are still free, even if they take an extra hop via function call to get there. However, for transactions that modify storage on chain (state), the gas costs of additional function calls can add up. Your users may not be too pleased, but one function call is only 2,000 gas units, so this may not be a showstopper.
Historically, the community hasn’t been enthralled with proxy contract approaches, mainly because it increases attack vectors. Anytime you add more complexity to your smart contracts, especially architecturally, be prepared to pay the price.
At the dapp level, every application needs to generate a transaction for a user to sign. A project with a non-standard token could be viewed and interacted with as if it were a standard token type in the application layer. This would only be possible with some adapter module, library or even a service or API, though the last option is less likely to be trusted. Each application, wallet, exchange would need to upgrade their code base.
The benefits of this approach is that token customization can be performed on-chain in a smart contract with complexities that cannot be mapped 1–1 using the proxy contract (1) architecture.
For example: a single contract responsible for deploying several unique fungible tokens. Why? Deploying contracts is expensive and there are several use cases that may require thousands of different varieties of fungible tokens.
In order to make this work using the proxy contract approach, one would still need to deploy a new proxy contract for every unique token; no bueno. But, if tokens were selectable by users through the application interface and you provided the application with code to specify the target token, everything goes smoothly and the results are stored on-chain.
Do I smell a token standard for Unique Fungible Tokens (UFTs)? I am aware the name is a bit of an oxymoron. 😂
The drawbacks to this approach are obvious on 2 major fronts:
Not an actual smart contract… Yellow fields stored once, relatively low cost to the token creator. Green fields are updated with each transfer, by each user with gas costs comparable to ERC-20 transfers.
This is a specific application of (2) to minimize the requirement of a whole new set of functionality and code at the application layer. In fact, you may be able to use only one small extra function in the application layer code for Unique Fungible Tokens (UFTs) to be treated just like ERC-20s. I like that 😉
The master contract would act like a Singleton and maintain all the bookkeeping for all UFTs. You would need an interface in this contract that mirrored the standard ERC-20 fungible token interface except for one small addition: include an index into the unique token you wish to modify balances for. For example, the transfer function of an ERC-20:
function transfer(address to, uint tokens) public returns (bool success);
Becomes:
function transfer(uint256 which, address to, uint tokens) public returns (bool success);
But how do we provide the index if we’re making a standard ERC-20 transaction from the application layer? First we would need to know which token we’re interested in, and as mentioned before that can be provided by the user through the application interface. Perhaps this application only deals with one type of UFT. Now the application layer needs to provide that token index as an argument, but it is only aware of creating ERC-20 transactions for all the tokens it handles. So we use a technique called decorators. Explaining how they work in detail is beyond the scope of this post, but it would go something like this:
Normal ERC-20 methods can be called like so:
const myToken = new web3.eth.Contract(json, address);myToken.methods.transfer(receiver, amount, { options });
A decorator “like” function can be used to wrap these methods:
const uniqueToken = new web3.eth.Contract(json, address);const which = application.state.specificTokenIndex;
UniqueToken.wrap(which, uniqueToken); // custom function
// business as usual for token transfersuniqueToken.methods.transfer(receiver, amount, { options });
What’s up with that wrap function?
UniqueToken.wrap = (which, instance) => {instance.methods = instance.methods.map((method) => (...args) => {method(which, ...args);});}
This assumes every method in the UniqueToken contract used by the application has a prefixed parameter of the integer index for the specific unique token you’re targeting. Likely not the case, but this small bit of code illustrates that it may be possible to create some nice, non-intrusive patterns and code for existing wallets, exchanges and dapps. Allowing the application layer to continue to treat your custom on-chain implementation of a fungible token as though it were a standard ERC-20 token would significantly ease the burden on their code, tests and team. This frees up application layer teams to focus on more important things like user experience.
The team at Enjin have created ERC-1155 Crypto Items Token Standard. In this standard one can create multiple non-fungible and fungible tokens from a single contract, transfer batches of tokens and much more. Amazing applications for gaming. The main motivation to depart completely from ERC-20 and ERC-721 is simply legacy code and overhead of storage and gas costs in the current standards don’t make it easy to spawn tens of thousands of tokens; and I have to agree with them on those points. This has launched a long discussion thread touching on several points I bring up in this post and others such as nomenclature.
ERC: Crypto Items Token Standard · Issue #1155 · ethereum/EIPs_eip: 1155 title: Crypto Item Standard author: Witek Radomski , Andrew Cooke type: Standards Track category: ERC status…_github.com
Maciej Górski has also proposed a ERC-1180 Non-Fungible Token State Verification standard for on-chain verification of NFTs that fits nicely with this post as well.
Non-Fungible Token State Verification (for exchanges) · Issue #1180 · ethereum/EIPs_While what we really own is a unique number known as tokenId in a namespace of certain contract, often it’s something…_github.com
ICO marketing “professionals” RE: token standards 😂
I’m not opposed to token standards. In fact, I think that token standards help move the space forward by leaps and bounds. A standard spec with a solid reference and / or working implementation creates an explosion of new projects. We saw this with ERC-20 and we’re seeing it again with ERC-721. A standard also makes it easy for single applications with small teams to represent hundreds of digital assets. However, the process for creating token standards has proven to be slow, unclear and in some cases abused for promotion and marketing.
A programmable blockchain like Ethereum provides us with the ability to prove any type of digital scarcity in any way we choose. Crypto, cats, whatever. If avoiding custom implementations of identical axiomatic concepts is the goal, are on-chain standards the best way to achieve this? Perhaps on-chain token standards could be more abstract. For example: defining requirements of fungibility versus defining a solidity interface like ERC-20.
I do agree and will fight for standard application layer interfaces. I feel these will be extremely useful for dapps to avoid excess burden of disparate implementations, although I don’t see why standard application interfaces must map 1–1 with any particular smart contract interface deployed on the blockchain.
There is another possibility emerging. That token standards will end up looking like OAuth, love it or leave it. Do we know what happens behind the scenes when we login to another website using Google or Facebook? I can hear the decentralists cringing, but follow for a second. We don’t know the login tech of those companies or even OAuth. We don’t care. Nor does the application wanting to log you in. Websites can leverage a standard functionality, through a common spec and some great libraries to authenticate you across multiple services, each with their own backend authentication functionality.
Truth be told. No one knows where this is headed. Sooner or later we’re all going to have to talk about it though 🤗
I started Crypto Composables (ERC-998) as a standard because I saw the wave of projects coming that would need to support this axiomatic functionality. I’m only the facilitator. I’m here to work for the community and synthesize ideas, design and knowledge from my brilliant colleagues. We need to work together to define a world of cryptoassets that can achieve mass adoption in end-user applications. This does require working together at some level to decide what that world looks like.
I’ve applied to the Ethereum Foundation (EF) for a small grant to research, interview, codify the token standards process for the entire community. We are still in early days. I feel that a comprehensive overview and the establishment of a more clear process would do wonders to increase understanding and inclusivity in the exuberant Ethereum token community.
As always, many other brilliant minds can be found in NFTy Magicians chatting away about token standards. Please feel free to join, developer, non-developers alike. We would like to hear from everyone what they feel the future of token design and engineering will be.
🧙♂️ https://discord.gg/3TtqP2C 🧙♀️
medium.com/@mattdlockyertwitter.com/mattdlockyerlinkedin.com/in/mattlockyer