Recently I was speaking about the future of security tokens at a blockchain conference in Europe. During one of the satellite receptions to the event, I was approached by a prominent figure in the crypto world whom apparently had been reading some of articles about security tokens and had developed some very interesting theses about the evolution of the space. A technologist by background, this person was struggling to reconcile the computer science-centric methods of the blockchain space with the semi-centralized, red-tape-first approaches that he is seeing in the security token market(I have the same problem BTW). At some point during our conversation he bluntly asked me “there is one thing that I still can’t understand about the security token community: What’s the obsession with standards? Isn’t that a bunch of [bleep bleep] at this point [bleep bleep bleep…]?”.
The subject of standards in the security token space might be a sensitive one. Obviously, there is a segment of the community that believes in the need for standardization. I tend subscribe to a different thesis. The security token space is too nascent, it still missing 99% of the infrastructure required to be a relevant vehicle for securities and there are simply not enough security tokens issued to make a statistically-significant sample. At this stage, standards act more like a constraining force rather than a vehicle for innovation. At this point, we simply don’t know enough about how security tokens are going to evolve and we certainly haven’t encountered any challenges that requires standardization. In my opinion, “standards without applications are figments of hallucination.”
Technological history shows us that the best standards evolve from innovation and competition between market participants. The goal of standards is to solve interoperability and portability challenges that can streamline the adoption of a technology segment. Unfortunately, many times standards become a vehicle for bureaucracy and for technology vendors to project fake thought leadership positions by creating the “rules of the game”.
To illustrate this point, let’s take two examples of different approaches to standardization in recent technology movements.
A few years ago, service oriented architecture(SOA) was positioned as an architecture style that could finally solve interoperability between systems in the enterprise by using an artifact known as web services. From the get-go, SOA triggered an intense rivalry between technology incumbents like Microsoft, IBM, Oracle, Tibco, SAP and many others. Even before SOA had achieve any meaningful traction, the vendors established two standards known as the Simple Object Access Protocol(which was neither simple nor a protocol) and the Web Service Description Language(WSDL). That was just the beginning, committees in organizations like W3C and OASIS started pushing new web services standards for every single aspect from basic communication to sophisticated security. The WS-* protocols introduced incredibly levels of complexities to the point that it was impossible for their own creators to implement them. The end result was the entire industry shifted to simpler approaches like the representational state transfer(REST) that rely on universal internet protocols such as HTTP instead of committee-designed standards.
A great example of how standards should evolve organically from competition is the
browsers you are using to read this blog. For decades, browsers have been at the center of intense battles between companies such as Microsoft, NetScape, Opera, Google, Apple and many others. The intense innovation has caused consumers to use different browsers forcing the need for interoperability. As a result, the best technologies in the space such as HTML5 or Google v8 become widely adopted within the entire ecosystem.
Bringing some of the lessons from key technology movements into security tokens, we can identify some of the core principles behind good and bad standardization.
In the context of security tokens, standards should focus less on the structure of the tokens and more on the areas of the market that require interactions between different participants. Here are five examples of areas that I think are better suited for standardization than others:
· Integration with Exchanges: Aspects such as listing, transferring, notifications are a good candidate for standarization as they require interoperability between different players. Additionally, security token exchanges are building on years of research in crypto -exchanges which make it a bit more mature scenario to adopt standards.
· Disclosures: Protocols for publishing and disclosing material pubic information about security tokens is another area in which standards can be relevant. Universal protocol for disclosures that can be integrated into exchanges or token issuance platforms are desperately needed in the security token space
· On-Chain Compliance: I believe many regulatory aspects of security tokens will be expressed in some form of on-chain protocol and smart contracts. Given the extensive regulatory knowledge in financial markets, this can be another solid area for standardization.
· Liquidity: Arguably the biggest challenge in the security token market, liquidity mechanics must be somewhat present at the protocol level and not rely solely on market interactions. Some of the ideas from protocols like Bancor can be adapt it into the security tokens in the form of standards that can be incorporated at the token and exchange level.
· Ownership: Security tokens are ultimately about expressing ownership claims. While the blockchain provides all the necessary building blocks for expressing legally-viable ownership constructs, standards in this area might help to streamline the adoption of security tokens.
Other areas such as privacy, dividend distribution or governance are likely to become relevant in terms of security token standards as the space evolves.
The subject of standards is likely to continue being a passionate area of debate in the security token space. If the security token market is successful, standards will evolve organically and the areas of standardization will become painfully visible. At this moment, we don’t need standards, we need more and better security tokens. After all, a security token standard that hasn’t been implemented in any security tokens is the definition of an oxymoron.