Cryptocurrency has come a long way since Satoshi’s revolutionary Bitcoin whitepaper. The second generation of blockchains has extended the possibilities of the technology far beyond the trustless exchange of digital cash, allowing for the creation of tokens, smart contracts, and the execution of complex computations on distributed ledgers.
Despite the excitement around blockchains, the technology in its current state is far from being perfect. The reality is that the sheer complexity of such networks means that a titanic amount of energy expenditure is required under the Proof-of-Work model to secure the transaction records. On top of this, arguments over how to go about scaling the amount of transactions per second have polarised the blockchain space for years.
At present, on-chain transactions for Bitcoin and Ethereum cap at 7 transactions per second and 10 transactions per second, respectively. This poses a huge problem when it comes to processing requirements of a real-world application. One needs to look at the 2017 example of CryptoKitties, a (relatively simple) decentralised application whose huge popularity resulted in subsequent huge congestion on the Ethereum network.
Every day, it seems that a new whitepaper characterised by fancy graphics and a vague vapourware project hits the internet.
By now, seasoned crypto veterans have grown tired of ICO projects claiming to have solved the scaling debate, without anything to show for it.
Enter Zilliqa: a blockchain project whose team makes the same claims, but actually backs them up.
Zilliqa made the news in late 2017, when its testnet managed a staggering 2488 transactions per second. The National University of Singapore researchers that make up its team rely on sharding to make this possible. As the name might indicate, sharding involves breaking down tasks into smaller ‘shards’, before later reassembling them. In this case, the network splits participating machines into sub-committees, which work in parallel with others to process micro-blocks that are then aggregated to form a full block, making for much higher throughput.
What’s even more promising is the possibility of mining on their platform, which will be available when their mainnet is launched in Q3 of 2018. Zilliqa’s revised mining model promises to appeal to those running GPU rigs. The team is conscious of the massive energy consumption of other blockchain protocols, and instead opts to use a Proof-of-Work scheme exclusively to establish new mining identities on the network (so as to deter Sybil attacks). To complete this round, the Ethash algorithm is employed.
In a blog post, it is estimated that only 12 hours of operation at full capacity will be required of GPU cards monthly, allowing them to run idly for the remainder of the time. Practical Byzantine Fault Tolerance, or pBFT, is used in conjunction with this approach to ensure consensus. Such a hybrid approach is expected to lead to electricity costs of almost a tenth of those associated with Ethereum mining.
While scalability of the Zilliqa protocol could lend itself to applications in heavily-used utility tokens or currency transfer, the team hopes to carve out a niche in scientific computing and other applications requiring high throughput. In fact, the name Zilliqa (a play on ‘silica’) lends itself to the idea that, just as silicon provides the architecture underlying computers, Zilliqa aims to provide the architecture for the next generation of consensus-based computing.
Applications suited to execution on the Zilliqa blockchain include those requiring high-throughput.
High-volume tasks such as training neural nets, scientific computing and parallel auctions are believed to be ideal fits for the Zilliqa protocol.
The technical FAQ found on the website also references the DAO and Parity hacks, catastrophes the team hopes to avoid by creating a new language for smart contracts, whose security can be proven.
The Zilliqa team is committed to an ongoing research and development, and have a busy roadmap outlined for the future: investigating a secure PoS mechanism (a technology still in its infancy), pruning the blockchain so as to lessen the storage burden, and to integrate functionalities that better protects the privacy of data processed by contracts (i.e., sensitive business information or protected scientific research). In line with its open-source ethos, it does not view similar projects as ‘competition’, and instead hopes to complement other blockchains wherever possible — one proposed method of achieving this is cross-chain support for maximum interoperability.
It’s refreshing to see an approach such as Zilliqa’s which, instead of competing with other blockchains, is committed to improving the ecosystem. Its colossal tx/s rate and successful use of sharding already places it years ahead of similar projects in the space, and as more nodes join the mainnet, it is only a matter of time before Zilliqa eclipses slower blockchains altogether.