Blockchain Scalability: Do Layer I Solutions Hold the Key?

Written by 1nst1tute | Published 2018/11/07
Tech Story Tags: blockchain | cryptocurrency | cryptography | scaling | ethereum

TLDRvia the TL;DR App

Cryptocurrencies currently do not scale in any meaningful sense. The scalability of blockchain technology — in both total transaction volume and the number of independent participants involved in processing them — is a crucial hurdle to mainstream adoption. This is especially true when weighted against security and decentralisation challenges.

Visa handles around 24,000 transactions per second (TPS) at capacity and needless to say, leaders like Bitcoin and Ethereum fall well short of that figure. Many other projects claim global-scale TPS — EOS or Zilliqa, for example — but these are mostly untested claims.

Many methods for scaling have been proposed, be they layer I ‘on-chain’ solutions or layer II solutions built on top of the blockchain ‘off-chain’. There are many differing viewpoints on how best to deploy scalability measures, and many of these conversations end up clouded by technical complexity or tribalism between projects.

Layer I vs Layer II

As layer II solutions like Lightning, Plasma, Raiden and Sprites are being feverishly developed, layer I solutions continue to play an essential role in the evolution of blockchains.

As Patrick McCorry, Assistant Professor at King’s College London and the UK’s first cryptocurrency PhD, puts it, “the blockchain (alongside a central operator) provides a beautiful means for parties to verify the central operator is not cheating. I envision lots of blockchains which are hopefully bootstrapped from popular blockchains. I’m a believer in layer II, but it won’t work without layer I.”

“I envision lots of blockchains which are hopefully bootstrapped from popular blockchains. I’m a believer in layer II, but it won’t work without layer I.”

The future is likely not a winner-takes-all scenario, rather blockchains will employ a variety of complementary fixes to perform at scale. But there’s so much activity in the space right now that it’s possible to get ahead of ourselves and look past layer one — regardless of how fundamental it may be.

What role will layer I solutions play in the future? As McCorry sees it, “layer II protocols like Lightning, Raiden, Sprites, Pisa, Perun, Counterfactual, Plasma, etc. should be seen as ‘optimistic protocols’. If everyone co-operates (or in the case of Plasma; if the central operator co-operates), then everything can be executed locally between the interested parties and not every transaction needs to be sent to the network.”

“Layer II is ultimately restricted by the scalability of Layer I — so it is crucial both are solved in parallel.”

The optimism implied in that statement hinges on whether or not everyone co-operates, which, McCorry acknowledges, may not be the case. “However, if one party does not cooperate,” he says, “then the application being executed in the Layer II solution (i.e. payments, gaming, etc.) must resort back to the blockchain. Thus Layer II is ultimately restricted by the scalability of Layer I — so it is crucial both are solved in parallel.”

The Quest for Throughput

Bitcoin’s TPS is continually varying. In its current form, though, the network doesn’t support more than 7 TPS, and Ethereum is not much better at around 14 TPS. If the endgame is to create a decentralised global financial system, figuring out how to significantly increase those numbers is clearly of utmost importance.

“One approach is to increase the block size of Bitcoin and Ethereum, but this isn’t good enough,” McCorry says. “At scale, this will simply break down once propagating a block across the network takes longer than creating a block.

“We need new blockchain and consensus protocols that can speed up the time it takes to ‘confirm’ a block and we need to evaluate simple questions like is it more efficient to confirm a single block at a time or a branch of blocks?”

“Instead, we need new blockchain and consensus protocols that can speed up the time it takes to ‘confirm’ a block and we need to evaluate simple questions like is it more efficient to confirm a single block at a time or a branch of blocks?”. There are other fundamental questions to be asked — should each peer validate every transaction on the network or can we distribute the work of validation (for example)?

A Multitude of Solutions

The good news is that some of the brightest minds in tech are working on the blockchain scalability trilemma — how to balance scaling solutions with security and decentralisation considerations.

If you look at the literature, one of the leading ideas for blockchain scaling is the notion of sharding: different subsets of nodes handle different parts of the blockchain, thereby reducing the work of each node.

Sharding is not a new concept, but the partitioning of data to reduce the amount a blockchain node must store & process is a new application of it.

Sharding

Ethereum has shown the potential to be a functional, open-source, truly decentralised system, although the scalability bottleneck appears to be a serious existential threat. Sharding is currently being explored in a variety of novel forms. Perhaps most notably by Ethereum Foundation developer and one of the minds behind the Casper protocol upgrade Vlad Zamfir, who contends that sharding is the only true blockchain scaling solution.

Ethereum is, of course, tackling the scalability problem from multiple angles — a long-discussed move to proof-of-stake (Beacon Chain, Casper FFG), Plasma, Plasma Cash, State Channels and eWASM all come to mind. PoS consensus, in particular, could make the blockchain faster by simplifying the process required to verify who has the greatest stake and most hashing power.

OmniLedger

Another recently introduced solution is OmniLedger, a secure, scale-out decentralised ledger. OmniLedger claims to be one of the first to achieve to ‘Visa-level’ throughput coupled with seconds of latency all while preserving full decentralisation and protecting against a Byzantine adversary.

“OmniLedger claims to be one of the first to achieve to ‘Visa-level’ throughput coupled with seconds of latency all while preserving full decentralisation and protecting against a Byzantine adversary.”

OmniLedger achieves this by, to use their words, “splitting the state into multiple shards and using distributed randomness to assign validators securely.” To maintain consistency across shards, OmniLedger proposes that validators use both a novel parallel consensus algorithm and an Atomic Commit.

Chainspace

Running under the tagline ‘decentralised infrastructure on a planetary scale’, Chainspacewas developed by Mustafa Al-Bassam and a team of researchers at University College London. Their creation is a decentralised platform that supports smart contracts and executes user-supplied transactions on their objects.

Chainspace approaches scalability in a new way — it offers “high throughput and low latency via a fast, two-phase Sharded Byzantine Atomic Commit protocol (S-BAC), a distributed commit protocol, to guarantee consistency.” It also features a leaderless consensus protocol which provides fast finality for operations within each shard.

Polyshard

Created by Mingchao Yu and a team of researchers at the University of Southern California, Polyshard is a new protocol for coded storage and computation in blockchains. In their words, “Polyshard is a polynomially coded sharding scheme that achieves information-theoretic upper bounds on the efficiency of the storage, system throughput, as well as on trust, thus enabling a truly scalable system.”

“Many sharding proposals fail at efficiency scaling because they compromise on trust.”

In a recently released paper, Yu et al. discuss the fact that many sharding proposals fail at efficiency scaling because they compromise on trust, going on to explain how Polyshard solves this. Polyshard’s paper also includes some fascinating simulations that numerically demonstrate the performance improvement inherent to the protocol.

Final Thoughts

The reason many blockchains cannot scale is that the key underpinning storage and computation methods involve full replication, meaning each network node must store the entire blockchain and replicate all the computations. While this makes the likes of Bitcoin highly secure, it’s also responsible for the current scalability predicament.

Scaling is arguably the biggest challenge for the current generation of blockchains, and despite the hype of the last few years, more time is needed to iron out the wrinkles and let some of these solutions mature.

A View from the Inside

This November, Binary District and the Research Institute are co-hosting a layer I scaling-focused Master Workshop in Amsterdam. Presenting at the workshop will be those pushing layer I solutions forward, with experts from the likes of Chainspace and Omniledger on hand to share their work and their vision for the future of blockchain.

This is a chance to dig into the core issues of blockchain scalability and an opportunity to learn from and collaborate with those pushing the boundaries in this vital field. Blockchain technology is developing at a dizzying rate, and being an active part of the community is the best way to keep up.


Published by HackerNoon on 2018/11/07