Why blockchains don’t scale
While blockchain has only sprung to mainstream attention in recent months, actual activity across the various blockchain protocols has remained relatively low. However, they are already creaking under the pressure, with networks such as Bitcoin and Ethereum regularly suffering from slow transactions and high fees.
While different groups have proposed a variety of solutions to scale the networks, there is one underlying flaw; blockchain simply can’t scale to the levels required if it is to act as the backbone of the financial system, let alone the multitude of wider use cases it is currently attempting to tackle. If it can’t scale to meet this anticipated demand then the unspoken truth is that the technology is unfit for its purpose.
Blockchain’s overriding achievement is that it solves the issue of how to guarantee trust between two unrelated parties without the need for a centralized third party. This was what it was designed to do. Blockchain does not, however, have a magic bullet when it came to processing transactions.
PayPal can handle around 200 transactions per second (tps). VISA considers itself able to process 56,000 tps, although average daily loads are closer to 2,000. Bitcoin is capable of ~7 tps, Ethereum 15-20 (less if it is tokens rather than ETH being transferred). This puts the networks well below the mere replication of one payment processor’s output, leaving the dream of mass adoption (which will require not just tens but hundreds of thousands of tps) well out of reach.
There are some blockchains which have claimed to reach into the tens of thousands of tps – but these are without exception private tests on optimized and ideally located hardware, a setup unrepresentative of real-world scenarios where you do not have control over the processing power of nodes or their location.
It is important to understand why the protocols are so slow. It is not because that is their inherent limitation (for instance, Bitcoin’s theoretical limit and performance on private chains, although still well below VISA’s peak capabilities, is far higher); the constraint is as a result of a deliberate design choice – to prioritize a decentralized network. Every node processes each and every transaction, keeping a record of the entire blockchain. These transactions form blocks, these blocks are processed to become part of the new chain and nodes then utilise this as their local copy of the global state to stay in sync with the overall network.
As transaction volumes increase, the size of each block increases too, meaning that it takes longer for nodes to receive the information. Over time, this causes consistency issues, as some nodes are more up to date than others. This can then cause invalid blocks to be produced, an issue that becomes crucial during times of high loads, as nodes download and discard successive blocks.
This is the trade-off that comes as a result of blockchain and decentralization. It brings about a trilemma between decentralization, security, and scalability. You can have two – but not all three.
All blockchain scaling solutions including block size increases, sharding, off- chain channels, and protocol-specific solutions like Lightning Network must be evaluated against three criteria:
- Does the network remain decentralized?
- Does the network remain secure?
- Will it allow, or at least prepare, the network to scale to levels required for mass adoption?
Increasing the block size
One of the more publicised attempts at scaling surrounds the block size debate which ultimately led to the Bitcoin hard fork of Bitcoin Cash in August 2017. Simply put, by increasing the block size you are able to process more transactions in the same timeframe.
Bitcoin has been at the heart of a battle over block sizes, but if we are looking to prepare for a future in which blockchain is at the heart of ‘Web 3.0,’ then these debates are irrelevant. An average transaction size of 500 bytes with 2,000 tps would lead to 1GB block sizes. There is no feasible block size that can process the amount of information that will be needed without running into the same problem; at a certain point block sizes become too large to maintain a decentralized network.
As block sizes increase, the higher the requirements (storage, processing power, bandwidth) become. This then leads to the risk of centralization as fewer and fewer nodes are able to participate. Furthermore, the network can only move as fast as all constituent nodes. – when blocks get past a certain size, it takes too long for it to move through all nodes on the network. This then leads to a backlog of blocks, and the system grinds to a halt.
Increasing block sizes will increase the feasible tps, but only to a limit – a limit that will be met sooner rather than later. It is therefore imperative that other solutions are found.
Sharding separates the overall blockchain into different shards, with these respective parts spread across different nodes. The easiest way to visualize it is splitting one island into hundred different islands. Each of the islands must still conform to the same rules as before – but they are only responsible for governing their small island, rather than the previous larger unified one.
Sharding introduces benefits when it comes to scaling, namely that it is far less intensive to process 1/100th of the blockchain than it is to process the entire chain. It also brings in a new set of problems. One is that the issue of double spend changes, as there are a number of chains operating simultaneously that must be reconciled in order to avoid a bad actor spending the same asset twice.
Therefore, sharding requires the creation of a network with the same level of security as before, despite nodes now processing a much smaller set of transactions. All child chains must still have complete transparency over the wider network. Nodes still have to agree on all the transactions being processed but they now have to trust other nodes given the network is split up.
As the network is sharded, the computational power required to attack each child chain is reduced significantly, increasing the chances of a 51% attack. The switch to Proof of Stake from Proof of Work is designed to aid with sharding and thus scaling, but this will likely to lead to pooled groups and the same issue as miner centralisation.
Crucially, this also has to be achieved without introducing any form of preferential or dominant ‘master’ nodes which increases centralisation on the network. There are a number of different proposals for sharding but this tradeoff has thus far proved impossible to reconcile with the scalability.
Offchain state channels
Offchain state channels are a means through which certain blockchain interactions are no longer conducted on the blockchain but rather are conducted ‘offline’. This works by essentially allowing two (or more) participants to lock a part of the blockchain state as a ‘state channel’. These participants can still make changes/transactions between themselves to the state channel in the same manner as usual and then, once they have concluded business, the participants submit the state channel back to the blockchain. The blockchain then updates with the new data and the state is unlocked.
This allows for faster transactions (as they are off the blockchain it does not require the same processing power for verification) and for lower fees (as transactions bar the first and last were off blockchain). The Lightning Network is an example of this. Unfortunately this solution again falls down in maintaining a decentralised network, as it is vulnerable to economic censorship. Whereas with a normal blockchain transaction any miner can process your transaction in a block (and thus there can be no censorship) Lightning Network relies on being able to route payments through censorship-free hubs. If no Lightning Network hub hasve a channel with the anonymous hub you wish to transact with, then you are not able to make the payment. Through this, the hubs are able to restrict the routing of your transaction, running counter to the decentralised ethos of blockchain.
Scaling without centralisation
For blockchains to become widely adopted public networks they need to be able to scale without sacrificing the guiding principle of decentralisation.
Distributed systems can be scaled in two ways:
- Horizontally (add more nodes)
- Vertically (increase the resources to each node)
The problem blockchain has is that adding more nodes will, as explained, end up slowing down the network. Scaling vertically is also a nonstarter, as it will reduce the number of nodes processing and thus lead to centralisation. There is no blockchain project that has solved this issue.
All of the current workarounds are capable of working – but they cannot deliver the levels of scaling needed, without centralisation, for blockchain to be the backbone of a future economy. They are temporary fixes to patch up a system creaking under the load before anyone is even using it, bandages to patch up a soldier that has not yet left for war.
Blockchain remains an amazing technological development. But the simple – albeit hard to hear – answer to solve the scaling issue is to remove the blockchain ledger structure. Start anew and design an architecture and consensus algorithm that support partitioning/sharding, which are centralisation resistant and capable of processing transactions in parallel without falling victim to double spending.
The alternative is to continue down a path along which it will be impossible to reach the desired destination.