The Tie Research

Solving the Blockchain Trilemma: On Scaling Challenges and their Solutions

By Vaish Puri
April 18, 2023

Jump To:

Build Bigger. Build Better.  Build Faster. These three tenets challenged humanity to pioneer forward, and have dictated the pace of innovation ever since the dawn of the Industrial Revolution. With the popularization of blockchain technology, we are once again faced with the challenge of addressing the three tenets. The Ethereum Virtual Machine allows developers to not only transact, but also create applications to provide utility and services beyond what Ethereum itself can do. As a result, development in Ethereum grew by 42% over 2021. Though, all that glitters is not gold and as Biggie Poppa once said: mo money, mo problems. As more developers build, more users participate in the Ethereum environment, leading to a monumental increase in gas fees as well as slower transaction times. 

This makes for a highly inefficient system that discourages usage and drives new users back to traditional finance or to other ecosystems. Currently, ETH can do roughly 15 transactions per second (TPS), a far cry from traditional finance players like Visa (50k+ TPS). At the heart of it, all Layer 1 networks suffer from the blockchain trilemma: choosing between decentralization, security, and scalability.

This article is meant as a primer, aiming to give an overview of the solutions and challenges of scaling, as well as compare a few projects that are currently making use of these scaling solutions. I intended to follow this up with an in-depth series on infrastructure; taking a deep dive and analyzing case studies on projects and solutions discussed.

On-Chain: Sharding

Ok, this may get a bit technical, but stick with me and the fruits will show. In blockchain environments, data can be stored either on-chain or off-chain. So in order to scale, the network must increase total throughput given an increased load (or in other words: the more data they have, the faster it should move). 

One way of doing this on-chain is through data sharding. Sharding is the process of splitting a database horizontally to spread the load. Basically, by distributing the data amongst many machines, the system can handle larger amounts of data. Sharding is not a novel idea, and in fact has been in use for years.

In terms of Ethereum, sharding will allow for increased TPS and reduce overall network congestion by creating new shards or chains. Each shard will be able to seamlessly communicate with other shards for transactions. Not only does sharding improve scalability, but also increases decentralization by allowing anyone to run a node, rather than strictly those with access to powerful and expensive computing equipment. Network validators would only need to store data for the specific shard they are validating, as opposed to storing data for the entire network (the status quo). As a result of increased decentralization, network security is bolstered, as the targets for attacks become dramatically reduced. Full deployment of sharding on the Ethereum Network is set to take place in 2023, as currently, there are many challenges that still need addressing. 

Challenges of Sharding

Although sharding has been in usage for database optimization for quite some time now, using it with a blockchain network presents unique challenges. There are two main issues that need to be addressed: data validity and data availability.

Data Validity

I’m going to use an example to illustrate the knot that is data validity. Consider a situation where a shard is corrupted due to a malicious validator invalidly producing a block, B, and minting 100 tokens out of thin air. The malicious validator then produces a new block, C, based on the results of block B, effectively hiding the false block B. The bad actor can then initiate a cross-shard transaction of those 100 falsely minted tokens to a different shard. As a result, the seemingly ‘improperly created’ tokens now are completely valid from the second shard’s perspective. 

A workaround to this situation is to arrange the shards in an undirected graph, in which each shard is connected to several other shards. Each shard is only allowed to transact with neighboring shards. In the event a shard needs to transact with a non-neighbor shard, the transaction is routed between multiple shards where nodes in each shard have to validate all the blocks in their shard, as well as their neighbor’s blocks. 

However, this does not fully solve the problem of malicious validators corrupting multiple shards. In order to fully solve the issue of data validity, two approaches are necessary: the fisherman method and cryptographic proofs of computation.

Long story short, the fisherman method says that if a block header is communicating between chains, there must be a period of time during which any validator can challenge or provide proof that the block is invalid, if they suspect foul play. So as long as there is at least one honest validator in the shard, the system would be secure. 

Cryptographic proof of computation would allow one to prove that a particular computation was done correctly. Some really roll off the tongue, like ‘zero-knowledge succinct non-interact arguments of knowledge’, which is a mouthful, so we call them: zk-SNARKs. More on SNARKs later. 

Data Availability

The second challenge of scaling via sharding is data availability. Nodes in a typical L1 blockchain are separated into two groups: full nodes and light nodes. Full nodes are those that download the entirety of the block and validate every transaction, while light nodes only download block headers and utilize Merkle proofs to call data from transactions. 

If a majority of full nodes decide they want to work together to produce invalid blocks, all they need to do is send the false block’s hash to the light nodes. The light nodes only download the metadata of the block (also called block header) along with the state of the transaction. So you can see how if the full nodes wanted, they could produce a valid looking block header with an invalid block, fooling the light node. In terms of sharding: the validators in the shards would be the full nodes, while the other participants, including the beacon chain, would be light nodes. There are two methods to solve this problem: proof of custody and erasure codes

Proof of custody introduces the concept of Notaries that rotate between the many shards, more often than the validators rotate. These Notaries would download a block, verifying that indeed the data is available. The Notaries would have to stake tokens in order to attest the claim, as an incentive to not falsely claim a block is downloadable. In conjunction with proof of custody, erasure codes would create a trustless system by allowing light nodes to fully recover an entire block when calling only parts of it.

Off Chain: Rollups

All the previous discussed methods for on-chain scalability require critical changes to Ethereum’s protocol. As a result, timetables for implementation are typically unreliable and late. Off-chain solutions are implemented separately from the Layer 1 Mainnet. Some solutions like Layer 2 networks derive their security from Layer 1 consensus, while other solutions call for the creation of new chains that derive their security separate from the Mainnet. Both solutions communicate with the Layer 1 Mainnet, but differ in how they obtain security. 

Layer 2 (L2) scaling solutions are off-chain methods that depend on the Mainnet for security. Layer 2 isn’t a sole solution to scalability but rather refers to a basket of solutions that all can work to scale applications by taking computation off the Layer 1 (speeding up transaction time), while maintaining robust decentralized security. These solutions consist of rollups and state channels.

Optimistic

Rollups create an efficient environment by taking transaction execution and processing it off chain, then posting the pertinent data to Layer 1, where consensus must be reached. All transaction data is inherently included on Layer 1 blocks, so rollups are secured through native Mainnet. Within, rollups include two different security models: optimistic and zero-knowledge. Optimistic rollups (ORUs) do not compute, but rather notarize and pass the computation to the L2. A sequencer can bundle multiple transactions into a batch and then submit that batch back to the main chain via a single transaction. Transactions are written to the Mainnet as call data, thereby further reducing gas fees. In Optimistic rollups, transactions called are assumed valid by default, hence the name. If a transaction is disputed, the rollup will rely on a fraud-proof and run the transaction computation. Consequently, if transactions are disputed, completed transactions per second will fall. ORUs are expected to have about 100x more throughput than the L1 ETH chain.

Zero-Knowledge

Zero-knowledge (ZK) rollups will “rollup” hundreds of transactions and run the computation off-chain. The key difference being, zk-rollups submit a validity proof to the Mainnet. These proofs can be SNARKs or STARKs. Each proof has its pros and cons, though discussion of that is beyond the scope of this paper. As zk-rollup smart contracts maintain the state of all transfers, the zk-rollups only need the validity proof rather than the entire transaction data, making validation of blocks quicker and cheaper.

Evaluating L2 Solutions

Below is a chart I put together which differentiates and evaluates each scaling framework

As we can see, the heat map displays each solution's strengths and weaknesses. For the final grading of each solution, I assigned a value of 1, 2, or 3 to each cell based on red, light green, and dark green respectively. As a result, the best score an L2 solution could get would be 57, and we can see that zk-rollups performed the best and met most of the needs and challenges.

Highlighted Projects

Now that we have defined the roadmap to scalability and shown multiple methods of scalability, we can start to highlight key projects taking advantage of the new scalability framework. Using TheTie’s proprietary SigDevTerminal, let’s look at 3 different scaling protocols and how they compare to one another.

Near

The NEAR Protocol is a development platform built on a sharded, proof-of-stake, L1 blockchain. Using the Rainbow, ETH developers can easily deploy their new or existing dapps in a fast, scalable, cost-effective manner as well as freely use ERC-20 tokens on NEAR. The SigDevTerminal offers a peek behind the curtain. This chart displays SigDev’s exclusive Hype-to-Activity Ratio. It measures the number of tweets a particular coin has per each $1M in reported trading volume of that coin. High Hype-to-Activity ratios, or lack thereof, may suggest that a particular cryptocurrency is over/underhyped in social conversations relative to the amount of trading activity that it has. It is a good metric for identifying outliers or for tracking the number of social conversations a particular coin has relative to its trading volume over time.

Among the many metrics SigDevprovides, I’ve chosen to examine the various protocols that have or are planning to adopt sharding as a scaling solution. Here we can see the disparity between Network Value to Transaction (NVT) ratio and the Long Term Sentiment versus the market caps. Evidently Ethereum leads in market cap, followed by Polkadot, then NEAR. This makes sense as the price of each of the former is higher. We can standardize by looking at the NVT ratio, which is the ratio of market capitalization divided by transacted volume. This shows us that the NVT ratio of NEAR is significantly lower than its peers, while at the same time carrying a higher long term sentiment! This would seem to indicate that NEAR is undervalued at its current price.

We see that most of the growth and volatility occurs during periods of great activity and muddled hype. Early on, while volume and price are relative low, hype-to-activity peaks at a roughly 4.5:1 ratio. This indicates lots of discussion but no real development in price action. This is followed again during the summer months, but the trend quickly inverses around December; with volume and price rising while hype-to-activity falls below a 1:1 ratio. With this, we come to a counterintuitive diagnosis for NEAR: while twitter fingers may be hot and bothered, hype may actually have an inverse correlation with price.

dYdX

dYdX is a hybrid-decentralized exchange that allows users to trade on margin, borrow and lend, and offer access to Perpetual smart contracts. dYdX recently collaborated with StarkWare to build on top of the Layer 2 zk-rollup engine called StarkExSigDevgives insight into how ownership of the dYdX token is distributed, as well as link us to their wallets via Etherscan. We can use this information to track the behavior of so-called “whales” and get an idea of how ownership moves price.

Complementing this, we return to the charts to analyze TheTie’s formulated NVTweet Ratio (not to be confused with the previously discussed NVT). The NVTweet Ratio looks at how many tweets a particular coin has per each $1M in Market Cap. The lower a coin's NVTweet Ratio, the more tweet volume it has per $1M in market cap. An increasing NVTweet Ratio could suggest that a particular coins' market is becoming increasingly driven by institutional trading. As market cap is increasing faster than social volume, this may be telling of less retail involvement in the market for a particular coin.

Notice how the top 5 owners are Genesis holders. Taking this in combination with the NVTweet ratio, we can firmly come to the conclusion that price activity in dYdX is driven predominantly by institutions and primordial adopters, while retail influence is quite negligible. This may be a telltale sign that, although dYdX has been around for some time, prices have yet to reflect overall market sentiment…aka any involvement at the present would be considered early! 

Arbitrum

Arbitrum markets itself as the ideal scaling solution for defi apps based on Ethereum and uses the Optimistic rollup framework.

We are able to visualize the TVL growth of all the projects built on Arbitrum’s ecosystem since its inception. Above, we notice the initial jump in bridge inflows starting September 2021, when the mainnet was launched to the public. Although it took a bit of a dive, it has maintained relatively steady growth ever since. This is just the beginning for Arbitrum and as more projects begin to choose to integrate L2 scaling options, Arbitrum should continue to see growth. 

When comparing TVL’s of Layer 2s in general, we can see that Arbitrum leads the pack. Arbitrum has established itself as the premier L2 scaling solution, with 54% of the market share and nearly $3.04B in TVL. It's likely that developers will be more attracted to it versus, say, Optimism, which also uses the Optimistic rollup framework. When evaluating long term growth potential, it's best to look toward where building is actually happening. For Arbitrum, that's Treasure DAO, Jones DAO and Dopex.

Conclusion

As usage and popularity for development blockchains like Ethereum grows, the challenges and concerns of the Blockchain Trilemma will need to be addressed. On-chain solutions like sharding as well as off-chain solutions like Layer 2 scaling options are all methods that are likely to gain acceptance and traction as network congestion continues to rise. Taking these solutions into consideration, investors and developers alike must begin to evaluate the pros and cons of each solution and how best to navigate them. For now, the best bet seems to be Arbitrum; though it remains to be seen which methodology will win outright, if at all.


This report is for informational purposes only and is not investment or trading advice. The views and opinions expressed in this report are exclusively those of the author, and do not necessarily reflect the views or positions of The TIE Inc. The Author may be holding the cryptocurrencies or using the strategies mentioned in this report. You are fully responsible for any decisions you make; the TIE Inc. is not liable for any loss or damage caused by reliance on information provided. For investment advice, please consult a registered investment advisor.

Stay up to date

Sign up to receive an email when we release a new post


Vaish Puri

Vaish Puri

Vaish Puri, Author at The Tie

See Additional Posts By Vaish