How Lens Protocol will scale to 50K users on Polygon.

Lens Protocol is a protocol for an open, programmable social graph from Aave. It’s superbly architected and I have been building on it for a couple of weeks now.

In this article, I dive deep into answering the question - how many users can this young protocol support?

Measuring scalability.

We want to understand how many users the Lens protocol can scale to. Lens is deployed on the Polygon blockchain, which is a Proof-of-Stake network. In terms of consensus, Polygon’s node software is a combination of geth and their custom fork of Tendermint called Peppermint.

This industry lacks engineering competency - it was very hard to find information on Polygon’s actual properties as a blockchain. When considering a platform for development, you should know of its constraints.

The main property we’re searching for is the “transactional throughput”. Some projects state this as transactions per second (TPS), which is as useless as measuring “packets per second” for your internet speed. Just like packets differ in size (e.g. ICMPping is 56 bytes, whereas a packet to setup a web connection is probably the MTU of 1500 bytes), so do transactions differ in size. The throughput we are measuring is the computational throughput - gas.

Measuring the theoretical throughput of a blockchain is pretty simple, conceptually.throughput (gas/second) = block gas limit / block time.

Let’s do it for Ethereum. The block gas limit as of writing this is 30,122,088. And Ethereum’s block time is targeted at 13s. So the theoretical throughput of Ethereum L1 is 30,122,088 / 13 = 2,317,083 gas/second.

Note: Ethereum’s gas limit is dynamic. It’s actually made up of a long-term target and a hard per-block cap, as per EIP-1559. Fun fact I didn’t know, the hard per-block cap can vary according to miners adjusting it (thanks @_prestwich, you protocol trivia god).

Can Ethereum’s throughput vary in practice?

Yes! The block time is related to the block production and consensus mechanics of the blockchain, which have underlying probabilistic elements. For Ethereum v1, the consensus is built atop proof-of-work (Ethash) and a modified GHOST consensus algorithm. POW is tightly coupled to difficulty retargeting - when new miners join the network, the block production may happen faster due to the increase in total hashpower that is mining - more miners mean POW is solved faster (distributed computing ftw), and more blocks.

What is Polygon’s throughput?

Short answer: it’s not stated anywhere. Polygon claims to have achieved 7200 TPS in a development network. But I have not found any public information that implies this.

So I went digging:

  • Polygon’s node software is separated into bor (Block Producer Layer) and Heimdall (Proof-of-Stake validator layer).

  • Polygon’s consensus engine is based on Tendermint

    • Quoting their docs:

Peppermint is modified Tendermint. It is changed to use to make it compatible with Ethereum addresses and verifiable on Ethereum chain

The modifications that the Polygon have made are principally to support ERC20 token stakes (ie. so staking can be secured via smart contracts on Ethereum), and don’t appear to have changed the Tendermint algorithm. So I decided to research Tendermint.

My Tendermint research came up with a paper that was very promising, but used a different max block size (20mb) to Polygon’s block size (1mb). So I wasn’t able to translate their findings (although have a review if anyone’s curious).

Instead, I found that Etherscan has some interesting graphs on network utilization and daily gas usage which are quite useful here! For MATIC/Polygon’s existence (as at March 14, 2022):

Utilization - gas used / gas limit
Utilization - gas used / gas limit
Total gas used per day
Total gas used per day

We can use this to calculate throughput as such:

average utilization = 35%
average gas used / day = 276889898973.124
throughput = 276889898973.124 / 24/60/60 = 3,204,744 gas/s

So Polygon has been sustaining an average 3.2M gas / second of throughput throughout its existence. And that’s at 35% utilization, meaning the blocks are partially empty. Let’s adjust for 100% utilization:

assuming average utilization = 100%
throughput = (1/0.35) * 276889898973 / 24/60/60 = 9,156,412 gas/s

9.1M gas, with a block time of 2s and significantly lower tx costs than Ethereum mainnet. Interestingly, this is only about 3.95x more throughput though.

eth_throughput = 2,317,084 
polygon_throughput = 9,156,412
2,317,084 / 9,156,412 = 3.95

For the fun of it, let’s convert to TPS in terms of a simple ERC20 transfer.

erc20_transfer_gas = 42000
eth_tps = 2,317,084 / 42000 = 55
polygon_tps = 9,156,412 / 42000 = 218

Ethereum - 55tps, Polygon - 218tps.

How well can the open social graph scale on-chain?

We will consider two cases - (1) the base case, where we only use the protocol’s base functionality, and (2) the innovation case, where people use composability to build more functionality on top.

Base case.

Now we know the throughput of Polygon, we can answer how Lens will scale. To do this:

  1. I measured the gas costs of a couple of common interactions (createProfile, createPost, follow)

  2. Wrote a very simple model of user behaviour - number of daily posts, etc. While I wanted to construct a more accurate model using an official Twitter data set, I needed some better modelling tools than Google Sheets (something that could sample from power law distributions lol).

The model is contained within this spreadsheet. Here’s a simple overview:

Considering a scenario with 50,000 users, each posting 5x a day, the Lens protocol would use up:

  • ~10% of Polygon’s throughput if Polygon’s current capacity were a limit (e.g. if network performance were to degrade) and

  • 3.4% if Polygon scaled effectively to its maximum gas limit.

At current prices of $1.42 per MATIC, and gas price of 10gwei, this amounts to:

  • $0.007/day for a user posting 5x.

  • $391/day of Lens-originated tx fees.

The innovation case.

But the open programmable social graph isn’t programmable for nuthin’. People will be using the follow and reference modules to build a whole new dimension of digital interactions - like tokenised attention, tokenised influence, fame derivatives, … the list goes on. And with this functionality, comes additional gas expenditure.

[todo modelling]

Thoughts.

Lens Protocol is great. It will scale effectively to 50K users using its most basic interactions. The most exciting part of Lens is in the programmability though. I’m actively building Gliss, where we tokenise influence and help artists fund through future fame (like Alchemix for followers!) - it’ll cost gas for sure, though I’m confident we can make it work. If you’re interested, shoot me a DM @liamzebedee.

The future is bright for this young protocol. Even with ZK rollups aside, we can already scale to a sufficient degree to make an impact in society. That excites me heaps.

Subscribe to Liam Zebedee
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
Verification
This entry has been permanently stored onchain and signed by its creator.