Scaling Summit 2021 Panel Recap | Different Flavours of Rollups: Optimistic and ZK (Validity)

On December 18, 2021, at the 8th Old Friends Reunion Scaling Summit, regarding the two popular L2 solutions, ZK-Rollup and Optimistic Rollup, Benjamin Jones (MC), Co-founder & Chief Scientist @Optimism; Avihu Levy, Head of Product @StarkWare; Harry Kalodner, Co-founder & CTO @Arbitrum; Alex Gluchowski, Co-founder & CEO @Matter Labs; Zachary Williamson, Co-founder&CTO @Aztec gave the audiences a thought-provoking discussion.

Overview:

What began as a hopeful vision for the implementation of scaling infrastructure has now taken the crypto ecosystem by storm as hundreds of dApps have rushed to deploy iterations on L2s in order to benefit from its increased throughput and decreased costs. We’re pleased to have invited these amazing guests who contribute a lot to the scaling space to share thoughts on different flavours of rollups.

🎤 Some inspiring & juicy ideas to go through:

Big Blocker L1s: projects like Solana and Avalanche are gaining momentum — promising high throughput and low fees, arguably at the cost of decentralization.

  • How would you summarize the approaches taken by these projects? What’s your TLDR of what is going on, and how do they compare to your work on Ethereum L2s?
  • Do you think that these projects represent a “threat” to Ethereum, either as a network or its core values? Why or why not?
  • Is there anything you like or admire about these projects?

Bridges: There are many different L2 projects represented on this panel, and the number is only going up from here. As such liquidity is fragmenting and how it moves between domains becomes increasingly relevant. Let’s discuss bridges.

  • For the ZKP teams: how do you think about bridges? Is there any difference that comes about from being based on S[N/T]ARKS?
  • For everyone: how would you categorize the different bridges? There are a lot of different protocols, are they different skin on the same body, or are there fundamental differences in approach? how do you think users should approach the security of different bridges? What are the tradeoffs?

UX for mass adoption: scalability promises to bring web3 into the hands of the next generation of users.

  • What do we need to get there?
  • What do you think are the biggest challenges to using your own scaling solution today, and what are you doing to remove those challenges?
  • At what point are we “ready?”

Open Floor — what are you most excited about for L2 scaling in the year 2022? What do you think are the biggest challenges will face?

We’ve captured the full learnings and takeaways from their words below:

🎤 Moderator-Benjamin Jones(Optimism)

Hello, everyone. Thank you IOSG for having us. We have a very exciting panel of guests today here to talk about L2 scaling solutions. I’m Ben, I work at optimism. We have some lovely other teams here. I’m gonna ask them to introduce themselves. So why don’t we start with… I put together in alphabetical order. So we’ll start with Arbitrum.

Harry Kalodner(Arbitrum)

Awesome, thank you. Thanks for having us. I’m Harry, I’m one of the Co-founders and the CTO of Arbitrum. We are building an optimistic rollup. Um, we’ve been on mainnet for a number of months now. We’ll get into a lot of the details of how it works over the panel, but try it today.

Zachary Williamson(Aztec)

Awesome. Hi there, I’m Zach, I’m the CEO and Co-founder of Aztec. We are a privacy-focused L2, for Ethereum. We focus on enabling both scaling and scaling transactions by making them private from observers. We use zero-knowledge proof zk-SNARKs to do this, including a previous system called PLONK, which we are put together explicitly for this purpose.

Alex Gluchowski(Matter Labs)

Hi, everyone. My name is Alex. I’m CEO and Co-founder of Matter Labs, the company behind zkSync protocol, which is a zk-Rollup. We were live on Ethereum with transfers and swaps and NFTs for over a year now. And currently, we are working on releasing the testnet for the first EVM compatible zk-Rollup.

Avihu Levy(StarkWare)

Hi everyone, I’m Head of Product at StarkWare. I’ll be talking today mainly on Starknet, or zk-Rollup. That is now on its alpha version on Mainnet since I think about a week.

🎤 Moderator-Benjamin Jones(Optimism)

Right on, okay, I guess I cheated and went for a little bit. But I’m Ben from Optimism. We’re also building an Optimistic Rollup called Optimistic Ethereum. We’re super excited about it. We’ve been on mainnet for coming up on a year now, which is pretty crazy. We just had our biggest upgrade today, which we’re very, very excited about. EVM equivalence one-click deploy, baby. So I thought we could kick off this panel by asking one thing from each of you, that you think is very awesome about your project, but you don’t think enough people are aware of. It’s a spicy problem, but I want to get this panel spicy y’all.

Harry Kalodner(Arbitrum)

The people aren’t aware of part is definitely by far the…

🎤 Moderator-Benjamin Jones(Optimism)

How about this question? What do you think is undervalued in the public eye in comparison to how you view it?

Avihu Levy(StarkWare)

I’ll go with the first version of the question. I think I like it better. So I think two things that are quite surprising for everyone is the involvement of the community in even the early version of STARK, so I see it in two aspects. One is how many people are interested to join Discord and doing stuff for the project, creating their own content that then everyone else is using and it’s happening real quick before you could even think that’s something that is very undervalued or unknown how large this part is in StarkNet. The other aspect that I’ve seen, they sit in is we have a very new StarkNet Shamans, which is like the same platform that you have known and like from Ethereum research and Eth magicians so we just really run this experiment of like let’s bring research open research question to the community and see if there’s any interest to jump in and I think there is and it’s pretty crazy to me and I don’t think that many people are reminded hey, I can just jump in and join and take part in the most critical decisions of building a network.

Harry Kalodner(Arbitrum)

Cool. I have a decent one. The interesting thing about open contract deployment is that basically a lot of the focus and a lot of the thought goes into the sort of already well-known applications are going, which is obviously hugely important, since they have attention and traction, but the coolest thing to me is the long tail of weirdness of having a system where I get surprised like, oh, wow, I had no idea it was being used for this, which is you know, a longer-term organic thing than sort of the big names, but is very valuable for developing an ecosystem.

Zachary Williamson(Aztec)

Yeah, I can jump into, particularly on some of that longtail stuff, I guess one of the things we’re really happy about with our tech that’s maybe not as widely known yet is the level of user intractability in our tech platform, particularly the work that we’re planning on launching in the new year. Just why it’s not that we learn. Something we’re calling Aztec Connect. Particularly right now with Aztec, you can shield and unshielded, private transfer ETH and other crypto assets. But the interface, the way of interacting is very much like fixed, With Aztec Connect, we’re enabling private interactions with existing public L1 smart contracts and default protocols without migrating having to migrate or port them to their to a particularly anybody who is literate in solidity programming can write a bridge contract, which acts as the translation service between the asset network and L1 contracts and enable users to interact privately with L1, where their identities are hidden. It allows users to enjoy the benefits of L2 whilst not having to suffer from fragmented liquidity. But also, on that longtail stuff, it allows for some new and innovative stuff that can’t yet really happen on Ethereum. For example, it’ll allow for things like blind auctions, or anonymous voting, we’re pretty excited about some of these prospects. We already have some people from the community tinkering around with these applications.

🎤 Moderator-Benjamin Jones(Optimism)

Okay, I want to follow up. So is this the kind of thing where I can be sitting in an Aztec pool, and basically like batch along with a bunch of other private users, and then we’ll like to do a Uniswap trade to L1?

Zachary Williamson(Aztec)

Yes, exactly. Yeah. So it’s I think sometimes it’s called like DeFi pooling or DeFi creation. And yeah, it works for fungible assets. And at the moment, like only fungible assets. But yeah it’s what you said, if you and a bunch of other users want to do the same kind of interaction, like doing Uniswap trade or deposit into a liquidity pool, then, yeah, we use zk-SNARK and our L2 to aggregate all these transactions together, and we’ll get into L1 is just one meta transaction that represents the batch. And that means the costs of the transaction is shared across all the users. Because of the awesome powers in the zk-proofs, the identities of everyone are hidden. So privacy is on top.

Moderator-Benjamin Jones(Optimism)

That’s pretty sweet. I’ve heard of that design before. I’ve never heard it done anonymously. That’s very cool. Thanks.

Alex Gluchowski(Matter Labs)

I am the only one left so I was thinking about what would be the most underappreciated aspect which is hard because you have to go and gauge the audience's awareness. So was like a year ago, I would say that we are building an EVM compatible zk-Rollup, but it’s by now probably very broadly known. Then I’ll think about zk-Porter, like this hybrid solution with on-chain data availability, off-chain data variability, and that you can actually have seamless interaction between them within the same state. So some users can enjoy the one part, some users can enjoy the security of zk-Rollup. The others can enjoy the cheapness of zk-Porter. They’re still interacting in the same system. But I think this is also broadly known. So I think the most underappreciated aspect of zkSync is our consistent focus on the mission, our commitment to the principles in which we build the decentralized systems and which we tried to make them resilient and unstoppable from the beginning. So when we launched the zkSync 1.5 years ago, it was an experimental project. So it would be totally acceptable for us to just do it fully upgradable and play around, but we decided not to go that path, and we made it upgradable with the notice period, by from day one, there can be risks that there might be some bugs and they might be exploited. And we won’t be able to do anything like that, because we will have this two weeks of delay. But we wanted to do the signal to the community that like we’re taking the dissociation as a serious, like, even at the risk of this potential attacks, right, and there is the attacks are actually higher than there is that we will be compromised, or we will be become malicious and like try to pull the rug. But we did it nonetheless. And we’re gonna stick to this principle. Also, in version 2, we only gonna launch version 2, once you have a fully trustless escape hatch on mainnet, where you can be sure that you can access your funds, no matter what happens, even if Matter Labs disappears. Or if we become malicious. Or if we get forced by some of the state authorities to do something and manipulate the data, you will be able to access it. Now we introduced this idea of Security Council. So we as a company cannot do an upgrade instantly. We will always initiate an upgrade and the users will have a few weeks' notices. We change this approach now because there are a lot of funds at stake. So we introduced a Security Council with a lot of prominent community members who act as a multi-sig like we can do an immediate upgrade if all of them agree. But this is like an ultimate difference. You need a lot of those people, but agree that we as a company cannot proceed. And you will always have this escape mechanism.

🎤 Moderator-Benjamin Jones(Optimism)

Nice. That’s pretty cool. So I’m going to I’ve been put in an interesting position here where I’m simultaneously hosting this panel and representing Optimism, I’ll take a moment to give my answer something that I’m extremely, extremely excited about that I’m not sure if people are so aware of is that there are now three Optimism forks, if you go to L2 that are all live, this is something that we are very proud of everything that we do is open source, MIT license. When we see other people deploying the stuff that we do, that’s pretty damn awesome. So that was going to be my answer. Okay, cool.

So the next juicy topic that I have for us all, I said, I titled big blocker L1s. So I’m really honored to be sitting among a bunch of people building Ethereum scaling solutions. But believe it or not, there are scaling solutions out there that are not Ethereum are not using Ethereum. So I’m really curious about what you guys' perspectives on these are because I think projects like Solana and Avalanche have been definitely picking up traction and steam as this year has gone underway. So I’d be really interested in hearing from you guys what your perspective is on these different L1s. Do you think that they have a role? If so, what are they? How would you summarize what they’re doing?

Zachary Williamson(Aztec)

I can kick off. Yeah, I mean, it’s cool to see there’s been an explosion of well, of L1s lately that are attempting to solve the scaling issue via other means, via different consensus mechanisms, different protocol architectures. And I think this is all uniformly positive for the entire blockchain space, right? Because I mean, Ethereum doesn’t have natural-born rights to be the canonical blockchain of choice. Right? That doesn’t. Having more L1s out there trying and experimenting with different architectures. It’s a melting pot of ideas. In the long run, the whole ecosystem is gonna benefit from this. And so yeah, I think it’s pretty positive. I don’t really see all the L1s or L2s being in competition with each other. I feel like all the L1s are eventually going to use L2 based scaling solutions to scale because the asymptotics of it are just not comparable as in like you’ve used zero-knowledge proofs or Optimistic Rollups you can scale it, you can scale if you ever have a network in theory binary like 100,000 10,000x, which you can’t do by tweaking with the consensus mechanisms whilst maintaining a high degree of security. So I see that was largely complimentary and I imagined that later on down the line, they’re gonna pretty much meld into such that big all at once be using L2s as well.

Avihu Levy(StarkWare)

I think there is some difference, in Solana’s point of view, I don’t know, to say much about Avalanche. But I think from Solana’s perspective, the good thing about what they did, and I appreciate it very much is that they did manage to get fairly nice parallel execution for transactions, which is something useful in many other cases, including in sequencing L2 transactions. So it’s something very useful. I guess that we’ll see all kinds of variations of racing in our systems as well. But I think the main point where things are different is how important everyone sees the ability of many users in the ecosystem to verify part of the state by themselves to verify execution and not a small set of validators that will do all that for you. And disabilities is like, is this important? Is it important to have the ability for everyone that wishes to verify everything? In an inexpensive manner? The answer to this question is different between let’s say, Solana and I assume L2s.

Harry Kalodner(Arbitrum)

Just to double down there, because the innovation going on at the execution layer is very cool to see. It’s an area that… The EVM is great, but it’s horrible. I think anybody who’s working on EVM-related stuff can relate to that. I mean, it’s massively important, but it’s also has a lot of technical debt, a lot of inefficiencies. So seeing other projects, Solana is a good example, they are really pushing on the execution space is pretty cool. I mean when you talk about other blockchains, there’s also areas like Avalanche, it seems like it primarily has done a huge amount of the consensus layer changing, having a fundamentally pretty different consensus mechanism, but as far as I recall, their main contract chain and EVM chain, as opposed to Solana, which is doing sort of less than the consensus layer, but it’s doing a lot on the execution layer. We start drifting towards, one of my favorite recent buzzwords, modular blockchains. You can kind of start talking about different parts of L1s and what they’re doing, which is interesting since I think there’s a lot of opportunities for mixing and matching cool ideas.

Alex Gluchowski(Matter Labs)

Yeah, I totally agree with that. It’s indeed really interesting on the research side I’m not going to repeat the parallel execution. It’s just a question of what the purpose of blockchain is. We’re in this regard, zkSync is completely in the camp of Ethereum philosophy maximalists, and Ethereum places decentralization above everything else. Decentralization, security, resilience against very powerful attackers. It’s clear that if you don’t prioritize this aspect, that you’re going to lose it. Because it’s inevitable that you’re going to make trade-offs. Because it costs, it bears some significant costs, to prioritize this aspect to place a lot of emphasis on decentralization. So in my opinion, only L1s that follow this philosophy will stay long term, broadly embraced by the community and the rest will be just experimental. So if the big chains that managed to innovate on the execution layer on other aspects, will eventually embrace this philosophy and will decentralize, we will be really happy to support them but otherwise, they just don’t see a point, it defeats the purpose of a blockchain.

🎤 Moderator-Benjamin Jones(Optimism)

There’s anything to be said about decentralization in terms of user basis. I feel like we’re all kind of sitting here working on different projects that solve this problem of gas fees being too high. And this drives users away. Do you think there’s anything to be said about the fact that these more centralized projects, at least give some class of users access that could otherwise be had? How do you guys think about that?

Harry Kalodner(Arbitrum)

I think it’s an excellent point.

🎤 Moderator-Benjamin Jones(Optimism)

That’s definitely what I struggle with as well. It’s like these products are clearly doing things that are like violating some of the decentralization principles that to Alex’s point that we’ve got to be Ethereum to the core. On the other hand, you see, like, people getting driven out of the market. And so I don’t know, I think it’s a very interesting catch too. I think it’s a challenging space for us to think about.

Harry Kalodner(Arbitrum)

Yeah, it’s super interesting. There’s the separation between stuff as simple as market positioning. Is someone think of themselves as an Ethereum killer? Or does somebody think of themselves as a side chain, essentially as a service that is mainly targeted at Ethereum users and how generally did they position themselves? Certainly, we’re heading into a nice world, where rollups can provide, where L2 scaling solutions can provide a range of security models with a range of costs. Alex was talking about this, with zk-Porter obviously that provides a similar nice trade-off space of security versus cost and continuing to be having options on that spectrum is pretty nice. I think the criticism of smart blockchains not accessible by most people is one that I definitely forget about, sometimes when I’m just like, interacting with Ethereum. Like how absurd it is. How expensive it can be, but it’s very real.

🎤 Moderator-Benjamin Jones(Optimism)

I agree with that. Go ahead, Zach.

Zachary Williamson(Aztec)

I just gonna say, I mean, just add something small. I agree completely. I think the infancy of technology is clouding what’s happened. It’s clouding us a little bit, I think what’s, obviously going to happen through things like this is people are going to get a security that they’re willing to pay for like Ethereum has when it comes to decentralized security, it’s pretty much like as good as you can get. You know, you need to be like a state-level actor to compromise the security of the chain. But if you’re just doing like, payments worth a cent or you’re trading around tokens for a video game that is worth like next to nothing other than, like sentimental value, right? Like you don’t really need the security of Ethereum, it’s not worth paying for at the current gas fees. And so yeah, the difficulty right now is that if you want to adjust… If you want to pay less and get a different security model, that means moving to a different chain, it means changing the execution language, which adds friction, which makes this whole mess, but longer term. Yeah, I think it’s basically about things like being able to choose the source of your data availability, which has a direct effect on your security of the system, I think users will get the security that they’re willing to pay for. So generally, I think that the cheap chains are good for community engagement. They’re good for getting people into this ecosystem, the good for getting people to experiment on with this tech. And eventually. Yeah, it’ll be more I think more market-based approaches to painting.

Alex Gluchowski(Matter Labs)

I want to take a contrarian take on this particular point. I like the contrarian things. And say that, this sounds absolutely correct. I mean, it makes perfect sense that I see the tail risk of this development like you lure users with low fees. And like, oh, look, you don’t have to care about this security. Then you have more users, more users, and you have some social proof, then you have a lot of capital in this chain. And then people start to put more money, more money in it. And so I eventually can end up with a black swan-like what’s happened to Mt Gox? Or worse what happened to the internet or what happened to the operation system race where Microsoft took all of the users luring them with some cheap properties or whatever or like Google and Apple, just getting all the users into their apps, being monopolies now, being able to set like arbitrary, exorbitant prices for developers or exclude certain apps from the community, and so on. We want to avoid those situations. In Web3, we have to learn from what happened to the internet. Internet was conceived as a decentralized network of networks where it was supposed to be peer to peer, and it ended up owned by a few corporations, very centralized parenthetical operating in a cartel-like structure where they can impose arbitrary rules, right? Just like getting users with the free product. Because you know, if the product is free, you know who is the product? So I’m really, really skeptical about it, I would rather have people join Polygon, or like other solutions that explicitly align themselves with the goals of Ethereum. It’s not about Ethereum. It’s not about tribalism in the name of Vitalik, or you know, some weird like maximizing your bags. It’s about the philosophy. It just so happens that Ethereum has the superior philosophy today, across like the chains I see. Because it values these principles. Maybe Bitcoin is even more superior. But like if you don’t innovate at all, that’s a different story, right? But in theory, we can innovate and you have rollups, you have the side chains, you can perfectly use them if you want to keep them as security for a short time.

🎤 Moderator-Benjamin Jones(Optimism)

Yeah, I totally agree. I think this is the dynamic that keeps me up at night and keeps me building. It’s basically that, yes, in the long run, there should be a large trade-off space, and you should be able to pay for your security. But at what time period? Do users come to understand the importance of that? And how does that go down in practice, because to your point, there’s definitely a world where you get hooked on the cheap transactions for your games, and then more and more transactions start routing through that. And it’s much more convenient because you don’t have to bridge anywhere else. Right. So yeah, this is something that I struggle with, too. I think we got to keep it all up here. Y’all keep the same vibes alive.

Zachary Williamson(Aztec)

Yeah, that’s a really valid point. The ideal world can get corrupted to the point where these network effects that actually were decentralized networks exist and are not really being utilized. I think one of the differences between the internet and today is there are going to be viable alternatives. I think one of the problems with the internet was that the idea of how do websites monetize, they make you the person or the customer, but there wasn’t really a great alternative revenue model for the Internet back in day one now. And that’s one thing that blockchains are going to change, hopefully, through extremely cheap payments and microtransactions. That means that actually, users can take a bit more control over their data. But really, it’s up to us to make sure that a fully decentralized solution is compelling enough that users don’t just flock to the cheap chains with terrible security that some of whom actively mislead users about the security of the networks. That’s going to happen if Ethereum and other maximally decentralized networks can’t properly scale and address the needs of their users. Which also keeps me up at night.

Harry Kalodner(Arbitrum)

It definitely one of the areas that I’ve become less optimistic about over time in the space is the ability to educate users about complex trade-offs spaces. There’s a lot of complex information and people are very busy. Not the people who are taking the time to listen to, you know, to this panel, but like all of the people who are not attending Blockchain events and listening to panels from rollup projects speaking.

🎤 Moderator-Benjamin Jones(Optimism)

I’m curious what has caused that shift in your mind? Has it been these ETH killers? There have been other things.

Harry Kalodner(Arbitrum)

I mean, it’s a combination. It’s definitely related. It’s definitely the ETH killers are some of it, but I’ve had a few too many conversations that people’s eyes kind of glazing over as I tried to describe the trade-offs between different solutions. It’s tough because you know all of us are very in it for the tech, not just the tech but the tech is certainly the big part of it. But there are also all sorts of other reasons why people come to the blockchain space. And you know some of those reasons don’t create a huge interest in understanding the fine-grained details of how things work. And this stuff is complicated. It’s easy to forget how we just down the rabbit hole are.

🎤 Moderator-Benjamin Jones(Optimism)

That’s true. Cool. Okay, speaking of complicated, I feel like we would be a mess if we spend the entire time talking just about these ETH coders and all that. Let’s talk about some…

ALL:

But it’s funny.

🎤 Moderator-Benjamin Jones(Optimism)

Oh, it is. And I think like, I wanted to make this panel as interesting as possible. And I think this is one of the things that shouldn’t be on all of our minds right now. I do really view it as the responsibility of the group here and on others to keep those principles lab because to Harry’s point, it’s tough to do. Okay, but let’s talk…

Avihu Levy(StarkWare)

We also need to make sure that Harry is not getting too much tired of running this explanation to everyone:)

🎤 Moderator-Benjamin Jones(Optimism)

Right. Okay, so I want to talk about bridges. Because speaking of complexity, right, we have everyone here, on this panel, creating one and many cases, multiple chains that are going to be or are already out there in the world. And one of the things that you have to do to use different chains is you have to bridge assets between them. So at least speaking from Optimism’s perspective, I know that we’ve seen over the past few months, a lot more bridges getting deployed onto our system. And this is in some ways, it’s almost saying that I wouldn’t have been thinking about like a year ago, right? Because in some sense, the whole point of a rollup is to create a secure bridge, right? Like, that’s sort of the point, as the whole point of this challenge, the fraud-proof mechanism is effective to enable a light client, which is a bridge. So I’m really curious. I have some thoughts on this. But I’m really curious to hear from you guys. Are there lots of bridges coming to your projects? I think this is an interesting question for the zk-proof people as well because the answer might be a little bit different for rollups. So are there more bridges coming? What does a bridge ecosystem look like in a year? And how are these different bridges working?

Alex Gluchowski(Matter Labs)

So they have a lot of interests.

Moderator-Benjamin Jones(Optimism)

Right? Yes, I was gonna pass this to ZK folks first, because I know I understand less about the ZK world.

Alex Gluchowski(Matter Labs)

Sure. So the bridge system is flourishing, we see a lot of bridges being built, we see some bridges already getting into production. Our focuses are on security and trustlessness and we want to leverage that to the maximum. With ZK, it’s really easy to pass the final state from one zk-Rollup to another within a very short period of time. So I don’t want to make exact predictions on how the world will look like because we will see a lot of innovation coming to the space, which is going beyond my imagination. But I think we will have lightweight solutions where you can like quickly send some funds in a trusted way. And you will have more stuff, more sophisticated machineries for bridging assets of very high value between different rollups in a very robust way.

Zachary Williamson(Aztec)

I guess to follow on from that. Pretty much hit the nail on the head, from what I was going to say. Might be a bit of repetition. You can, in theory, succinctly quickly verify the correctness of an entirely different chain, in theory, it opens up the possibility to do completely trustless swaps between two chains. As long as you trust the consensus mechanisms of the respective chains. I think we’re a little ways off from that actually happening, just because all these chains have different architectures, different interfaces, connecting them involves a lot of work, which is why I think it’s important for the long term goal is for all of these systems to be completely decentralized and composable. So that’s going to take a little bit longer. Just because we add privacy into the mix, which means that making protocol private smart contracts is something that we’re targeting about 12 months from now. Once you have composability, the community can build their own bridges you get this innovation. We’re seeing that happening other L2s. I feel like the bridge is going to be solved by the community. Because these networks are maximally decentralized because they’re based around composability. It’s going to become… Like bridges between L2 are going to be a big deal, just because people are gonna want to use different L2, they’ll have different projects and might have different trade-offs of different features, like privacy, for example. You don’t want to go through an L1 like Ethereum to transfer between two L2s because then you have an extremely expensive transaction in their hands. In the early days, I think most bridges are going to be fairly trusted. I'm very excited about the long tail future where you have fully trusted swap space around zero-knowledge.

Harry Kalodner(Arbitrum)

There are two relatively interesting things I could say about bridging. I certainly agree with all this for obviously bridging is quite important to optimistic rollups because of the withdrawal delay. Two at least relative interesting things to make. One is that one of the nice things about bridges from a security perspective is that most of the risk is on liquidity providers, rather than users. Not all of it. But essentially, for users if you’re swapping, you put your funds in, you get funds out quite quickly. You’re not like leaving funds in control of somebody, like the worst-case scenario is not good, which is your funds disappear. But essentially, the risk is much higher, to liquidity providers in those bridges. And those are the people who can actually do the technical due diligence to make sure they’re comfortable with it. So that’s one thing that’s nice. So not that bridges are inherently safe, but at least there’s a strict limit to how much you need to trust them. The other is that one of the very interesting things I found is that at least the liquidity base bridges trying to keep funds across chains that they are bridging in balance, which means that they actually can incentivize, they can actually make cheap things even cheaper. For example, I’m depositing into an optimistic rollup, which there’s no delay, you still may want to go across the bridge, because it might be valuable to that bridge to rebalance assets from Ethereum and onto the rollup. So even if you’re not saving time, you could save money by using one of these protocols.

🎤 Moderator-Benjamin Jones(Optimism)

Yeah, I agree. I think one of the visions that we had early on in the Plasma days Eden was we had that there was all this talk about fast exits. The OG idea of a Plasma fast exit was like you do this exit, it’s a pending withdrawal, and you get an NFT. And you can sell that NFT and get your funds out, right. It’s really been interesting to see that massively evolve, as it’s kept progressing, where we move to, it’s okay, maybe it’s not an NFT that you’re selling. But it is sort of like an order that somebody else can fill by sending you money on L1. And then that was how we thought about it. And then we can think about it more. And we realize, Wait, this basically means that in the steady-state of a rollup, maybe it’s the case that nobody is ever depositing and withdrawing, because of the demand for moving funds into the rollup and the demand for funds out of the rollup is the same. And they should just be trading those assets all the time. Right?

So we had this realization like, oh, maybe it’s, we’re not gonna see any use of the native bridge at all. It’s all just going to be the swaps. I think to your point here that the market-based mechanics for liquidity now are basically showing that where, okay, we don’t get that perfect equilibrium. We’re not there. But there is a market that is basically incentivizing arbitrage to produce that equilibrium or try to approximate it. I totally agree with you as well, like that taxonomy of security for the users is at least only at risk when funds are in transit is definitely, definitely good. I agree that it’s good. It’s interesting. Honestly, there’s like a funny parallel here between the previous stuff about people that understand decentralization and being worried about it. One of the things that scare me is that like, there are so there are classes of bridges, where users are completely collateralized. So there it is trustless, in the worst case, the fast action might not happen. They have to wait for their slow path withdrawal, but their money is guaranteed to be safe. Then there are these other classes, which are less collateralized, but to your point, the funds are at risk. If the bridge is hacked once a month, but it takes me five minutes to get my money crossed. That’re good chances. So I think this is honestly, a similar concern that we may have of like, oh, shoot, people are gonna start using these cheaper bridges because probably collateralization is more economically efficient. But then what happens when the bridge breaks.

Harry Kalodner(Arbitrum)

If we’re talking about bridging, I want to hop on one of my favorite/ least favorite subjects, which is related, which is fractionalized asset representation, aka, there’s five different DAIs or USDCs on some chain, and they’re non-fungible. That’s one of the biggest dividers to be in terms of bridging is what is actually being bridged. I’ve seen a mix of both, I’ve seen some that sort of operating on top of the native bridge, I’ve seen some that kind of define their own representation of the asset in the L2, and in some cases, you have a DApp that decides that its primary representation of the token is not going to be the native bridge one, it’s going to be the one that was bridged using an alternate bridge with a different representation. So this is one of the issues, I think that’s going to be emerging over time, it is going to be really tricky to deal with, it’s I think the primary reason why native built-in bridges exist is to try to not have this occur, since we could have just not built bridges and had other people do it, but then the problem would be even worse. So figuring out how to minimize that as much as possible, it’s going to be interesting.

🎤 Moderator-Benjamin Jones(Optimism)

Yeah, that’s an interesting point, because also those projects, which will choose to use the nonnative representation, that statement that we made earlier about the security-only being at responsive transit is not true, right? Because the assumption you’re making there is that the reason that the risk ends is that you trade it back out for the native asset. So if an APP chooses to leave it in the non-native asset, then you’re actually now exposed to that risk much for a much longer-term. So that’s an interesting note.

Harry Kalodner(Arbitrum)

I know this happened on xDai as well. It’s just impossible to have to completely avoid it.

🎤 Moderator-Benjamin Jones(Optimism)

Yeah for sure. But I remember seeing some wallet UX horror story tweets, like a year ago, or something that we’re all talking about this. Yeah, it’s interesting, that the zk space seems to me like you guys have probably a little bit of an easier time at this. But this is just an intuition, right? I’m not as familiar with you guys' protocols as Optimistic guys, but it seems like the fact that you have these validity proofs that are like being resolved in less than seven days. It seems like you have an opportunity here to like to keep the fungibility alive, so to speak. Is that do you guys think that’s true?

Avihu Levy(StarkWare)

I think that in any case, you will also have this problem of you don’t maintain at least, in the beginning, some canonical bridge to L1, even if you submit proofs, very frequently, a new finality really, really quick. If you get the implementation of a different bridge that does different things on L1, and you can still get fragmentation and liquidity, right. Fragmentation on the second token. So this is the beginning, you have to make sure that it’s coming from the same source after some time, you will anyway have the effect of everybody trusts, whatever, they make a bridge for DAI and that’s become the canonical bridge for Maker and then and so on. Also think that short finality is an advantage and it makes life easier for liquidity providers, and therefore, you probably see lower fees on faster finality rollups. But on the other hand, we also wanted a better user experience of almost instant transfers, you’ll see that emerging you’ll see the bridges ecosystem emerging on zk-Rollups, that’s what I think.

Zachary Williamson(Aztec)

Yeah, I think long term, it’s, there’s always going to be some kind of fragmentation because it’s intermittent networks. You can’t control people to incentivize them to in certain ways. The way it currently works at Aztec is that we have L1 smart contracts that act as custodians of your tokens while they’re in the L2 and for a given token type, it’s very important that everyone uses the same custodian contract, because if you don’t, then you get provided liquidity. But you have to do on smart contracts that represent that hold DAI as an asset network, then you have two different representations of DAI on the L2. Do you think that that uses incentives are relatively well aligned to use the same, the same custodian contract? I’m quite hopeful that the levels of fragmentation will be accepted won’t be severe. But we’ll see how that plays out.

🎤 Moderator-Benjamin Jones(Optimism)

Wait until your users start depositing multiple versions of wrapped ETH as the next WETH comes up, then you’ll be in trouble.

Zachary Williamson(Aztec)

Yeah, well, we can’t help with L1 fragmentation. But that’s happening on L1, then we’ll just need…

🎤 Moderator-Benjamin Jones(Optimism)

You are just disinheriting L1 that’s true.

Alright, guys, we’re looking at like 9 minutes left. I have a generic question here. The UX for mass adoption. I feel like we talked about UX a little bit here. I’m wondering if the last 9 times why we don’t just open up the floor. I’m curious what y’all might want to talk about or show what are you excited about in 2022? I feel like this is gonna be a big year for L2. So I’m sure there are some opinions there. So yeah, let’s open up the floor.

Alex Gluchowski(Matter Labs)

I have a question. We have a couple of problems with UX and security on L1, which we have a chance to address in L2 from get-go. I’m curious, what was your guy’s take on those. One particular aspect is, or maybe it’s interconnected, like different approaches to dealing with native ETH and ERC- 20 tokens and the approvals in ERC 20 tokens, because that’s a massive problem. Now we’ve seen last week, a few people got hacked because the front ends of certain protocols were compromised, and they previously approved unlimited amounts of tokens to those protocols. Some weird things happen to us, it would be much easier if you had a very clear policy. Whenever you interact with the contract, you just authorize exactly the number of tokens that you want to spend with this contract. That requires changes in the product. All the contracts that rely on the ERC20 just do not work this way. So how do you think about this issue?

Harry Kalodner(Arbitrum)

There’s one nice thing that we had the opportunity to do, which is like a relatively small thing, but which we were really excited about, which was to set the standard for what the default token would look like, in L2. One big thing we were able to do there is to have a permit included, which I don’t think we need to go into the details. I think probably people here are generally familiar with that on the call, I don’t think we need to go into the exact details but basically have a better standard er extended ERC 20. That kind of DApps can just assume tokens implement, which does not solve everything, but certainly at least improves the situation.

🎤 Moderator-Benjamin Jones(Optimism)

Yeah, I think that the challenge in that question Alex comes down to. To some extent, what is the role of us as people building infrastructure that people are going to be building applications on top of? One of the biggest realizations for us this year has been that, the realization that there’s an incredibly strong power to following EVM, like to the tee, right? This is what we called, like our EVM equivalents. The upgrade is like following EVM, not just in a way that looks familiar to devs. But literally does the exact same geth schedule for everything has the exact same geth code running behind it. So to some extent, I do feel like that philosophy is, like counters, our ability to, at least at a protocol level, make any decisions that are going to change this behavior, because ultimately, the approval behaviors that have come out, are just a result of usage of EVM. So to the extent that it’s our responsibility to just build an EVM equivalent platform, somehow I feel like my hands are tied. I think that what we can do is try to provide leadership and try to promote the right kinds of these things. I think maybe that just comes down to community. Ultimately, it can’t just be us building the infrastructure, but if we promote a community of security, and try to develop around these new standards, it is through that there’s a chance as these nation networks start growing to get some of it right from the get-go but I think that’s as much about community as it is about our particular responsibilities it’s protocol developers.

Harry Kalodner(Arbitrum)

Totally agreed.

Zachary Williamson(Aztec)

I can have a stab at this as well. I think at Atzec, we’re in a bit of a semi-unique position, where we have something which is both a combined problem and an opportunity. That’s because everything is by default private in Aztec, which means that we cannot perform the interfaces or model of the EVM to the mass. One of the benefits of that, though, is that we have a chance to kind of re-architect things from the ground up, because we don’t have this pressure to conform to the EVM to ease adoption, because well it’s private. So one thing for example, in our network, like ETH tokens are the same, they’re not treated as different classes, you can pay for transactions in tokens if you want. For our composable programmable network aspect, that way we will be representing ETH just like another kind of token. So you don’t have these alternative transaction flows. And similarly, we will almost certainly in the canonical asset contracts by using adding having a permit function and also shield and unshield functions. But yeah, I feel like, for every problem we solved, we’re gonna create another one. Because composable privacy is not really been done in a big way before. So there’s probably going to be for every issue solved, we’re going to create another one from the decisions we make. We’ll see, we’ll find out soon.

Avihu Levy(StarkWare)

It’s hard to predict the future.

Zachary Williamson(Aztec)

But yeah, pretty much.

🎤 Moderator-Benjamin Jones(Optimism)

I guarantee that will always be true. Well, I Okay, we’ll jump in here. Again, actually, you did just remind me Zach of one other thing, which is WETH vs ERC- 20, this is something that you mentioned, Alex, I totally neglected to mention this, but I want to take a minute to shill people joining the discussion about how optimistic Ethereum should do this.

In the EVM 1.0 we actually implemented it both ways. So if you deposited ETH in the way that that would work would be that you would receive an ETH balance that you could query with all of the native ETH opcodes but it would also be the result of a balance of call to a standard ERC20 like a contract.

*And so you actually had a WETH contract that the result of the balance call was the same as your native balance and you would pay ETH from that ERC 20. We actually disabled that because we decided that when we want the EVM equivalence route, we realized that this is not equivalent to the EVM. So it may be something that people are a fan of, but we actually just disable it to keep things equivalent for now. So if you are a dev listening to this and you have strong opinions, because we have opinions both ways on this, go to our **GitHub *github.com/Ethereum-optimism/optimism and contribute to the discussion because we want to know what you think we should do.

Harry Kalodner(Arbitrum)

I have a good shill. I think Ethereum 2.0. I know this is an L2 panel, but I feel like it would be incomplete if we were not to show the excitement of the upcoming merge and then all of the many new funny naming calling for the various roadmap components.

🎤 Moderator-Benjamin Jones(Optimism)

Oh yeah, the merge. I agree. ETH2 gonna be heavy to scale so I guess even beyond that is 4488.

Avihu Levy(StarkWare)

Yeah someone should mention this. Thank you.

Harry Kalodner(Arbitrum)

Is someone gonna appreciate the effect of the ability of rollups to sync and how data availability of past blocks will work for a fresh syncing rollup node under these new models?

🎤 Moderator-Benjamin Jones(Optimism)

Oh that’s interesting. Wow. Okay, that’s a juicy one that will have to save for another panel y’all but that is that does that is making me think. Yeah 4488 though. Tell your friends I think it will help everyone on this call to have larger blocks sizes without that meaning gas.

Zachary Williamson(Aztec)

Yeah awesome stuff.

🎤 Moderator-Benjamin Jones(Optimism)

Alright guys well I think that’s a good time to stop as any so I want to give you guys a sincere thanks for taking the time to sit down and chat and get into some real meat. I thought that was really fun thanks to everyone.

ALL:

Thank you Ben. Thanks a lot guys. A lot of fun.

About Co-hosts

❄️ IOSG Ventures

IOSG Ventures, founded in 2017, is a community-friendly and research-driven early-stage venture firm. We focus on open finance, Web 3.0 and infrastructure for a decentralized economy. As a developer-friendly fund with long-term values, we launch the Kickstarter Program, which offers innovative and courageous developers capital and resources. Since we consistently cooperate with our partners and connect with communities, we work closely with our portfolio projects throughout their journey of entrepreneurship.

❄️ StarkWare

StarkWare invented, and continually develops, STARK-based Layer-2 Validity Proof scaling solutions over Ethereum. StarkWare’s solutions, which rely on Ethereum’s security, have settled over $250B, and over 60M transactions, serving hundreds of thousands of users. StarkNet, StarkWare’s permissionless general-purpose scaling solution, is live (Alpha) on Ethereum Mainnet. StarkEx, a custom standalone scaling service, has been powering applications since June 2020, including dYdX, Immutable X, Sorare, and DeversiFi.

❄️ imToken

imToken is a decentralized digital wallet used to manage and safeguard a wide range of blockchain- and token-based assets, identities, and data. Since its founding in 2016, it has helped its users transact and exchange billions of dollars in value across more than 150 countries around the world. imToken allows its users to manage assets on 12 mainstream blockchains and all EVM chains, it also supports decentralized token exchange and open DApp browser.

❄️ Arbitrum

Arbitrum is a leading Ethereum Layer-2 scaling solution developed by OffchainLabs. Based on the Optimistic Rollup scheme, Arbitrum enables ultrafast, low-cost transactions without sacrificing the security of the Ethereum ecosystem. Launched on August 31st, 2021, Arbitrum has attracted 100+ ecosystem projects. Arbitrum is currently EVM-compatible to the bytecode level. In the next upgrade, Arbitrum Nitro, Arbitrum will further increase developer experience by incorporating WASM support.

Subscribe to IOSG Ventures_EN
Receive the latest updates directly to your inbox.
Verification
This entry has been permanently stored onchain and signed by its creator.