0x1D40
May 26th, 2022

The Ropsten merge is a significant milestone in Ethereum’s progress towards moving to proof-of-stake. It is the first pre-existing public testnet to be merged, making it an important test-case for merging Mainnet. It has already been entertaining because a naughty miner deployed a lot of hashpower to the network and brought the merge date suddenly much closer - so close that the merge data passed before a Ropsten Beacon Chain even existed. The client teams quickly posted fixes that pushed the trigger for the merge (TTD - terminal total difficulty) into the far future. Nodes that were sync’d to Ropsten halted because there was no Beacon Chain yet to take over the consensus and block gossip responsibility, but those functions were switched off in Geth.

I am writing this article on the day that the Ropsten Beacon Chain is supposed to go live and the client fixes that bump the TTD have already been released. The Ropsten merge itself will not happen for a few days so there is still time to participate by syncing a node and spinning up a validator.

update Ethereum Foundation notes on the Ropsten merge now available here.

It is important to make responsible choices for both the execution and consensus client. The correct choice will differ person by person, but using minority clients is strongly encouraged. I chose to use Lodestar for my consensus client in this tutorial as it is currently one of the lesser-used clients, being newer than some of the others. Increasing the adoption of minority clients is important for evening out the client diversity, which has security benefits for the network as a whole. I am, however, still using Geth as my execution client despite it being the majority client on the execution layer simply because I know Geth quite well and already have a Geth node sync’d to Ropsten. Readers are encouraged to experiment with minority execution clients.

0x1D40
May 18th, 2022

Sybils

Quadratic funding - the mechanism that currently determines the value of Gitcoin grant funding -is inherently vulnerable to Sybil attacks. Sybil attacks are individual humans dividing themselves into multiple “virtual humans” in order to gain additional voting weight. In traditional banking and voting systems, Sybil resistance comes from “KYC” (know-your-customer) which links personal identifying information to some action. In Web3, “KYC” is generaly minimized because it undermines the core ethos of censorship resistance and permissionlessness. This means other methods are required to identify which participants in a grant round are real individual humans, and which are not.

Sybil Strategies

The goal of Sybil defense is to increase the investment of time and money required for an attacker to convice a grant review system that they are > 1 person to the extent that as rational attacker would not do it. Defenders constantly attempt to push this cost up while minimizing their own expenses, while attackers constantly try to pull the attack cost down. The greater the size of the exploitable pool of funds, the higher cost an attacker will be willing to pay. At the same time, extremely low-cost Sybil attacks are often worthwhile for attackers because even a low success rate can still be profitable if the attack cost is sufficiently low. This means that a robust Sybil defense structure requires systems that identify cheap, simple attacks very effectively and efficiently as well as more complex defenses against sophisticated attacks.

0x1D40
May 17th, 2022

Across 13 rounds, Gitcoin has given almost $60 million to public goods. Projects that demonstrably create positive externalities bid for portions of the total funding pool. Like any substantial pool of money, Gitcoin grants attract diverse attacks from bad actors aiming to divert a portion of that money away from public goods and into their own wallets. The role of Gitcoin's Fraud Detection and Defense (FDD) squad is to protect the Gitcoin community - a diverse group that includes users, $GTC holders, grant recipients, donors and stakeholders in funded projects - from these attacks. From FDD emerges a protective layer that filters out attackers, enables partnerships with people and projects that have genuinely good intentions and delivers a trustworthy set of grant decisions. In doing so, FDD minimizes financial spillage to dishonesty and incompetence are thereby maximizes the public goods that can be supported by a given pool of funds. This article explores the various components of FDD and explains how they operate together to form a community "trust function" that protects public goods.

Trust in FDD

FDD aims to deliver grant decisions can be trusted by the community.

In this context, trust can be defined as belief that the grant evaluation system is effective at eliminating dishonesty and wastefulness. To foster this belief the community must perceive the process to be transparent and well-aligned with its values. This requires an open system that ensures grant applicants, grant reviewers and voters act honestly. Trust can be thought of as the synthesis of five core concepts:

0x1D40
April 1st, 2022

People make or break DAOs so onboarding good people is critical. However, it is also notoriously difficult to get onboarding right, and not only for DAOs - onboarding challenges cost businesses millions every year and there is a growing recognition that organizations with strong onboarding protocols outperform those with a "sink or swim" approach. Onboarding is where organizations and contributors make their first impressions on each other, set expectations and establish the tone of their new relationship. Getting this wrong has substantial costs in terms of reputation, time, morale, money and opportunity costs when potentially great contributors decide to go elsewhere.

In this article I will reflect on my own onboarding experiences into the Fraud Detection and Defense (FDD) squad in Gitcoin DAO. These reflections might help the ongoing efforts to refine the onboarding process and also provide some pointers for prospective contributors.

DAO-level onboarding

In December 2021 I decided to start contributing to the DAO. Gitcoin was the obvious choice for me. Public goods funding was my gateway into the web3 space a few years earlier and I was arriving thoroughly green-pilled. I'd also been a winner of a Gitcoin RFP a few months before and met a few people from the DAO as a result. I was happy to chat with some of the core DAO members about potential routes to work. These initial conversations led me to the FDD stream, where I felt my background in data science could be put to good use. The first steps were quite organic and very informal, mostly consisting of discord DMs.

0x1D40
March 2nd, 2022

Thanks to Tim Beiko and Caspar Schwarz-Schilling for helpful comments on earlier drafts!

Ethereum is a notoriously adversarial environment. Ethereum has even been compared to a “dark forest” - acknowledging the terrifying game-theoretic concept from the Three body Problem that being visible to other entities in the universe is an unavoidable precursor to being destroyed by them. This reputation mostly comes from weaknesses in the application layer (insecure smart contracts) or the social layer (users being manipulated to give up their private keys or unwittingly sign transactions) and from the existence of bots extracting value from the transaction mempool. However, sophisticated hackers acting either as thieves or saboteurs are also constantly seeking out opportunities to attack Ethereum’s client software. The client software is what turns a computer into an Ethereum node - it is code that defines all the rules for connecting to other nodes, swapping information and agreeing on the state of the Ethereum blockchain. Attacks on the protocol layer are attacks on Ethereum itself.

Soon, Ethereum clients will undergo a major upgrade (“the merge”) that will switch off their protective proof-of-work algorithm and replace it with a proof-of-stake mechanism instead. There are many reasons for this, which have been discussed at length elsewhere. This will be a philosophical change as well as technical one. Securing the network will evolve from requiring that network security is always more expensive than an attacker could plausibly spend to instead adopting a model that is cheap for all network participants except for an attacker. The merge to proof-of-stake brings security, sustainability and scalability benefits, but on the other hand the complexity of the client software will grow and so will the protocol’s potential attack surface. Participating in securing the Ethereum blockchain currently requires running a single piece of software; after “the merge” it will require three (execution client, consensus client, validator).

This article gives an overview of known attack vectors on the Ethereum’s consensus layer and outlines how those attacks can be defended. Some basic knowledge of the Beacon Chain is probably required to get the most value from this article. Good introductory material is available here, here and here. Also, it will be helpful to have a basic understanding of the Beacon Chain’s incentive layer and fork-choice algorithm, LMD-GHOST. These are big topics, but I’ve included a very high level primer in the preamble below.

0x1D40
February 8th, 2022

This article was one of 7 winners of the Gitcoin Public Goods RFP October 21: https://gitcoin.co/blog/seeking-a-new-kind-of-public-good-closing-the-loop/

Science is the process of discovery. It powers technological advancement and enables us to navigate and manipulate our environment. Strong economic, ethical and pragmatic arguments can be made in favour of scientific knowledge being a public good. This is especially true now that damaging disinformation is especially rife yet novel discoveries will be needed to fight existential-level threats like pandemics and climate change. However, this knowledge comes from scientific research, and our current infrastructure for doing that is broken, from the initial allocation of funds right through to the eventual dissemination of results. The reasons for this are diverse but ultimately share a common thread: they are emergent phenomena of centralized control. In future, a stack of DeSci dapps could offer a more attractive model for altruistically driving, doing and disseminating scientific research. This essay will make the case for DeSci and suggest a potential roadmap for building out a DeSci stack.

The broken TradSci model

Science is enabled by distribution of funds to individuals or groups who propose to complete some specific project. In almost all cases, written applications are scored by a small panel of individuals who might then interview shortlisted candidates prior to awarding funds to a successful few. This general model has a long history, but is also well-known to be vulnerable to the biases, politics and self-interest of the review panel. There has been shown to be no correlation between grant application scores and their eventual outcomes indicating that review panels do a poor job of selecting high quality projects. The same proposals given to different panels have wildly different outcomes, without even agreement on the relative merits of the proposals. These issues have been amplified as research funding has become more scarce over time, entrenching a “funding crisis“. Funders have increasingly favored “safe hands”, hindering the progression of new researchers and stifling intellectually ambitious projects. The effect has been to circulate money around an established pool of academics, while also creating a hyper-competitive funding landscape that incentivizes applicants to over-promise and under-deliver. There have been calls to replace the current system with, for example, grant lotteries. Overall, the current centralized funding model is deeply inefficient, entrenches perverse incentives and undermines the scientific progress it is supposed to promote.

0x1D40
February 8th, 2022

This is a more detailed version of introductory article I wrote at ethereum.org

Ethereum's current energy expenditure with proof-of-work (PoW) is too high. Resolving energy expenditure concerns without sacrificing security and decentralization is a significant technical challenge and has been a focus of research and development for years. This article will explore why building Ethereum has had a substantial environmental impact, how it compares to other chains and how the upcoming network upgrade to proof-of-stake (PoS) will dramatically change this. This is a undeniably a very controversial issue that attracts a lot of vitriol on all sides of the debate. This article is an attempt to pick some signal out of the noise.

Energy secures the network

Transactions on the PoW Ethereum blockchain are validated by miners. Miners bundle together transactions into ordered blocks and add them to the blockchain. The new blocks get broadcast to all the other node operators who run the transactions independently and verify that they are valid. Any dishonesty shows up as an inconsistency between different nodes. Honest blocks are added to the blockchain and become an immutable part of the chain’s history.

0x1D40
February 8th, 2022

This is a detailed accompaniment to an introductory article I wrote at ethereum.org

Ethereum has multiple interoperable clients developed and maintained in different languages by independent teams. This is a major achievement and can provide resilience to the network by limiting the effects of a bug or attack to only the portion of the network running the affected client. However, this strength is only realized if users distribute roughly evenly across the available clients. At present, the vast majority of Ethereum nodes run a single client, inviting unnecessary risk to the network.

Ethereum will soon undergo one of the most significant upgrades to its architecture since its inception - the merge from proof-of-work (PoW) to proof-of-stake (PoS). This will fundamentally change the way the network comes to consensus about the true state of the blockchain and network security is maintained. This new architecture brings security, scalability and sustainability benefits, but at the same time amplifies the risks associated with single-client dominance. This article will explore why…