In DeFi, all financial transactions are public by definition. However, administering a DAO's financial data on a quarterly or annual basis is not always as simple as it is in traditional financial systems. There, monitoring two or three bank accounts is usually enough to gather all transactions. In BadgerDAO’s case, the system is composed of at least 250 different addresses, making it a challenge to generate a complete overview of all transactions.
In the past couple of years, numerous products have entered the market with the goal of solving this very issue. However, the shortfall of a lot of these new products is their coverage. As soon as it cannot price an ERC-20 token or does not support a specific type of action or protocol, it will end up rendering an incomplete report. Badger therefore had no choice but to write an application from scratch, in which all custom business logic for every single transaction that ever occurred, was correctly captured.
For the most part, on-chain data is being captured using ApeWorX's ape, custom Dune queries and Etherscan's API. Each data source has its own advantages and disadvantages, which is why it is important to pull from multiple. An obvious angle is leveraging Ethereum's logs, which both ape and Dune do well. Badger even added some events to existing smart contracts where they were missing, in order to capture them for reporting.
Alternatively, one could try to categorise all transactions based on their origin, destination or calldata. While this has worked in the past, what if the system gets upgraded and some addresses get replaced? What if an external system starts adding a different type of ERC-20 into the mix? Etc. Having a complete set of events in your smart contract system that logs everything is a must if you want to make sense of what is happening later.
Once all events are captured, an important next step is pricing them. For this the system is using historical data from both CoinGecko and Badger's API, the latter even allowing pricing a token at a specific block. This makes it possible to calculate the value of complex transactions dating back months or even years.
Finally, with all the events now priced based on their value at time of occurrence, all transactions will need to be categorised in accordance with traditional accounting specifications and requirements. Which type of revenue is this specific ERC-20 token transfer? Are these costs recurring or one-offs? Is this transaction internal or external? This is where Badger’s programmatic approach, made possible by a transparent and open-source protocol, really shines. In traditional systems, much of this labelling is done manually based on uploaded invoices and domain specific knowledge. The vast majority of Badger transactions however, are labelled automatically based on the underlying events and smart contracts. Only a tiny fraction of them still have to be labelled manually, for which Utopia’s app on top of the Safe multisig contracts proved to be very useful. It currently processes all of the outgoing transactions, and even those are mostly tagged by preset rules already.
This ETL process eventually results in a database of dated, priced and labelled transactions. Those in turn can then be read by data reporting tools such as Google’s Data Studio, in which various perspectives and reports on that data can be defined. Take for example 2022’s profit&loss report in USD, all costs expressed in their native token or the treasury’s current non-native holdings.
Badger will continue to progress the ability to understand and report on end-to-end financial performance:
Strengthen the reliable and scalable system to extract, transform, and load data to produce financial statements
Progress to trend performance over time against budgets and the ability to integrate analysis into product and treasury results
Make more informed decisions overall that are better supported with enhanced financial planning and analysis
Be Relentless. Be Badgers