Inscriptions EVM Madness

As we are running our own EVM infrastructure ourselves and not relying on any third party services, we have to deal with all kind of new trendy things allowing bots/users to hit new bottom of Geth performance against hardware. In December it was Inscriptions.

For whatever reason someone decided that moving BRC-20 on EVM chains is a good idea, which can be cheaply implemented submitting transactions with nothing but data in Input field

inscriptions raw
inscriptions raw

Parsed easily with Clickhouse JSONExtractString function

inscriptions parsed
inscriptions parsed

So everyone started deploying and "minting" whatever, and nodes got on fire, multiple chains infrastructure got de-synced massively, block explorers were lagging and all the things we all love, when we go tru the next "Performance testing" like that.

Following @cygaar post on Inscirptions

and intrigued by what the hell of a thunderstorm was causing nodes to de-sync on polygon and BSC, I'm followed my own the research. In a nutshell ,whoever want to have a dApp dealing with inscriptions needs to have own indexer running. So, whenever you are getting the transaction with inscriptions you would be able to update your state of the asset accordingly. Indexer operates off-chain and if everyone participating following brc-20 standard in the metadata they submit into Input of transaction, anyone indexing all the transactions can restore the state of the ticker balances wise. Tickers themselves are not unique, same as with Symbols on ERC-20, so the unique combination to track there is deployer from address plus ticker name.

Following the pattern and looking into Dune dashboard created by @hildobby EVM Inscriptions 📝 decided to tackle the thing myself, and dropped couple of Materiazlized views on top of our transactions table in Clickhouse powering up Guru Warehouse :

CREATE TABLE `inscriptions` $on_cluster
        (
            `block_number` UInt64,
            `block_timestamp` UInt32,
            `transaction_hash` String CODEC(ZSTD(1)),
            `from_address` String CODEC(ZSTD(1)),
            `to_address` Nullable(String) CODEC(ZSTD(1)),
            `transaction_index` UInt32,
            `gas_used` UInt64,
            `gas_price` UInt256,
            `standard` LowCardinality(String),
            `operation` LowCardinality(String),
            `ticker` String CODEC(ZSTD(1)),
            `amount` String CODEC(ZSTD(1)),
            `inscription` JSON CODEC(ZSTD(1)),
            INDEX blocks_timestamp block_timestamp TYPE minmax GRANULARITY 1
        )
        ENGINE = ${replicated}ReplacingMergeTree
        ORDER BY (block_number, transaction_hash)
        SETTINGS index_granularity = 8192;



        CREATE MATERIALIZED VIEW `inscription_mv_parsed_transactions` $on_cluster
        TO `inscriptions`
        AS
        SELECT
            block_number as block_number,
            block_timestamp as block_timestamp,
            hash AS transaction_hash,
            from_address AS from_address,
            to_address AS to_address,
            transaction_index AS transaction_index,
            ifNull(receipt_gas_used, 0) as gas_used,
            ifNull(receipt_effective_gas_price, 0) as gas_price,
            `gas_price` UInt256,
            JSONExtractString(input_json, 'p') AS standard,
            JSONExtractString(input_json, 'op') AS operation,
            JSONExtractString(inscription, 'tick') AS ticker,
            JSONExtractInt(inscription, 'amt') AS amount,
            input_json AS inscription
        FROM
        (
            SELECT 
                block_number,
                block_timestamp,
                hash,
                from_address,
                to_address,
                transaction_index,
                receipt_gas_used,
                receipt_effective_gas_price,
                substring(unhexed_input, position(unhexed_input, '{')) AS input_json, -- Extracts the JSON part from the input
                unhex(input) AS unhexed_input -- Converts hex input to binary string
            FROM transactions
        )
        WHERE
            JSONHas(input_json, 'p') AND
            JSONHas(input_json, 'op'); -- Check if JSON extraction is successful

Having there those tables as a result in warehouse, allowed me to start building queries:

inscriptions.all added to warehouse.dex.guru
inscriptions.all added to warehouse.dex.guru

Added Some Queries

queries in warehouse.dex.guru
queries in warehouse.dex.guru

All of those combined in Dashboard, showing activities there as well as particular operations types

Inscriptions dashboard in warehouse.dex.guru
Inscriptions dashboard in warehouse.dex.guru

Some funny tickers are overminted tremendously:

overminted tickers on arb
overminted tickers on arb

Outcomes

DexGuru Warehouse Dashboards Integration POC

We've decided to embed Inscriptions Data Warehouse dashboard into our Markets Overview page, and test how Warehouse <-> Dex guru integrations looks overall, so we would utilize same idea for more dashboards in the future, placing them on different pages, allowing to have community driven analytics in the same place as trading happens.

Inscriptions API

All those queries are available via API, so anyone can build their own dashboards and analytics on top of it. https://warehouse.dex.guru/queries/158#-1

Inscriptions as a way to sync on-chain/off-chain Runtimes

Doing that research I was tackling the idea of utilizing inscriptions for something useful, so I've got an idea of using Inscriptions as a Rewards Points in off-chain/on-chain rewards system where user have to accomplish some quest off-chain(Web2 website) and can claim points with representation on-chain in Inscriptions form. Benefits there:

  • Inscriptions are cheap to mint

  • Allowing to da activities off-chain(Web2) submitting them into Inscriptions indexer.

So looking forward into building something off them.

Subscribe to evahteev
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
Verification
This entry has been permanently stored onchain and signed by its creator.