Uncovering hidden gems in the NFT market can feel like searching for a needle in a haystack. The process of evaluating and comparing the value of each NFT to its listing price can be time-consuming and often by the time you've found a good deal, it's already been snatched up. But what if you could have an AI-powered sidekick that could help you identify the most valuable NFTs in a collection with just a few clicks, improving your cost basis and boosting your overall returns?
This is where Atlas comes in. We leverage cutting-edge machine learning techniques to deliver real-time, best-in-class valuations for NFTs, so instead of haphazardly purchasing the lowest-priced NFTs in a collection, you can use a sophisticated system that helps you make informed decisions and maximize your returns:
Search for the collection you're interested in buying into
The collection will automatically be sorted by the best value, but you can also sort by lowest price for comparison
Use filters for traits or price ranges, take a closer look at a few of the top NFTs, and purchase or sweep with confidence.
Thousands of data points per item. Thousands of items per collection. Thousands of collections.
Building a robust, real-time machine learning pipeline for fast inference using billions of data points is not easy. And that’s why we’re excited to share our learnings with you. ⚡
We calculate a valuation distribution for each individual NFT in each collection we support. None of those valuations would be useful if they aren’t reacting real-time to how the markets are changing! We knew the strict importance of both breadth of coverage and speed of valuations criteria when designing a useful machine learning system and were up to the challenge. To accomplish this, we've segmented our learning system into two parts: a static metadata system and a real-time inference system.
The static metadata system precomputes all of the unchanging (or at least changing less frequently than our model needs to update) characteristics of an NFT, things such as all of the trait combinations, the number of traits an NFT has, how many are gold traits, what other collections it is similar to, how long the collection has been around, what the historical volume for the collection is…really anything that doesn’t change often, but contributes to the value of an NFT.
This static system then allows us to do deep inference on all that metadata. Think of this as doing all of the ‘heavy lifting’ of comparing all of those different combinations. We use neural networks to create embeddings of this metadata for each NFT so that they can be easily processed downstream in real-time as new order events come in. One technique we use to embed these features is Deep Hyperedges. With this method, we represent an NFT collection as a hypergraph. A hypergraph can be thought of as a big Venn diagram: points in the Venn diagram represent NFTs and circles represent traits. So, if there’s a circle for “gold fur” and another circle for “coin eyes”, an NFT with both of these traits would live in the intersection.
This method and way of representing an NFT collection allows us to create a very powerful relational model and get a large amount of implied information about an NFT’s value given the market behavior of other NFTs in the collection.
But like we said earlier, if the valuations aren’t in realtime, what’s the point!? Enter the real-time inference system. We feed the output of the heavy-lifting static system into the real-time system and then add our real-time signals. This takes into account all of the NFT market structures that are in flux. Things like the current floor, the current collection-wide bid ceiling, the current NFT listings, and which sales just happened seconds ago are important events we stream through our system to perform real-time inference. Those real-time market factors don’t have as many combinations, and can be distilled much more numerically than all of the static parts, so tracking them is (relatively!) less computationally intense for us to process.
Each individual NFT’s orderbook and transaction history gives us insight into the sentiment, market performance, and demand relative to other NFTs in the collection. By analyzing the historical and current offers and listings for the NFTs individually in the collection, we can get a clear sense of what the open market is willing to do at a baseline. For instance, if an NFT has an offer for 5 ETH and is listed at 7 ETH, we know the value of the NFT is between 5 and 7 ETH. This information is used to calculate various technical indicators (such as the floor:ceiling ratio) that help us understand liquidity and market sentiment. Our sequence models process this with a bit more complexity and comprehension than a simple signal could.
Above is a visualization of the desirability rank of bored apes over time based purely on their offers and listings (the orderbook) using a ranking algorithm we created. This serves to “sanity check” the valuations we provide.
A considerable amount of signal comes from Twitter for many collections. We leverage natural language models for processing related tweets, as well as graph neural networks to analyze the follower graph surrounding a collection. By analyzing social media data, we can determine the NFT collection’s reach and popularity, which can have a significant impact on its market cap. Similarly, we use the performance of other assets such as crypto and equities to gain insight into market conditions and how they may affect the collection’s value. We incorporate these other market factors into our dynamic real-time models as features to give traders not just a sense of how the NFTs are valued relative to the NFT market, but also relative to the overall market.
We’re glad you asked. We’ve asked the same questions of other NFT valuation services in the market, and that’s why we’ve built a robust system of interpretability tools to give you the information you need to leverage Atlas’ AI in your own workflow as a trusted companion. Stay tuned for our next articles on Liquidity Analysis and Interpretability which will dive into details about how you can use Atlas effectively and confidently in your trading workflow.
Remember, executing trades on Atlas means you’ll be saving on gas with the most gas-efficient aggregator around, and you’ll also be earning points toward the eventual airdrop!
Atlas is a revolutionary trading and financial products platform for NFTs, powered by cutting-edge machine learning. The team is deeply technical with a strong track record of building great products, having founded or been on the founding team of an NFT unicorn, a DeFi unicorn, and a highly successful quantitative trading firm. We’ve been building NFT marketplaces since 2018, quant and AI trading strategies since 2010, and have scaled multiple products from 0 to millions. Join Atlas today and be a part of the future of NFT trading.