State of Crypto+AI 2024

TL;DR

We conducted an in-depth analysis of 67 Crypto+AI projects, categorizing them from a GenAI perspective. Our classification covers:

  1. GPU DePIN

  2. Decentralized Compute (Training + Inference)

  3. Verification (ZKML + OPML)

  4. Crypto LLM

  5. Data (General + AI-Specific)

  6. AI Creator Apps

  7. AI Consumer Apps

  8. AI Standards (Tokens + Agents)

  9. AI Economy

Why we’re writing this?

The Crypto+AI narrative has captured a lot of attention so far. Many reports on Crypto+AI are emerging, but they either cover just part of the AI story or interpret AI solely from the perspective of Crypto.

This article will look at the topic from an AI perspective, exploring how Crypto supports AI and how AI can benefit Crypto, to better understand the current landscape of the Crypto+AI industry.

Part I: Decoding the GenAI Landscape

Let's explore the entire GenAI landscape starting from the AI products we use every day. These products typically consist of two main components: a LLM and a UI. For the large model, there are two key processes: model creation and model utilization, commonly known as Training and Inference. As for the UI, it comes in various forms, including conversation-based like GPT, visual-based like LumaAI, and many others that integrate inference APIs into existing product interfaces.

Compute

Diving deeper, computation is fundamental for both training and inference, relying heavily on underlying GPU computing. While the physical connections of GPUs may differ between training and inference, GPUs serve as the foundational infrastructure component across AI products. Above this, we have the orchestration of GPU clusters, known as Clouds. These can be split into Traditional Versatile Clouds and Vertical Clouds[1], with Vertical Clouds being more AI-focused and optimized for AI computing scenarios.

Storage

Regarding storage, AI data storage can be divided into traditional storage solutions like AWS S3 and Azure Blob Storage, and specific storage optimized for AI datasets. These specialized storage solutions, like Google Cloud's Filestore, are designed to enhance data access speeds in specific scenarios.

Training

Continuing with AI infrastructure, it's crucial to distinguish between Training and Inference, as they differ significantly. And beyond general computing, both entail numerous AI-specific business logics.

For training, the infrastructure can generally be divided into[2]:

  • Platforms: These are specifically designed for training and assisting AI developers in effectively training large language models, with added software acceleration solutions, such as MosaicML.

  • Base Model Providers: This category includes platforms like Hugging Face, which offer base models that users can further train or fine-tune.

  • Frameworks: Lastly, there are various foundational training frameworks built from scratch, such as PyTorch and TensorFlow.

Inference

For Inference, the landscape can generally be divided into:

  • Optimizers: These specialize in making a series of optimizations specifically for inference and particular use cases, such as supporting parallel processing or algorithmic enhancements for media generation. An example includes fal.ai, which has optimized inference for text-to-image processes, improving the speed of diffusions by 50% compared to general approaches.

  • Deployment Platforms: These provide general model inference cloud services, such as Amazon SageMaker, facilitating the deployment and scaling of AI models across different environments.

Application

While there are countless AI applications, they can broadly be categorized based on user groups into two main types: creator and consumer[3].

  • AI Consumer: Starting with AI consumer, this group primarily uses AI products and is willing to pay for the value these products bring. A typical example of this category is ChatGPT.

  • AI Creator: On the other hand, applications for AI Creators are more about inviting AI creators to their platforms to create agents, share knowledges and then share profits with them, with GPT marketplace being one of the most famous examples.

These two categories encompass nearly all AI applications. While more detailed classifications exist, this article will focus on these broader categories.

Part II: How Crypto helps AI

Before answering this question, let's summarize the main advantages of Crypto that AI could leverage: Monetization, Inclusivity, Transparency, Data ownership, Cost reduction, and more.

A high-level summary of crypto+AI intersections from vitalik.eth blog
A high-level summary of crypto+AI intersections from vitalik.eth blog

These key synergies[4] primarily help the current landscape by:

  • Monetization: Through unique Crypto mechanisms such as tokenization, monetization, and incentivization, disruptive innovations can be made in AI creator applications, ensuring that the AI economy is open and fair.

  • Inclusivity: Crypto enables participation without the need for permission, breaking the various constraints imposed by the closed, centralized AI companies that dominate the market today. This allows AI to achieve true openness and freedom.

  • Transparency: Crypto can make AI fully open-source by utilizing ZKML/OPML technologies to put the entire training and inference process of LLMs on-chain, ensuring the openness and permissionless of AI.

  • Data ownership: By enabling on-chain transactions to establish data ownership for accounts(users), thereby allowing users to truly own their AI data. This is especially beneficial at the application layer, helping users effectively secure their AI data rights.

  • Cost reduction: By incentivizing with tokens, the future value of computing power can be cashed, significantly reducing the current cost of GPUs. This approach greatly reduces the cost of AI at the computational level.

Part III: Exploring the Crypto+AI Landscape

Applying the Crypto advantages to the different categories within the AI landscape creates a new perspective of the AI landscape through the lens of Crypto.

LLM Layer

1. GPU DePIN

We continue to outline the AI+Crypto blueprint based on the AI Landscape. Starting with LLMs and beginning at the foundational level with GPUs, a longstanding narrative in Crypto has been Cost Reduction.

Through blockchain incentivization, we can significantly reduce costs by rewarding GPU providers. This narrative is currently known as GPU DePIN. While GPUs are used not only in AI but also in gaming, AR, and other scenarios, the GPU DePIN track generally covers these areas.

Those focused on the AI track include Aethir and Aioz network, while those dedicated to visual rendering include io.net, render network, and others.

2. Decentralized Compute

Decentralized computing is a narrative that has existed since the inception of blockchain and has developed significantly over time. However, due to the complexity of computing tasks (compared to decentralized storage), it often requires limiting the computing scenarios.

AI, as the latest computing scenario, has naturally given rise to a series of decentralized computing projects. Compared to GPU DePIN, these decentralized computing platforms not only offer cost reduction but also cater to more specific computing scenarios: Training and Inference. They orchestrate over wide-area networks to significantly enhance scalability[5].

Scale and cost-efficiency by gensyn.ai
Scale and cost-efficiency by gensyn.ai

For example, platforms focused on Training include AI Arena, Gensyn, DIN, and Flock.io; those focused on Inference include Allora, Ritual, and Justu.ai; and those handling both aspects include Bittensor, 0G, Sentient, Akash, Phala, Ankr and Oasis.

3. Verification

Verification is a unique category within Crypto+AI, primarily because it ensures that the entire AI computing process, whether Training or Inference, can be verified on-chain.

This is crucial for maintaining complete decentralization and transparency of the processes. Additionally, technologies like ZKML also safeguard data privacy and security, allowing users to have 100% ownership of their personal data.

Depending on the algorithm and verification process, this can be divided into ZKML and OPML. ZKML uses zero-knowledge (ZK) technology to convert AI Training/Inference into ZK circuits, making the process verifiable on-chain, as seen with platforms like EZKL, Modulus Labs, Succinct and Giza. On the other hand, OPML utilizes off-chain oracles to submit proofs to the blockchain, as demonstrated by Ora and Spectral.

4. Crypto Base Model

Unlike general LLMs like ChatGPT or Claude, Crypto Base Models are retrained with extensive crypto data, endowing these base models with a specialized knowledge base in cryptocurrency.

These base models can provide powerful AI capabilities to crypto-native applications such as DeFi, NFT, and GamingFi. Currently, examples of such base models include Pond and Chainbase.

5. Data

Data is a critical component in the AI field. In AI Training, datasets play a crucial role, and during Inference, the vast amounts of prompts and knowledge-base from users also demand substantial storage.

Decentralizing data storage not only significantly reduces storage costs but, more importantly, ensures traceability and ownership rights of the data.

Traditional decentralized storage solutions like Filecoin, Arweave, and Storj can store large volumes of AI data at very low costs.

Meanwhile, newer AI-specific data storage solutions are optimized for the unique characteristics of AI data. For example, Space and Time and OpenDB optimize data pre-processing and querying, while Masa, Grass, Nuklai, and KIP Protocol focus on monetizing AI data. Bagel Network concentrates on user data privacy.

These solutions leverage the unique advantages of Crypto to innovate in areas of data management within the AI field that have previously received less attention.

Application Layer

1. Creator

In the Crypto+AI application layer, creator applications are particularly noteworthy. Given Crypto's inherent capability for monetization, incentivizing AI Creators naturally follows.

For AI Creators, the focus splits between low/no-code users and developers. Low/no code users, such as bot creators, use these platforms to create bots and monetize them through tokens/NFTs. They can quickly raise funds via ICO or NFT Mint, and then reward long-term token holders through shared ownership, such as revenue sharing. This opens up their AI products completely through community co-ownership, thus completing the AI Economy Lifecycle[6].

Moreover, as Crypto AI creator platforms, they address the challenges of early to mid-stage funding and long-term profiting for AI creators. This is done by leveraging the unique advantage of tokenization inherent in Crypto, and offering services at a fraction of the take rates typical of Web2—demonstrating the 0 operational cost benefits brought by Crypto's decentralization[7].

In this sector, platforms like MagnetAI, Olas, Myshell, Fetch.ai, Virtual Protocol, and Spectral cater to low/no-code users by providing agent creator platforms. For AI model developers, MagnetAI and Ora offer model developer platforms. Additionally, for other categories such as AI+Social creators, there are platforms like Story Protocol and CreatorBid that tailor specifically to their needs, while SaharaAI focuses on monetizing knowledge bases.

2. Consumer

Consumer refers to using AI to directly serve crypto users. Currently, there are fewer projects in this track, but those that exist are irreplaceable and unique, such as Worldcoin and ChainGPT.

3. Standard

Standards are a distinctive track within Crypto , characterized by the development of independent blockchains, protocols, or improvements to create AI dApp blockchains, or by enabling existing infrastructures, such as Ethereum, to support AI applications.

These standards enable AI dApps to embody Crypto advantages like transparency and decentralization, providing fundamental support to both creator and consumer products.

Examples include Ora, which extends ERC-20 to offer revenue sharing, and 7007.ai, which extends ERC-721 to tokenize model inference assets. Additionally, platforms like Talus, Theoriq, Alethea, and Morpheus are creating on-chain VMs to provide execution environments for AI Agents, while Sentient offers comprehensive standards for AI dApps.

4. AI Economy

AI Economy is a significant innovation within the Crypto+AI domain, emphasizing the use of Crypto's tokenization, monetization, and incentivization to democratize AI.

AI Economy Lifecycle by MagnetAI
AI Economy Lifecycle by MagnetAI

It highlights the AI sharing economy, community co-ownership, and sharing of ownership rights. These innovations substantially drive the further prosperity and development of AI.

Among them, Theoriq and Fetch.ai focus on agent monetization; Olas emphasizes tokenization; Mind Network offers restaking benefits; and MagnetAI integrates tokenization, monetization, and incentivization into a single cohesive platform.

Last Part: Conclusion

AI and Crypto are natural partners. Crypto helps make AI more open, transparent, and irreplaceably supportive of its further prosperity.

AI, in turn, expands the boundaries of Crypto, attracting more users and attention. As a universal narrative for all humanity, AI also introduces a massive adoption narrative to the Crypto world that is unprecedented.

References

  1. “The Cloud Killed Infrastructure, Long Live Infrastructure!”, by A16Z: https://a16z.com/the-cloud-killed-infrastructure-long-live-infrastructure/

  2. AI Infrastructure Landscape: https://blog.segmind.com/the-generative-ai-infrastructure-landscape-by-segmind/

  3. “Generative AI’s Act Two”, by Sequoia: https://www.sequoiacap.com/article/generative-ai-act-two/

  4. “The promise and challenges of crypto + AI applications”, by Vitalik Buterin: https://vitalik.eth.limo/general/2024/01/30/cryptoai.html

  5. “Scale and cost-efficiency“, by Gensyn: https://docs.gensyn.ai/litepaper#scale-and-cost-efficiency

  6. “Rethinking AI Economy”, by MagnetAI: https://mirror.xyz/magnetai.eth/EmrtTXyRlgiSCCOY-4RZ6bPtMpJ9n96jjb8M9Cg5DYk

  7. “Your take rate is my opportunity”, by Chris Dixon: https://a16zcrypto.com/posts/article/going-from-web-2-to-web-3-your-take-rate-is-my-opportunity/


This article is written by the MagnetAI Research Team. This is version 1, and we will continue to update our coverage of the Crypto+AI landscape. If you're interested, please subscribe to our Mirror.

If you are building within Crypto+AI intersections, DM us.

About MagnetAI

MagnetAI is pioneering the first ModelFi protocol, designed to create financial opportunities for AI creators, consumer, and investors through blockchain-based principles like tokenization, incentivization and monetization.

ModelFi transforms the AI economy through 3 primary components:

  • Model Store: A decentralized AI marketplace that enables creators to monetize their AI products, such as models, bots, agents, and knowledge bases.

  • Model Economy: Allows creators to tokenize their AI products, facilitating fractional ownership and benefits such as revenue sharing with "Model tokens."

  • Model Network: Integrates Web2, Web3, and Virtual GPU nodes into a cohesive inference network that is ZKP-based and on-chain verifiable.

Discover more about how MagnetAI shaping a new AI economy on our homepage: https://magnetai.xyz/

Stay engaged with MagnetAI: Follow us on X, and join our Discord and Telegram for the latest news and developments.

Subscribe to MagnetAI
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
Verification
This entry has been permanently stored onchain and signed by its creator.