Small language models challenge Big Tech’s AI giants

Community-owned small language model project introduces a framework for incentivized data sharing, aiming to redefine AI development and user interaction.

Assisterr targets the monopolization of AI by Big Tech, advocating for data ownership and the democratization of AI through community-owned small language models (SLMs), which offer tailored and efficient solutions.

The debate on whether artificial intelligence (AI) poses a global threat often misses a crucial point: the real danger lies not in AI itself but in its potential monopolization by the tech giants — commonly known as Big Tech — and governmental bodies. These powerful entities can misuse AI to subtly shape public perceptions and behaviors to serve their own ends, whether for profit maximization or political control.

Far from being a dystopian fantasy, this scenario reflects our current reality, demanding immediate intervention. Data ownership lies at the heart of the issues with AI technology. Big Tech has effectively appropriated the collective knowledge of humanity, training their large language models (LLMs) on free information and then locking it behind $20 monthly subscriptions.

Google's $60 million annual investment for access to Reddit’s treasure trove of user-generated content underscores the disparity between the value created by community contributions and the compensation (or lack thereof) received by those contributors.

Empowering communities with small language models

Against this backdrop, Assisterr — a Cambridge-based data layer for decentralized AI — positions itself as a force for change by creating an infrastructure that backs decentralized AI data inference and a network of community-owned SLMs, empowering the very people who feed the data ecosystem.

Assisterr provides a data infrastructure layer for small language models. Source: Assisterr
Assisterr provides a data infrastructure layer for small language models. Source: Assisterr

SLMs represent a targeted approach to AI, honed to address specific use cases with greater efficiency and lower costs compared to their larger counterparts. Marrying efficiency with high-quality assistance, SLMs excel in automating and enhancing real-time interactions and support for developers within the Web3 ecosystem.

Assisterr’s integration of blockchain technology facilitates a transparent mechanism for tracking community contributions and incentivizes the sharing of previously inaccessible knowledge and data through rewards.

Community-owned SLMs have two pivotal advantages against LLMs:

  • SLMs are more efficient and cheaper to train and maintain, making them ideally suited for specific business or technical needs.

  • The importance of a dynamic data pipeline often surpasses that of sheer model size, as regular updates to data are essential for keeping AI models relevant.


Assisterr addresses the reluctance of individuals and organizations to share data, a key challenge in developing AI-powered solutions, by creating an infrastructure designed for the quick setup of models and a framework that encourages data sharing through incentives.

Assisterr solves data inference bottlenecks by facilitating quick model setups and motivating data sharing through incentives.
Assisterr solves data inference bottlenecks by facilitating quick model setups and motivating data sharing through incentives.

At its core, Assisterr enables the creation of SLMs specialized for specific domains or business functions, which can be integrated with user interfaces and improved through community contributions. The data layer’s SLMs are highly effective in their designated areas, benefitting from the expertise and continuous data updates provided by individual contributors.

From gatekeepers to contributors: Changing the AI narrative

Assisterr trained AI-powered developer relations agents (DevRel AI agents) for platforms including Solana, Near, Particle Network and Light Link. Trained using extensive tech documentation and codebases, DevRel AI agents improved customer service by handling up to 95% of support requests, reducing wait times and identifying areas for documentation improvement.

Assisterr’s model ensures the SLMs’ expertise in specific fields and champions community data ownership. The project’s approach includes an AI infrastructure layer for the interoperability of community-owned models and a mechanism for incentive-driven data contribution and verification, ensuring the models remain up-to-date and efficient.

Alongside initiating a contributors program, Assisterr is scheduled to launch its testnet and deploy 100 AI Agents in the second quarter of 2024. The transition to its mainnet, integration with Solana and the beta release of the AI Lab will follow the developments, along with the official launch of Assisterr’s AI Lab and Monad integration.

The goal of Assisterr is to maintain an up-to-date knowledge base for each AI model, ensuring an optimal user experience and interface. Beyond providing support to the developer community, Assisterr envisions expanding its capabilities to include the development of DApps and applications on behalf of users in the near future.

Learn more about Assisterr

Original post was on CoinTelegraph

Subscribe to
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
This entry has been permanently stored onchain and signed by its creator.