Stable Diffusion and Web3

Stable Diffusion and Web3: Transforming image2image data with the decentralized cloud

Artificial intelligence text-to-image tools like Stable Diffusion, Midjourney, and Dalle2 are rapidly unlocking new possibilities for memes, marketing, and predictive learning for advertising.  

Stable Diffusion was officially released into beta on August 22 as an open-source alternative to Dalle-2.

Because it’s training parameters are open-source, anyone can access and play around with the weights used to train the machine learning model.  Stable Diffusion does not contain the restrictions on political or sensitive content like Dalle2 does, so users must leverage the model at their own risk.

Stable Diffusion is quickly becoming more popular than Dalle2
Stable Diffusion is quickly becoming more popular than Dalle2

As such, these open-source models are especially well suited for web3-based decentralized cloud storage models, like Storj DCS.

Storj DCS is a decentralized object storage solution where data is encrypted and erasure coded by the client – and then distributed across a network of uncorrelated nodes across the word.

A “network of uncorrelated nodes” means resilience through diversity – due to its unique architecture, data stored on the network can’t be withheld, censored, or “held hostage” by any one individual, company, or state actor.

As part of its web3 stack” – this decentralized cloud storage protocol has replaced Amazon S3-backed solutions with Storj DCS to act as a distributed, decentralized, and multi-cloud IPFS pin, worthy of the “web3” moniker.

Hyper-Parallelism means better performance

Storj breaks each file into 64 mb segments, of which any 29 of the total 80 distributed pieces can rebuild the file. This means, that when you stream a file, instead of streaming a data single thread sequentially from a datacenter like AWS US-EAST-1, you are parallelizing it into multiple, concurrent threads.

This method of decentralization + erasure coding + parallel packet streaming has a number of advantages vs pinning to a centralized datacenter like AWS.

Critically, rather than relying on single service provider, Storj facilitates a competitive, open, and global market for saturating bandwidth – where the fastest responders globally will recreate the file at the source (client-side). In addition to better resiliency and uptime, this quality of decentralized cloud storage also means better performance, everywhere.


In this developer walkthrough, we will utilize this cutting edge AI tool to pull an image from a bucket on the Decentralized Cloud, transform/diffuse it, and upload the result back to a new bucket.

The code for this solution can be found on Github here: https://github.com/keleffew/decentralized-diffusion/blob/main/Img2Img_Stable_Diffusion_on_Decentralized_Cloud.ipynb

The weights, model card and code for the Stable Diffusion model can be viewed here:  https://huggingface.co/CompVis/stable-diffusion.


Step 1: Upload a base reference image to the decentralized cloud

To get started, let’s first upload a base image, which we will transform to the decentralized cloud. 

This can be done via the command line toolkit (https://docs.storj.io/dcs/getting-started/quickstart-uplink-cli/uploading-your-first-object/), or by using the Web Portal App at storj.io/login.

For this guide, I created a sketch of a bear, using the best of my artistic talents, and uploaded it to the decentralized cloud.  You can see the results below

A sketch image of a bear made with an online Paint tool.
A sketch image of a bear made with an online Paint tool.

The image is sharded and dispersed across the decentralized cloud. We can find the realtime node distribution below:

Sketch Image Source: <https://link.storjshare.io/s/jxtuohe5sssowgr4dutpq32xgy6a/machine-learning-test/BearDrawing.jpegSource: https://link.storjshare.io/s/jxtuohe5sssowgr4dutpq>

After uploading the file, grab the link from Storj linkshare and add a /raw/ prefix – indicating a direct download.

Example: https://link.storjshare.io/raw/jxtuohe5sssowgr4dutpq32xgy6a/machine-learning-test/BearDrawing.jpeg


Step 2: Create Storj S3 Gateway Credentials

The S3 Gateway lets you use existing AWS libraries (like Boto3 for Python) to upload data directly to the decentralized cloud.

For the codebase linked to in GitHub above, we will need to input the S3 credentials, including an access key, secret key, and gateway credentials.  There are also docs to accomplish this, located here.

Generating S3 Gateway Credentials for S3-compatibility on web3
Generating S3 Gateway Credentials for S3-compatibility on web3

In this developer walkthrough, we will utilize this cutting edge AI tool to pull an image from a bucket on the Decentralized Cloud, transform/diffuse it, and upload the result back to a new bucket.


Step 3: Updating the configuration

Navigate to: https://colab.research.google.com/drive/1hyHOBsy9UqS79vewLjcmBXeYTBkM6qpK#scrollTo=uMqbvbHGR6O8 and start to run the code blocks as described.

Run Steps 1-2, in the Jupyter worksheet importing the relevant dependencies

To access the stable diffusion model, you will need to create a hugging face account (https://huggingface.co/ and create a User Access Token for importation into step 3.

from huggingface_hub import notebook_login

notebook_login()
Hugging Face Access Token Page
Hugging Face Access Token Page

In the final Step, enter the relevant Access Key, Secret Key, and Gateway Endpoint config information.

#Export img to decentralized cloud

# Python Kit for S3, upload to Storj
import boto3

# Pull S3 credentials from Storj.io, for config docs see: https://docs.storj.io/dcs/api-reference/s3-compatible-gateway/
s3 = boto3.resource('s3',
endpoint_url = 'https://gateway.storjshare.io',
aws_access_key_id = 'ACCESS',
aws_secret_access_key = 'SECRET')

# Upload a new file
data = open('brittish-gosling.png', 'rb')
s3.Bucket('machine-learning-test').put_object(Key='test.jpg', Body=data)

Step 4: Running the Decentralized Diffusion Script

Now that the configuration has been updated, we can run the script, and an image will be pulled from the Decentralized Cloud, transformed, and then re-uploaded in parallel.

Let’s check out our result:

Stable Diffusion output resulting from earlier script
Stable Diffusion output resulting from earlier script

Congratulations! 

You have successfully used Stable Diffusion with Storj DCS, storing the data transformed in a decentralized manner on web3!


Kevin Leffew is the Chief Product Officer at Europa Labs, a software company building web3 applications and infrastructure for companies, creators, and brands.

Subscribe to Kevin Leffew
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
Verification
This entry has been permanently stored onchain and signed by its creator.