The human experience, for roughly our entire existence, has been shaped around the ways we communicate and share stories. From cave paintings to the postal system, to the telegraph, telephone, email, and now social media; we have always found ways to aggregate our experiences and broadcast them to those we hold dear. What's more powerful is that it allowed us to share them with those we didn't. Our inherent nature to connect more efficiently allowed us to innovate means to do so, so we began to imagine.
The earliest iteration of what we call the internet today was known as the Advanced Research Projects Agency Network (ARPANET). A network developed by the United States Department of Defense (DoD) in the late 1960s as a way to protect critical infrastructures such as telecommunications and nuclear weaponry from possible Soviet Union attacks during the Cold War. Having a single point of failure meant that the nation's defenses were vulnerable during times of disaster. With this, the design decisions for this communication layer had to be:
To get this network off the ground, the agency funded numerous research facilities and universities across the country; allotting each of them computers connected to ARPANET. Over the decades, this network grew larger, encompassing not only institutions in the United States, but ones in Europe as well. Although its initial function was to serve military purposes, the bright minds of the past saw the possibilities this network presented and worked to make it accessible for everyone. That visionary lens is what brought us to this point.
In the early 1990s, a British scientist working at CERN, Tim Berners-Lee introduced the World Wide Web (WWW). He saw the increased popularity of decentralized network usage among research institutions and thought of a way to meet its growing demands. The fundamental element of Berners-Lee's vision was to merge the emerging fields of computers, networking, and hypertext into a robust information system.
He took this vision further and invented the programming language, Hyper Text Markup Language (HTML), to create the first-ever WWW website. This ushered in the Read-Only era of the internet, because it was completely static and non-interactive. Most people hosting websites during this era had the technical expertise to do so, simply because the existing infrastructures were not easy to build on. There were no advertisements, content creators, or social media for that matter. In fact, trying to sell things online during this era was frowned upon. People simply read.
By 1993, WWW accounted for 1% of all traffic on the internet. We tapped into a collective network that spanned the globe and the stars in mere seconds; and with it, brought new ways of expression, storytelling, communication, and commerce we had yet to imagine. And boy did we imagine.
By the early 2000s, a new era of the internet had emerged; the Social Web, or Web 2.0. One that shifted from the static read-only landscape to a more dynamic and interactive one. For the first time, user-generated content became commonplace and expanded our 1-way communications over the internet to N-way. We could not only read the content, but write them as well. Sharing and consuming media became much more convenient and accessible.
Users no longer needed technical expertise to participate online, because developers focused more on the users ability to share their experiences. With the introduction of programming languages like CSS, PHP, AJAX, and SQL, they created a myriad of web applications that met our growing demand for media creation and consumption:
Social media companies recognized that to scale their systems and businesses, they would need to design centralized systems to house and process all this information. This offered them the means to:
To enjoy the speed and convenience of social media, we neglected the terms & conditions and willingly gave up control over our digital privacy and information. Every kilobyte (KB) of the content we uploaded was logged, analyzed, and sold to third-party corporations for a profit. Profit users rarely got to benefit from.
The era of the Social Web was an enormous step in human interaction and commerce; however, the convenience of these interactions came at a cost. The systems were extractive, evasive, closed-sourced, censorship-heavy, insecure, and not private. Our data, confined within single entities and their data centers, formed patterns about our behaviors, spending habits, fears, insecurities, likes, and so on. I mean, just look at what happened with Facebook and Cambridge Analytica.
We imagined a world where we could communicate across continents in near-instant time. We imagined a world where we could commercialize our talents, ideas, and experiences. And now, we are imagining one where the content we provide does not belong to an Inc, a Co., or an org, but us.
To talk about the next era of the internet, the Decentralized Web, or Web 3.0, I feel it's important we talk about why it first became a topic of discussion.
Until about a decade ago, it was widely accepted that global financial conglomerates' strength and superb management made them exceedingly difficult to fail. More significantly, these institutions were considered too large to fail. If they did collapse, they could count on the government's monetary policies to keep them solvent. During the Global financial crisis of 2008, people quickly realized that the money they stored in banks, was never theirs. They could neither withdraw, transfer nor spend their hard-earned currency. It was an alarming wake-up call.
What rose from this mistrust was Bitcoin; a digital currency that promised to deliver us from the controlling clutches of financial institutions.
Bitcoin was not only revolutionary for the conversations it sparked about money, but for popularizing a technology that would fundamentally change how we would exchange information and transact online. That technology was Blockchain.
To me, a blockchain can be explained as a global network of computers not owned by a single entity. This network can either store, exchange, and secure information such as media, or even digital currency. To ensure that the information exchanged maintains its integrity, each transfer/transaction is verified and cemented by every computer expending energy, i.e., network validators, to keep that network running. As long as >51% of these validators maintain an accurate record of every submitted transaction, those with malicious intent cannot corrupt or shut down the network. This is what makes blockchains decentralized, immutable, and secure.
Since these are essentially global computers, the types of applications that can be built on top of them are near limitless. Almost all the applications we currently use to communicate & transact can be re-imagined on blockchains, but with one key feature; ownership.
Today, the Big Data age is compromising consumer privacy in various digital contexts. Large third-party companies gain from data management by collecting, analyzing, correlating, and managing large volumes of our data. We simply have no control over whom this information is given to, how much, or what is shared.
To illustrate how blockchain technology improves on previous models, I will use Lens Protocol as an example.
At its basic level, Lens Protocol is an open-source tool that allows developers to build social media applications on top of it. It is a system that allows a modular social graph, meaning that any data and connections contributed to applications built on top of Lens, can be ported over to another with ease and at your discretion.
Today, there are over 50 application built on this protocol. A few notable ones are:
As discussed earlier, the media companies of today control almost every byte of the content we consume. However, for those creators, only a small percentage are properly compensated for the value they add to these networks. According to a 2021 survey conducted by Musictech, most of the top 1% of artists on Spotify, made less than $50,000 in streaming revenue. If these are the numbers for the heaviest hitters on the platform, then what about the remaining 99%?
Within this new decentralized era, content creators would interface directly with their audience. No middlemen dictating how much more or little you earn, so they meet their quarterly profit margins. Fans can invest in your vision and journey as a content creator; and in turn, they can earn special privileges for their continued support. Of course, that is up to the artist. Each sale would go directly to the creator's wallet. If they set up secondary market royalties, they could dictate what percentage they would receive if their art was resold. A feature unavailable to traditional art sales.
To see this for yourself, check out a few marketplaces to discover how creators are monetizing their work in this new era:
Mirror's Writing NFTs (Non-Fungible Tokens) allow people who enjoy reading your work to show their support by collecting it digitally for the fee you, the writer, set. If said writer has others contributing to their work, they could automatically set up revenue sharing using 0xSplits.
The beauty is that it works for art and music as well.
We continuously innovated on the internet so we could share our stories in a much more seamless manner. From just reading, to then writing, we exchanged and consumed experiences across continents. However, we never owned them in the way that we should have. With emerging technologies like blockchain, we are beginning to re-imagine our current social models. From data to content, we are beginning to re-imagine our current ownership models. This technology has evolved greatly over the decade, and in my opinion, it will continue to do so. However, it is up to us to steer it in the direction that encourages openness, fairness, security, and community.
If you enjoyed reading this, you would enjoy the resources I got inspiration and information from. Check them out below!