Token Engineering Fundamentals Module 1: Summary & Discussion

Greetings dear readers, in this series, I will be writing about what I have learned from the Token Engineering Academy courses module by module. These articles will summarize what is being taught and further discussions will be conducted on these mentioned topics to comprehend further. In this article, I will be summarizing and discussing the topics of Module 1: Introduction to Token Engineering. In this module, token engineering fundamentals are described in five different articles, “Towards a Practice of Token Engineering” by Trent McConaghy, “Foundations of Cryptoeconomic Systems” by Shermin Voshmgir and Michael Zargham, “The Web3 Sustainability Loop” by Trent McConaghy, “Cryptonetwork Governance as Capital” by Joel Monegro and “Engineering Ethics in Web3” by Michael Zargham respectively.

Why do we need Tokenized Ecosystems?

We need tokenized ecosystems since it is a way to incentivize people to do things, without imposing on people what to do because it is not in our nature. Natural utilization can only be achieved via incentives as a driving force rather than impositions. Therefore tokens as incentives are a great way to organize people to do things. Blockchains are tokenized ecosystems, with tokens as the driving force for people within the network to fulfill requirements. Therefore we can create the ultimate Web3 experience!

What is Token Engineering?

Token Engineering (TE) can be defined as the engineering branch used to design, verify and analyze tokenized ecosystems with the tools, practice, and theory [Trent Ref Towards a Practice]. We can also think of TE as the Swiss army knife of engineering for tokenized systems. This Swiss army knife of engineering consists of concepts like Computer Science, Electrical and Electronics Engineering, AI, distributed systems, Finance, Law, Economics, Optimization Design, Game Theory, and more. I made a meme that can summarize Token Engineering. Original image made by Davesrightmind on DevianArt.

Token Engineering: Swiss Army Knife of Engineering.
Token Engineering: Swiss Army Knife of Engineering.

Why do we need Token Engineering?

We need it cause we need to utilize engineering principles like ethics, economics, science, practice, and theory leveraging science to holistically design and improve a tokenized ecosystem that can fulfill our requirements while being resilient and sustainable. To meet our requirements, we need to be clear when setting the objective function and constraints. It is an essential but hard-to-achieve task in many optimizations of practices like AI, blockchain, and simulation softwares. Constraints and functions need to be well designed and decided, so the system that we are aiming to design can scale to the levels that we wanted with accuracy. In the article, “Can Blockchains Go Rogue?” by Trent McConaghy importance of constraints is underlined with examples like The Paperclip Maximizer. This scenario is basically about the relationship between human intent vs what the machine understands and what happens if constraints are not chosen correctly. If you want to know what it is you can access it here.

📎🤖📎🤖📎🤖📎🤖📎🤖📎🤖📎🤖📎🤖

To comprehend this importance further, I would like to give an example related to my background. I am an individual with a Chemical Engineering background and I work in simulations to evaluate processes and their outcomes with scalable models to save cost, time, and effort. In modeling for simulations (e.g. COMSOL), it is really important to set the correct constraints that will give you accurate, precise results for the system that you are working on. If these constraints are not set correctly, you can not possibly reach your objectives, because the machine can not understand your intent, due to false choices of constraints. It will result in errors in the model, the machine can not converge to a solution, or result in inaccurate solutions that may lead to even catastrophic results. Briefly, we need Token Engineering principles to analyze, design and optimize scalable, resilient, and sustainable tokenized ecosystems.

Tokenized Ecosystems: Public Blockchains

As mentioned previously in this article, public blockchains are tokenized ecosystems with different capabilities. Blockchains can be described as decentralized, immutable trust machines with many capabilities like smart contracts . Blockchains are also incentive machines that define the objective functions which are block rewards and communicate what it wants from the network participants. This communication and incentive mechanism of blockchains makes them potential life forms. Blockchains have;

  • Autonomous mobility

  • Self-replication

  • Decision making

  • It pays people to keep it alive < - - > Performs a useful service that people will pay it to perform.

Concepts like these mentioned above, make blockchains artificial life forms that show vital signs on the network! However, these life forms may have disadvantages if we can’t set our constraints correctly/accurately.

When we think about the problems that humankind is facing, we can say that our biggest problem is the energy problem. Since fossil energy resources are not distributed uniformly around the globe it has accessibility problems. It is also harmful to the environment due to emissions that result in greenhouse gasses that result in climate change. And renewable energy technologies like solar, wind and geothermal, etc are still not mature enough and are highly dependent on fossil energy sources in production on their complete life cycles, there is no complete solution for our energy needs as the population increases significantly, yet.

With respect to this fact given in the paragraph above, let’s think about Bitcoin’s rewards function:

Bitcoin's objective function, maximizing security by maximizing the hash rate requires maximizing electricity usage.
Bitcoin's objective function, maximizing security by maximizing the hash rate requires maximizing electricity usage.

When we check this relationship of the objective function mentioned above, we can say that Bitcoin is optimizing itself as a living creature. However, we should also remember that Bitcoin is doing this via energy consumption, which is our number one problem. This is looking similar to the Paperclip Maximizer scenario, where an AI optimizes for objective function non-stop regardless of the consequences outside the objective. Hence, it is important to set constraints with a more holistic approach with respect to different perspectives and scenarios. That’s why incentives and constraints are crucial when building tokenized ecosystems.

What should be the approach when we are designing a token?

When we want to see Token Engineering in action, we can use different tools and patterns and we can analyze token/tokenized ecosystem designs from different perspectives, which will be explained in upcoming sections. However before we start to dive deeper, it will be good to look at Token Design pathways. For example, we can approach Token design as AI optimization design. Block rewards that we encoded in Token design optimization will have the same purpose as the AI optimization design. In this approach, tokenized ecosystem design will be similar to the design of evolutionary algorithms. If we simplify tokenized ecosystems:

  • Goals: Block reward functions

  • Measurements and Tests: Proof of Work, Proof of Stake

  • System agents: Miners and holders of the token in a network

  • System Clock: Block rewards intel → Works in:

    • Batches/epochs: Lamport-style logical clock. Blocks and block rewards are being generated with respect to this system clock

    • Continuous

  • Incentives and disincentives: Giving tokens and slashing respectively

When using the optimization pathway for Token Design, we should follow these steps:

  1. Formulating the problem

  2. Trying an existing solver

  3. New solver

In this design pattern inspired by AI optimization, the first step is to formulate the problem with all constraints and objectives in mind. This formulation is a hard task to achieve since due to constraints and objective functions complexity, the difficulty of formulation, and its solution also varies. For example, if the formulation is NP-hard, it will be difficult to solve it. Therefore, methods and techniques can be used to convert these types of hard problems into solvable ones such as convex problems. This step shows the importance of selecting the constraints and objective functions accurately and formulating the intent.

The second step of the optimization will be trying an existing solver. After your formulation in the most suitable manner, your next step will be using existing solvers to observe if a solution converges or not. If the solution converges, it is great! You made it! However, if your solver does not converge, try another existing solver. If the solver converges too well, you need to modify your problem or edit your constraints to be much more detailed and accurate to formulate your intent. If the existing solvers are not good enough to give satisfying results for your problem, you may need to create a new solver to obtain a solution that can give you accurate results.

This design pattern is not only specific to AI optimization. As I mentioned before, I have a Chemical Engineering background, and I use computational modeling softwares to model and solve energy storage problems. In this softwares, there are existing solvers and modules that will solve problems such as electrochemical reactions and mass transfer in a battery. However, in some cases, if the battery that you are working on is a novel technology that considers different parameters, and detailed physics, existing solvers can not converge due to the specific and complex structure of the problem. Therefore in these cases, engineers may need to write their own User Defined Functions (UDF) to explain the physics (similar to constraints) happening inside this novel battery. Also, if the parameters and conditions are not set correctly inside the geometry, your solution can not converge.

In the figure below, Token Design there are some building blocks already, and due to this space’s pace of innovation, we see many new building blocks are being added. Building blocks mentioned by Trent McConaghy in “Towards a Practice of Token Engineering” article are summarized in the figure below. The figure below summarizes the building blocks that shape core token mechanics.

Token Design Patterns, Building blocks that can be implemented to token mechanics.
Token Design Patterns, Building blocks that can be implemented to token mechanics.

Some of these concepts are still working in progress and some of them are mature enough to be safely included in the Token Design process. Curation can be used in various membership ways. In binary membership of curation, Token Curated Registry (TCR) can be used to obtain a good list of actors. In discrete-valued memberships, a combination of curation and finite-state machines can be used to create a Stake Machine, to model governance that incentivizes the community to participate in the network. Identity can be used with the utilization of Decentralized Identifiers (DIDs) to increase an individual’s control of information, cryptographically guarantees to prove the validity of attestations, and portable data, and enables anti-Sybil mechanisms which eliminates dependency on centralized entities or third-party services for different use cases from voting to authentications to be conducted on the network. Reputation systems have similar functionalities with identity and curation. More like a combination of these. When a token has governance functionality in the design, it will be utilized in decision-making processes and adds another dimension to network participation. However, the majority of the governance functionalities that we see today, have many problems in the decisions making structure. These problems lead governance to be plutocratic which means it is controlled by the wealthy. We can see many examples of this system’s disadvantages in DeFi ecosystems such as the Mango hack. In a decentralized world, fair governance should not work with the economic power directly, it should also consider other aspects of the network participants. Concepts such as reputation, proof of humanity, and DIDs can be considered in addition to token amounts in the governance to make it fairer.

Concepts like proof of computing “work” or humanity can be used to evaluate the objective function. Additionally, when designing a tokenized ecosystem, concepts like how tokens are distributed, token standards (e.g. ERC20, ERC721), token valuation, and organizing the compute stacks are also important.

Different tools can be used to simulate analyze and design the tokenized ecosystems. These can be agent-based models from the complexity science and artificial general intelligence or existing consensus algorithm design simulators. If a token engineer uses differential equations, the solution can be reached via DE solvers such as SPICE.

To verify tokenized systems with respect to their performance metrics, indicators like “worst-case performance” and “n-sigma performance” can be used. These are conventional methods to measure and verify the performance of a system. “n-sigma” is used to evaluate the failure rate.

In the simulation of the Tokenized Ecosystems, CAD programs used in circuit simulation can be utilized. However, these tools are limited in design because it is not the exact case in tokenized ecosystems. These ecosystems are economies that have humans in the loop so it will be a different and not really accurate simulation since we are not rational creatures and this makes systems evaluation complex and detailed. We can not make sure the simulations are perfect or flawless. However, we can make designs flexible and scalable to make them resilient in any conditions. Governance, staking, and many more utilities that were mentioned previously are good examples to make these designs flexible and resilient while achieving incentive/disincentive mechanisms. For Token Engineering case studies, I recommend reading “Token Engineering Case Studies” article by Trent McConaghy.

Token Engineering Case Study for Bitcoin by Trent McConaghy
Token Engineering Case Study for Bitcoin by Trent McConaghy

Recently, tools like Machinations gathered important attention due to its detailed structure to analyze and predict game economies and systems. Having tools like Machinations is really great and I believe that taking game economies as a basis is a pretty good starting point for Token Engineering. Different game economies such as World of Warcraft and Runescape can be helped us to understand how communities behave in different conditions. Game economies also can help us to evaluate the behavioral and psychological aspects of a system.

Despite recent developments, existing tools to simulate and design tokenized ecosystems still have some limitations. These limitations are due to the nature of tokenized ecosystems. They are cryptoeconomic networks that are dynamic, responsive, resilient, multi-scale, and adoptive. Therefore in order to understand and design better cryptoeconomic systems, we need to conduct interdisciplinary research and combine what we learned.

Cryptoeconomic networks are complex systems, they require a multi-disciplinary approach. This complex structure needs different methodologies and tools for analysis and design. Token Engineers should consider these methods and disciplines together in order to create a holistic cryptoeconomic network. In cryptoeconomic networks, operations research & management science, economics & game theory, AI & control optimization theory, computer science & cryptography, philosophy, law & ethics, political science & governance, psychology & decision science are key concepts to be considered in the token design processes. These concepts can be summarized as:

  • Operations research & management science:

  • Economics & Game theory: We don’t like to be controlled or dictated to. It is much better and more efficient for us to be incentivized and disincentivized when participating in the cryptoeconomic networks. Hence, it will be a driving force rather than a forced action. Game theory assesses the strategic behavior of the participants and economics is utilized to evaluate networks’ tokenomics affects their behavior.

  • AI optimization & control theory: This concept helps us to further improve cryptoeconomic networks with the utilization of AI optimization and control theory techniques. Utilization will also help us to have a better understanding of network dynamics and decisions.

  • Computer science & cryptography: With these concepts security and integrity of the cryptoeconomic system can be ensured.

  • Philosophy, law & ethics: Crucial foundation of a decentralized network. Because decentralization is also an idea, that significantly depends on ethical considerations and legal implications. Automation of the network, the privacy of participants, and governance can be truly shaped if these areas are considered.

  • Political science & governance: Acts of different network participants within different conditions are analyzed in this concept. In cryptoeconomic network governance, without having a solid understanding of these concepts, it is not possible to achieve a successful network that can be dynamic, resilient, and decentralized.

  • Psychology & decision science: Helps us to assess and interpret decisions made by network participants. These decisions mostly depend on different specific psychological factors and ethos.

All of these concepts are powerful when we are designing a cryptoeconomic network. Just like the infinity stones, they have different use cases and strengths on their own. However, when we combine these concepts, it will be the Infinity Gauntlet of the Token Engineer! You can check the great visualization of these concepts in a Venn diagram by Shermin Voshmgir and Michael Zargham in the “Foundations of Cryptoeconomic Systems” article. This article also provides a more detailed explanation of these concepts in cryptoeconomic concepts.

SNAP of a Token Engineer
SNAP of a Token Engineer

Cryptoeconomic systems such as blockchains, provide a governance infrastructure for various socioeconomic activities.

  • Physical: Electricity & Hardware

  • Financial: Tokens & Fiat money

  • Social: Attention, governance participation, and evangelism

In cryptoeconomic systems, it is crucial to know and identify who makes decisions, under which conditions, to whom are they accountable, whether is it changing over time, how individuals decide, and how this system can be engineered to coordinate individuals’ decisions to achieve a collective decision-making process.

To analyze and design a cryptoeconomic system, we should look at these systems from different perspectives such as institutional economics, the evolution of cooperation, multi-scale, and network perspectives. Holistic consideration of these perspectives will help us to design and analyze cryptoeconomic systems better since different perspectives can compensate each other’s insufficiencies. Within these perspectives,

  • Institutional Economics: Cryptoeconomic networks can be seen as a novel governance form. Governance that relies on code and math rather than conventional institution hierarchy. For example, Decentralized autonomous organizations (DAOs) operate based on smart contracts and are governed by holders of the tokens.

  • Evolution of cooperation: Since cryptoeconomic networks like blockchains allow the formation of decentralized communities and organizations. These systems rely on incentive and disincentive structures to organically coordinate the behaviors of network participants such as miners, and token holders in a decentralized manner. Therefore when this evolution of cooperation perspective is well-considered, the network can encourage participants to coordinate with each other in such an equilibrium, preventing individuals from acting selfishly and maliciously. Therefore, long-term stability and success of cryptoeconomic networks can be achieved (e.g Miners validate transactions to keep networks integrity or in DAOs incentivize token holders to vote on proposals in benefit to the organization.)

  • Multi-scale: Cryptoeconomic systems have multi-scaled structures. Therefore these systems can not be analyzed and designed just by looking at one scale. If we want to observe agent-to-agent interactions, we need to consider the micro-scale perspective and if we want to observe agents against the network itself. Since cryptoeconomic systems contain both of them, micro, meso, and macro-scales evaluations should be conducted together.

  • Network science perspective: Since cryptoeconomic systems are complex systems with different entities interacting with each other by forming a collective network, these systems can be considered as multigraphs with different types of edges, and vertices. Edges are pairs of vertices and their interactions. Edges can help us to evaluate interactions and relations such as entity to entity, entity to account, entity to node, account to account, and node to node. Vertices help us to define types of the components in the network. These can be summarized well in the tables below:

Cryptoeconomic networks in network science perspective, vertex types explained. Table by Shermin Voshmgir and Michael Zargham
Cryptoeconomic networks in network science perspective, vertex types explained. Table by Shermin Voshmgir and Michael Zargham
Edge types with respect to vertex interactions. Table by Shermin Voshmgir and Michael Zargham
Edge types with respect to vertex interactions. Table by Shermin Voshmgir and Michael Zargham

In the cryptoeconomic networks, nodes represent the computation and communication network, addresses represent the financial network and entities represent the off-chain socioeconomic network. Therefore when designing and analyzing cryptoeconomic networks it is really important to include a network science perspective to evaluate component interactions.

Tokens help us to achieve durable and provable digital values in cryptoeconomic systems. They can be used to represent states in a system, where ownership of an asset, rights to participate in an activity, or completing of a task. They can be used to prevent problems such as double spending of digital values over public networks. With tokens as a state in a cryptoeconomic network we can represent various things on top of the network decisions:

  • Tokens can be used to represent ownership of various types of assets, such as real estate, artwork, and commodities. This allows for the creation of a digital market for these assets, where they can be bought, sold, and traded in a decentralized and transparent manner.

  • Tokens can be used to access or use certain platforms or services.

  • Tokens can be used to represent voting rights in decentralized autonomous organizations (DAOs). This allows token holders to vote on proposals and make decisions about the organization's direction.

  • Tokens can be used to represent ownership of a financial product. This allows for the creation of a digital market for these financial products, where they can be traded in a decentralized and transparent manner.

  • Tokens can be used to represent reputation points for identities in the network. This allows for the creation of a decentralized identity system where users have control over their personal information.

As you can see, tokens unlock significant values and functionalities for us. However, if these systems are not engineered well with different perspectives and principles in mind, it will result in catastrophes. For example, identity and reputation systems may end up something authoritarian similar to China’s social credit system. Therefore, when designing these cryptoeconomic systems, Token Engineering that combines these mentioned perspectives and principles must be applied. Also, we need to build on what we learned in our experiences. Hence, we can achieve sustainable, comprehensive ecosystems.

Designing Sustainability Loops in Web3: Token Engineering in Action

So to see Token Engineering in action lets look at Trent McConaghy’s “The Web3 Sustainability Loop” article. In this article, Trent McConaghy demonstrates a systems design approach for Web3 projects. While doing this, the article describes different models of growth and sustainability loops such as company business models, nation-level, and Web3 approaches. Make sure to check the full article to see how sustainability loops evolved in Web1, Web2, and company, and nation-level business models and how it helped Web3 sustainability loops to be constructed.

So let’s look at this Web3 project, a tokenized ecosystem without any loops. What is missing? Can the project fund itself in the long term? Will this project sustain itself? What are the incentives for the token, and how it will increase in value while it increases its activity and utility? Here are the steps of this system design.

  1. $TOKEN is generated and distributed at the beginning

  2. Initial funding with some $TOKEN allocation distribution in a $TOKEN sale (ICO, IDO, Private, Seed rounds).

  3. After funding, enhancements, and expansions in the team, building the product

  4. Project release. Trying to add and improve token mechanics to increase $TOKEN value as the usage increases.

Web3 Projects without loops
Web3 Projects without loops

So what is missing? How the team will fund itself in the long term? This structure is not really sustainable and healthy structure for a token design. It doesn’t have any feedback loops to sustain itself or help the team fund itself. Therefore this will result in the team selling more $TOKENs for fiat to sustain its operations. This will lead to increased supply and sell pressure on the $TOKEN, hence decreasing in value of it. As Token Engineers, we need to do better.

In the figure below, we see the improved version of the first one, with additions like revenue generation and loop structure. However, we see that $TOKEN generation occurred and was distributed at the beginning. Even though this network revenue addition is good for sustainability, and revenue is sent back in a curated way to workers in the ecosystem, it is not enough for the long term. Because the network is highly dependent on the network revenue and this structure is not really available to sustain the team for the long term. How can we fix this?

Web3 growth model with the addition of network revenue that is sent back for curation.
Web3 growth model with the addition of network revenue that is sent back for curation.

First of all, to obtain a decent runaway for a tokenized ecosystem. We can do one simple thing in tokenomics, which is the duration of token distribution. Here, instead of distributing all tokens at the beginning, having a token distribution in a larger time interval will give teams more space to operate, and more runway. Therefore, teams can work much longer to increase the $TOKEN and $TOKEN ecosystem’s value without extreme inflation. The figure below summarizes this proposed system design.

In this loop, we see that projects are introduced, and curated by the community, and funding is conducted via network revenue and rewards. This creates a positive feedback loop since more work will be done to get more network revenue, and this increases the $TOKEN value since it increases its usage of it too. This design shows us the importance of incentives and how/where/why to distribute them. With incentive structures like this, we make our tokenized ecosystems:

  • Resilient, sustainable → Automated and positive feedback loop included funding

  • Provide conditions for the perfect competition → More innovation

To see if this system is working or not in the desired way, we have to observe projects adding sufficient value to the ecosystem and this value addition must affect token value/price in a good way. By sufficient value, it means that when a project adds value, this added value must be higher than the funding that it got (Return on Investment (ROI)). The average ROI is desired to be> 1.0 for the successful growth of the ecosystem. Therefore, the net value will be positive and the positive feedback loop is created successfully. There are different dApp examples with similar and successful systems designs in different blockchains and I really want to conduct case studies in upcoming articles. L1s like Fantom, and Canto introduced similar system designs on the network level to incentivize builders to build on top of them.

Fantom proposed the dApp Gas monetization program to create this positive feedback loop for its network. In this system, Fantom uses an approach similar to the approach of Web2 social media monetization. In the social media approach, monetization optimization is focused on ad revenues and the success of the content creators. This inspired Fantom Foundation to focus on optimizing for increased demand for block space of dApps built on Fantom. Therefore, it is aimed to attract more developers to consider building with Fantom, it will give dApps the to obtain extra network rewards, and it will positively affect the Fantom network overall since it will increase usage of FTM token with increased activity and innovation due to incentives triggers competition. For this purpose, this monetization system redirects %10 of the network transaction fees to dApps and distributes it with respective block demands. You can check further details here.

Canto introduced a revenue share model called Contract Secured Revenue (CSR). The main objective of this system design is to incentivize public goods creation with DeFi. The development team Plex which contributes to the core infrastructure of Canto, states that conventional value accrual of protocols is not really following the spirit of public goods. Conventional revenue and value approaches for protocols were usually deploying tokens for governance and/or using them for protocol fee payments. Canto’s vision is opposite to this since contributors believe governance tokens at their current state are highly speculative and protocol fees make accessibilities harder due to expenses. Canto contributors introduce CSR to prevent unsustainable scenarios to incentivize builders. CSR enables the collection portion of gas fees spent on a specific smart contract to be collected on a respective CSR NFT. This CSR NFT base fee is initially 20% however, this can be changed through Canto governance decisions. Contributors believe that this will unlock a new era for public goods and it will maximize the benefits. CSR is also aimed to achieve sustainable growth per protocol without giving any community grants which we have seen before in other ecosystems and they were not that long-term effective.

Canto Contract Secured Revenue model explained by Plex, core infra contributors.
Canto Contract Secured Revenue model explained by Plex, core infra contributors.

These revenue share models that FTM and Canto introduced to their networks are interesting. After some comprehensive investigation, Token Engineers may develop something even better to take a huge step further in tokenized ecosystem design. These are really parallel with the Web3 sustainability loop and in my opinion, these are great ways of creating positive feedback loops. I would like to make a detailed case study on FTM and Canto revenue-sharing models to compare them in the next articles.

Governance in Cryptoeconomic Systems

Capital is the power to organize the economic resources of a social system, and its value is proportional to the value of the resources it governs. This insight reveals the inherent value of crypto network governance as capital and helps us understand tokens with governance rights as new kinds of capital assets. Intangible forms of capital also exhibit this quality, such as political capital, which governs the rules of markets, and social capital, which drives human attention. Crypto networks are a new form of social organization, and it is useful to think about these ideas through the crypto economic circle.

The two pillars of trust of a cryptonetwork are its cryptoeconomic and governance models. The cryptoeconomic model defines the rules of the system (what is the unit of work, how do users pay, how miners are compensated, the token supply model, etc.), while the governance model defines who has the power to change those rules and under which conditions. Assets under power include the token itself, productive resources, and flows of value. Proof-of-stake systems are good examples of this idea, where miners are required to lock a certain amount of tokens in order to be allowed the right to work for the network. Tokens that can be staked are a form of capital in that they represent the power to organize some of the economic resources of the network, such as production capacity and distribution of income. As the value of these resources grows, so does the value of the capital which governs them.

Ethics is Engineering, Engineering is Ethics

With Web3, we can finally use technological developments better to create outcomes that prioritize humanity’s needs. This dynamic community introduced a new concept, Token Engineering to implement our intents better and create sustainable tokenized ecosystems that create mutual values. Since Token Engineering is a multi-disciplinary engineering application based on conventional engineering principles, it requires similar approaches to build, design and maintain solutions.

The Web3 ecosystem is continuously evolving and developing. It is an emerging technology and it requires a further understanding of how to apply engineering principles around it. Within this ecosystem it is safe to say that it is highly connected with social and economic constructs, therefore it requires further understanding in these areas. However, while exploring what and how to implement solutions within Web3, it is crucial to have common values as the foundation.

Common values require ethics, which is an indispensable part of engineering. Without ethical grounds, it is not possible to build a sustainable and widely accepted ecosystem. Therefore, Token Engineers must build on solid foundations where participants and contributors of the ecosystem reach to consensus and work in harmony. In Web3, it is important to discuss two topics as the foundations, privacy, and transparency. Even though these two topics are in conflict, discussing the trade-offs when designing ecosystems can significantly develop ecosystems further. Another important topic of ethical values in Web3 is the agency, where individuals are free to do whatever they want and have self-custody if they are adequate enough to handle their private keys. This freedom may lead to another problem if there are no ethical grounds since one person’s actions may harm others. That’s why tokenized ecosystems have incentive/disincentive mechanisms built in, and specified unacceptable actions to prevent undesired conditions. However, these mechanisms sometimes can not be enough and require further discussions.

The Ethereum hard fork following the DAO attack is an example of the Ethereum community fragmenting over value differences. The Ethereum Classic Community kept the notion that "code is law" and that acts committed in bad faith (grabbing someone else's money by exploiting a line in code rather than taking action to safeguard the system) were to be respected since the code itself was the determining factor. The Ethereum community as a whole took drastic efforts to stop the fraudulent conduct and triggered an irregular state transition, thereby eliminating the hackers' assets and redistributing them to the impacted parties. Neither was absolutely correct, but the incident was a highly public demonstration of leaders' values and judgment.

The DAO Hack, step by step explanation by Samuel Falcon in "The Story of the DAO — Its History and Consequences" article
The DAO Hack, step by step explanation by Samuel Falcon in "The Story of the DAO — Its History and Consequences" article

Briefly, while designing tokenized ecosystems, it is crucial to think about common values to build a community in consensus. Besides, token engineers in Web3 must be highly flexible to adapt to changes in the Web3 landscape and help build technological solutions to fulfill the needs of ecosystem participants.

Conclusion

In the dynamic world of blockchain and decentralized systems, the concept of tokenized ecosystems and Token Engineering emerges as a powerful force. These ideas reshape our understanding of value, incentives, and the harmony between technology and human behavior.

At the center of tokenized ecosystems lies the art of crafting incentives. The belief is that people respond more fervently to positive incentives than to rigid mandates. In this framework, tokens become the driving force, propelling individuals within networks to achieve specific goals. A symbiotic relationship develops: as tokens gain value, the network flourishes, and as the network prospers, tokens accrue value. It's a beautiful synergy that aligns individual aspirations with collective progress.

Token Engineering amplifies this synergy, encompassing a multifaceted approach to design, verify, and fine-tune tokenized ecosystems. Drawing inspiration from a variety of fields, from computer science to economics, Token Engineering provides a versatile toolkit. It's akin to the Infinity Gauntlet, where each concept is an "infinity stone" with its unique strengths, but combining them creates an all-powerful whole.

However, the journey isn't without pitfalls. The challenge lies in constructing sustainable and resilient systems. The initial token distribution sets the tone, where elongated intervals can grant projects the necessary runway for value creation. An ideal ecosystem intertwines incentives and distribution, creating a harmonious loop where innovation and competition thrive.

Interesting examples like, Fantom's dApp Gas monetization and Canto's Contract Secured Revenue model, demonstrate the potential of positive feedback loops. These models reward builders, stimulate public goods, and encourage sustainable growth—a testament to the power of well-crafted incentives. With comprehensive investigation, we can build further and create better tokenized ecosystems from what is right wrong or missing.

Yet, these systems are not solely driven by algorithms and code. Governance is the other pillar of trust in cryptoeconomic networks. Tokens endowed with governance rights become forms of capital, steering resources and shaping the network's rules. As these networks mature, the ethical dimensions of engineering become paramount. Balancing transparency and privacy, navigating individual agency, and cultivating shared values emerge as foundational considerations.

Token Engineers, in this ever-evolving landscape, are akin to architects of value and ethics. They wield the tools of technology and principles of economics to craft ecosystems that resonate with human needs and aspirations. As the Web3 era unfolds, these architects must keep pace with change, guided by a vision of a decentralized world that thrives not just through innovation, but through ethical foundations that resonate with our shared humanity.

REFERENCES:

Subscribe to themodularpenguin.eth
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
Verification
This entry has been permanently stored onchain and signed by its creator.