a16z Encryption Entrepreneurship Course: After "Token Design", "Protocol Design" is launched

By Eddy Lazzarin

Compilation: Sissi

Introduction:

**a16z has established an important position in the encryption field to guide the development of the industry with his in-depth articles, providing us with the guidance we need for cognitive improvement and transformation. Recently, a16z has been focusing on issues beyond the token economy. It started with a talk on "Token Design", followed by "Tokenology: Beyond Token Economics", and now the much-anticipated "Protocol Design" course. As the lecturer of the course, Eddy Lazzarin, CTO of a16z crypto, repeatedly emphasized that the key to surpassing the token economy lies in the protocol design, and the token design is only an auxiliary means. In this course focusing on protocol design, he shared for more than an hour, bringing valuable insights and enlightenment to entrepreneurs, helping them deeply understand the key role of protocol design in project success. This article is a simplified version of the translation. For more exciting content, see the link to the full-text version of the translation. **

Inherent laws of protocol evolution

Internet Protocol: The Bond of Interaction

Internet is a protocol network, including various different types of protocols. Some protocols are concise, such as the state diagram of HTTP, while others are quite complex, such as the interaction diagram of the Maker protocol. The figure below shows various protocols, including Internet protocols, physical protocols, and political protocols. On the left of the image below, we see an interactive map of a street intersection, which feels familiar and interesting to us.

What these protocols have in common is that they are all formalized interactive systems that facilitate complex group behavior, which is a core component of the protocol. The power of the Internet protocol lies in its ability to connect not only human-to-human interactions, but software as well. We know that software is highly adaptable and efficient, capable of integrating mechanisms. As such, the Internet Protocol is arguably one of our most important, if not the most important, types of protocols.

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

Protocol evolution: Web1 - Web2 - Web3

In the chart below, the horizontal axis represents the degree of decentralization and centralization of the protocol, that is, the degree of control over the protocol. On the vertical axis, there is an agreed economic model, specifically referring to whether the economic model is explicit or unspecified. This distinction may seem subtle, but it has important implications.

a16z Encryption Entrepreneurship Course: After "Token Design", "Protocol Design" is launched

Web1: Decentralized & No Clear Economic Model

Protocols in the Web1 period (such as NNTP, IRC, SMTP, and RSS) were neutral in terms of value flow, ownership, access rights, and payment mechanisms, without a clear economic model. Among them, Usenet is a protocol similar to today's Reddit for exchanging posts and files. IRC was an early and widely used chat protocol, and SMTP and RSS were used for e-mail and content subscriptions.

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

Usenet is a taxonomy-organized platform that allows users to post relevant content on sub-servers of specific categories. It was an important part of early Internet culture and existed outside of HTTP. Using Usenet requires a specific client and an Internet Service Provider (ISP) that supports Usenet. Usenet is distributed across a large number of ever-changing news servers that can be run by anyone and posts are automatically forwarded to other servers, forming a decentralized system. While users rarely pay for Usenet access directly, in the late 2000s some began paying for commercial Usenet servers. Overall, Usenet lacks a clear protocol economic model, and users must use it through their own transactions.

These Web1 protocols are architecturally similar and derived from the same values. Even with little knowledge of protocols, we can still understand how they work, which shows the importance of **Web1 protocol legibility and clear templates. **However, these protocols have gradually faced failure or change over time. The reasons for the failure can be attributed to two aspects: first, lack of specific features, unable to compete with Web2 competitors; second, difficulties in obtaining funds. Ultimately, the success of a protocol depends on its ability to take a decentralized approach and develop a sustainable economic model to incorporate specific features. In summary, the Web1 protocol can be classified as decentralized and lacks a clear economic model.

a16z Encryption Entrepreneurship Course: After "Token Design", "Protocol Design" is launched

Web2: Centralization & Clear Economic Model

Web2 has brought about an interesting trend: Reddit has replaced forums such as Usenet, and centralized messaging systems such as WhatsApp and iMessage have replaced forums such as IRC. While email still exists, it is challenged by the spam problem. Also, RSS did not compete well with Twitter. **Web2 addresses the limitations of the Web1 protocol and provides specific functionality. ** Email and other decentralized protocols cannot verify message legitimacy, sender identity, authority, and economic relationships, so dealing with spam becomes a problem. In immature decentralized systems, the lack of these features allows centralized competitors to outperform their predecessors by offering unique features.

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

**The Web2 protocol is fully under the owner's control, limited only by business policy and law. **In order to drive the development of the Web1 protocol, a more explicit economic model is needed. However, it is impossible to achieve a clear economic model while maintaining decentralization without utilizing decentralized consensus, verifiable computing and encryption technology tools. **Agreement typically transitions from the lower left corner of the design space to the upper right corner. Sometimes protocols become de facto centralized, such as email. With more than 50% of emails handled by centralized email service providers, email has become highly centralized. Email is under pressure from spam problems, lack of an economic model, DNS registration cost sharing and high switching costs.

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

In the absence of a viable economic model, email can only be sustainable as a side project of the big tech companies. Methods to reduce spam rely on economies of scale and data binding, and it is easier for companies hosting millions of email accounts to detect anomalies. In addition, switching costs are also an important factor. Now, we need to recognize two key centralizing forces that affect different components of the protocol,** that are constantly at play at every turn in the protocol design process, and they are network effects and switching costs. **

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

**Network effects are the phenomenon of power accumulating as systems scale and become widely used. Switching costs refer to the economic, cognitive, or temporal costs required for users to leave the current system and switch to another system. **In the email example, switching costs are critical for users using Gmail. If you use Gmail but don't have your own domain, switching costs will be high. However, if you own your own domain name, you are free to switch mail service providers and continue to use any service provider to receive mail. A company can increase switching costs through protocol design, forcing or encouraging users to use specific components, thereby reducing the likelihood of users switching to other suppliers.

Take Reddit, a system that allows moderators to unilaterally control subforums, blurring the line between decentralization and centralization. While allowing anyone to be a moderator might be seen as a form of decentralization, they are still fully centralized systems if ultimate power remains in the hands of administrators (such as Reddit teams). A high-quality user experience has nothing to do with centralized power, but providing a high-quality user experience often requires financial support. ** In the era of Web1, due to lack of funds, decentralized protocols often cannot provide a good user experience. **Funding plays an important role in delivering a high-quality user experience.

Web3: Decentralized & Clear Economic Model

On a **Web2 platform like Twitter, Facebook, Instagram, or TikTok, user choice is limited, subject to the platform's interface decisions. **However, how will the decentralized components introduced by Web3 change the protocol? Utilizing encryption and blockchain technology can reduce the reliance on trust, while clarifying the economics and supporting decentralization. **Web3 provides openness, interoperability, and open source, with a clear economic model, and the ability to integrate funds in the protocol to achieve sustainable development and avoid monopolizing all values. **

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

**As a developer, choosing to build on a decentralized system with a clear economic model is the best choice. This way ensures the continued existence of the system and understands the economic relationships associated with it without having to let the economic relationships develop outside of the agreement. ** In terms of stability and value capture, this needs to be considered differently. Choosing to build on a decentralized system is important because it avoids potential risks and builds a project that is durable and has the potential to be the largest system possible.

Internet construction is no longer regarded as crazy behavior, because the Internet itself is a completely decentralized system. Likewise, the use of open source programming languages and reliance on web browsers has become a solid foundation for building ambitious projects. Building on a centralized system can be limited and prevent the scale and scope of the project. Web3 attracts great developers who can build bigger, more ambitious projects. Other systems or platforms may emerge and compete with the existing Web2 platform, comply with regulations and have a competitive advantage, and compete fiercely with the Web2 platform.

The biggest problem with the Web2 network is its fragility and over-optimized business model. These networks pursue optimization for specific metrics while ignoring things not related to their goals, resulting in a lack of innovation and development of new products. While they have strong network effects, not enough to form a monopoly, they are vulnerable to countermeasures against their weaknesses.

In contrast, **Web3 provides a more resilient and innovative space through decentralization and a clear economic model. **Similar to a rich and diverse rainforest ecosystem, the Web3 system has established infrastructure and protocols suitable for the development of all kinds of interesting things, providing a more fertile soil for innovation. By leveraging cryptocurrencies and token economic models, participants are assured that their creativity and risk-taking will be rewarded, furthering the development of the system.

Therefore, **Web3 has better ecosystem sustainability and innovation potential, rather than relying solely on the accumulation of economic resources. **The clear economic model and decentralization features enable Web3 to achieve innovation and development in a true sense, away from the predicament of over-optimization and centralized accumulation in a single field. By introducing encryption technology and token economic model, Web3 provides participants with greater creative space and return mechanism, and promotes the development of the system in a more valuable and lasting direction.

Web3 protocol design case

Case background and design goals

Let's start with an interesting example, "Stable Horde" is a free system for generating images and a Web2 protocol. It uses a collaborative layer feature that allows users to ask other people willing to help to generate images. The client submits the task to the work queue, the worker performs inference processing and sends the result to the result storage, from which the client can retrieve the result and pay Kudos points to the worker. In Stable Horde, Kudos is a free points system used to prioritize tasks. However, the longer the queue, the longer it takes to generate the image due to the limitation of computing resource donation.

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

We faced an interesting problem: how to scale this system to make it bigger and more specialized, while remaining open and interoperable, without risking centralization to destroy the original spirit of the project. **One proposal is to convert Kudos scores into ERC20 tokens and record them on the blockchain. However, simply adding blockchain may cause a series of problems, such as false result attacks and so on.

Let's rethink the protocol design process. **You should always start with a clear goal, then consider the constraints, and finally define the mechanism. **Designing a system requires measuring goals and identifying effective mechanisms. Constraints come in endogenous and exogenous forms, and by restricting the design space, mechanisms can be more explicitly identified. Mechanisms are the substance of the protocol, such as clearing, pricing, staking, incentives, payments, and verification. Designs should fit within constraints and meet well-defined goals.

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched Web3 protocol Example: Unstable Confusion

Let's move on to a brand new Web3 protocol called "Unstable Confusion". In what follows we outline some interesting directions proposed in the context of converting the existing Web2 protocol "Stable Horde" to the Web3 protocol "Unstable Confusion".

As mentioned earlier, there is a problem with sending false results, so there needs to be a mechanism to ensure that users get what they need, this is called "validation reasoning". In simple terms, we need to verify the reasoning to ensure that its results are as expected. Another problem concerns workers in "Stable Horde". Workers request the next task from the database in the order they were requested, and assign the task to the worker who made the request earliest. But in a system where money is involved, workers may claim tasks in large numbers in order to get paid more, but don't actually intend to complete them. They may compete for low latency, grab tasks, and cause system congestion. **

To solve the above problems, some solutions are proposed. The first is "Pay Proportional to Contribution", where workers are paid according to their contribution, competing for tasks in a way that is beneficial to the network. Second is "flexible participation", that is, workers can freely join or exit the system at a lower cost, attracting more participants. Finally "Low Latency", that is, how responsive and fast the application is, is critical to the user experience. ** Going back to our goal, to build a decentralized, interoperable marketplace for image generation. While we still have some key constraints, these could be added, modified or more specific details later. Now, we can evaluate the feasibility of different mechanisms.

Potential Mechanism Design

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

1. Verification mechanism

We can use methods such as game theory and cryptography to ensure the accuracy of reasoning. Game theory mechanisms can be used in dispute resolution systems, where users can escalate disputes and be arbitrated by specific roles. Continuous or sample auditing is another approach, by reviewing workers' work, ensuring that tasks are assigned to different workers, and recording which workers pass the audit. Zero-knowledge proofs in cryptography can generate efficient proofs to verify the correctness of reasoning. Traditional methods include trusted third-party institutions and user reviews, but there are centralization risks and network effects.

Other possible validation mechanisms include having multiple workers complete the same task, and the user chooses from the results. This may be costly, but if the cost is low enough, it can be considered an approach.

2. Pricing strategy

Regarding the pricing strategy, an order book can be established on-chain. It is also possible to use on-chain verifiable computing resource proxy metrics, such as gas. This approach differs from a simple free market, where users simply post what they are willing to pay for inference, which workers can accept, or they can bid to compete for tasks. Instead, users can create a gas-like proxy metric where a specific inference requires a certain amount of computing resources, and the amount of computing resources directly determines the price. In this way, the operation of the entire mechanism can be simplified.

Alternatively, an off-chain order book could be used, which is less expensive to run and potentially very efficient. The problem, however, is that whoever owns that order book may concentrate the network effect on themselves.

3. Storage mechanism

The storage mechanism is very important to ensure that the results of the work can be delivered to the user correctly, but it is difficult to reduce the risk of trust and prove that the work was delivered correctly. Users may question whether an item was delivered, similar to complaining about not receiving an expected item. Auditors may need to verify the calculation process and check the accuracy of the output results. Therefore, the output should be visible to the protocol and stored where the protocol can access it.

In terms of storage mechanism, we have several options. One is to store data on-chain, but this is expensive. Another option is to use a dedicated storage encryption network, which is more complex but attempts to solve the problem in a peer-to-peer fashion. Alternatively, there is an option to store the data off-chain, but this raises other issues, as whoever controls that storage system could influence other aspects such as the verification process and transmission of the final payment.

4. Task allocation strategy

The way tasks are distributed also needs to be considered, which is a relatively complex area. It can be considered that the worker chooses the task by itself after the task is submitted, or the agreement distributes the task after the task is submitted, and it is also possible to let the user or end user select the specific worker. There are pros and cons to each approach, and also consider the combination of ways the protocol decides which workers can request which tasks.

Assignment of tasks involves many interesting details. For example, in a protocol-based system, it needs to know whether a worker is online and available in order to decide whether to assign a task to it. You also need to know the capacity and load of each worker. Therefore, various additional factors need to be considered in the protocol, which may not have been included in the initial simple implementation.

Key Points of Decentralized Protocol Design

a16z Encryption Entrepreneurship Class: After "Token Design", "Protocol Design" is launched

7 key design elements that can lead to centralization risk

These include space naming introduced by email, payment systems, reputation, and storage, matching, pricing systems, and verification systems. These elements may become centralized due to network effects or high switching costs. Govern the protocol by mitigating the accumulation of network effects, channeling network effects into the protocol, and building a decentralized control layer into the protocol to ensure the long-term health of the system. Decentralized control can be achieved using volatile tokens or other governance designs such as reputation systems or rotating election mechanisms.

Reduce switching costs and promote interoperability

In order to encourage entrepreneurs to build applications on the system, it is important to reduce switching costs and promote interoperability between different systems. Avoid introducing high switching costs and reduce over-reliance on off-chain order books or third-party verification systems.

Using Web3 technology to create a decentralized system

Leverage Web3 tools and principles to design systems that empower entrepreneurs and avoid excessive centralization. Protocols that embrace Web3 principles typically have greater scale, longer life, and more vibrant ecosystem vitality, providing fertile areas of innovative exploration beyond the boundaries set by the largest incumbents.

Deep research and selection of the best solution

When designing a protocol and determining a strategy, various aspects need to be studied in depth. For authentication, cryptographic solutions are usually the best choice. In terms of pricing, proxy metrics using on-chain verifiable computing resources can be adapted to a variety of different inference or machine learning tasks. In terms of task assignment, a protocol for updating worker capabilities and status in real time is adopted to distribute tasks fairly and allow workers to independently choose whether to accept tasks. For storage problems, solutions such as prototype sharding technology can be considered to solve problems in a short time window and adopt temporary storage methods.

When designing a decentralized system, the above considerations can help to build a system with long-term robustness and decentralization properties.

Original: Protocol design: Why and how

Link to translated full-text version:

View Original
The content is for reference only, not a solicitation or offer. No investment, tax, or legal advice provided. See Disclaimer for more risks disclosure.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments