🚀 Gate.io #Launchpad# for Puffverse (PFVS) is Live!
💎 Start with Just 1 $USDT — the More You Commit, The More #PFVS# You Receive!
Commit Now 👉 https://www.gate.io/launchpad/2300
⏰ Commitment Time: 03:00 AM, May 13th - 12:00 PM, May 16th (UTC)
💰 Total Allocation: 10,000,000 #PFVS#
⏳ Limited-Time Offer — Don’t Miss Out!
Learn More: https://www.gate.io/article/44878
#GateioLaunchpad# #GameeFi#
IOSG: An Illustrated Vision of the Integration of AI and Web3
At first glance, AI x Web3 seems to be independent technologies, each based on fundamentally different principles and serving different functions. However, a deeper exploration will reveal that these two technologies have the opportunity to balance each other's trade-offs, and their unique advantages can complement each other and enhance each other. Balaji Srinivasan eloquently explained this concept of complementary capabilities at the SuperAI conference, inspiring a detailed comparison of how these technologies interact with each other.
Token emerged from the efforts of anonymous network punks to achieve decentralization, and evolved over the course of a decade through the collaborative efforts of numerous independent entities worldwide. In contrast, artificial intelligence is developed through a top-down approach dominated by a few technology giants. These companies determine the pace and dynamics of the industry, and the barrier to entry is determined more by resource intensity than technical complexity.
These two technologies also have fundamentally different natures. Fundamentally, TOKEN is a deterministic system that produces immutable results, such as the predictability of hash functions or Zero-Knowledge Proof. This contrasts sharply with the probabilistic and generally unpredictable nature of artificial intelligence.
Similarly, encryption technology excels in verification, ensuring the authenticity and security of transactions, and establishing trustless processes and systems, while artificial intelligence focuses on generating and creating rich digital content. However, in the process of creating rich digital content, ensuring the source of the content and preventing identity theft becomes a challenge.
Fortunately, Token offers a rich digital concept of opposition - digital scarcity. It provides relatively mature tools that can be applied to artificial intelligence technology to ensure the reliability of content sources and avoid identity theft issues.
One notable advantage of TOKEN is its ability to attract a large amount of hardware and capital into a coordinated network to serve specific goals. This ability is particularly beneficial for resource-intensive artificial intelligence. Mobilizing underutilized resources to provide cheaper computing power can significantly improve the efficiency of artificial intelligence.
By comparing these two major technologies, we can not only appreciate their respective contributions, but also see how they jointly create new paths for technology and the economy. Each technology can complement the other, creating a more integrated and innovative future. In this blog post, we aim to explore the emerging AI x Web3 industry landscape, focusing on some emerging vertical fields at the intersection of these technologies.
Source: IOSG Ventures
2.1 Calculate Network
The industry chart first introduces computing networks, which attempt to solve the constrained GPU supply problem and try to drop computing costs in different ways. The following items are worth following:
2.2 Training and Inference
The compute network is primarily used for two main functions: training and inference. The demand for these networks comes from Web 2.0 and Web 3.0 projects. In the field of Web 3.0, projects like Bittensor utilize computing resources for model fine-tuning. In terms of inference, Web 3.0 projects emphasize the verifiability of the process. This focus has given rise to verifiable inference as a market vertical, with projects exploring how to integrate AI inference into Smart Contracts while maintaining the principle of Decentralization.
2.3 Intelligent Agent Platform
Next up is the intelligent agent platform, the map outlines the core problems that startups in this category need to solve:
These features emphasize the importance of flexible and modular systems, which can be seamlessly integrated into a variety of blockchain and artificial intelligence applications. AI agents have the potential to completely change the way we interact with the Internet, and we believe that agents will use infrastructure to support their operations. We envision AI agents relying on infrastructure in the following aspects:
Source: IOSG Ventures
2.4 Data Layer
In the fusion of AI x Web3, data is a core component. Data is a strategic asset in AI competition, along with computing resources, forming critical resources. However, this category is often overlooked because most of the industry's attention is focused on the computing layer. In fact, primitives provide many interesting value directions in the data acquisition process, mainly including the following two high-level directions:
Access to public Internet data: This direction aims to build a distributed web crawler network that can crawl the entire Internet in a few days, obtain massive datasets, or access very specific Internet data in real time. However, to crawl a large dataset on the Internet, the network demand is very high, requiring at least a few hundred Nodes to start some meaningful work. Fortunately, Grass, a distributed web crawler Node network, has over 2 million Nodes actively sharing Internet bandwidth, with the goal of crawling the entire Internet. This demonstrates the enormous potential of economic incentives in attracting valuable resources.
Although Grass provides a fair competitive environment for public data, there is still a challenge of utilizing proprietary data - the issue of accessing proprietary datasets. Specifically, a large amount of data is still preserved in a privacy-protected manner due to its sensitive nature. Many startups are using cryptographic tools to enable AI developers to build and fine-tune large language models based on proprietary datasets while keeping sensitive information confidential.
Federated learning, differential privacy, trusted execution environment, fully homomorphic and multi-party computation technologies provide different levels of privacy protection and trade-offs. Bagel's research article () summarizes an excellent overview of these technologies. These technologies not only protect data privacy in the machine learning process, but also enable comprehensive privacy protection AI solutions at the computational level.
2.5 Data and Model Source
The data and model sourcing technology aims to establish a process that can assure users that they are interacting with the expected models and data. In addition, these technologies also provide authenticity and source assurance. Taking watermarking technology as an example, watermarking is one of the model sourcing technologies, which directly embeds the signature into the machine learning Algorithm, more specifically, directly embedded into the model weights, so that the inference can be verified to come from the expected model during retrieval.
2.6 Application
In terms of applications, the possibilities of design are endless. In the above industry landscape, we have listed some particularly anticipated development cases with the application of AI technology in the Web 3.0 field. As these use cases are largely self-descriptive, we do not make additional comments here. However, it is worth noting that the intersection of AI and Web 3.0 has the potential to reshape many vertical domains, as these new primitives provide developers with more freedom to create innovative use cases and optimize existing ones.
Summary
The fusion of AI and Web3 brings a promising and innovative future. By leveraging the unique strengths of each technology, we can address various challenges and pave the way for new technological paths. In exploring this emerging industry, the synergy between AI and Web3 can drive progress, reshaping our future digital experiences and our interactions on the web.
The fusion of digital scarcity and digital abundance, the mobilization of underutilized resources to achieve computational efficiency, and the establishment of data practices for security and privacy protection will define the era of next-generation technological evolution.
However, we must recognize that this industry is still in its infancy, and the current industry landscape may become outdated in a short period of time. The rapid pace of innovation means that today's cutting-edge solutions may soon be replaced by new breakthroughs. Nevertheless, the fundamental concepts discussed, such as computing networks, agent platforms, and data protocol, highlight the tremendous potential of the integration of artificial intelligence and Web 3.0.