LogoLogo
  • Learn
    • Introduction
      • AegisAI: Modular AI-Specific Layer 1 Blockchain
      • The Need for Decentralized AI
      • Features of AegisAI
      • Typical Use Cases
      • Vision: Building a Fair and Open AI Ecosystem
    • Overview of AegisAI
      • Dual-layer Architecture
      • Lifecycle of an AI Task
      • PoS+W Inference Concensys
      • Reward Mechanism
      • Composability
      • Staking-equivalent License
  • How-to Guides
    • Run a Node
    • Develop a DeAI Application
      • Quick Start Guide
      • AegisAI for EVM Developers
      • AegisAI for Solana Developers
      • AegisAI RPC API
      • Build Your First AI dApp
  • Reference
    • AegisAI LLM Inference Specifications
  • Community
    • Twitter
    • Telegram
Powered by GitBook
On this page
  • Consensus Layer
  • Resource Layer
  1. Learn
  2. Overview of AegisAI

Dual-layer Architecture

PreviousOverview of AegisAINextLifecycle of an AI Task

Last updated 2 months ago

AegisAI introduces a dual-layer architecture that separates blockchain consensus from resources required by AI applications, optimizing both security and performance.

This system comprises two distinct layers: Consensus layer and Resource layer, each with specialized roles that enhance the network’s ability to manage AI workloads while preserving decentralization.

Consensus Layer

Consensus Layer is a specialized blockchain aiming to maintain AI tasks states, distribute fees and slash fraud resources nodes in the resource layer.

Consensus layer is composed of up to 48,000 validator nodes. This extensive validator set bolsters decentralization, preventing dominance by any single party.

Resource Layer

Resource Layer serves as the fundamental powerhouse of AegisAI. In Resource Layer, resource nodes deliver the processing power, storage, and networking capabilities needed for AI tasks. The requirements of resource nodes are:

  • Staking for Security(POS Requirements) Resource nodes stake $ASI tokens as collateral, ensuring honest and efficient performance.

  • Hardware Requirements(POW Requirements) Resource nodes must meet minimum hardware standards to compute and verify outputs of AI models.