Quick Start

Step 1: Install
pip install ainzos
Step 2: Create Kernel
from ainzos.kernel import AinzOSKernel

kernel = AinzOSKernel()

Use Cases

LLM Agent Orchestration

Route requests to different models based on complexity. Fast tasks hit the fast model. Complex tasks hit the strong model.

Heterogeneous Agent Pool

Different agents have different SLAs. Database agents are fast but limited. API agents are slower but more flexible.

Priority-Based Processing

Some requests matter more. VIP users, time-sensitive operations, critical alerts. Priority scheduling handles it.

LLM Orchestration Example

llm_orchestration.py
kernel = AinzOSKernel()

# Fast model for simple tasks
fast_llm = Agent(name="gpt-4-mini", capacity=20,
capabilities=["summarization", "classification"])
kernel.register_agent("fast", fast_llm)

# Powerful model for complex reasoning
strong_llm = Agent(name="gpt-4", capacity=5,
capabilities=["reasoning", "code_generation"])
kernel.register_agent("strong", strong_llm)

# Route tasks by complexity
if task_complexity < 5:
kernel.enqueue_task(Task(
id=task_id,
required_capability="summarization",
payload={prompt: user_prompt}
))

Ready to build?

Start with the documentation or jump straight into the code.