Axio¶
A highly extensible, streaming-first agent framework for Python.
Axio gives you a minimal but complete foundation for building LLM-powered agents. Every integration point is a protocol — bring your own transport, context store, tools, and permission guards.
Install the TUI and start chatting with an LLM agent in under a minute.
Write a minimal agent from scratch with the core library.
Understand the agent loop, protocols, tools, events, and the plugin system.
Step-by-step guides for writing custom tools, transports, and guards.
Overview of every package in the monorepo and their entry points.
How to draw an owl¶
import aiohttp
from axio import Tool, ToolHandler
class Fetch(ToolHandler):
"""Fetch the text content of a URL."""
url: str
async def __call__(self) -> str:
async with aiohttp.ClientSession() as session:
async with session.get(self.url) as r:
return (await r.text())[:2000]
fetch = Tool(name="fetch", description=Fetch.__doc__, handler=Fetch)
import asyncio
from axio import Agent, MemoryContextStore
from axio_transport_openai import OpenAITransport
async def main() -> None:
agent = Agent(
system="You are a helpful assistant.",
tools=[fetch],
transport=OpenAITransport(),
)
reply = await agent.run(
"What's the weather tomorrow? Use geoip for detect my location and wttr.in for weather.",
MemoryContextStore(),
)
print(reply)
asyncio.run(main())
Why Axio?¶
- Extensible by design
Every integration point is a runtime-checkable
Protocolor ABC. Swap transports, context stores, tools, and guards without touching framework code.- Streaming by default
All LLM I/O flows through typed
StreamEventvalues. No hidden buffering — you see every token, tool call, and result as it happens.- Tools are Pydantic models
Define parameters as fields, get JSON schema for free, override
__call__for execution. Guards gate every tool call through a composable permission chain.