OpenAgentFramework — minimal, fast, transparent AI agent framework
Project description
OAF — OpenAgentFramework
Minimal, transparent AI agent framework for Python
Build AI agents that call tools, manage conversation context, and stream responses — with zero magic. Every prompt, tool call, and LLM decision is fully inspectable. Works with OpenAI and Anthropic out of the box.
Why OAF?
| OAF | Typical frameworks | |
|---|---|---|
| Abstraction | Flat — one agent loop, one tool decorator | Deep chains, hidden prompt wrangling |
| Debuggability | Full prompt/response inspection via hooks | Opaque internal state |
| Surface area | ~25 top-level exports | Hundreds of classes |
| Tool definition | Decorate any async function |
Special base classes, schemas, descriptors |
| Context | Single subclass point — you own the prompt | Scattered across prompt templates, chains, memory |
| Providers | OpenAI + Anthropic, same API | Often single-provider or heavy adapter layer |
Install
pip install scope-oaf
Requires Python 3.11+. Core deps: openai, anthropic, tiktoken.
Quick Start
import asyncio
from oaf import Agent, ToolRegistry
registry = ToolRegistry()
@registry.register
async def get_weather(city: str) -> str:
"""Get the current weather for a city."""
return f"Weather in {city}: 72°F, sunny"
agent = Agent(model="gpt-4.1-nano", tools=registry)
response = asyncio.run(agent.message("What's the weather in Tokyo?", role="user"))
print(response.text)
A full agent with tool calling in 10 lines.
What's Inside
Tools
Decorate any async function — type hints become JSON Schema automatically. Supports str, int, float, bool, list[T], dict[K,V], Optional[T], Union, Enum.
@registry.register
async def search(query: str, max_results: int = 5) -> str:
"""Search the web."""
return f"Results for {query}"
Group related tools with BaseToolGroup:
from oaf import BaseToolGroup, tool_method
class MathTools(BaseToolGroup):
name = "math"
@tool_method
async def add(self, a: int, b: int) -> str:
return str(a + b)
registry.register_group(MathTools()) # → "math.add"
Role-based filtering and per-tool timeouts built in:
@registry.register(allowed_roles={"leader"}, timeout=10.0)
async def sensitive_op(cmd: str) -> str: ...
Context System
BaseContext is the single subclass point for prompt engineering. The default ConversationalInMemory handles message/token limits. Build your own for RAG, vector DB injection, or custom retention policies:
from oaf import BaseContext
class MyRAGContext(BaseContext):
def build_messages(self, **kwargs):
# You control everything the LLM sees
...
Multiple agents can share one context safely — internal tools are passed at call time, never stored.
Agent Loop
Generator-based execution yields typed events for real-time UIs:
async for event in agent.run("What's the weather?", role="user"):
match event.type:
case "tool_call_start": print(f"Calling {event.data['name']}...")
case "tool_call_end": print(f" → {event.data['result']}")
case "message_complete": print(event.data["response"].text)
Or use the simple agent.message() / agent.message_stream() wrappers.
Internal Tools
Built-in tools controlled via config — filesystem (sandboxed), thinking/reasoning, credential management, and channel switching:
from oaf import Agent, InternalToolConfig
agent = Agent(
model="gpt-4.1-nano",
internal_tool_config=InternalToolConfig(
thinking=True,
filesystem=True,
filesystem_settings={"base_path": "./workspace"},
credentials=True,
),
)
Config auto-propagates to subagents with optional overrides.
Multi-Agent
Spawn subagents with shared mailboxes. Results flow back automatically:
from oaf import Agent, Project, Subagent, InMemoryMailbox, InMemoryCredentialStore
researcher = Subagent(
name="researcher",
description="Researches topics and summarizes findings",
model="gpt-4.1-nano",
)
project = Project(
name="my-project",
mailbox=InMemoryMailbox(),
credential_store=InMemoryCredentialStore(),
)
leader = project.agent(
model="gpt-4.1-nano",
role="leader",
subagents=[researcher],
)
# Leader gets delegate.task + delegate.status tools automatically
Credentials
Store secrets with auto-injection into tool parameters:
from oaf import Cred, InMemoryCredentialStore
store = InMemoryCredentialStore()
await store.add("api_key", "sk-secret-123")
@registry.register
async def call_api(query: str, api_key: Cred) -> str:
"""Cred params resolve from the credential store at call time."""
return f"Called with {api_key}"
Hooks
14 lifecycle events — subclass or register ad-hoc. before_* events support mutation:
from oaf import Hooks
class MyHooks(Hooks):
async def before_message(self, message, role, messages):
print(f"→ {message}")
async def after_tool_call(self, tool_name, arguments, result):
print(f" {tool_name}({arguments}) = {result}")
LLM Client (standalone)
Use independently of the agent framework:
from oaf.llmclient import LLMClient, SyncLLMClient, Message
client = LLMClient()
response = await client.chat("gpt-4.1-nano", [Message(role="user", content="Hello")])
# Sync wrapper, streaming, embeddings all supported
Architecture
┌──────────────────────────────────────────────────┐
│ Your Application │
├──────────────────────────────────────────────────┤
│ OpenAgentFramework (OAF) │
│ ┌────────┐ ┌────────┐ ┌─────────┐ ┌─────────┐ │
│ │ Agent │ │ Tools │ │ Context │ │ Hooks │ │
│ └────────┘ └────────┘ └─────────┘ └─────────┘ │
│ ┌────────┐ ┌────────┐ ┌─────────┐ ┌─────────┐ │
│ │Project │ │ Creds │ │Mailbox │ │Channels │ │
│ └────────┘ └────────┘ └─────────┘ └─────────┘ │
├──────────────────────────────────────────────────┤
│ llmclient (built-in, standalone) │
│ OpenAI + Anthropic providers │
└──────────────────────────────────────────────────┘
Contributing
git clone https://github.com/devincii-io/scope-oaf.git
cd scope-oaf
pip install -e ".[dev]"
pytest tests/ -v
All development goes to dev branch. Push to master triggers PyPI publish via GitHub Actions.
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file scope_oaf-0.3.0.tar.gz.
File metadata
- Download URL: scope_oaf-0.3.0.tar.gz
- Upload date:
- Size: 132.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9f34a2b01cf188404d3830a143e9f1c6976dd284684809c04fd81cc6b7ffb485
|
|
| MD5 |
cbfb72ab461021cb615b17da45c69cb0
|
|
| BLAKE2b-256 |
001212fef827628a7eeb0c5edbfb02cf790fb3c6860f657881949b9c5f25cf1c
|
Provenance
The following attestation bundles were made for scope_oaf-0.3.0.tar.gz:
Publisher:
publish.yml on devincii-io/scope-oaf
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
scope_oaf-0.3.0.tar.gz -
Subject digest:
9f34a2b01cf188404d3830a143e9f1c6976dd284684809c04fd81cc6b7ffb485 - Sigstore transparency entry: 1237071692
- Sigstore integration time:
-
Permalink:
devincii-io/scope-oaf@a940ce14e380a70e753a59c82c85e2632948ee25 -
Branch / Tag:
refs/heads/master - Owner: https://github.com/devincii-io
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@a940ce14e380a70e753a59c82c85e2632948ee25 -
Trigger Event:
push
-
Statement type:
File details
Details for the file scope_oaf-0.3.0-py3-none-any.whl.
File metadata
- Download URL: scope_oaf-0.3.0-py3-none-any.whl
- Upload date:
- Size: 70.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
198db2f278da5971e3519dbcbbc4196dc9eb48ce3b7c0a8e4f2770581e0d3455
|
|
| MD5 |
ae44bff165451ca8488e9e1d6af89836
|
|
| BLAKE2b-256 |
31e2ba9e489565f31102151305410a91b6e98bc0b0a689041ec06be5ce32a895
|
Provenance
The following attestation bundles were made for scope_oaf-0.3.0-py3-none-any.whl:
Publisher:
publish.yml on devincii-io/scope-oaf
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
scope_oaf-0.3.0-py3-none-any.whl -
Subject digest:
198db2f278da5971e3519dbcbbc4196dc9eb48ce3b7c0a8e4f2770581e0d3455 - Sigstore transparency entry: 1237071757
- Sigstore integration time:
-
Permalink:
devincii-io/scope-oaf@a940ce14e380a70e753a59c82c85e2632948ee25 -
Branch / Tag:
refs/heads/master - Owner: https://github.com/devincii-io
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@a940ce14e380a70e753a59c82c85e2632948ee25 -
Trigger Event:
push
-
Statement type: