Skip to main content

OpenAgentFramework — minimal, fast, transparent AI agent framework

Project description

OAF — OpenAgentFramework
Minimal, transparent AI agent framework for Python

PyPI Python License Docs


Build AI agents that call tools, manage conversation context, and stream responses — with zero magic. Every prompt, tool call, and LLM decision is fully inspectable. Works with OpenAI and Anthropic out of the box.

Full Documentation →

Why OAF?

OAF Typical frameworks
Abstraction Flat — one agent loop, one tool decorator Deep chains, hidden prompt wrangling
Debuggability Full prompt/response inspection via hooks Opaque internal state
Surface area ~25 top-level exports Hundreds of classes
Tool definition Decorate any async function Special base classes, schemas, descriptors
Context Single subclass point — you own the prompt Scattered across prompt templates, chains, memory
Providers OpenAI + Anthropic, same API Often single-provider or heavy adapter layer

Install

pip install scope-oaf

Requires Python 3.11+. Core deps: openai, anthropic, tiktoken.

Quick Start

import asyncio
from oaf import Agent, ToolRegistry

registry = ToolRegistry()

@registry.register
async def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"Weather in {city}: 72°F, sunny"

agent = Agent(model="gpt-4.1-nano", tools=registry)
response = asyncio.run(agent.message("What's the weather in Tokyo?", role="user"))
print(response.text)

A full agent with tool calling in 10 lines.

What's Inside

Tools

Decorate any async function — type hints become JSON Schema automatically. Supports str, int, float, bool, list[T], dict[K,V], Optional[T], Union, Enum.

@registry.register
async def search(query: str, max_results: int = 5) -> str:
    """Search the web."""
    return f"Results for {query}"

Group related tools with BaseToolGroup:

from oaf import BaseToolGroup, tool_method

class MathTools(BaseToolGroup):
    name = "math"

    @tool_method
    async def add(self, a: int, b: int) -> str:
        return str(a + b)

registry.register_group(MathTools())  # → "math.add"

Role-based filtering and per-tool timeouts built in:

@registry.register(allowed_roles={"leader"}, timeout=10.0)
async def sensitive_op(cmd: str) -> str: ...

Context System

BaseContext is the single subclass point for prompt engineering. The default ConversationalInMemory handles message/token limits. Build your own for RAG, vector DB injection, or custom retention policies:

from oaf import BaseContext

class MyRAGContext(BaseContext):
    def build_messages(self, **kwargs):
        # You control everything the LLM sees
        ...

Multiple agents can share one context safely — internal tools are passed at call time, never stored.

Agent Loop

Generator-based execution yields typed events for real-time UIs:

async for event in agent.run("What's the weather?", role="user"):
    match event.type:
        case "tool_call_start": print(f"Calling {event.data['name']}...")
        case "tool_call_end":   print(f"  → {event.data['result']}")
        case "message_complete": print(event.data["response"].text)

Or use the simple agent.message() / agent.message_stream() wrappers.

Internal Tools

Built-in tools controlled via config — filesystem (sandboxed), thinking/reasoning, credential management, and channel switching:

from oaf import Agent, InternalToolConfig

agent = Agent(
    model="gpt-4.1-nano",
    internal_tool_config=InternalToolConfig(
        thinking=True,
        filesystem=True,
        filesystem_settings={"base_path": "./workspace"},
        credentials=True,
    ),
)

Config auto-propagates to subagents with optional overrides.

Multi-Agent

Spawn subagents with shared mailboxes. Results flow back automatically:

from oaf import Agent, Project, Subagent, InMemoryMailbox, InMemoryCredentialStore

researcher = Subagent(
    name="researcher",
    description="Researches topics and summarizes findings",
    model="gpt-4.1-nano",
)

project = Project(
    name="my-project",
    mailbox=InMemoryMailbox(),
    credential_store=InMemoryCredentialStore(),
)

leader = project.agent(
    model="gpt-4.1-nano",
    role="leader",
    subagents=[researcher],
)
# Leader gets delegate.task + delegate.status tools automatically

Credentials

Store secrets with auto-injection into tool parameters:

from oaf import Cred, InMemoryCredentialStore

store = InMemoryCredentialStore()
await store.add("api_key", "sk-secret-123")

@registry.register
async def call_api(query: str, api_key: Cred) -> str:
    """Cred params resolve from the credential store at call time."""
    return f"Called with {api_key}"

Hooks

14 lifecycle events — subclass or register ad-hoc. before_* events support mutation:

from oaf import Hooks

class MyHooks(Hooks):
    async def before_message(self, message, role, messages):
        print(f"→ {message}")
    async def after_tool_call(self, tool_name, arguments, result):
        print(f"  {tool_name}({arguments}) = {result}")

LLM Client (standalone)

Use independently of the agent framework:

from oaf.llmclient import LLMClient, SyncLLMClient, Message

client = LLMClient()
response = await client.chat("gpt-4.1-nano", [Message(role="user", content="Hello")])

# Sync wrapper, streaming, embeddings all supported

Architecture

┌──────────────────────────────────────────────────┐
│                  Your Application                │
├──────────────────────────────────────────────────┤
│             OpenAgentFramework (OAF)             │
│  ┌────────┐ ┌────────┐ ┌─────────┐ ┌─────────┐  │
│  │ Agent  │ │ Tools  │ │ Context │ │  Hooks  │  │
│  └────────┘ └────────┘ └─────────┘ └─────────┘  │
│  ┌────────┐ ┌────────┐ ┌─────────┐ ┌─────────┐  │
│  │Project │ │ Creds  │ │Mailbox  │ │Channels │  │
│  └────────┘ └────────┘ └─────────┘ └─────────┘  │
├──────────────────────────────────────────────────┤
│          llmclient (built-in, standalone)        │
│           OpenAI + Anthropic providers           │
└──────────────────────────────────────────────────┘

Contributing

git clone https://github.com/devincii-io/scope-oaf.git
cd scope-oaf
pip install -e ".[dev]"
pytest tests/ -v

All development goes to dev branch. Push to master triggers PyPI publish via GitHub Actions.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scope_oaf-0.3.0.tar.gz (132.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scope_oaf-0.3.0-py3-none-any.whl (70.3 kB view details)

Uploaded Python 3

File details

Details for the file scope_oaf-0.3.0.tar.gz.

File metadata

  • Download URL: scope_oaf-0.3.0.tar.gz
  • Upload date:
  • Size: 132.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for scope_oaf-0.3.0.tar.gz
Algorithm Hash digest
SHA256 9f34a2b01cf188404d3830a143e9f1c6976dd284684809c04fd81cc6b7ffb485
MD5 cbfb72ab461021cb615b17da45c69cb0
BLAKE2b-256 001212fef827628a7eeb0c5edbfb02cf790fb3c6860f657881949b9c5f25cf1c

See more details on using hashes here.

Provenance

The following attestation bundles were made for scope_oaf-0.3.0.tar.gz:

Publisher: publish.yml on devincii-io/scope-oaf

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file scope_oaf-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: scope_oaf-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 70.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for scope_oaf-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 198db2f278da5971e3519dbcbbc4196dc9eb48ce3b7c0a8e4f2770581e0d3455
MD5 ae44bff165451ca8488e9e1d6af89836
BLAKE2b-256 31e2ba9e489565f31102151305410a91b6e98bc0b0a689041ec06be5ce32a895

See more details on using hashes here.

Provenance

The following attestation bundles were made for scope_oaf-0.3.0-py3-none-any.whl:

Publisher: publish.yml on devincii-io/scope-oaf

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page