Skip to main content

Simple LLM-powered assertions for any pytest test

Project description

pytest-llm-assert

PyPI version Python versions CI License: MIT

Natural language assertions for pytest.

Testing a text-to-SQL agent? Validating LLM-generated code? Checking if error messages are helpful? Now you can:

def test_sql_agent_output(llm):
    sql = my_agent.generate("Get names of users over 21")
    
    assert llm(sql, "Is this a valid SQL query that selects user names filtered by age > 21?")

The LLM evaluates your criterion and returns pass/fail — no regex, no parsing, no exact string matching.

Features

  • Semantic assertions — Assert meaning, not exact strings
  • 100+ LLM providers — OpenAI, Azure, Anthropic, Ollama, Vertex AI, Bedrock via LiteLLM
  • pytest native — Works as a standard pytest plugin/fixture
  • Response introspection — Access tokens, cost, and reasoning via llm.response

Installation

pip install pytest-llm-assert

Quick Start

# conftest.py
import pytest
from pytest_llm_assert import LLMAssert

@pytest.fixture
def llm():
    return LLMAssert(model="openai/gpt-5-mini")
# test_my_agent.py
def test_generated_sql_is_correct(llm):
    sql = "SELECT name FROM users WHERE age > 21 ORDER BY name"
    assert llm(sql, "Is this a valid SELECT query that returns names of users over 21?")

def test_error_message_is_helpful(llm):
    error = "ValidationError: 'port' must be an integer, got 'abc'"
    assert llm(error, "Does this explain what went wrong and how to fix it?")

def test_summary_captures_key_points(llm):
    summary = generate_summary(document)
    assert llm(summary, "Does this mention the contract duration and parties involved?")

Setup

Works out of the box with cloud identity — no API keys to manage:

# Azure (Entra ID)
export AZURE_API_BASE=https://your-resource.openai.azure.com
az login

# Google Cloud (Vertex AI)
gcloud auth application-default login

# AWS (Bedrock)
aws configure  # Uses IAM credentials

Supports 100+ providers via LiteLLM — including API key auth for OpenAI, Anthropic, Ollama, and more.

Documentation

Related

  • pytest-aitest — Full framework for testing MCP servers, CLIs, and AI agents
  • Contributing — Development setup and guidelines

Requirements

  • Python 3.11+
  • pytest 8.0+
  • An LLM (OpenAI, Azure, Anthropic, etc.) or local Ollama

Security

  • Sensitive data: Test content is sent to LLM providers — consider data policies

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytest_llm_assert-0.2.0.tar.gz (183.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pytest_llm_assert-0.2.0-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file pytest_llm_assert-0.2.0.tar.gz.

File metadata

  • Download URL: pytest_llm_assert-0.2.0.tar.gz
  • Upload date:
  • Size: 183.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pytest_llm_assert-0.2.0.tar.gz
Algorithm Hash digest
SHA256 05640eba304e8475263b4023779afbd9d59805f861718a8a1a47e08b3e2ab94b
MD5 c6bc7f9b7307e926aac575aeb04820a3
BLAKE2b-256 ba49590f31962e23753de3a558b6c9b561cf93da8f1f8e765d1ac3c0d09998f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytest_llm_assert-0.2.0.tar.gz:

Publisher: release.yml on sbroenne/pytest-llm-assert

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pytest_llm_assert-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for pytest_llm_assert-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0b7dc2f858fe94bcbdcafb077eabfcc2afcb92f34ef699965ebf8d6c49bc3a7c
MD5 ba0b9ecf97f6827084827ad739b8ad84
BLAKE2b-256 7d9abd05eb96140090a82a228be2ed9255d80696413ce736386701df94372c7f

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytest_llm_assert-0.2.0-py3-none-any.whl:

Publisher: release.yml on sbroenne/pytest-llm-assert

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page