Skip to main content

LLM plugin for OpenAI

Project description

llm-openai-plugin

PyPI Changelog Tests License

LLM plugin for OpenAI models.

This plugin is a preview. LLM currently ships with OpenAI models as part of its default collection, implemented using the Chat Completions API.

This plugin implements those same models using the new Responses API.

Currently the only reason to use this plugin over the LLM defaults is to access o1-pro, which can only be used via the Responses API.

Installation

Install this plugin in the same environment as LLM.

llm install llm-openai-plugin

Usage

To run a prompt against o1-pro do this:

llm -m openai/o1-pro "Convince me that pelicans are the most noble of birds"

Run this to see a full list of models - they start with the openai/ prefix:

llm models -q openai/

Here's the output of that command:

OpenAI: openai/gpt-4o
OpenAI: openai/gpt-4o-mini
OpenAI: openai/gpt-4.5-preview
OpenAI: openai/gpt-4.5-preview-2025-02-27
OpenAI: openai/o3-mini
OpenAI: openai/o1-mini
OpenAI: openai/o1
OpenAI: openai/o1-pro
OpenAI: openai/gpt-4.1
OpenAI: openai/gpt-4.1-2025-04-14
OpenAI: openai/gpt-4.1-mini
OpenAI: openai/gpt-4.1-mini-2025-04-14
OpenAI: openai/gpt-4.1-nano
OpenAI: openai/gpt-4.1-nano-2025-04-14
OpenAI: openai/o3
OpenAI: openai/o3-2025-04-16
OpenAI: openai/o3-streaming
OpenAI: openai/o3-2025-04-16-streaming
OpenAI: openai/o4-mini
OpenAI: openai/o4-mini-2025-04-16
OpenAI: openai/codex-mini-latest
OpenAI: openai/o3-pro
OpenAI: openai/gpt-5
OpenAI: openai/gpt-5-mini
OpenAI: openai/gpt-5-nano
OpenAI: openai/gpt-5-2025-08-07
OpenAI: openai/gpt-5-mini-2025-08-07
OpenAI: openai/gpt-5-nano-2025-08-07
OpenAI: openai/gpt-5-codex
OpenAI: openai/gpt-5-pro
OpenAI: openai/gpt-5-pro-2025-10-06

Add --options to see a full list of options that can be provided to each model.

The o3-streaming model ID exists because o3 currently requires a verified organization in order to support streaming. If you have a verified organization you can use o3-streaming - everyone else should use o3.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-openai-plugin
python -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

python -m pytest

This project uses pytest-recording to record OpenAI API responses for the tests, and syrupy to capture snapshots of their results.

If you add a new test that calls the API you can capture the API response and snapshot like this:

PYTEST_OPENAI_API_KEY="$(llm keys get openai)" pytest --record-mode once --snapshot-update

Then review the new snapshots in tests/__snapshots__/ to make sure they look correct.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_openai_plugin-0.7.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_openai_plugin-0.7-py3-none-any.whl (12.4 kB view details)

Uploaded Python 3

File details

Details for the file llm_openai_plugin-0.7.tar.gz.

File metadata

  • Download URL: llm_openai_plugin-0.7.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_openai_plugin-0.7.tar.gz
Algorithm Hash digest
SHA256 865d3116f4daf9823f470f5702bdc3fe71ea59d7bc67e525817aa35413f21e9f
MD5 2eecbf96d096773449ac60ead0171471
BLAKE2b-256 d922be5f1e3e17a11e5710771ebe22080e0cd4f6e101a4af3d0ff0be8d17dfd0

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_openai_plugin-0.7.tar.gz:

Publisher: publish.yml on simonw/llm-openai-plugin

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_openai_plugin-0.7-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_openai_plugin-0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 ba3f02194e0cad0eded015dd19492f72d8a81b22ecdc1f69562c70a3ef52c029
MD5 acc15fbe9957af4e9bacbb2f278aa81d
BLAKE2b-256 4d714aec9bd35d3a305bc372567ea15ff8a26d6c5fd53b95582d09f91f6eaf62

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_openai_plugin-0.7-py3-none-any.whl:

Publisher: publish.yml on simonw/llm-openai-plugin

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page