LLM plugin for OpenAI
Project description
llm-openai-plugin
LLM plugin for OpenAI models.
This plugin is a preview. LLM currently ships with OpenAI models as part of its default collection, implemented using the Chat Completions API.
This plugin implements those same models using the new Responses API.
Currently the only reason to use this plugin over the LLM defaults is to access o1-pro, which can only be used via the Responses API.
Installation
Install this plugin in the same environment as LLM.
llm install llm-openai-plugin
Usage
To run a prompt against o1-pro do this:
llm -m openai/o1-pro "Convince me that pelicans are the most noble of birds"
Run this to see a full list of models - they start with the openai/ prefix:
llm models -q openai/
Here's the output of that command:
OpenAI: openai/gpt-4o
OpenAI: openai/gpt-4o-mini
OpenAI: openai/gpt-4.5-preview
OpenAI: openai/gpt-4.5-preview-2025-02-27
OpenAI: openai/o3-mini
OpenAI: openai/o1-mini
OpenAI: openai/o1
OpenAI: openai/o1-pro
OpenAI: openai/gpt-4.1
OpenAI: openai/gpt-4.1-2025-04-14
OpenAI: openai/gpt-4.1-mini
OpenAI: openai/gpt-4.1-mini-2025-04-14
OpenAI: openai/gpt-4.1-nano
OpenAI: openai/gpt-4.1-nano-2025-04-14
OpenAI: openai/o3
OpenAI: openai/o3-2025-04-16
OpenAI: openai/o3-streaming
OpenAI: openai/o3-2025-04-16-streaming
OpenAI: openai/o4-mini
OpenAI: openai/o4-mini-2025-04-16
OpenAI: openai/codex-mini-latest
OpenAI: openai/o3-pro
OpenAI: openai/gpt-5
OpenAI: openai/gpt-5-mini
OpenAI: openai/gpt-5-nano
OpenAI: openai/gpt-5-2025-08-07
OpenAI: openai/gpt-5-mini-2025-08-07
OpenAI: openai/gpt-5-nano-2025-08-07
OpenAI: openai/gpt-5-codex
OpenAI: openai/gpt-5-pro
OpenAI: openai/gpt-5-pro-2025-10-06
Add --options to see a full list of options that can be provided to each model.
The o3-streaming model ID exists because o3 currently requires a verified organization in order to support streaming. If you have a verified organization you can use o3-streaming - everyone else should use o3.
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-openai-plugin
python -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
python -m pytest
This project uses pytest-recording to record OpenAI API responses for the tests, and syrupy to capture snapshots of their results.
If you add a new test that calls the API you can capture the API response and snapshot like this:
PYTEST_OPENAI_API_KEY="$(llm keys get openai)" pytest --record-mode once --snapshot-update
Then review the new snapshots in tests/__snapshots__/ to make sure they look correct.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_openai_plugin-0.7.tar.gz.
File metadata
- Download URL: llm_openai_plugin-0.7.tar.gz
- Upload date:
- Size: 12.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
865d3116f4daf9823f470f5702bdc3fe71ea59d7bc67e525817aa35413f21e9f
|
|
| MD5 |
2eecbf96d096773449ac60ead0171471
|
|
| BLAKE2b-256 |
d922be5f1e3e17a11e5710771ebe22080e0cd4f6e101a4af3d0ff0be8d17dfd0
|
Provenance
The following attestation bundles were made for llm_openai_plugin-0.7.tar.gz:
Publisher:
publish.yml on simonw/llm-openai-plugin
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_openai_plugin-0.7.tar.gz -
Subject digest:
865d3116f4daf9823f470f5702bdc3fe71ea59d7bc67e525817aa35413f21e9f - Sigstore transparency entry: 585874022
- Sigstore integration time:
-
Permalink:
simonw/llm-openai-plugin@4f1787d1c81523ceae1dd2b34075fa9632502795 -
Branch / Tag:
refs/tags/0.7 - Owner: https://github.com/simonw
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@4f1787d1c81523ceae1dd2b34075fa9632502795 -
Trigger Event:
release
-
Statement type:
File details
Details for the file llm_openai_plugin-0.7-py3-none-any.whl.
File metadata
- Download URL: llm_openai_plugin-0.7-py3-none-any.whl
- Upload date:
- Size: 12.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ba3f02194e0cad0eded015dd19492f72d8a81b22ecdc1f69562c70a3ef52c029
|
|
| MD5 |
acc15fbe9957af4e9bacbb2f278aa81d
|
|
| BLAKE2b-256 |
4d714aec9bd35d3a305bc372567ea15ff8a26d6c5fd53b95582d09f91f6eaf62
|
Provenance
The following attestation bundles were made for llm_openai_plugin-0.7-py3-none-any.whl:
Publisher:
publish.yml on simonw/llm-openai-plugin
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_openai_plugin-0.7-py3-none-any.whl -
Subject digest:
ba3f02194e0cad0eded015dd19492f72d8a81b22ecdc1f69562c70a3ef52c029 - Sigstore transparency entry: 585874029
- Sigstore integration time:
-
Permalink:
simonw/llm-openai-plugin@4f1787d1c81523ceae1dd2b34075fa9632502795 -
Branch / Tag:
refs/tags/0.7 - Owner: https://github.com/simonw
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@4f1787d1c81523ceae1dd2b34075fa9632502795 -
Trigger Event:
release
-
Statement type: