Skip to main content

Kryten LLM integration service - provides AI chat responses and interactions

Project description

Kryten LLM

AI-powered chat bot service for CyTube, part of the Kryten ecosystem.

Features

  • Trigger-Based Responses: Configurable trigger words with probabilities
  • Direct Mentions: Responds when mentioned by name
  • LLM Integration: Multiple LLM provider support (OpenAI, OpenRouter, local)
  • Rate Limiting: Per-user, per-trigger, and global rate limits
  • Spam Detection: Automatic spam detection with exponential backoff penalties
  • Context Awareness: Tracks chat history and video context
  • Hot-Reload: Reload configuration without restart (SIGHUP)
  • Service Discovery: Publishes heartbeats and responds to discovery polls
  • Dry-Run Mode: Test responses without sending to chat

Requirements

  • Python 3.10+
  • Poetry
  • NATS server
  • kryten-py library

Installation

Using Poetry

# Install dependencies
uv sync

# Copy example configuration
cp config.example.json config.json

# Edit configuration with your settings
# See config.example.json for all options

Using pip

pip install kryten-llm

Configuration

Configuration is stored in a JSON file. See config.example.json for a complete example.

Key Sections

Section Description
nats NATS connection settings
channels CyTube channel connections
personality Bot character and behavior
llm_providers LLM API configurations
triggers Trigger words with patterns and probabilities
rate_limits Rate limiting rules
spam_detection Spam detection settings
testing Dry-run and logging options

Environment Variables

Override configuration with environment variables:

export OPENROUTER_API_KEY="your-api-key"
export KRYTEN_LLM_DRY_RUN="true"

Usage

Running the Service

# Using Poetry
uv run kryten-llm --config config.json

# Direct Python execution
python -m kryten_llm --config config.json

Command Line Options

Option Description
--config PATH Path to configuration file (default: config.json)
--log-level LEVEL Logging level: DEBUG, INFO, WARNING, ERROR
--dry-run Generate responses but don't send to chat
--validate-config Validate configuration and exit

Validation Mode

Validate your configuration without starting the service:

uv run kryten-llm --config config.json --validate-config

Dry-Run Mode

Test responses without sending to chat:

uv run kryten-llm --config config.json --dry-run

Hot-Reload (POSIX)

Reload configuration without restarting the service:

# Send SIGHUP to reload configuration
kill -HUP $(pgrep -f kryten_llm)

Safe changes that can be hot-reloaded:

  • Triggers (patterns, probabilities, enabled status)
  • Rate limits
  • Spam detection settings
  • Personality configuration
  • LLM provider settings

Unsafe changes (require restart):

  • NATS connection settings
  • Channel configuration
  • Service name

Production Deployment

Systemd Service

Install the systemd service file:

# Copy service file
sudo cp kryten-llm.service /etc/systemd/system/

# Create config directory
sudo mkdir -p /etc/kryten-llm
sudo cp config.json /etc/kryten-llm/

# Create log directory
sudo mkdir -p /var/log/kryten-llm
sudo chown kryten:kryten /var/log/kryten-llm

# Enable and start service
sudo systemctl daemon-reload
sudo systemctl enable kryten-llm
sudo systemctl start kryten-llm

Service Management

# Check status
sudo systemctl status kryten-llm

# View logs
sudo journalctl -u kryten-llm -f

# Reload configuration
sudo systemctl reload kryten-llm

# Restart service
sudo systemctl restart kryten-llm

Development

Running Tests

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=kryten_llm

# Run specific test file
uv run pytest tests/test_trigger_engine.py -v

Code Quality

# Linting
uv run ruff check .

# Formatting
uv run black .

# Type checking
uv run mypy kryten_llm

Architecture

The service processes messages through a pipeline:

Chat Message
    ↓
MessageListener (parse/filter)
    ↓
TriggerEngine (detect triggers/mentions)
    ↓
RateLimiter (check rate limits)
    ↓
SpamDetector (check spam)
    ↓
ContextManager (gather context)
    ↓
PromptBuilder (build LLM prompt)
    ↓
LLMManager (call LLM API)
    ↓
ResponseValidator (validate response)
    ↓
ResponseFormatter (format for chat)
    ↓
Send to Chat

Service Discovery

The service publishes lifecycle events via kryten-py's built-in ServiceConfig:

  • Startup event: When service connects to NATS
  • Heartbeat: Every 10s (configurable via heartbeat_interval_seconds) with health status
  • Shutdown event: When service stops gracefully

All lifecycle events use the kryten.lifecycle.<service>.<event> subject pattern:

  • kryten.lifecycle.llm.startup
  • kryten.lifecycle.llm.heartbeat
  • kryten.lifecycle.llm.shutdown

License

MIT License - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kryten_llm-0.6.1.tar.gz (374.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kryten_llm-0.6.1-py3-none-any.whl (75.1 kB view details)

Uploaded Python 3

File details

Details for the file kryten_llm-0.6.1.tar.gz.

File metadata

  • Download URL: kryten_llm-0.6.1.tar.gz
  • Upload date:
  • Size: 374.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kryten_llm-0.6.1.tar.gz
Algorithm Hash digest
SHA256 c9fb8f2029e7f1149852be91319fdf7955bf0016bbca83db04c769bbc3018213
MD5 9a45f670ffe49e9dd268fcce0acfcf00
BLAKE2b-256 be8e46c4bea14efb21e6e7ef34a8a2a1622891dd916a802e622097cacff5a181

See more details on using hashes here.

Provenance

The following attestation bundles were made for kryten_llm-0.6.1.tar.gz:

Publisher: python-publish.yml on grobertson/kryten-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kryten_llm-0.6.1-py3-none-any.whl.

File metadata

  • Download URL: kryten_llm-0.6.1-py3-none-any.whl
  • Upload date:
  • Size: 75.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kryten_llm-0.6.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e274cc1c01c43d368ad59dfbd33a8d1275831fdf9703451d524a33353c333d84
MD5 4d0e6a4e86a454a271ef126449b6ba05
BLAKE2b-256 61ad27c43f023630959a35705567b79d718a7c9488208bd537f324d6836465d2

See more details on using hashes here.

Provenance

The following attestation bundles were made for kryten_llm-0.6.1-py3-none-any.whl:

Publisher: python-publish.yml on grobertson/kryten-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page