Skip to main content

Local LLM Made Simple - Consumer-friendly AI assistant

Project description

LMAPP

Local LLM CLI โ€“ AI everywhere. Easy, simple, and undeniable.
Online or offline. The future is yours to command.

License: MIT PyPI CI codecov Status

v0.4.0 - Agent Mode. Production Ready. Fully Featured. Free.

See Demo & Features for examples and use cases.


๐Ÿš€ Quick Start

Full installation and setup: see QUICKSTART.md. Customize your AI's behavior: see Roles & Workflows Guide.

Everyday commands:

lmapp chat          # Start chatting locally
lmapp chat --agent  # Start in auto-Agent Mode (Copilot-like)
lmapp server start  # Start API server (Web App & VS Code)
lmapp status        # Check backend/model status

๐ŸŒ Web Interface (New in v0.4.0)

Access lmapp from your browser without installing the desktop app.

  1. Start the server: lmapp server start
  2. Open http://localhost:8000 in your browser.
  3. Chat, run workflows, and manage settings.

๐ŸŽฅ Demonstrations

Recommended: Check out our Feature Tour to see lmapp in action!

We offer two ways to explore lmapp:

  1. Feature Tour (Read) - A visual walkthrough of features, use cases, and the "first run" experience.
  2. Interactive Tour (Run) - A script you can run on your own machine to experience lmapp firsthand.

Explore real-world use cases including:

  • ๐Ÿค– auto-Agent Mode (Autonomous terminal & file operations)
  • ๐Ÿ“ Content Generation (Blogs, Emails, Code)
  • ๐Ÿ” Document Analysis (Summarization, Q&A)
  • ๐Ÿ“Š Data Extraction (JSON from text)

๐ŸŽฏ Features

๐Ÿค– auto-Agent Mode (New in v0.4.0)

Turn your terminal into an autonomous coding assistant.

$ lmapp chat --agent
> Create a python script to calculate fibonacci
[AGENT] Creating fib.py...
[AGENT] Running fib.py...

๐Ÿ’ฌ Chat

$ lmapp chat --model mistral
โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—
โ•‘        Chat with Mistral (Local)           โ•‘
โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•

You: Explain quantum computing in simple terms

AI: Quantum computers use quantum bits (qubits) instead of regular bits.
While regular bits are 0 or 1, qubits can be both at once (superposition).
This lets them solve certain problems exponentially faster...

You: What are the use cases?

AI: Key use cases include:
  โ€ข Drug discovery (molecular simulation)
  โ€ข Finance (portfolio optimization)
  โ€ข Cryptography (breaking encryption)
  โ€ข Machine learning (optimization)

๏ฟฝ VS Code Integration

Turn VS Code into an AI-powered IDE with our extension.

  1. Start the Server:
    lmapp server start
    
  2. Open Dashboard: Go to http://localhost:8000 to see status and chat.
  3. Install Extension: Install lmapp-vscode (coming soon to marketplace).
  4. Enjoy: Get inline code completions and chat directly in your editor.

๏ฟฝ๐Ÿ” RAG (Semantic Search)

$ lmapp rag index ~/my_docs
๐Ÿ“ Indexing documents...
โœ“ Processed: README.md (1,234 tokens)
โœ“ Processed: GUIDE.pdf (5,678 tokens)
โœ“ Processed: NOTES.txt (892 tokens)
โœ“ Index created: 7,804 tokens in 12 documents

$ lmapp rag search "how to optimize python code"
๐Ÿ“Š Search Results (3 matches):

1. GUIDE.pdf - Line 45 (score: 0.92)
   "Optimization techniques include: list comprehensions,
    caching, and using built-in functions instead of loops"

2. NOTES.txt - Line 12 (score: 0.88)
   "Profile code with cProfile before optimizing"

3. README.md - Line 89 (score: 0.81)
   "Performance tips for production code"

$ lmapp chat --with-context
You: Summarize the best Python optimization tips from my docs

AI: Based on your documents, here are the key optimization tips:
  1. Use list comprehensions instead of loops
  2. Profile with cProfile before optimizing
  3. Leverage built-in functions (map, filter, etc.)
  4. Implement caching for expensive operations

๐Ÿ“ฆ Batch Processing

$ lmapp batch create inputs.json
Processing 5 queries in batch...
[โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ] 100% (5/5)

Job created: batch_20250211_143022
Estimated time: 45 seconds

$ lmapp batch results batch_20250211_143022 --json
{
  "job_id": "batch_20250211_143022",
  "status": "completed",
  "results": [
    {"input": "Explain AI", "output": "AI is..."},
    {"input": "What is ML?", "output": "Machine learning..."},
    ...
  ],
  "completed_at": "2025-02-11T14:30:47Z"
}

๐Ÿ”Œ Plugins

$ lmapp plugin list
Available Plugins:
  โœ“ translator     - Real-time translation (8 languages)
  โœ“ summarizer     - Extract key points from long text
  โœ“ code-reviewer  - Analyze code and suggest improvements
  โœ“ sql-generator  - Write SQL queries from descriptions
  โœ“ regex-helper   - Build and test regex patterns
  โœ“ json-validator - Validate and format JSON
  โœ“ git-helper     - Explain git commands and operations
  โœ“ api-tester     - Test REST APIs interactively

$ lmapp plugin install translator
Installing translator plugin...
โœ“ Downloaded (245 KB)
โœ“ Installed successfully
Ready to use: lmapp translate --help

$ lmapp translate --text "Hello World" --to spanish
Translation (Spanish):
"ยกHola Mundo!"

โš™๏ธ Configuration

$ lmapp config show
Current Configuration:
  Model: mistral (7B)
  Temperature: 0.7
  Max Tokens: 2048
  Context Size: 4096
  System Prompt: You are a helpful AI assistant

$ lmapp config set temperature 0.3
โœ“ Configuration updated

$ lmapp config --set-prompt
Enter your custom system prompt:
> You are a Python expert. Help with code, explain concepts clearly.
โœ“ System prompt saved

$ lmapp status
Status Report:
  โœ“ Backend: Ollama (running)
  โœ“ Model: mistral (7.4B)
  โœ“ Memory: 6.2 GB / 16 GB
  โœ“ Performance: 45 tokens/sec

๐Ÿ’ก Who Is This For?

Perfect Fit

  • Developers - Code explanations, debugging, documentation, CLI workflows
  • Students & Researchers - Study partner, research assistance, offline-first
  • SysAdmins - Command lookups, automation scripts, system analysis
  • Professionals - Writing, analysis, research, note-taking
  • Privacy-Conscious Users - Want AI without cloud dependencies
  • Gamers - Quick tips, cheat codes, game help without searching websites
  • Travelers - Offline knowledge access (camping, flights, road trips, deployment)
  • Anyone who values control over convenience

๐ŸŒŸ Real-World Examples

Example 1: No Internet? No Problem!

Scenario: Camping, road trip, flight, deployment - anywhere without internet
Question: "Why does the sky look blue?"

$ lmapp chat
> AI: Hello, how may I help you?
> You: Why does the sky look blue?

> AI: Short answer: the sky is blue because air molecules scatter 
     sunlight more strongly at short (blue) wavelengths than at long 
     (red) wavelengths โ€” a process called Rayleigh scattering.

     A bit more detail:
     โ€ข Sunlight contains all visible colors
     โ€ข Rayleigh scattering scales as 1/ฮป^4
     โ€ข Blue light (~450 nm) scattered many times more than red (~650 nm)
     โ€ข At sunrise/sunset, blue light scattered out โ†’ sky looks red/orange
     โ€ข Clouds look white (Mie scattering of larger water droplets)
     
     Extra neat fact: scattered sky light is partially polarized!

> AI: Is there anything else I can help you with?

Key Point: Full scientific answers WITHOUT internet


Example 2: Gaming Power User

Scenario: Playing GTA San Andreas, need cheat codes and tips

$ lmapp chat
> You: I'm playing GTA San Andreas, show me cheat codes and top 20 tips

> AI: Nice! Which platform? (PC, PS2/3/4/5, Xbox)
> You: PC

> AI: Here are PC cheat codes (type during gameplay):
     HESOYAM - Health, Armor, $250,000
     AEZAKMI - Never Wanted
     BAGUVIX - Infinite Health
     FULLCLIP - Infinite Ammo
     [20+ more codes...]

     Top 20 Tips & Tricks:
     1. Save often โ€” use multiple save slots
     2. Complete side jobs for cash/stat boosts (Ambulance, Firefighter)
     3. Grind stats: stamina, muscle, driving/flying skill
     4. Learn weapon ranges, aim for headshots
     5. Keep Armor on you at all times
     [15+ more detailed tips...]

> AI: Need anything else? Specific mission help?

Key Point: Skip endless website searches, get comprehensive game help instantly


๐Ÿ“– Basic Usage

# Start chat
lmapp chat

# Use specific model
lmapp chat --model mistral

# Check status
lmapp status

# View configuration
lmapp config show

Supported Backends: Ollama, llamafile (auto-detected). Extensible architecture supports custom backends.

See QUICKSTART.md for complete usage guide.


โœ… Quality & Features

  • ๐Ÿงช 587 tests (100% coverage)
  • ๐Ÿ”’ 100% private (no cloud, no tracking)
  • โšก Fast & lightweight (<200ms startup)
  • ๐Ÿ”Œ 8 production plugins
  • ๐Ÿ” RAG system (semantic search)
  • ๐Ÿ“ฆ Batch processing
  • ๐Ÿ’พ Session persistence
  • ๐ŸŒ Web UI (optional)

๐Ÿ” Privacy & Security

  • 100% Local - Everything runs on your device
  • No Cloud - No internet after setup
  • No Telemetry - Zero tracking
  • Open Source - MIT licensed
  • Your Data - You own it all

๐Ÿ—บ๏ธ Roadmap

v0.3.0 (Current) - Production ready
v0.4.0+ - Mobile/desktop apps, team features, enterprise tier


๐Ÿค Contributing

Help wanted! See Contributing Guide for code contributions, bug reports, or feature ideas.


All contributions welcome: bug fixes, features, documentation, tests, and ideas.


๐Ÿ’ฌ Support


โš™๏ธ Troubleshooting

Issue Solution
command not found Add ~/.local/bin to $PATH or use pipx install lmapp
ModuleNotFoundError Reinstall: pip install --upgrade lmapp
Debian/Ubuntu issues Use pipx install lmapp instead of pip

See Troubleshooting Guide for more.


โ“ FAQ

Q: How do I install?
pip install lmapp

Q: How do I update?
pip install --upgrade lmapp

Q: Can I use commercially?
Yes! MIT License allows it. See LICENSE.

Q: Does it collect data?
No. 100% local, no telemetry.

More questions? See Troubleshooting Guide.


๐Ÿ“š Documentation


๐Ÿ“„ License

MIT License - See LICENSE file

This means:

  • โœ… Use commercially
  • โœ… Modify and distribute
  • โœ… Include in closed-source projects
  • โœ… Just include the license

Third-Party Licenses

  • Ollama: MIT License
  • llamafile: Apache 2.0 License
  • Pydantic: MIT License
  • Pytest: MIT License
  • AI Models: Various (see model documentation)

๐Ÿ™ Built With

  • Ollama - LLM management platform
  • llamafile - Portable LLM runtime
  • Pydantic - Data validation
  • Pytest - Testing framework
  • Meta, Mistral, and other amazing AI model creators

โญ Show Your Support

If lmapp helps you, please:

  • โญ Star this repository
  • ๐Ÿ› Report bugs and suggest features
  • ๐Ÿ“ข Share with friends and colleagues
  • ๐Ÿค Contribute improvements
  • ๐Ÿ“ Share your use cases

๐Ÿ“ž Get Started Now

pip install lmapp
lmapp chat

๐Ÿ“– Documentation Map

Document Purpose
QUICKSTART.md 5-minute setup guide โญ Start here
docs/installation.md Installation methods for all platforms
docs/CONFIGURATION.md Configuration, environment, and settings
docs/development.md Developer workflow and tips
TROUBLESHOOTING.md Solutions for common issues
SECURITY.md Security policy and vulnerability reporting
CHANGELOG.md Release history
CONTRIBUTING.md Contribution guidelines
CODE_OF_CONDUCT.md Community standards
LICENSE License terms
DEMO.md Live examples and feature tour
API_REFERENCE.md Lightweight CLI + HTTP API reference

Additional references:


Welcome to the future of local AI. ๐Ÿš€ This is the way...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lmapp-0.4.1.tar.gz (247.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lmapp-0.4.1-py3-none-any.whl (212.8 kB view details)

Uploaded Python 3

File details

Details for the file lmapp-0.4.1.tar.gz.

File metadata

  • Download URL: lmapp-0.4.1.tar.gz
  • Upload date:
  • Size: 247.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for lmapp-0.4.1.tar.gz
Algorithm Hash digest
SHA256 8fa455a2637aabea4eac6fb6a6a631e595f91468e45f82a06e049185ce518416
MD5 1b36dc7e73b4f740d10a88703115d490
BLAKE2b-256 243fb8419d25557556f15dfd951dc6f8367d47eac2b881f0bd49ad2d9db377f1

See more details on using hashes here.

Provenance

The following attestation bundles were made for lmapp-0.4.1.tar.gz:

Publisher: publish.yml on nabaznyl/lmapp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file lmapp-0.4.1-py3-none-any.whl.

File metadata

  • Download URL: lmapp-0.4.1-py3-none-any.whl
  • Upload date:
  • Size: 212.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for lmapp-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4c7c87fd05296494ffcdd80e42be4e58c3e20f32c773c3ba391c9ab057c57f19
MD5 4b0841243ef41c9b3e479dd35150f025
BLAKE2b-256 7a2acb1d0ecc551e5fcb6764f86a7887d20e1f97e16689b4149d5c7f9f0643da

See more details on using hashes here.

Provenance

The following attestation bundles were made for lmapp-0.4.1-py3-none-any.whl:

Publisher: publish.yml on nabaznyl/lmapp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page