Local LLM Made Simple - Consumer-friendly AI assistant
Project description
LMAPP
Local LLM CLI โ AI everywhere. Easy, simple, and undeniable.
Online or offline. The future is yours to command.
v0.4.0 - Agent Mode. Production Ready. Fully Featured. Free.
See Demo & Features for examples and use cases.
๐ Quick Start
Full installation and setup: see QUICKSTART.md. Customize your AI's behavior: see Roles & Workflows Guide.
Everyday commands:
lmapp chat # Start chatting locally
lmapp chat --agent # Start in auto-Agent Mode (Copilot-like)
lmapp server start # Start API server (Web App & VS Code)
lmapp status # Check backend/model status
๐ Web Interface (New in v0.4.0)
Access lmapp from your browser without installing the desktop app.
- Start the server:
lmapp server start - Open
http://localhost:8000in your browser. - Chat, run workflows, and manage settings.
๐ฅ Demonstrations
Recommended: Check out our Feature Tour to see lmapp in action!
We offer two ways to explore lmapp:
- Feature Tour (Read) - A visual walkthrough of features, use cases, and the "first run" experience.
- Interactive Tour (Run) - A script you can run on your own machine to experience lmapp firsthand.
Explore real-world use cases including:
- ๐ค auto-Agent Mode (Autonomous terminal & file operations)
- ๐ Content Generation (Blogs, Emails, Code)
- ๐ Document Analysis (Summarization, Q&A)
- ๐ Data Extraction (JSON from text)
๐ฏ Features
๐ค auto-Agent Mode (New in v0.4.0)
Turn your terminal into an autonomous coding assistant.
$ lmapp chat --agent
> Create a python script to calculate fibonacci
[AGENT] Creating fib.py...
[AGENT] Running fib.py...
๐ฌ Chat
$ lmapp chat --model mistral
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Chat with Mistral (Local) โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
You: Explain quantum computing in simple terms
AI: Quantum computers use quantum bits (qubits) instead of regular bits.
While regular bits are 0 or 1, qubits can be both at once (superposition).
This lets them solve certain problems exponentially faster...
You: What are the use cases?
AI: Key use cases include:
โข Drug discovery (molecular simulation)
โข Finance (portfolio optimization)
โข Cryptography (breaking encryption)
โข Machine learning (optimization)
๏ฟฝ VS Code Integration
Turn VS Code into an AI-powered IDE with our extension.
- Start the Server:
lmapp server start
- Open Dashboard: Go to
http://localhost:8000to see status and chat. - Install Extension: Install
lmapp-vscode(coming soon to marketplace). - Enjoy: Get inline code completions and chat directly in your editor.
๏ฟฝ๐ RAG (Semantic Search)
$ lmapp rag index ~/my_docs
๐ Indexing documents...
โ Processed: README.md (1,234 tokens)
โ Processed: GUIDE.pdf (5,678 tokens)
โ Processed: NOTES.txt (892 tokens)
โ Index created: 7,804 tokens in 12 documents
$ lmapp rag search "how to optimize python code"
๐ Search Results (3 matches):
1. GUIDE.pdf - Line 45 (score: 0.92)
"Optimization techniques include: list comprehensions,
caching, and using built-in functions instead of loops"
2. NOTES.txt - Line 12 (score: 0.88)
"Profile code with cProfile before optimizing"
3. README.md - Line 89 (score: 0.81)
"Performance tips for production code"
$ lmapp chat --with-context
You: Summarize the best Python optimization tips from my docs
AI: Based on your documents, here are the key optimization tips:
1. Use list comprehensions instead of loops
2. Profile with cProfile before optimizing
3. Leverage built-in functions (map, filter, etc.)
4. Implement caching for expensive operations
๐ฆ Batch Processing
$ lmapp batch create inputs.json
Processing 5 queries in batch...
[โโโโโโโโโโโโโโโโโโโโ] 100% (5/5)
Job created: batch_20250211_143022
Estimated time: 45 seconds
$ lmapp batch results batch_20250211_143022 --json
{
"job_id": "batch_20250211_143022",
"status": "completed",
"results": [
{"input": "Explain AI", "output": "AI is..."},
{"input": "What is ML?", "output": "Machine learning..."},
...
],
"completed_at": "2025-02-11T14:30:47Z"
}
๐ Plugins
$ lmapp plugin list
Available Plugins:
โ translator - Real-time translation (8 languages)
โ summarizer - Extract key points from long text
โ code-reviewer - Analyze code and suggest improvements
โ sql-generator - Write SQL queries from descriptions
โ regex-helper - Build and test regex patterns
โ json-validator - Validate and format JSON
โ git-helper - Explain git commands and operations
โ api-tester - Test REST APIs interactively
$ lmapp plugin install translator
Installing translator plugin...
โ Downloaded (245 KB)
โ Installed successfully
Ready to use: lmapp translate --help
$ lmapp translate --text "Hello World" --to spanish
Translation (Spanish):
"ยกHola Mundo!"
โ๏ธ Configuration
$ lmapp config show
Current Configuration:
Model: mistral (7B)
Temperature: 0.7
Max Tokens: 2048
Context Size: 4096
System Prompt: You are a helpful AI assistant
$ lmapp config set temperature 0.3
โ Configuration updated
$ lmapp config --set-prompt
Enter your custom system prompt:
> You are a Python expert. Help with code, explain concepts clearly.
โ System prompt saved
$ lmapp status
Status Report:
โ Backend: Ollama (running)
โ Model: mistral (7.4B)
โ Memory: 6.2 GB / 16 GB
โ Performance: 45 tokens/sec
๐ก Who Is This For?
Perfect Fit
- Developers - Code explanations, debugging, documentation, CLI workflows
- Students & Researchers - Study partner, research assistance, offline-first
- SysAdmins - Command lookups, automation scripts, system analysis
- Professionals - Writing, analysis, research, note-taking
- Privacy-Conscious Users - Want AI without cloud dependencies
- Gamers - Quick tips, cheat codes, game help without searching websites
- Travelers - Offline knowledge access (camping, flights, road trips, deployment)
- Anyone who values control over convenience
๐ Real-World Examples
Example 1: No Internet? No Problem!
Scenario: Camping, road trip, flight, deployment - anywhere without internet
Question: "Why does the sky look blue?"
$ lmapp chat
> AI: Hello, how may I help you?
> You: Why does the sky look blue?
> AI: Short answer: the sky is blue because air molecules scatter
sunlight more strongly at short (blue) wavelengths than at long
(red) wavelengths โ a process called Rayleigh scattering.
A bit more detail:
โข Sunlight contains all visible colors
โข Rayleigh scattering scales as 1/ฮป^4
โข Blue light (~450 nm) scattered many times more than red (~650 nm)
โข At sunrise/sunset, blue light scattered out โ sky looks red/orange
โข Clouds look white (Mie scattering of larger water droplets)
Extra neat fact: scattered sky light is partially polarized!
> AI: Is there anything else I can help you with?
Key Point: Full scientific answers WITHOUT internet
Example 2: Gaming Power User
Scenario: Playing GTA San Andreas, need cheat codes and tips
$ lmapp chat
> You: I'm playing GTA San Andreas, show me cheat codes and top 20 tips
> AI: Nice! Which platform? (PC, PS2/3/4/5, Xbox)
> You: PC
> AI: Here are PC cheat codes (type during gameplay):
HESOYAM - Health, Armor, $250,000
AEZAKMI - Never Wanted
BAGUVIX - Infinite Health
FULLCLIP - Infinite Ammo
[20+ more codes...]
Top 20 Tips & Tricks:
1. Save often โ use multiple save slots
2. Complete side jobs for cash/stat boosts (Ambulance, Firefighter)
3. Grind stats: stamina, muscle, driving/flying skill
4. Learn weapon ranges, aim for headshots
5. Keep Armor on you at all times
[15+ more detailed tips...]
> AI: Need anything else? Specific mission help?
Key Point: Skip endless website searches, get comprehensive game help instantly
๐ Basic Usage
# Start chat
lmapp chat
# Use specific model
lmapp chat --model mistral
# Check status
lmapp status
# View configuration
lmapp config show
Supported Backends: Ollama, llamafile (auto-detected). Extensible architecture supports custom backends.
See QUICKSTART.md for complete usage guide.
โ Quality & Features
- ๐งช 587 tests (100% coverage)
- ๐ 100% private (no cloud, no tracking)
- โก Fast & lightweight (<200ms startup)
- ๐ 8 production plugins
- ๐ RAG system (semantic search)
- ๐ฆ Batch processing
- ๐พ Session persistence
- ๐ Web UI (optional)
๐ Privacy & Security
- 100% Local - Everything runs on your device
- No Cloud - No internet after setup
- No Telemetry - Zero tracking
- Open Source - MIT licensed
- Your Data - You own it all
๐บ๏ธ Roadmap
v0.3.0 (Current) - Production ready
v0.4.0+ - Mobile/desktop apps, team features, enterprise tier
๐ค Contributing
Help wanted! See Contributing Guide for code contributions, bug reports, or feature ideas.
All contributions welcome: bug fixes, features, documentation, tests, and ideas.
๐ฌ Support
- Found a bug? Open an Issue
- Questions? See Troubleshooting Guide
- Discussions? Use GitHub Discussions
โ๏ธ Troubleshooting
| Issue | Solution |
|---|---|
command not found |
Add ~/.local/bin to $PATH or use pipx install lmapp |
ModuleNotFoundError |
Reinstall: pip install --upgrade lmapp |
| Debian/Ubuntu issues | Use pipx install lmapp instead of pip |
See Troubleshooting Guide for more.
โ FAQ
Q: How do I install?
pip install lmapp
Q: How do I update?
pip install --upgrade lmapp
Q: Can I use commercially?
Yes! MIT License allows it. See LICENSE.
Q: Does it collect data?
No. 100% local, no telemetry.
More questions? See Troubleshooting Guide.
๐ Documentation
๐ License
MIT License - See LICENSE file
This means:
- โ Use commercially
- โ Modify and distribute
- โ Include in closed-source projects
- โ Just include the license
Third-Party Licenses
- Ollama: MIT License
- llamafile: Apache 2.0 License
- Pydantic: MIT License
- Pytest: MIT License
- AI Models: Various (see model documentation)
๐ Built With
- Ollama - LLM management platform
- llamafile - Portable LLM runtime
- Pydantic - Data validation
- Pytest - Testing framework
- Meta, Mistral, and other amazing AI model creators
โญ Show Your Support
If lmapp helps you, please:
- โญ Star this repository
- ๐ Report bugs and suggest features
- ๐ข Share with friends and colleagues
- ๐ค Contribute improvements
- ๐ Share your use cases
๐ Get Started Now
pip install lmapp
lmapp chat
๐ Documentation Map
| Document | Purpose |
|---|---|
| QUICKSTART.md | 5-minute setup guide โญ Start here |
| docs/installation.md | Installation methods for all platforms |
| docs/CONFIGURATION.md | Configuration, environment, and settings |
| docs/development.md | Developer workflow and tips |
| TROUBLESHOOTING.md | Solutions for common issues |
| SECURITY.md | Security policy and vulnerability reporting |
| CHANGELOG.md | Release history |
| CONTRIBUTING.md | Contribution guidelines |
| CODE_OF_CONDUCT.md | Community standards |
| LICENSE | License terms |
| DEMO.md | Live examples and feature tour |
| API_REFERENCE.md | Lightweight CLI + HTTP API reference |
Additional references:
- docs/ERROR_DATABASE.md - Known errors and fixes
Welcome to the future of local AI. ๐ This is the way...
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lmapp-0.4.1.tar.gz.
File metadata
- Download URL: lmapp-0.4.1.tar.gz
- Upload date:
- Size: 247.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8fa455a2637aabea4eac6fb6a6a631e595f91468e45f82a06e049185ce518416
|
|
| MD5 |
1b36dc7e73b4f740d10a88703115d490
|
|
| BLAKE2b-256 |
243fb8419d25557556f15dfd951dc6f8367d47eac2b881f0bd49ad2d9db377f1
|
Provenance
The following attestation bundles were made for lmapp-0.4.1.tar.gz:
Publisher:
publish.yml on nabaznyl/lmapp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lmapp-0.4.1.tar.gz -
Subject digest:
8fa455a2637aabea4eac6fb6a6a631e595f91468e45f82a06e049185ce518416 - Sigstore transparency entry: 768880152
- Sigstore integration time:
-
Permalink:
nabaznyl/lmapp@3de6b8dbc2df4762de9f5810dbae93cf5bded766 -
Branch / Tag:
refs/tags/v0.4.1 - Owner: https://github.com/nabaznyl
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@3de6b8dbc2df4762de9f5810dbae93cf5bded766 -
Trigger Event:
push
-
Statement type:
File details
Details for the file lmapp-0.4.1-py3-none-any.whl.
File metadata
- Download URL: lmapp-0.4.1-py3-none-any.whl
- Upload date:
- Size: 212.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4c7c87fd05296494ffcdd80e42be4e58c3e20f32c773c3ba391c9ab057c57f19
|
|
| MD5 |
4b0841243ef41c9b3e479dd35150f025
|
|
| BLAKE2b-256 |
7a2acb1d0ecc551e5fcb6764f86a7887d20e1f97e16689b4149d5c7f9f0643da
|
Provenance
The following attestation bundles were made for lmapp-0.4.1-py3-none-any.whl:
Publisher:
publish.yml on nabaznyl/lmapp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lmapp-0.4.1-py3-none-any.whl -
Subject digest:
4c7c87fd05296494ffcdd80e42be4e58c3e20f32c773c3ba391c9ab057c57f19 - Sigstore transparency entry: 768880161
- Sigstore integration time:
-
Permalink:
nabaznyl/lmapp@3de6b8dbc2df4762de9f5810dbae93cf5bded766 -
Branch / Tag:
refs/tags/v0.4.1 - Owner: https://github.com/nabaznyl
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@3de6b8dbc2df4762de9f5810dbae93cf5bded766 -
Trigger Event:
push
-
Statement type: