llmtop
is an intelligent system monitoring tool that combines real-time system metrics with LLM-powered insights. It provides a dynamic terminal interface showing system performance metrics enhanced with AI-driven analysis.
Note: This project is currently in beta testing. Features and interfaces may change.
- Real-time system metrics monitoring (CPU, Memory, Disk, Network)
- Process monitoring with resource usage
- AI-powered system analysis using either OpenAI or Ollama
- Smart alerting system for resource thresholds
- Dynamic terminal UI with auto-updating metrics
Install directly from PyPI:
pip install llmtop
llmtop supports two LLM backends for system analysis:
- Set your OpenAI API key:
export OPENAI_API_KEY='your-api-key-here'
- Run with OpenAI:
llmtop --use-openai
- Install Ollama from ollama.ai
- Start the Ollama service
- Run llmtop:
llmtop
llmtop [OPTIONS]
Options:
--update-frequency INTEGER Update frequency in seconds (default: 5)
--use-openai Use OpenAI instead of local model
--history-length INTEGER Number of historical data points to keep (default: 60)
--help Show this message and exit
- The tool defaults to using Ollama for analysis, which is free but requires local installation
- OpenAI mode provides more detailed analysis but requires an API key and has associated costs
- Adjust update frequency based on your system's performance and monitoring needs
- Experimental support for Windows systems
- Update frequency might need adjustment on slower systems
- Some process names may be truncated in the display
This project is in beta, and we welcome contributions! Please feel free to:
- Report bugs
- Suggest features
- Submit pull requests
MIT License - see LICENSE file for details.
Built using:
- Rich for terminal UI
- OpenAI/Ollama for LLM integration
- psutil for system metrics