Promptery is a cross-platform (Linux, Mac, Windows) frontend for Ollama, designed to provide flexible prompt and context settings during chat interactions with AI models.
- Multiple Chat Management: Switch between multiple chat sessions effortlessly.
- System Prompts: Configure prompts to guide the AI's behavior.
- Content Pages: Manage content pages for referencing during conversations.
- Decorator Prompts: Customize how the AI presents its responses.
- External Files Integration: Incorporate external text files into chats.
- Dynamic Model Switching: Change models mid-conversation without losing context.
- Context Customization: Tailor conversation context by enabling, disabling, or rerunning former messages.
- Image upload
- Multi-step queries
- Additional backends
- Tool usage
- Markdown rendering
- Writing external files
Promptery is a hobby project in its early stages. It is not ready for production and may lose stored data. Expect bugs and lacking error handling. Save and backup regularly. Currently all user data is stored in a settings directory that adheres to your platform's standard location for such files.
Prebuild releases for Mac (Apple Silicon) and Windows are available on GitHub.
A C++20 development environment with CMake and Qt6 is required to build Promptery.
git clone https://github.com/promptery/promptery
cd promptery
mkdir build
cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
cmake --build . # -j <job count> for parallel build
Ensure Ollama is reachable and serving at least one model. Start Promptery afterwards.
Community contributions are welcome, especially in testing and packaging. Contribute via issues and pull requests.
Promptery is licensed under the GPL-3.0 license.