- Reorganize project structure and file locations - Add ReasoningController to manage model selection and reasoning mode - Update design and requirements for reasoning mode toggle - Implement model switching between Qwen3-4B-Instruct and Qwen3-4B-Thinking models - Remove deprecated files and consolidate project layout - Add new steering and specification documentation - Clean up and remove unnecessary files and directories - Prepare for enhanced AI sidebar functionality with more flexible model handling
1.7 KiB
1.7 KiB
inclusion
| inclusion |
|---|
| always |
Technology Stack
Framework & Environment
- Platform: Ignis desktop environment (Python-based GTK4 framework)
- Python Version: 3.10+
- UI Framework: GTK4 via Ignis widgets
- Async/Threading: GLib for main loop, Python threading for background tasks
Key Dependencies
ignis- Desktop environment framework providing widgets and window managementollama- Python package for Ollama API integration- GTK4 (
gi.repository.GLib) - UI toolkit and event loop
Architecture Patterns
Widget System
- Uses Ignis widget abstractions (
widgets.Box,widgets.RevealerWindow, etc.) - Material Design 3 styling via CSS classes
- Revealer-based slide animations
API Communication
- Direct HTTP calls to Ollama REST API (no external HTTP library)
- Uses
urllib.requestfor HTTP operations - Timeout handling: 2s for health checks, 5s for model lists, 120s for chat
State Management
- Conversation persistence via JSON files
- Atomic file writes using
tempfileandos.replace() - In-memory caching for model lists
Threading Model
- UI operations on GLib main thread
- AI requests in background daemon threads
GLib.idle_add()for thread-safe UI updates
Error Handling
- Graceful degradation when Ollama is unavailable
- Availability monitoring with 30-second polling interval
- User-facing error messages instead of exceptions
Common Commands
Since this is an Ignis module, there are no build/test commands. The module is loaded directly by Ignis:
# Reload Ignis to apply changes
ignis reload
# Run Ignis with console output for debugging
ignis
# Check Ollama status
curl http://127.0.0.1:11434/api/tags
# List installed Ollama models
ollama list