From 55aad289bc6313fd78e1664c1fac14ad2fe4190e Mon Sep 17 00:00:00 2001 From: Melvin Ragusa Date: Sun, 26 Oct 2025 10:05:33 +0100 Subject: [PATCH] docs(readme): Comprehensive update to AI Sidebar documentation - Completely restructure README with detailed project overview - Add comprehensive installation and usage instructions - Include architecture diagram and component descriptions - Expand on features, requirements, and configuration options - Provide clear examples for keyboard shortcuts and slash commands - Improve formatting and readability of technical documentation - Describe core system components and their responsibilities - Add details about conversation management and reasoning mode --- README.md | 318 ++++++++++++++++++++++++++++++++++++++---------------- 1 file changed, 224 insertions(+), 94 deletions(-) diff --git a/README.md b/README.md index 169044e..4762b2e 100644 --- a/README.md +++ b/README.md @@ -1,124 +1,254 @@ -## AI Sidebar for Ignis +# AI Sidebar for Ignis -A sleek AI chat sidebar that integrates with your Ignis desktop, sliding in from the left side with Ollama AI integration. +A local AI chat interface for the Ignis desktop environment with streaming responses, conversation management, and reasoning mode support. -### Features +## Features -- **Slide-in Animation**: Smoothly slides in from the left side (opposite of QuickCenter) -- **Ollama Integration**: Chat with local AI models via Ollama -- **Conversation Persistence**: Your conversations are automatically saved and restored -- **Material Design 3**: Matches your existing Ignis theme perfectly -- **Keyboard Toggle**: Bind a key to toggle the sidebar visibility +- **Streaming Responses**: Real-time token-by-token display with smooth scrolling +- **Reasoning Mode**: Toggle between standard and thinking models for enhanced problem-solving +- **Conversation Management**: Auto-archiving, conversation history, and resume capabilities +- **Slash Commands**: Quick actions for managing conversations and models +- **Graceful Degradation**: Automatic Ollama availability monitoring with user notifications +- **Material Design 3**: Seamless integration with Ignis theming -### How to Use - -#### Open/Close the Sidebar - -You can toggle the sidebar using: -1. **Python/Script**: Call `window_manager.toggle_window("AISidebar")` -2. **Keyboard Shortcut**: Add a binding in your window manager config - -#### Setting up a Keyboard Shortcut - -For **Niri**, add this to your `~/.config/niri/config.kdl`: - -```kdl -binds { - // ... your other bindings - - // Toggle AI Sidebar with Super+G (or any key you prefer) - Mod+G { spawn "ignis" "run" "ignis.window_manager.WindowManager.get_default().toggle_window('AISidebar')"; } -} -``` - -For **Hyprland**, add this to your `~/.config/hypr/hyprland.conf`: - -```conf -# Toggle AI Sidebar with Super+G -bind = SUPER, G, exec, ignis run "ignis.window_manager.WindowManager.get_default().toggle_window('AISidebar')" -``` - -For **Sway**, add this to your `~/.config/sway/config`: - -``` -# Toggle AI Sidebar with Super+G -bindsym $mod+G exec ignis run "ignis.window_manager.WindowManager.get_default().toggle_window('AISidebar')" -``` - -### Requirements +## Requirements - **Ignis** desktop environment - **Python 3.10+** -- **Ollama** with at least one model installed -- **ollama Python package**: `pip install ollama` +- **Ollama** running locally with models installed +- **GTK4** (provided by Ignis) -### Configuration +## Installation -The sidebar will automatically: -- Detect your default Ollama model -- Store conversations in `~/.config/ignis/modules/aisidebar/data/conversations/` -- Apply your current Ignis theme colors +1. Clone or copy this module to your Ignis modules directory: + ```bash + ~/.config/ignis/modules/aisidebar/ + ``` + +2. Install Ollama and pull the required models: + ```bash + # Install Ollama (if not already installed) + curl -fsSL https://ollama.com/install.sh | sh + + # Pull the default models + ollama pull hf.co/unsloth/Qwen3-4B-Instruct-2507-GGUF:Q8_K_XL + ollama pull hf.co/unsloth/Qwen3-4B-Thinking-2507-GGUF:Q8_K_XL + ``` + +3. Start Ollama: + ```bash + ollama serve + ``` + +4. Reload Ignis: + ```bash + ignis reload + ``` + +## Usage + +### Keyboard Shortcuts + +Bind a key in your window manager configuration: + +**Niri** (`~/.config/niri/config.kdl`): +```kdl +binds { + Mod+G { spawn "ignis" "toggle-window" "AISidebar"; } +} +``` + +### Chat Interface + +- **Enter**: Send message +- **Shift+Enter**: New line in input +- **Click outside**: Close sidebar + +### Reasoning Mode + +Toggle the reasoning mode button (🧠) to switch between: +- **Standard Mode**: Fast responses using Qwen3-4B-Instruct +- **Reasoning Mode**: Detailed thinking process using Qwen3-4B-Thinking + +When reasoning mode is enabled, responses include a collapsible thinking section showing the model's internal reasoning process. + +### Slash Commands + +Execute commands by typing them in the chat: + +- `/new` or `/clear` - Start a new conversation (archives current) +- `/models` - List available Ollama models +- `/model ` - Switch to a specific model +- `/list` - Show archived conversations +- `/resume ` - Resume an archived conversation + +## Architecture + +``` +aisidebar/ +├── __init__.py # Module entry point +├── aisidebar.py # RevealerWindow implementation +├── chat_widget.py # Main chat UI and message handling +├── ollama_client.py # HTTP client for Ollama REST API +├── ollama_monitor.py # Availability monitoring with callbacks +├── conversation_manager.py # Active conversation persistence +├── conversation_archive.py # Multi-conversation management +├── command_processor.py # Slash command system +├── reasoning_controller.py # Reasoning mode state management +├── streaming_handler.py # Token-by-token streaming display +├── style.css # GTK4 CSS styling +└── data/ + └── conversations/ # JSON conversation files + ├── default.json # Active conversation + └── archive_*.json # Archived conversations +``` + +### Core Components + +**AISidebar** (`aisidebar.py`) +- RevealerWindow that slides in from the left +- Manages window visibility and keyboard focus +- Integrates with Ignis WindowManager + +**ChatWidget** (`chat_widget.py`) +- Complete chat interface with multi-line input +- Message list with auto-scrolling +- Background threading for AI requests +- Command processing and reasoning mode toggle + +**OllamaClient** (`ollama_client.py`) +- Direct HTTP calls to Ollama REST API +- Streaming and non-streaming chat support +- Model listing with caching +- Graceful error handling + +**StreamingHandler** (`streaming_handler.py`) +- Token buffering for smooth UI updates +- Separate handling for thinking and response content +- Thread-safe GLib.idle_add integration + +**ConversationManager** (`conversation_manager.py`) +- JSON-based persistence with atomic writes +- Message validation and timestamp tracking +- Auto-trimming to keep recent messages + +**ConversationArchive** (`conversation_archive.py`) +- Multi-conversation support with unique IDs +- Metadata extraction for conversation lists +- Archive and resume functionality + +**OllamaMonitor** (`ollama_monitor.py`) +- Periodic availability checking (30s interval) +- Callback-based state change notifications +- Non-blocking GLib timeout integration + +**CommandProcessor** (`command_processor.py`) +- Extensible slash command system +- Command registration and execution +- Structured result handling + +**ReasoningController** (`reasoning_controller.py`) +- Reasoning mode state persistence +- Model selection based on mode +- Model-specific parameter optimization + +## Configuration + +### Data Storage + +Conversations are stored in: +``` +~/.config/ignis/modules/aisidebar/data/conversations/ +``` + +User preferences are stored in: +``` +~/.config/aisidebar/preferences.json +``` + +### Auto-Archiving + +The sidebar automatically archives old messages when the conversation exceeds 20 messages, keeping only the most recent ones in the active conversation. Archived messages are saved with timestamps for later retrieval. ### Customization -#### Change Width - -Edit `aisidebar.py` line 19: +**Change sidebar width** (edit `aisidebar.py`): ```python -self.content_box.width_request = 400 # Change to desired width +self.content_box.width_request = 400 # Default: 400px ``` -#### Change Animation Speed - -Edit `aisidebar.py` line 24: +**Change animation speed** (edit `aisidebar.py`): ```python -transition_duration=300, # Change to desired milliseconds +transition_duration=300, # Default: 300ms ``` -#### Custom CSS Styling - -Edit `~/.config/ignis/styles/aisidebar.scss` to customize: -- Colors (uses Material Design 3 color tokens) -- Border radius -- Padding/margins +**Modify styling** (edit `style.css`): +- TextView background and colors +- Thinking box appearance - Message bubble styling -### Troubleshooting +## Troubleshooting **Sidebar doesn't appear:** -- Restart Ignis: `ignis reload` -- Check Ollama is running: `curl http://127.0.0.1:11434/api/tags` -- Check console for errors: `ignis` +```bash +# Reload Ignis +ignis reload + +# Check for errors +ignis +``` **No AI responses:** -- Ensure Ollama is running -- Ensure `ollama` Python package is installed in Ignis's Python environment -- Check that you have at least one model: `ollama list` +```bash +# Verify Ollama is running +curl http://127.0.0.1:11434/api/tags -**CSS not applying:** -- Restart Ignis: `ignis reload` -- Check SCSS compilation: Look for errors in Ignis console output +# Check installed models +ollama list -### Architecture - -``` -~/.config/ignis/modules/aisidebar/ -├── __init__.py # Module exports -├── aisidebar.py # Main RevealerWindow class -├── chat_widget.py # Chat UI widget -├── ollama_client.py # Ollama API wrapper -├── conversation_manager.py # Conversation persistence -└── data/ - └── conversations/ # Saved conversations (auto-created) +# Start Ollama if not running +ollama serve ``` -### Visual Design +**Ollama connection issues:** +- The sidebar will display "Ollama not running" in the model label +- Input will be disabled until connection is restored +- Automatic monitoring checks every 30 seconds +- A notification appears when connection is restored -The AI Sidebar follows the same visual language as QuickCenter: -- Material Design 3 color system -- 20px border radius on container -- Surface elevation with shadows -- Smooth slide-in transitions -- Translucent overlay backdrop +**Reasoning mode not working:** +- Ensure both models are installed (see Installation) +- Check that Ollama is running +- Try toggling reasoning mode off and on -Clicking outside the sidebar will close it (same as QuickCenter behavior). +## Development + +### Adding New Commands + +Register commands in `chat_widget.py`: + +```python +def _register_commands(self): + self._command_processor.register_command("/mycommand", self._cmd_my_handler) + +def _cmd_my_handler(self, args: str) -> CommandResult: + # Implementation + return CommandResult(success=True, message="Done!") +``` + +### Modifying Model Parameters + +Edit `reasoning_controller.py` to adjust temperature, top_p, and other parameters for each model mode. + +### Custom Styling + +The sidebar uses GTK4 CSS classes: +- `.ai-sidebar` - Main container +- `.ai-sidebar-content` - Content area +- `.thinking-box` - Reasoning display +- `.thinking-header` - Collapsible header +- `.thinking-content` - Reasoning text + +## License + +This project is provided as-is for use with the Ignis desktop environment.