feat(aisidebar): implement Ollama availability handling and graceful startup
- Add comprehensive Ollama connection error handling strategy - Implement OllamaClient with non-blocking initialization and connection checks - Create OllamaAvailabilityMonitor for periodic Ollama connection tracking - Update design and requirements to support graceful Ollama unavailability - Add new project structure for AI sidebar module with initial implementation - Enhance error handling to prevent application crashes when Ollama is not running - Prepare for future improvements in AI sidebar interaction and resilience
This commit is contained in:
@@ -97,7 +97,37 @@
|
||||
- Update message rendering to handle reasoning metadata
|
||||
- _Requirements: 5.5_
|
||||
|
||||
- [ ] 8. Add error handling and edge cases
|
||||
- [-] 8. Implement graceful Ollama unavailability handling
|
||||
- [ ] 8.1 Update OllamaClient initialization
|
||||
- Modify `__init__()` to never raise exceptions during initialization
|
||||
- Add connection check that sets internal availability flag
|
||||
- Update `list_models()` to return empty list instead of raising on connection failure
|
||||
- Update `chat()` and `stream_chat()` to return error messages instead of raising
|
||||
- _Requirements: 6.1, 6.3, 6.5_
|
||||
|
||||
- [ ] 8.2 Create OllamaAvailabilityMonitor
|
||||
- Create `ollama_monitor.py` with OllamaAvailabilityMonitor class
|
||||
- Implement periodic availability checking using GLib.timeout_add (30s interval)
|
||||
- Add callback mechanism to notify UI of state changes
|
||||
- Ensure checks are non-blocking and don't impact UI responsiveness
|
||||
- _Requirements: 6.4_
|
||||
|
||||
- [ ] 8.3 Update SidebarWindow for Ollama unavailability
|
||||
- Initialize OllamaAvailabilityMonitor in SidebarWindow
|
||||
- Display "Ollama not running" status message when unavailable at startup
|
||||
- Update model label to show connection status
|
||||
- Disable input field when Ollama unavailable, show helpful message
|
||||
- Add callback to re-enable features when Ollama becomes available
|
||||
- _Requirements: 6.1, 6.2, 6.4_
|
||||
|
||||
- [ ] 8.4 Add user-friendly error messages
|
||||
- Display clear instructions when user tries to chat without Ollama
|
||||
- Show notification when Ollama connection is restored
|
||||
- Update all command handlers to check Ollama availability
|
||||
- Provide actionable error messages (e.g., "Start Ollama with: ollama serve")
|
||||
- _Requirements: 6.2, 6.3_
|
||||
|
||||
- [ ] 9. Add error handling and edge cases
|
||||
- Implement stream timeout handling (60s limit) with cancellation
|
||||
- Add connection error recovery for streaming failures
|
||||
- Handle command execution during active streaming
|
||||
@@ -105,10 +135,11 @@
|
||||
- Implement graceful degradation for missing preferences file
|
||||
- _Requirements: 1.4, 3.5, 4.3, 4.4_
|
||||
|
||||
- [ ] 9. Polish and integration
|
||||
- [ ] 10. Polish and integration
|
||||
- Add CSS styling for system messages, reasoning content, and streaming indicator
|
||||
- Implement `/help` command to display available commands
|
||||
- Add visual feedback for command execution (loading states)
|
||||
- Ensure all UI updates maintain smooth scrolling behavior
|
||||
- Test keyboard focus management across all new widgets
|
||||
- _Requirements: 1.3, 2.3, 3.5, 5.5_
|
||||
- Add status indicator in UI showing Ollama connection state
|
||||
- _Requirements: 1.3, 2.3, 3.5, 5.5, 6.2_
|
||||
|
||||
Reference in New Issue
Block a user