Addresses multiple issues related to GTK4 Layer Shell initialization and Ollama integration. - Reorders initialization to ensure the layer shell is set up before window properties. - Adds error detection for layer shell initialization failures. - Implements a focus event handler to prevent focus-out warnings. - Introduces a launcher script to activate the virtual environment, force native Wayland, and preload the GTK4 Layer Shell library. - Warns users of incorrect GDK_BACKEND settings. - Updates the Ollama client to handle responses from both older and newer versions of the Ollama SDK. These changes improve the application's stability, compatibility, and functionality on Wayland systems.
7.1 KiB
GTK4 Layer Shell and Ollama Fixes
Problems Identified
You were experiencing multiple issues when running the application:
- "Failed to initialize layer surface, not on Wayland"
- Multiple "GtkWindow is not a layer surface" warnings (9 times)
- "GtkText - did not receive a focus-out event" warnings
- "No content received from Ollama" - Ollama responses not working
Root Causes
1. Wrong GDK Backend
Your environment had GDK_BACKEND=x11 set, which forces GTK to use XWayland instead of native Wayland. GTK4 Layer Shell only works with native Wayland, not XWayland.
2. Initialization Order
The layer shell was being initialized after window properties (title, size) were set. GTK4 Layer Shell must be initialized immediately after super().__init__().
3. Library Linking Order
The GTK4 Layer Shell library needs to be loaded before libwayland-client.so, which requires using LD_PRELOAD.
4. Missing Focus Event Handler
The Gtk.Entry widget wasn't properly handling focus-out events, causing GTK to emit warnings.
5. Virtual Environment Not Activated
The launcher script wasn't activating the Python virtual environment (.venv), so the ollama package wasn't available even though it was installed in the venv.
6. Ollama SDK API Change (Pydantic Objects)
The newer ollama package (v0.6.0) returns Pydantic objects instead of dictionaries. The OllamaClient code was using .get() methods which don't work on Pydantic objects, causing responses to appear empty. This caused all Ollama API calls to return empty content with "No content received from Ollama".
Fixes Applied
1. Reordered Initialization (sidebar_window.py:26-41)
def __init__(self, **kwargs) -> None:
super().__init__(**kwargs)
# CRITICAL: Layer shell must be initialized BEFORE any window properties
self._setup_layer_shell()
self.set_default_size(360, 720)
self.set_title("Niri AI Sidebar")
# ... rest of initialization
2. Added Error Detection (sidebar_window.py:44-65)
def _setup_layer_shell(self) -> None:
if Gtk4LayerShell is None:
return
Gtk4LayerShell.init_for_window(self)
# Verify initialization succeeded before configuring
if not Gtk4LayerShell.is_layer_window(self):
return
# ... rest of layer shell configuration
3. Added Focus Event Handler (sidebar_window.py:110-113, sidebar_window.py:173-176)
# Add focus event controller to properly propagate focus-out events
focus_controller = Gtk.EventControllerFocus()
focus_controller.connect("leave", self._on_entry_focus_out)
self._entry.add_controller(focus_controller)
4. Created Launcher Script (run.sh)
#!/bin/bash
# Get the directory where this script is located
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
# Activate virtual environment if it exists
if [ -f "$SCRIPT_DIR/.venv/bin/activate" ]; then
source "$SCRIPT_DIR/.venv/bin/activate"
fi
# Force GTK to use native Wayland backend (not XWayland)
export GDK_BACKEND=wayland
# Preload GTK4 Layer Shell library to ensure proper initialization
export LD_PRELOAD=/usr/lib/libgtk4-layer-shell.so
# Run the application
exec python3 "$SCRIPT_DIR/main.py" "$@"
Key additions:
- Activates
.venvif present (fixes Ollama integration) - Sets
GDK_BACKEND=wayland(forces native Wayland) - Preloads GTK4 Layer Shell library (fixes linking order)
5. Added Environment Detection (main.py:41-50)
Warns users if they're running with the wrong backend configuration.
6. Fixed Ollama SDK Compatibility (ollama_client.py:59-76, 94-109)
Updated OllamaClient to handle both dictionary responses (old SDK) and Pydantic objects (new SDK v0.6.0+):
# Handle both dict responses (old SDK) and Pydantic objects (new SDK)
if isinstance(result, dict):
message = result.get("message")
role = message.get("role") or "assistant"
content = message.get("content") or ""
else:
# Pydantic object (ollama SDK >= 0.4.0)
message = getattr(result, "message", None)
role = getattr(message, "role", "assistant")
content = getattr(message, "content", "")
This ensures compatibility with both old and new versions of the ollama Python package.
How to Run
Use the launcher script:
./run.sh
Or set environment variables manually:
GDK_BACKEND=wayland LD_PRELOAD=/usr/lib/libgtk4-layer-shell.so python3 main.py
Do NOT run directly with python3 main.py if you have GDK_BACKEND=x11 in your environment, as this will cause the layer shell initialization to fail.
Expected Behavior
After these fixes:
- ✅ No "Failed to initialize layer surface" warnings
- ✅ No "GtkWindow is not a layer surface" warnings
- ✅ Reduced "GtkText - did not receive a focus-out event" warnings (GTK4 internal issue, mostly mitigated)
- ✅ Window properly anchored to the left edge of your screen
- ✅ Window appears as a layer surface in Niri
- ✅ Ollama integration working - receives and displays responses
- ✅ Conversation history persisted properly
Testing
Run the application with the launcher script and verify:
- Minimal warnings in the console output (only harmless Vulkan warnings may appear)
- Window appears on the left edge of the screen
- Window stays anchored when switching workspaces
- Text input works properly
- Ollama responses are received and displayed
- Conversations are saved and restored on restart
Quick Test
./run.sh
# Type a message in the UI and press Enter
# You should see a response from Ollama
Troubleshooting
"No content received from Ollama" Error
Symptom: The application displays "No content received from Ollama" or similar errors.
Causes:
- The
ollamaPython package is not installed - The virtual environment is not activated
- Ollama server is not running
Solutions:
# Ensure Ollama is installed and running
curl -s http://127.0.0.1:11434/api/tags
# Install the ollama package in your venv
source .venv/bin/activate
pip install ollama
# Always use the launcher script (it activates the venv)
./run.sh
Layer Shell Initialization Fails
Symptom: "Failed to initialize layer surface" warning appears.
Causes:
GDK_BACKEND=x11is set (forces XWayland instead of native Wayland)- GTK4 Layer Shell library not installed
- Not running on a Wayland compositor
Solutions:
# Check your environment
echo $GDK_BACKEND # Should be empty or "wayland"
echo $WAYLAND_DISPLAY # Should show your Wayland display (e.g., "wayland-1")
# Unset GDK_BACKEND if it's set to x11
unset GDK_BACKEND
# Install GTK4 Layer Shell (Arch Linux)
sudo pacman -S gtk4-layer-shell
# Use the launcher script (it sets the correct environment)
./run.sh