Web Search RAG Added to Ollama Colab Private Chat
Ollama Colab Private Chat now supports Web Search RAG. Just flip the 🔍 Web Search toggle in the chat UI to include DuckDuckGo search results as context when sending messages to the LLM.
What's New
Web Search Toggle
A 🔍 Web Search toggle button has been added to both the Inline and Standalone chat UIs. It is off by default, so existing behavior is completely unchanged — search only runs when you explicitly enable it.
LLM-Optimized Query Generation
Rather than sending the user's raw input directly to the search engine, the local LLM first extracts a concise keyword query from the message before searching. This keeps search accuracy high even when questions are written in natural, conversational language.
Source Citations Below Responses
After each LLM response, the titles and links of the referenced search results are displayed as a source list, so you can verify the basis of any answer right in the chat.
TTL Cache
Search results for identical queries are cached for 5 minutes, reducing the number of requests sent to DuckDuckGo even when the same topic is asked about repeatedly.
Web Search Configuration Parameters
The following parameters have been added to the top of the Server cell and can be adjusted via the Colab form.
| Parameter | Description | Default |
|---|---|---|
SEARCH_MAX_RESULTS | Maximum number of search results to retrieve | 5 |
SEARCH_BODY_LENGTH | Maximum number of characters extracted from each result body | 300 |
SEARCH_TIME_LIMIT | Time-range filter for search results | No limit |
SEARCH_REGION | Region / language for search results | English |
Other Changes
- The default value of
num_ctxhas been raised from 4096 to 8192. A longer context window is now the default to accommodate search results being injected as RAG context. - The
Model Registry,Server, andChat UI — Standalonecells are now collapsed (collapsed) by default, improving notebook readability. - Inline mode streaming and Tunnel mode connection handling have been refactored and simplified.
Getting Started
No environment setup is required. Open the Colab link below and run the cells from top to bottom.
- Run on Google Colab: Ollama Colab Private Chat (English)
- Browse the source: hiroaki-com/colab-ollama-private-chat on GitHub
Feedback and Pull Requests are always welcome.
