Open WebUI is a user-friendly WebUI for LLMs. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.
See the demo with Ollama backend:
Want to run your own AI chatbot (like ChatGPT) locally? You can do that with Nix.
— NixOS Asia (@nixos_asia) June 18, 2024
Powered by services-flake (#NixOS), using @OpenWebUI and @ollama.
See example: https://t.co/dyItC93Pya pic.twitter.com/DeDow8bEPw
Getting Started
# In `perSystem.process-compose.<name>`
{
services.open-webui."open-webui1".enable = true;
}
Examples
Open WebUI with ollama backend
{
services = {
# Backend service to perform inference on LLM models
ollama."ollama1" = {
enable = true;
# The models are usually huge, downloading them in every project directory can lead to a lot of duplication
dataDir = "$HOME/.services-flake/ollama1";
models = [ "llama2-uncensored" ];
};
# Get ChatGPT like UI, but open-source, with Open WebUI
open-webui."open-webui1" = {
enable = true;
environment =
let
inherit (pc.config.services.ollama.ollama1) host port;
in
{
OLLAMA_API_BASE_URL = "http://${host}:${toString port}";
WEBUI_AUTH = "False";
};
};
};
# Start the Open WebUI service after the Ollama service has finished initializing and loading the models
settings.processes.open-webui1.depends_on.ollama1-models.condition = "process_completed_successfully";
}
See Ollama for more customisation of the backend.
Open browser on startup
{
services.open-webui."open-webui1".enable = true;
# Open the browser after the Open WebUI service has started
settings.processes.open-browser = {
command =
let
inherit (pc.config.services.open-webui.open-webui1) host port;
opener = if pkgs.stdenv.isDarwin then "open" else lib.getExe' pkgs.xdg-utils "xdg-open";
url = "http://${host}:${toString port}";
in
"${opener} ${url}";
depends_on.open-webui1.condition = "process_healthy";
};
}