It has some other nice features, in particular, I use:
- RAG (retrieval augmented generation) - easily include local and web based documents in chats and context
- Managing models and modelfiles on an Ollama server
- OpenAI model support alongside local models
- Users & RBAC to let friends and family play for free :)
Pipelines, tools, functions etc.
These are 3 things different solutions to the same problem: programatically interacting with LLM conversations.
pipeline: combine LLMs interactions
tools/toolkits: give LLMs tools that they can use to get context from other places
functions: code to run on conversations, e.g. to implement chat limits
Pipelines
.....
Tools and toolkits
Tools can be used in the LLM call, like the LLM is using them. For example, a "getname" function could read the user's session name.
Pipelines let you do complex stuff with LLM interactions through openwebui. It has all the familiar pipeline like concepts, different modules with inlets and outlets. Modules can be custom python code.
A function is a single module in the pipeline. Functions can be applied and used individually, or combined into pipelines, and then the pipelines used.
To use pipelines or functions in openwebui, there's normally 3 steps:
- get the code into open webui
- normally through somewhere in the workspaces tab, or add pipelines through admin > pipelines
- enable the code to run
- same place, once something is added, there's normally toggle "enable" button next to it
- associate the code with a model to use it
- for functions and tools, in workspaces > models, select the model you want to use, and tick the tools or functions you want to use.
- use it.
- for tools (and web search), in the actual chat window, the + sign must be pressed and the toolkit enabled
Channels, bots and dombot
openwebui 0.5 jsut dropped, and it adds basic support for channels, like slack or IRC channels: chats which can have multiple human users in them. There's also an API for the channels and that API can then be used to send messages to the channels, like a bot.
There's a sample MVP for a bot here: https://github.com/open-webui/bot which is just a python script
So is what I'm going to try and do is create a chat with qwencode and see if we can develop an MVP bot to speak onto the channel, and run it in openwebui chat. The main problem I can foresee is that the code seems to keep an open connection or websocket or something, which I'd guess openwebui python env won't like.