Lädt...


🔧 Self-Hosted Chat Solutions for LLMs: Open WebUI


Nachrichtenbereich: 🔧 Programmierung
🔗 Quelle: dev.to

I have been experimenting with various Large Language Models (LLMs) for things like proof reading, code suggestions, learning Japanese using platforms like GhatGPT and Claude AI. However, I keep hitting the daily limits and I have to go to different websites to use their chat interfaces. There subscriptions are not cheap.

I then tried some local LLMs, using Ollama. Ollama doesnt have an official chat interface, and I found Open WebUI. It is super easy to set up, and supports both OpenAI API and Ollama out of the box using your API keys. Although there are still costs associated with purchasing credit to use the API keys, these expenses are significantly lower than the subscriptions. Since I already have credits for some of my personal projects so this is a plus.

Open WebUI can be extended use Anthropic API keys (and other LLM API keys) using Pipelines.

Step 1: Install Ollama (Optional)

Ollama is an open source tool that enables us to run various local LLMs on our own machines. To get started, download the installer from https://ollama.com/. Once install, use the CLI to pull a model. For reference, the available models are listed here.

For example, to pull the model llama3.2, use this command ollama pull llama3.2 in your terminal.

Step 2: Set Up Open WebUI

The setup is straightforward using docker (recommended). The Open WebUI Getting Started Page also has instructions for alternative installation methods.

Once the setup process is completed, Open WebUI will be available at http://localhost:3000

Navigate to Admin Panel > Settings > Connections to enter your openAI API Key

Step 3: Install Pipelines (Optional)

Pipelines are like plugins for Open WebUI. They enable us to perform actions using non-OpenAI LLM providers, and much more.

Again, setting up Pipelines is pretty straightforward using docker. Instructions are available here.

The Docker Desktop containers page should look like this (Im running Open WebUI on port 3080 instead of 3000).

Navigate to Admin Panel > Settings > Connections > OpenAI API , and add a new entry:

API Base URL: http://localhost:9099 API KEY: 0p3n-w3bu!

If Open WebUI is also running in a docker container, use the following configuration:

Base URL: http://host.docker.internal:9099API KEY: 0p3n-w3bu!

The Connections section should now look like this.

We will add the Anthropic Manifold Pipeline as an example. All the official pipelines are available here. Its important to only fetch pipelines from sources you trust.

Paste this URL into the Install from Github URL and then click the install button:

https://github.com/open-webui/pipelines/blob/main/examples/pipelines/providers/anthropic_manifold_pipeline.py

After installing the pipeline, enter your Anthropic API Key and click Save.

Now I can pick Anthropic, OpenAI or Ollama models using the same app.

Open WebUI is a powerful tool that can be customized to meet different needs. To explore the various features it offers and how they can enhance your experience, be sure to check out their features page here.

...

📰 Meet Baize: An Open-Source Chat Model with Parameter-Efficient Tuning on Self-Chat Data


📈 26.99 Punkte
🔧 AI Nachrichten

🔧 LLMs can learn self-restraint through iterative self-reflection


📈 26.68 Punkte
🔧 Programmierung

🔧 <self-close /> or <not-to-self-close></not-to-self-close>


📈 25.35 Punkte
🔧 Programmierung

🕵️ CVE-2024-5933 | parisneo lollms-webui Chat Message cross site scripting


📈 24 Punkte
🕵️ Sicherheitslücken

📰 Shifting Tides: The Competitive Edge of Open Source LLMs over Closed Source LLMs


📈 23.59 Punkte
🔧 AI Nachrichten

🔧 Paddler - open-source llama.cpp load balancer (self-host LLMs in production)


📈 22.25 Punkte
🔧 Programmierung

🔧 Leveraging GitHub Copilot Chat syntax: chat participants, chat variables, slash commands


📈 21.79 Punkte
🔧 Programmierung

🕵️ CVE-2024-6707 | Open WebUI 0.1.105 path traversal


📈 20.75 Punkte
🕵️ Sicherheitslücken

⚠️ Open WebUI 0.1.105 Persistent Cross Site Scripting


📈 20.75 Punkte
⚠️ PoC

🕵️ Open WebUI 0.1.105 File Upload / Path Traversal


📈 20.75 Punkte
🕵️ Sicherheitslücken

📰 KL-001-2024-005: Open WebUI Stored Cross-Site Scripting


📈 20.75 Punkte
📰 IT Security Nachrichten

📰 KL-001-2024-006: Open WebUI Arbitrary File Upload + Path Traversal


📈 20.75 Punkte
📰 IT Security Nachrichten

🔧 Open WebUI + talkd.ai/Dialog: RAG deployment and Development made easy with awesome UI!


📈 20.75 Punkte
🔧 Programmierung

🔧 Get ChatGPT Equivalent in Under 5 Minutes with Ollama & Open WebUI


📈 20.75 Punkte
🔧 Programmierung

🔧 Extending Open WebUI: Beyond a ChatGPT Alternative


📈 20.75 Punkte
🔧 Programmierung

🔧 Setting Up an Ollama + Open-WebUI Cluster


📈 20.75 Punkte
🔧 Programmierung

🔧 The Multi-LLM Showdown: Seeking Consensus on Business Strategy with Open WebUI


📈 20.75 Punkte
🔧 Programmierung

🔧 Setup Llama 3 using Ollama and Open-WebUI


📈 20.75 Punkte
🔧 Programmierung

🔧 Running Ollama and Open WebUI containers on Jetson Nano with GPU Acceleration: A Complete Guide


📈 20.75 Punkte
🔧 Programmierung

🔧 I Said Goodbye to ChatGPT and Hello to Llama 3 on Open WebUI - You Should Too


📈 20.75 Punkte
🔧 Programmierung

🔧 Running AI Models with Open WebUI


📈 20.75 Punkte
🔧 Programmierung

🔧 How to Run Llama 3 Locally with Ollama and Open WebUI


📈 20.75 Punkte
🔧 Programmierung

🕵️ CVE-2024-7038 | open-webui up to 0.3.8 Admin Setting information disclosure


📈 20.75 Punkte
🕵️ Sicherheitslücken

🔧 Open WebUI + talkd.ai/Dialog: RAG deployment and Development made easy with awesome UI!


📈 20.75 Punkte
🔧 Programmierung

🪟 Einen KI-Server mit Ollama und Open WebUI einrichten - Computer Weekly


📈 20.75 Punkte
🪟 Windows Server

🕵️ CVE-2024-6706 | Open WebUI 0.1.105 cross site scripting


📈 20.75 Punkte
🕵️ Sicherheitslücken

🔧 Even LLMs need education—quality data makes LLMs overperform


📈 19.57 Punkte
🔧 Programmierung

📰 Researchers automated jailbreaking of LLMs with other LLMs


📈 19.57 Punkte
📰 IT Security Nachrichten

matomo