Quantcast
Channel: Active questions tagged ubuntu - Stack Overflow
Viewing all articles
Browse latest Browse all 7074

Connecting Ollama with Open-WebUI - Basic, CPU-based, and Slow

$
0
0

Please feel free to remove if this violates any rules, or if this has been provided already.I did a search but wasn't able to find these simple instructions here, but there was one person trying to get help which did not work for me. I had the worst time trying to set up Ollama with Open-WebUI, so I wanted to share what I did to finally get this going using Docker. I'm sharing this because it is really simple, but for some reason not readily handy to get a local version in Docker working where I could use a web browser locally on the network with ubuntu server running docker and managing Ollama and Open-WebUI in seperate containers. Here are my notes and what I did to get it working. Feel free to try them:

Create a container and download ollama. Create persistence:

docker run -d  --name ollama  -p 11434:11434   -v ollama_volume:/root/.ollama ollama/ollama:latest

Check to ensure the container was created:

docker ps

Create container and download Open-WebUI:

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Pull llama3.2 into ollama and test run:

docker exec -it ollama ollama run llama3.2

To exit, type /bye

Connecting using browser:

[server IP]:3000

Keep in mind this is just a basic instance. It is slow compared to GPU-based LLM's.

I'll share this on Open-WebUI as well, so you may find there too. Stack Overflow has helped me tremendously with various projects over the years, so I wanted to return the favor where I can. Have a fantastic day!


Viewing all articles
Browse latest Browse all 7074

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>