this post was submitted on 12 Dec 2024
15 points (100.0% liked)

LocalLLaMA

2792 readers
8 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

founded 2 years ago
MODERATORS
15
Fixed it (sh.itjust.works)
submitted 3 months ago* (last edited 3 months ago) by [email protected] to c/[email protected]
 

Seriously though, does anyone know how to use openwebui with the new version?

Edit: if you go into the ollama container using sudo docker exec -it bash, then you can pull models with ollama pull llama3.1:8b for example and have it.

top 1 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 3 months ago

For some reasons, there are now two models settings pages. One in the workspace, and another one in the admin settings (the old one was moved here). The feature you are looking for was probably just moved in the admin settings page