this post was submitted on 21 Feb 2024
160 points (100.0% liked)

Technology

69947 readers
3251 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 17 comments
sorted by: hot top controversial new old
[–] [email protected] 31 points 1 year ago (2 children)

Direct link to the GitHub repo:
https://github.com/nickbild/local_llm_assistant?tab=readme-ov-file

It's a small model by comparison. If you want something that's offline and actually closer to comparing to ChatGPT 3.5, you'll want the Mixtral 8x7B model instead (running on a beefy machine):

https://mistral.ai/news/mixtral-of-experts/

[–] [email protected] 6 points 1 year ago (1 children)

Nice! Thats a cool project, ill have to give it a try. I love the idea of self hosting local LLMs. Ive been playing around with: https://lmstudio.ai/ and it directly downloads from hugging face.

[–] [email protected] 2 points 1 year ago

There's also ollama which seems to be similar. Not sure if LMStudio is open source but ollama is.

[–] [email protected] 26 points 1 year ago (1 children)

Can we have smaller more domain specific models. that shouldn't require more than casual hardware. like a small model for coding, one for medicine, one for history, and so on. ???

[–] [email protected] 16 points 1 year ago (1 children)

Check out hugging face! Honestly fine tunned models for specific domains seems very popular (if for nothing else because training smaller models is just easier!).

[–] [email protected] 6 points 1 year ago

That’s gonna be a no from me dawg