herseycokguzelolacak

joined 3 days ago
[–] [email protected] 10 points 7 hours ago

It's open source. Which is way better than ClosedAI, or anyone else. Show show gratitude folks.

[–] [email protected] 4 points 8 hours ago (1 children)

This is beautiful

[–] [email protected] 5 points 9 hours ago

You mean like how Israel destroyed almost every hospital in Gaza?

Israel started this way. Now they are getting the same treatment they gave to Palestine or Lebanon.

[–] [email protected] 5 points 9 hours ago (2 children)

Iran is just defending itself from Israeli terrorism.

[–] [email protected] 1 points 1 day ago

Not on top of my head, but there must be something. llama.cpp and vllm have basically solved the inference problem for LLMs. What you need is a RAG solution on top that also combines it with web search.

[–] [email protected] 5 points 2 days ago (2 children)

for coding tasks you need web search and RAG. It's not the size of the model that matters, since even the largest models find solutions online.

[–] [email protected] 89 points 2 days ago (5 children)

This is a ''The worst person you know just made a great point.'' moment, isn't it?

[–] [email protected] 1 points 2 days ago

LLMs are great at automating tasks where we know the solution. And there are a lot of workflows that fall in this category. They are horrible at solving new problems, but that is not where the opportunity for LLMs is anyway.

[–] [email protected] 1 points 2 days ago

For VLMs I love Moondream2. It's a tiny model that packs a punch way above its size. Llama.cpp supports it.