this post was submitted on 13 Nov 2024
704 points (100.0% liked)

Technology

67422 readers
3439 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 4 months ago (1 children)

I just want a portable self hosted LLM for specific tasks like programming or language learning.

[–] [email protected] 9 points 4 months ago (1 children)

You can install Ollama in a docker container and use that to install models to run locally. Some are really small and still pretty effective, like Llama 3.2 is only 3B and some are as little as 1B. It can be accessed through the terminal or you can use something like OpenWeb UI to have a more "ChatGPT" like interface.

load more comments (1 replies)
[–] [email protected] 9 points 4 months ago (4 children)

so long, see you all in the next hype. Any guesses?

load more comments (4 replies)
[–] [email protected] 9 points 4 months ago (6 children)

Seems to me the rationale is flawed. Even if it isn't strong or general AI, LLM based AI has found a lot of uses. I also don't recognize the claimed ignorance among people working with it, about the limitations of current AI models.

load more comments (6 replies)
[–] [email protected] 8 points 4 months ago

Nice, looking forward to it! So much money and time wasted on pipe dreams and hype. We need to get back to some actually useful innovation.

[–] [email protected] 8 points 4 months ago (1 children)

Short on the AI stocks before it crash!

load more comments (1 replies)
[–] [email protected] 8 points 4 months ago

Sigh I hope LLMs get dropped from the AI bandwagon because I do think they have some really cool use cases and love just running my little local models. Cut government spending like a madman, write the next great American novel, or eliminate actual jobs are not those use cases.

[–] [email protected] 8 points 4 months ago

I hope it all burns.

[–] [email protected] 7 points 4 months ago

Fingers crossed.

[–] [email protected] 7 points 4 months ago

Oh nice, another Gary Marcus "AI hitting a wall post."

Like his "Deep Learning Is Hitting a Wall" post on March 10th, 2022.

Indeed, not much has changed in the world of deep learning between spring 2022 and now.

No new model releases.

No leaps beyond what was expected.

\s

Gary Marcus is like a reverse Cassandra.

Consistently wrong, and yet regularly listened to, amplified, and believed.

[–] [email protected] 6 points 4 months ago
[–] [email protected] 6 points 4 months ago (1 children)

The tech priests of Mars were right; death to abominable intelligence.

load more comments (1 replies)
[–] [email protected] 5 points 4 months ago

Theres no bracing for this, OpenAI CEO said the same thing like a year ago and people are still shovelling money at this dumpster fire today.

load more comments
view more: ‹ prev next ›