this post was submitted on 27 Feb 2025
11 points (100.0% liked)

Technology

68066 readers
3756 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

I somehow doubt this, but interesting read nonetheless

top 18 comments
sorted by: hot top controversial new old
[–] [email protected] 16 points 1 month ago* (last edited 1 month ago)

Heres a summary of the predictions made, from never all the way up to within the year. It seems to me the closer you get to the dollar bill the sooner the projections become.

"Some experts predict it will never happen..."

"Some experts argue that human intelligence is more multifaceted than what the current definition of AGI describes." (That AGI is not possible.)

"Most agree that AGI will arrive before the end of the 21st century."

"Some researchers who’ve studied the emergence of machine intelligence think that the singularity could occur within decades."

Current surveys of AI researchers are predicting AGI around 2040"

"Entrepreneurs are even more bullish, predicting it around ~2030"

"The CEO of Anthropic, who thinks we’re right on the threshold—give it about 12 more months or so."

[–] [email protected] 12 points 1 month ago

Predictions across the field range from a few months to a few decades

lol worthless.

[–] [email protected] 11 points 1 month ago

Humanity won't be able to finish a lego build in the next 12 months.

[–] [email protected] 8 points 1 month ago

I tried search the web for a particular comic - I think it might have been smbc - where the person's prediction of when the singularity was inversely proportional to how long they had to live, but I can't find it.

The last panel was an old guy saying "The singularity will arrive by Friday! Hopefully before 5..."

[–] [email protected] 7 points 1 month ago (1 children)

I also doubt this, even after reading this. I remain sceptical that we will ever even reach it.

[–] [email protected] 2 points 1 month ago

I think we sooner discover that human intelligence is more “on the rails” and a “super complex flow chart” before we discover AGI.

[–] [email protected] 6 points 1 month ago (1 children)

I will happily bet any amount of money and possessions at any ratio to any "expert" that we won't.

My entire life earnings, every possession I own, times 100, at a 1000:1 ratio. Please, "experts" bet me, I would love to go all in on this prediction.

[–] [email protected] 5 points 1 month ago (1 children)

You're leaving money on the table if you don't leverage your bet first by taking out loans to bet more.

[–] [email protected] 2 points 1 month ago

Reminds me of a line from the recurring Daily Show “sports” segment: “Gambling. Because home ownership is a burden.”

[–] [email protected] 4 points 1 month ago* (last edited 1 month ago)

However, not everyone thinks AGI is a dead certainty. Some experts argue that human intelligence is more multifaceted than what the current definition of AGI describes. For example, some AI experts think of the human mind in terms of eight intelligences, of which “logical-mathematical” is just one (alongside it exists, for example, interpersonal, intrapersonal, and existential intelligence). Deep learning pioneer Yann LeCun thinks AGI should be rebranded to “advanced machine intelligence,” and argues that human intelligence is too specialized to be replicable. The report also suggests that, while AI can be an important tool in making new discoveries, it can’t make these discoveries on its own.

This is more realistic.

LLMs aren’t even close to intelligent, they just regurgitate information. Human thought is many times more complex. Even basic animals are more complex.

[–] [email protected] 3 points 1 month ago (1 children)
[–] [email protected] 1 points 1 month ago

Yawn. Ok buddy scientist.

[–] [email protected] 2 points 1 month ago* (last edited 1 month ago)

The singularity is an interesting idea, but further analysis to me indicate that physical barriers will prevent it from ever happening.
Yes we have development at an increasing pace, but the barriers for improvements are increasing even more as we approach physical barriers. So we are not approaching the singularity, but we are approaching what could be the peak of fast progress, especially on living standards, where it may already have passed for the developed world.

Ray Kurzweil is a brilliant man, but I think he miscalculated regarding the singularity.

[–] [email protected] 2 points 2 weeks ago

Another classic case of someone just sharing an article and then getting bombed with down votes.

@[email protected] I am sorry. Here is my upvote. Have a good day.

[–] [email protected] 2 points 1 month ago (2 children)

Scientist say it's possible, yet let's see Lemmy users deny this possibility.

[–] [email protected] 7 points 1 month ago

I canceled my Popular Mechanics subscription 20 years ago because of all the improbable headlines like this. They keep doing it.

[–] [email protected] 3 points 1 month ago

A Boltzmann brain is possible. Just not bloody likely.

[–] [email protected] 1 points 1 month ago