this post was submitted on 21 Sep 2023
279 points (97.0% liked)

Science Memes

15726 readers
2138 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 12 comments
sorted by: hot top controversial new old
[–] [email protected] 32 points 2 years ago* (last edited 2 years ago) (2 children)

Dude on the right is correct that perturbed gradient descent with threshold functions and backprop feedback was implemented before most of us were born.

The current boom is an embarrassingly parallel task meeting an architecture designed to run that kind of task.

[–] [email protected] 17 points 2 years ago

The current boom is an embarrassingly parallel task meeting an architecture designed to run that kind of task.

Plus organizations outside of the FAANGs having hit critical mass on data that's actually useful for mass comparison multiple correlation analyses, and data as a service platforms making things seem sexier to management in those organizations.

[–] [email protected] 11 points 2 years ago (1 children)

Random but why is "embarrassing" or similar adjectives so often used to describe a parallel program? What's embarrassing about it?

[–] [email protected] 14 points 2 years ago (1 children)

"Embarrassingly parallelizable" is just the term for a process that can be perfectly paralleled.

[–] [email protected] 6 points 2 years ago (3 children)

rather odd choice of adjective though

[–] [email protected] 13 points 2 years ago

I think the usage implies it's so easy to parallelize that any competent programmer should be embarrassed if they weren't running it in parallel. Whereas many classes of problems can be extremely complex or impossible to parallelize, and running them sequentially would be perfectly acceptable.

[–] [email protected] 2 points 2 years ago

It's commonly used in some corners of computer science

[–] [email protected] 1 points 2 years ago

It's in the same spirit as the phrase "an embarrassment of riches". So a bit of an archaic usage.

[–] [email protected] 11 points 2 years ago* (last edited 2 years ago)

Man i dont know. I had an introductery lecture into ML and we were told of some kernel stuff, where you look at a space that could be infinite dimensional and that you do some math to project into low dimensional feature space, where your seperation still works because of your kernel function.

That isnt some black box art form, that is clearly black magic.

[–] [email protected] 6 points 2 years ago

The reached the right end pretty quickly. One of the reasons I gave up on ML rather fast. Hyperparameter tuning is really, really random.

[–] [email protected] 4 points 2 years ago* (last edited 2 years ago)

There is truth in this, but it isn't as true as some people seem to think. it's true that trial and error is a real part of working in ml, but it isn't just luck whether something works or not. We do know why some models work better than others for many tasks, there are some cases in which some mannual hyperparameter tuning is good, there was a lot of progress in the last 50 years, and so on.

[–] [email protected] 2 points 2 years ago

Maybe we should try sacrificing a farm animal to ML. If we're getting into the realm of magic, there are established practices going back thousands of years.