hotsox

joined 2 years ago
11
Jubilee Jim Fisk (en.wikipedia.org)
[–] [email protected] 23 points 9 months ago* (last edited 9 months ago)

Universal function approximation - neural networks.

Auto-differentiation - algorithmic calculation of partial derivatives (aka gradients)

Backpropagation - when using a neural network (or most ML algorithms actually), you find the difference between model prediction and original labels. And the difference is sent back as gradients (of the loss function)

Parameter dimensionality - the “neurons” in the neural network, ie, the weight matrices.

If thats your argument, its worse than Statistics imo. Atleast statistics have solid theorems and proofs (albeit in very controlled distributions). All DL has right now is a bunch of papers published most often by large tech companies which may/may not work for the problem you’re working on.

Universal function approximation theorem is pretty dope tho. Im not saying ML isn’t interesting, some part of it is but most of it is meh. It’s fine.

[–] [email protected] 15 points 2 years ago

Bojack Horseman.

[–] [email protected] 37 points 2 years ago (4 children)

If you're in US, you can borrow audiobooks from public libraries for free usually through an app. Its very convenient.

[–] [email protected] 3 points 2 years ago

Its not the same comparison tho. Tech is a far more lucrative career than coal mining/primary ed. Inequity in tech affects a lot more than inequity in those other fields.