this post was submitted on 19 Oct 2025
736 points (98.5% liked)

Science Memes

17149 readers
3017 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Aceticon@lemmy.dbzer0.com 1 points 3 days ago (2 children)

I haven't really done Neural Networks in 2 decades, and was under the impression that NNs pretty much dominate Machine Learning nowadays, whilst stuff like Genetic Algorithms were way less popular or not at all used anymore.

Is that the case?

[–] howrar@lemmy.ca 1 points 3 days ago

Neural networks are a class of models. Genetic algorithms are a class of learning algorithms. You use learning algorithms to train models. Genetic algorithms are a valid way of training neural networks, but this is not currently in vogue. They're typically trained via gradient descent.

[–] Nikls94@lemmy.world 1 points 3 days ago (1 children)

I don’t know if it is the case for the world today, but all those Models behave like genetic algorithms and IF-functions with a little RNG sprinkled on top of them.

[–] Aceticon@lemmy.dbzer0.com 3 points 3 days ago* (last edited 3 days ago)

You mean that they're actually competing multiple variants of a model against each other to see which ones get closer to generating the expected results, and picking the best ones to create the next generation?

Because that's how Genetic Algorithms work and get trained, which is completelly different from how Neural Networks work and get trained.

Also the links in Neural Networks don't at all use IF-functions: the output of a neuron is just a mathematical operation on the values of all its inputs (basically a sum of the results of a function applied to the input numbers, though nowadays there are also cyclic elements) - the whole thing is just floating values being passed down the network (or back up the network during training) whilst being transformed by some continuous function or other with no discontinuity like you would get with IF involved.