I haven’t really done Neural Networks in 2 decades, and was under the impression that NNs pretty much dominate Machine Learning nowadays, whilst stuff like Genetic Algorithms were way less popular or not at all used anymore.
I don’t know if it is the case for the world today, but all those Models behave like genetic algorithms and IF-functions with a little RNG sprinkled on top of them.
You mean that they’re actually competing multiple variants of a model against each other to see which ones get closer to generating the expected results, and picking the best ones to create the next generation?
Because that’s how Genetic Algorithms work and get trained, which is completelly different from how Neural Networks work and get trained.
Also the links in Neural Networks don’t at all use IF-functions: the output of a neuron is just a mathematical operation on the values of all its inputs (basically a sum of the results of a function applied to the input numbers, though nowadays there are also cyclic elements) - the whole thing is just floating values being passed down the network (or back up the network during training) whilst being transformed by some continuous function or other with no discontinuity like you would get with IF involved.
I haven’t really done Neural Networks in 2 decades, and was under the impression that NNs pretty much dominate Machine Learning nowadays, whilst stuff like Genetic Algorithms were way less popular or not at all used anymore.
Is that the case?
I don’t know if it is the case for the world today, but all those Models behave like genetic algorithms and IF-functions with a little RNG sprinkled on top of them.
You mean that they’re actually competing multiple variants of a model against each other to see which ones get closer to generating the expected results, and picking the best ones to create the next generation?
Because that’s how Genetic Algorithms work and get trained, which is completelly different from how Neural Networks work and get trained.
Also the links in Neural Networks don’t at all use IF-functions: the output of a neuron is just a mathematical operation on the values of all its inputs (basically a sum of the results of a function applied to the input numbers, though nowadays there are also cyclic elements) - the whole thing is just floating values being passed down the network (or back up the network during training) whilst being transformed by some continuous function or other with no discontinuity like you would get with IF involved.