• Nikls94@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 hours ago

    I don’t know if it is the case for the world today, but all those Models behave like genetic algorithms and IF-functions with a little RNG sprinkled on top of them.

    • Aceticon@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      5 hours ago

      You mean that they’re actually competing multiple variants of a model against each other to see which ones get closer to generating the expected results, and picking the best ones to create the next generation?

      Because that’s how Genetic Algorithms work and get trained, which is completelly different from how Neural Networks work and get trained.

      Also the links in Neural Networks don’t at all use IF-functions: the output of a neuron is just a mathematical operation on the values of all its inputs (basically a sum of the results of a function applied to the input numbers, though nowadays there are also cyclic elements) - the whole thing is just floating values being passed down the network (or back up the network during training) whilst being transformed by some continuous function or other with no discontinuity like you would get with IF involved.