• m_‮f@discuss.online
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    2 days ago

    What does “aware” mean, or “knowledge”? I think those are going to be circular definitions, maybe filtered through a few other words like “comprehend” or “perceive”.

    Does a plant act with deliberate intention when it starts growing from a seed?

    To be clear, my beef is more with the definition of “conscious” being useless and/or circular in most cases. I’m not saying “woah, what if plants have thoughts dude” as in the meme, but whatever definition you come up with, you have to evaluate why it does or doesn’t include plants, simple animals, or AI.

    • auraithx@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      Aware means it has a sense of self. They are circular because we use these words to define how that is perceived.

      Plants do not act deliberately when they do anything, because they do not have a sense of self and are not conscious.

      If you don’t think plants have thoughts then you agree they are not conscious.

        • auraithx@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          We also can’t know “for certain” that a rock isn’t screaming silently, or that there isn’t a china teapot orbiting the sun between Earth and Mars. Science doesn’t deal in absolute certainties; it deals in probabilities based on evidence. There is zero evidence for plant consciousness and massive evidence against it.

          Consciousness, as far as we observe it in the entire animal kingdom, is an emergent property of a centralized nervous system processing information. Plants lack neurons, a brain, or any substrate capable of integrating information into a unified experience.

          Claiming a plant might be conscious is like claiming a calculator might be running Call of Duty. It’s not that we “don’t know”, it’s that the hardware simply cannot run that software.

          Evolutionarily, consciousness (and specifically the ability to feel pain or fear) is a mechanism to trigger escape or avoidance. Since plants are sessile (they cannot move), developing a complex, energy-expensive system to “feel” damage would be a massive evolutionary disadvantage. Why would nature select for an organism that can feel being eaten but do absolutely nothing about it?

          • commie@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 hours ago

            I’m not defending the claim “plants think”. I’m saying there is not, and cannot be, evidence to the contrary. which you seem to understand, so I don’t know why you felt it took four paragraphs to agree with me.

            • auraithx@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 hours ago

              Those paragraphs outlined the evidence which explain why it’s as improbable as any other nonsense statement.

      • m_‮f@discuss.online
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        What do “sense” and “perceived” mean? I think they both loop back to “aware”, and the reason I point that out is that circular definitions are useless. How can you say that plants lack a sense of self and consciousness, if you can’t even define those terms properly? What about crown shyness? Trees seem to be able to tell the difference between themselves and others.

        As an example of the circularity, “sense” means (using Wiktionary, but pick your favorite if you don’t like it) “Any of the manners by which living beings perceive the physical world”. What does “perceive” mean? “To become aware of, through the physical senses”. So in your definition, “aware” loops back to “aware” (Wiktionary also has a definition of “sense” that just defines it as “awareness”, for a more direct route, too).

        I meant that plants don’t have thoughts more in the sense of “woah, dude”, pushing back on something without any explanatory power. But really, how do you define “thought”? I actually think Wiktionary is slightly more helpful here, in that it defines “thought” as “A representation created in the mind without the use of one’s faculties of vision, sound, smell, touch, or taste”. That’s kind of getting to what I’ve commented elsewhere, with trying to come up with a more objective definition based around “world model”. Basing all of these definitions on “representation” or “world model” seems to the closest to an objective definition as we can get.

        Of course, that brings up the question of “What is a model?” / “What does represent mean?”. Is that just pushing the circularity elsewhere? I think not, if you accept a looser definition. If anything has an internal state that appears to correlate to external state, then it has a world model, and is at some level “conscious”. You have to accept things that many people don’t want to, such as that AI is conscious of much of the universe (albeit experienced through the narrow peephole of human-written text). I just kind of embraced that though and said “sure, why not?”. Maybe it’s not satisfying philosophically, but it’s pragmatically useful.

        • auraithx@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          Your definition of consciousness as any “internal state correlating to external state” is functionally too broad; by this metric, a mercury thermometer possesses a “world model” and is therefore conscious, which renders the term useless for distinguishing complex biology from simple causality. Phenomena like crown shyness are better explained by mechanical feedback loops, essentially biological if/then statements based on light and abrasion, rather than a self-aware “sense of self.” A true “thought” or “world model” requires the capacity for “offline” simulation (counterfactuals) decoupled from immediate sensory input, whereas plants are entirely reactive (“online”) and current AI lacks continuous internal state. Ultimately, you are conflating reception (reflexive data intake) with perception (integrated awareness), failing to distinguish between the mechanism of a map and the subjective experience of the territory.

          • m_‮f@discuss.online
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 hours ago

            Well, it seems kind of absurd, but why doesn’t a thermometer have a world model? Taken as a system, it’s “conscious” of the temperature.

            If you scale up enough mechanical feedback loops or if/then statements, why don’t you get something you can call “conscious”?

            The distinction you’re making between online and offline seems to be orthogonal. Would an alien species much more complex than us laugh and say “Of course humans are entirely reactive, not capable of true thought. All their short lives are spent reacting to input, some of it just takes longer to process than other input”? Conversely, if a pile of if/then statements is complex enough that it appears to be decoupled from immediate sensory input like a Busy beaver, is that good enough?

            Put another way, try to have a truly novel thought, unrelated to the total input you’ve received in your life. Are you just reactive?