• m_‮f@discuss.online
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    What do “sense” and “perceived” mean? I think they both loop back to “aware”, and the reason I point that out is that circular definitions are useless. How can you say that plants lack a sense of self and consciousness, if you can’t even define those terms properly? What about crown shyness? Trees seem to be able to tell the difference between themselves and others.

    As an example of the circularity, “sense” means (using Wiktionary, but pick your favorite if you don’t like it) “Any of the manners by which living beings perceive the physical world”. What does “perceive” mean? “To become aware of, through the physical senses”. So in your definition, “aware” loops back to “aware” (Wiktionary also has a definition of “sense” that just defines it as “awareness”, for a more direct route, too).

    I meant that plants don’t have thoughts more in the sense of “woah, dude”, pushing back on something without any explanatory power. But really, how do you define “thought”? I actually think Wiktionary is slightly more helpful here, in that it defines “thought” as “A representation created in the mind without the use of one’s faculties of vision, sound, smell, touch, or taste”. That’s kind of getting to what I’ve commented elsewhere, with trying to come up with a more objective definition based around “world model”. Basing all of these definitions on “representation” or “world model” seems to the closest to an objective definition as we can get.

    Of course, that brings up the question of “What is a model?” / “What does represent mean?”. Is that just pushing the circularity elsewhere? I think not, if you accept a looser definition. If anything has an internal state that appears to correlate to external state, then it has a world model, and is at some level “conscious”. You have to accept things that many people don’t want to, such as that AI is conscious of much of the universe (albeit experienced through the narrow peephole of human-written text). I just kind of embraced that though and said “sure, why not?”. Maybe it’s not satisfying philosophically, but it’s pragmatically useful.

    • auraithx@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      Your definition of consciousness as any “internal state correlating to external state” is functionally too broad; by this metric, a mercury thermometer possesses a “world model” and is therefore conscious, which renders the term useless for distinguishing complex biology from simple causality. Phenomena like crown shyness are better explained by mechanical feedback loops, essentially biological if/then statements based on light and abrasion, rather than a self-aware “sense of self.” A true “thought” or “world model” requires the capacity for “offline” simulation (counterfactuals) decoupled from immediate sensory input, whereas plants are entirely reactive (“online”) and current AI lacks continuous internal state. Ultimately, you are conflating reception (reflexive data intake) with perception (integrated awareness), failing to distinguish between the mechanism of a map and the subjective experience of the territory.

      • m_‮f@discuss.online
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        6 hours ago

        Well, it seems kind of absurd, but why doesn’t a thermometer have a world model? Taken as a system, it’s “conscious” of the temperature.

        If you scale up enough mechanical feedback loops or if/then statements, why don’t you get something you can call “conscious”?

        The distinction you’re making between online and offline seems to be orthogonal. Would an alien species much more complex than us laugh and say “Of course humans are entirely reactive, not capable of true thought. All their short lives are spent reacting to input, some of it just takes longer to process than other input”? Conversely, if a pile of if/then statements is complex enough that it appears to be decoupled from immediate sensory input like a Busy beaver, is that good enough?

        Put another way, try to have a truly novel thought, unrelated to the total input you’ve received in your life. Are you just reactive?