• altphoto@lemmy.today
    link
    fedilink
    arrow-up
    1
    ·
    45 minutes ago

    I’ve never seen a tree wearing a ring, but they say if you count the rings you can tell how old it is. Sounds like an asshole playa to me.

  • Viking_Hippie@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    9 hours ago

    How to know thing

    Know thing.

    How to know thing from past

    Remember thing.

    How to know thing from future

    Predict thing.

    I’ll take my payment in a combination of Catalan Peseta, Cypriot Lira, and D-Marks, thanks.

  • tomiant@piefed.social
    link
    fedilink
    English
    arrow-up
    21
    ·
    13 hours ago

    To determine the age of a tree you would need to examine and perform measurements on the tree so as to learn its age.

    • stupidcasey@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 hours ago

      Could that include the incriminating of predetermined units within the constraints given from within the tree and it’s altercations of the environment?

  • expatriado@lemmy.world
    link
    fedilink
    arrow-up
    16
    arrow-down
    1
    ·
    13 hours ago

    trees are already 4-12 years old when planted, i guess that margin of error wont matter as much for really old trees

  • F/15/Cali@threads.net@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    13 hours ago

    Considerations: years are counted differently in some cultures. If you’re part of an uncontacted tribe, you might have an alternative calendar or numerical system.

    Trees: trees trees trees trees trees prompt: how to trees trees trees trees trees trees

    • hoshikarakitaridia@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      13 hours ago

      “figure it out” is not technically wrong but it’s worse than “it depends” so it’s definitely very unhelpful. Which is kind of the opposite of what an LLM should be.

      • Switorik@lemmy.zip
        link
        fedilink
        arrow-up
        4
        arrow-down
        3
        ·
        12 hours ago

        I’m only playing devils advocate. I’m not a fan of LLMs myself.

        You need to know how to use the tool to get the correct output. In this case, it’s giving you a literal answer. Craft your question in a way so that it will give you what you’re looking for. Look up “prompt engineer” for a more thorough answer. It’s how we thought LLMs were going to be to begin with.

        • hoshikarakitaridia@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          5 hours ago

          Disagree. The short term solution is for you to change your prompt but it is definitely a short-coming of the AI when the answer is strictly useless.

          It’s like crime: it should be safe everywhere anytime because of police and laws, but since it’s not, you can’t go everywhere anytime. That’s not on you, but you have to deal with it.

        • wischi@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          8 hours ago

          Though the phrase “prompt engineer” is so funny. Has literally nothing to do with engineering at all. Like having a PhD in “Google Search” 🤣