• rizzothesmall@sh.itjust.works
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    5 hours ago

    They’re confusing knowledge for function calling. If you say “what’s my nearest thingy?” It will call into a function in the app which will get your GPS coordinates to find a “thingy” near to there. and the result will be posted back to the LLM to serve you the result in human language. TheLLM doesn’t know anything that isn’t in the training / fine-tuning data, the context data, or function call results.

    • TrackinDaKraken@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 hours ago

      For those who believe AGI is right around the corner, this is just splitting hairs. The end result for AGI would be that it can discover your location easily, by making the function call, while still adamantly claiming it doesn’t “know”.

      “I don’t know. I really don’t know, but THAT guy knows, and he’ll tell me if I ask. You didn’t ask if I can find out, only whether I know.”

    • possumparty@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      5 hours ago

      Everything you need to know about Tonawanda in two sentences: It’s a polluted shithole located on the Niagara River in close proximity to Buffalo, it is a former sundown town who took down their last “Sundowners Neighborhood Watch” sign in 2020. There is a cult following for the Buffalo Bills, potholes the size of baby elephants, and several rails to trails paths.

      It’s been several years since I’ve been there and do not anticipate returning at any point.

  • qarbone@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    15 hours ago

    I’m presuming that the “scare” is that you didn’t give it location perms? Although that looks like Snapchat, which I remember having location perms.

    • Taldan@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 hours ago

      You can use Snapchat without giving location permissions. It’s just that several features don’t work without it

      Disclaimer: At least the last time I used it, this was the case (maybe 6 months ago?)

    • yermaw@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      5 hours ago

      Its like saying i dont have access to your address, when the big book of everybody’s address including yours is on the desk in front of me, that I can look at whenever required.

    • PeriodicallyPedantic@lemmy.ca
      link
      fedilink
      arrow-up
      11
      ·
      6 hours ago

      That means that functionally the LLM has access to your location.

      The tool needs to be running on your device to have access to the location, and apps can’t/don’t really call each other in the background, which means the chat app has access to your location, which means the LLM can request access to your location via the tool or the app can just send that information back to home base whenever it wants.

    • bluesheep@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      7 hours ago

      I also know that iOS allows an approximate location to be sent to apps, which maybe is the case here.

      Which doesn’t take away from the creep factor let me set that straight.

      • prettybunnys@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        7 hours ago

        I think that’s still a permission, by default it’s “general area” but you can also allow more fine grained location data

        • village604@adultswim.fan
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          Oh, I know how bad it can be. On my cell network I constantly get online stores thinking I’m in a city 8 hours away.

          But it can be accurate, and might have been enough in this case to get the result.

      • oortjunk@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        arrow-down
        4
        ·
        8 hours ago

        It very very much does if you understand how that sausage is made.

        To the untrained eye though, I feel that.

        • MotoAsh@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 hours ago

          Does it if you know, though…?

          IMO, even involving location and private data in the digital ecosystem that includes a centralized LLM is a very unwise thing to do.

          We both know that LLMs can and will spit out ANYTHING in their training data regrdless of how many roadblocks are put up and protective instructions given.

          While they’re not necessarily feeding outright personal info (of the general public, anyways) in to their LLMs’ models, we should also both know how slovenly greedy these cunt corpos are. It’ll only be a matter of time before they’re feeding everything they clearly already have in.

          At that point, it won’t just be creep factor, but a legitimate doxxing problem.

        • MadameBisaster@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 hours ago

          Yeah and means that it can call on the location too, so while it doesnt have direct access it has indirect access. If thats a problem anyone has to fecide for themself