• PeriodicallyPedantic@lemmy.ca
      link
      fedilink
      arrow-up
      12
      ·
      10 hours ago

      That means that functionally the LLM has access to your location.

      The tool needs to be running on your device to have access to the location, and apps can’t/don’t really call each other in the background, which means the chat app has access to your location, which means the LLM can request access to your location via the tool or the app can just send that information back to home base whenever it wants.

    • yermaw@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      9 hours ago

      Its like saying i dont have access to your address, when the big book of everybody’s address including yours is on the desk in front of me, that I can look at whenever required.

    • bluesheep@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      12 hours ago

      I also know that iOS allows an approximate location to be sent to apps, which maybe is the case here.

      Which doesn’t take away from the creep factor let me set that straight.

      • prettybunnys@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        11 hours ago

        I think that’s still a permission, by default it’s “general area” but you can also allow more fine grained location data

        • village604@adultswim.fan
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          Oh, I know how bad it can be. On my cell network I constantly get online stores thinking I’m in a city 8 hours away.

          But it can be accurate, and might have been enough in this case to get the result.

      • oortjunk@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        arrow-down
        4
        ·
        12 hours ago

        It very very much does if you understand how that sausage is made.

        To the untrained eye though, I feel that.

        • MotoAsh@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          Does it if you know, though…?

          IMO, even involving location and private data in the digital ecosystem that includes a centralized LLM is a very unwise thing to do.

          We both know that LLMs can and will spit out ANYTHING in their training data regrdless of how many roadblocks are put up and protective instructions given.

          While they’re not necessarily feeding outright personal info (of the general public, anyways) in to their LLMs’ models, we should also both know how slovenly greedy these cunt corpos are. It’ll only be a matter of time before they’re feeding everything they clearly already have in.

          At that point, it won’t just be creep factor, but a legitimate doxxing problem.

          • oortjunk@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            30 minutes ago

            This made me think of that “Im a robot” movie starring Fresh prince of Bel-air, when he had that hologram of that guy whose murder he was tryna solve, and it only answered him when he asked the right question. Definitely a tool call.

            Also there was an AI in that that drove a bulldozer at fresh prince and made the robots glow red angrily (and also probably had location data accessible it it).

            I’m not trying to say that it’s art imitating life or anything because even my elastic definition of art can’t be stretched that far, but it’s sure something!

        • MadameBisaster@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          12 hours ago

          Yeah and means that it can call on the location too, so while it doesnt have direct access it has indirect access. If thats a problem anyone has to fecide for themself