Lemmy, I really would like to hear your opinions on this. I am bipolar. after almost a decade of being misdiagnosed and on medication that made my manic symptoms worse, I found stable employment with good insurance and have been able to find a good psychiatrist. I’ve been consistently medicated for the past 3 years, and this is the most stable I have been in my entire life.

The office has rolled out the use of an app called MYIO app. My knee jerk reaction was to not be happy about the app, but I managed my emotions, took a breath and vowed to give it a chance. After being sent the link to validate my account, the app would force restart my phone at the last step of activation. (I have my phone locked down pretty tight, and lots of google shit, and data sharing is disabled, so I’m thinking that might be the cause. My phone is also like 4-5 years old, so that could also be the cause.)

Luckily I was able to complete the steps on PC and activate that way. Once I was in the account there were standard forms to sign, like the HIPAA release. There was also a form there requesting I consent to the use of AI. Hell to the NO. That’s a no for me dawg.jpg.

I’m really emotional and not thinking rationally. I am hoping for the opinions of cooler heads.

If my doctor refuses to let me be a patient if I don’t consent to AI, what should I do? What would you do? Agree even though this is a major line in the sand for me, or consent to keep a provider I have a rapport with, who knows me well enough to know when my meds need adjusting?

EDIT: This is the text of the AI agreement. As part of their ongoing commitment to provide the best possible service, your provider has opted to use an artificial intelligence note-taking tool that assists in generating clinical documentation based on your sessions. This allows for more time and focus to be spent on our interactions instead of taking time to jot down notes or trying to remember all the important details. A temporary recording and transcript or summary of the conversation may be created and used to generate the clinical note for that session. Your provider then reviews the content of that note to ensure its accuracy and completeness. After the note has been created, the recording and transcript are automatically deleted.

This artificial intelligence tool prioritizes the privacy and confidentiality of your personal health information. Your session information is strictly used for the purpose of your ongoing medical care. Your information is subject to strict data privacy regulations and is always secured and encrypted. Stringent business associate agreements ensure data privacy and HIPAA compliance.

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    87
    arrow-down
    6
    ·
    1 day ago

    I would nope the fuck out and change doctors. A regurgitation machine prone to hallucinations has no place in medical care.

    • oneser@lemmy.zip
      link
      fedilink
      arrow-up
      15
      arrow-down
      19
      ·
      1 day ago

      If this was for a GP, I would agree with this stance. But a good, fitting and competent mental health professional can be harder to find.

      • applebusch@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        That’s the last fucking profession who should be using LLMs… People can gaslight themselves with chatbots without paying for a trusted professional to reinforce that bullshit.

        • oneser@lemmy.zip
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          15 hours ago

          OP didn’t state this clearly, but I went and looked. The app is not for replacing consults, only billing etc. so I’d put it in the “annoying, but not world ending” category.

      • Zos_Kia@jlai.lu
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        20 hours ago

        By god they’re going to make OP change doctors just because they hate “le stochastic parrot”. And op is probably in the US which makes the whole thing even crueller.

        Literally a horde of teenagers playing with a bipolar’s head because they have big feelings about stuff.

        And all this for a fucking note taking app Jesus Christ. Yeah sure OP is probably risking their mental health in the process but who gives a shit about that when you have an occasion to proclaim that le AI bad.

        • Washedupcynic@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          I am concerned about what is done with the data generated via the saved recording and transcription. Yes I live in the USA. Our government is currently kidnapping people off the street and disappearing them for being brown. They are attempting to build databases identifying trans people. So yeah, I’m concerned that the third party my doctor is using, MYIO, could sell the data/transcripts, and before I know it I end up on a government list and disappeared because I am gay. Could the theft of this data being generated by the app lead to identity theft? MYIO says the videos aren’t stored long term, and everything is encrypted; but companies like and the monetary penalties are just rolled into the cost of doing business. This isn’t a note taking app, there are already plenty of transcribers on the market. This is something entirely different.

          I’ve already had my identity stolen and credit cards opened in my name.

          And no one is going to “MAKE” me change doctors. That’s something I decide for myself.

        • WhyJiffie@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          you seem to have no clue about the problem at hand. It’s the lesser of issues that the AI transcriber could hallucinate. the worse problem, which is irreversible, that the treatment session and every private detail that gets discussed is funneled to at best questionable companies who will do whatever they want with your private information. once that happened, you can’t just make them delete what they stored in the process, it is completely unveriable what they do besides offering the original service. everything that was told in the session will not stay between the two of you.
          accepting this unknowingly is very dangerous. accepting it knowingly will alter what you say and the results with it, like going to a therapist who you know personally, which is not allowed for very good reasons.

          • Zos_Kia@jlai.lu
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            17 hours ago

            You think therapists and doctors in general don’t use Docs or Notes services that are hosted or backed up in the cloud ? You think having your medical data leaked to tech companies is new ? Just because the notes transcription app is AI doesn’t make it magically worse. In fact it makes the data harder to access as you need to re-infer the whole enchilada if you want to mine it (as opposed to, say, Google Drive who can just make a SQL query on your data and get it structured and ready to use).

            It’s nice that mental health is so inconsequential to you that you can balance it against privacy purity politics. It’s really cool for you that you’re in this position of privilege. It’s not cool to be pushing on someone with a clinical condition in a way that will probably get them worse off, in a country with absolutely no mental health safety net. Just like antivax it’s coated in fake concern, but you’re playing a dangerous game with someone else’s life and you’re cool with it because you’re insulated from the consequences.

            You guys really are a pure product of those amoral hyper-individualistic times.

      • phoenixarise@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        4
        ·
        edit-2
        1 day ago

        I don’t believe that. They just don’t want to pay them what they’re worth. Machines don’t ask for days off or health insurance, that’s their rationale. I hope they go out of business.

    • originalucifer@moist.catsweat.com
      link
      fedilink
      arrow-up
      7
      arrow-down
      53
      ·
      1 day ago

      you do know at some point the whole ‘hallucinations’ line is going to be as fresh as calling things ‘woke’, right?

      the ‘does this thing have ai in it’ is already a fucking blur as businesses link to each other via private and public APIs… healthcare is no different.

      these things are already in place in many places. if youre a part of any nation wide health services, youre already impacted.

      its like the fact that a huge % of our GDP is tied to like 10 companies… you cannot live your life in the modern united states without suffering products or services from those 10 companies, full stop. your life with ai will look the same.

      can you work hard avoid shit and cry about it? yep. yep you can… but thats about it.

      • OwOarchist@pawb.social
        link
        fedilink
        English
        arrow-up
        43
        arrow-down
        1
        ·
        edit-2
        1 day ago

        you do know at some point the whole ‘hallucinations’ line is going to be as fresh as calling things ‘woke’, right?

        The truth doesn’t care whether it’s “fresh” or not.

        As long as AI still hallucinates, it will be useful for entertainment purposes only and never for anything as serious as healthcare.

        your life with ai will look the same.

        lol, tell that to every other business fad that has come and gone.

        The AI bubble will pop, the economy will crash, and in the long run, that will be a good thing.

        • THE_GR8_MIKE@lemmy.world
          link
          fedilink
          English
          arrow-up
          24
          arrow-down
          1
          ·
          1 day ago

          Dude must be some MBA crypto bro AI slop jock. His grammar isn’t good enough to be one of those idiot CEOs who just learned what artificial intelligence is. Maybe he’s a shareholder for one of those soul-less companies. Probably not that either though. Perhaps he’s just a terrible artist or programmer who uses AI slop for all of his works of shart. The possibilities really are endless these days.

          • originalucifer@moist.catsweat.com
            link
            fedilink
            arrow-up
            5
            arrow-down
            10
            ·
            1 day ago

            im an ex corp drone whose value was replacing humans with automation.

            it sucks, its already exists, it will happen more. llms are already in these pipelines and theres nothing any of us can do to avoid it.

            im not saying its good. im not saying it should be. im saying, it exists right now cuz ive been a part of it.

                • phoenixarise@lemmy.world
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  edit-2
                  1 day ago

                  Oh okay, so your only value is the pursuit of material bullshit and not the well being of human beings. Good luck getting AI to pay for your shitty wares when nobody makes money to afford them. 🤭

                  I have no idea what it’s like to be you, and I’m glad I don’t. Enjoy your cold empty heart! 🙂

                  • originalucifer@moist.catsweat.com
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    3
                    ·
                    1 day ago

                    you lack reading comprehension skills. the value was to my ex employer, obviously.

                    i love how in your infinite ignorance you ignored the fact i bolted purposefully. fuck off you piece of shit

                    annd the other point you seem to be missing is the internal pipelines this shit is currently fucking automating. its not about end users using tokens, or idiots generating images with 6 fingers. its about businesses moving and processing data, replacing humans that used to do that shit.

                    it has ZERO to do with “im trying to bundle ai into my new shiny app, buy it!”

                    idiots.

      • Janx@piefed.social
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        edit-2
        1 day ago

        It’s almost like the very businesses that creamed their pants about being able to replace workers and endless “blue ocean” profits exaggerated, lied, and forced AI into every. single. product. That’s not consumers’ faults…

        • originalucifer@moist.catsweat.com
          link
          fedilink
          arrow-up
          3
          arrow-down
          6
          ·
          1 day ago

          i cant understand why people are oblivious to the multi-faced war-front that is AI.

          theres the shit you hear about and see every day (oh look copilot shit the bed! claude cant add! teehehee look at all the extra fingers!) and then theres the shit that is actually being implemented in process models all over the place in nearly every department. from inventory to healthcare analysis to customer service, this shit is in daily use now … and you cannot avoid it.

          ai is just an api call away and software companies suck.

      • kescusay@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        1 day ago

        Ummm, hallucinations are literally how LLMs work. Everything they generate is confabulation, though sometimes it’s useful confabulation.

        • timbuck2themoon@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          12 hours ago

          I think we should stop using their terms.

          Llms spout BULLSHIT half the time. They don’t hallucinate. They confidently state incorrect garbage.

      • slazer2au@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        you cannot live your life in the modern united states without suffering products or services from those 10 companies

        Well, its good that I don’t live there.