• surfrock66@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    15 hours ago

    Devil’s advocate…this is one of the few good opportunities for ML. If you train a model on a specific dataset with expert validation, this has the opportunity to save lives.

    First, radiology isn’t one thing; different radiologists with different expertise looking at the same imaging can see different things. Second, there are not enough radiologists; my wife is an ER doctor who only does overnights and her hospital network has a central radiology center that reviews all films from all the hospitals, and it’s always backlogged and waiting on results impacts outcomes in a real way. Third, there are simply human limits to what we can visually perceive, take a look at this study: https://pmc.ncbi.nlm.nih.gov/articles/PMC3964612/

    Radiological ML models could change healthcare. Imagine a world where part of your annual preventative care you go and get a full body CT. The ML model can compare your CT with references in your demographics AND your prior years, and find changes/issues before they’re crises. That’s simply not an amount of data analysis that could be done by an army of radiologists, and has the opportunity to spot things like tumors or organ swelling way earlier. I get that the late stage capitalism reality is “they’ll use the data to farm money out of you” but from an actual technological standpoint, this could have real life-saving and improving implications for a lot of people and removes a huge bottleneck in healthcare.

    • EvergreenGuru@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      15 hours ago

      Even if AI does the job of reading medical imaging extremely well, I’d still want a radiologist to double-check the scans.

      • saimen@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        11 hours ago

        And it clearly would be necessary because even these higly sophisticated models would only look for what they are trained for and will have a lot of arbitrary/non relevant findings.

        Also someone has to take responsibility. The moment software firms are willing to take full responsibility without disclaimers I will start to believe they might be able to replace some people.

      • surfrock66@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        15 hours ago

        I see it as bulk imaging goes through the ML, which flags things, orders additional more detailed targeted scans, and those elevate to radiologists.

        • SpaceNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          15 hours ago

          Instead, they’re just gonna throw away the radiologists and make everything computer.

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            6 hours ago

            Anyone who does that will find themselves quickly out of business and bankrupt from lawsuits.

            The headline is a fantasy, it’s a tool that augments professionals in some situations. It doesn’t replace them.