• sga@lemmings.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      that could be the case. but what I have seen my younger peers do is use these llms to “read” the papers, and only use it’s summaries as the source. In that case, it is definitely not good.

        • sga@lemmings.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          you would find more and more of it these days. people who are not good in the language, or not in subject both would use it.

          • fullsquare@awful.systems
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            if someone is so bad at a subject that chatgpt offers actual help, then maybe that person shouldn’t write an article on that subject in the first place. the only language chatgpt speaks is bland nonconfrontational corporate sludge, i’m not sure how it helps

            • sga@lemmings.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              53 minutes ago

              What I meant was for example, if someone is weak in, let’s say, english, but understands their shit, then they conduct their research however they do, and then have some llm translate it. that is a valid use case to me.

              Most research papers are written in English, if you need international cites, collaboration or accolades. A person may even speak english but it is not good enough, or they spell bad. But then the llm is purely a translator/grammar checker.

              But there are people who use it to do the latter, use it to generate stuff, and that is bad imo