that could be the case. but what I have seen my younger peers do is use these llms to “read” the papers, and only use it’s summaries as the source. In that case, it is definitely not good.
if someone is so bad at a subject that chatgpt offers actual help, then maybe that person shouldn’t write an article on that subject in the first place. the only language chatgpt speaks is bland nonconfrontational corporate sludge, i’m not sure how it helps
What I meant was for example, if someone is weak in, let’s say, english, but understands their shit, then they conduct their research however they do, and then have some llm translate it. that is a valid use case to me.
Most research papers are written in English, if you need international cites, collaboration or accolades. A person may even speak english but it is not good enough, or they spell bad. But then the llm is purely a translator/grammar checker.
But there are people who use it to do the latter, use it to generate stuff, and that is bad imo
that could be the case. but what I have seen my younger peers do is use these llms to “read” the papers, and only use it’s summaries as the source. In that case, it is definitely not good.
in one of these preprints there were traces of prompt used for writing paper itself too
you would find more and more of it these days. people who are not good in the language, or not in subject both would use it.
if someone is so bad at a subject that chatgpt offers actual help, then maybe that person shouldn’t write an article on that subject in the first place. the only language chatgpt speaks is bland nonconfrontational corporate sludge, i’m not sure how it helps
What I meant was for example, if someone is weak in, let’s say, english, but understands their shit, then they conduct their research however they do, and then have some llm translate it. that is a valid use case to me.
Most research papers are written in English, if you need international cites, collaboration or accolades. A person may even speak english but it is not good enough, or they spell bad. But then the llm is purely a translator/grammar checker.
But there are people who use it to do the latter, use it to generate stuff, and that is bad imo