I’m flabbergasted that they admit that ChatGPT said it, rather than copy-pasting it and pretending it’s their own work and hoping you don’t read it closely.
Even plagiarism has become lazy these days. At least do me the respect of concocting a lie.
I have a work colleague who does the copy pasting. He asks me how I can tell when he’s using AI to write git commit messages when there’s a sudden spike in capitalised words, correct grammar, emojis, bullet points (and add in that the message sometimes has nothing to do with what’s in the changes). It’s infuriating when he uses it in a discussion. I thought he’s lack in skills to make himself understood was bad, but arguing essentially with a chatbot is so much worse.
I’m flabbergasted that they admit that ChatGPT said it, rather than copy-pasting it and pretending it’s their own work and hoping you don’t read it closely.
Even plagiarism has become lazy these days. At least do me the respect of concocting a lie.
I have a work colleague who does the copy pasting. He asks me how I can tell when he’s using AI to write git commit messages when there’s a sudden spike in capitalised words, correct grammar, emojis, bullet points (and add in that the message sometimes has nothing to do with what’s in the changes). It’s infuriating when he uses it in a discussion. I thought he’s lack in skills to make himself understood was bad, but arguing essentially with a chatbot is so much worse.
Some people seem to use it as an appeal to authority. This only works if you think ChatGPT is an authority on anything, though.
I suppose you’re right, which is odd to me as the phrase “ChatGPT says…” automatically makes me question the validity of the information
It makes me doubt the validity of the person who wrote “ChatGPT said”