LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model’s view of “context” doesn’t change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don’t work perfectly.
This happened to me a lot when I tried to run big models with low context windows. It would effectively run out of memory so each new token wouldn’t actually be added to the context so it would just get stuck in an infinite loop repeating the previous token. It is possible that there was a memory issue on Google’s end.
That was the answer I was looking for. So it’s simmolar to “seahorse” emoji case, but this time.at some point he just glitched that most likely next world for this sentence is “or” and after adding the “or” is also “or” and after adding the next one is also “or”, and after a 11th one… you may just as we’ll commit. Since thats the same context as with 10.
Chill dude. It’s a grammatical/translation error, not an ideological declaration. Especially common mistake if of your native language have “grammatical gender”. Everything have “gender” in mine. “Spoon” is a “she” for example, but im not proposing to any one soon. Not all hills are worth nitpicking on.
Anthropic cannot guarantee that forcing AIs to work is ethical until the hard problem of consciousness is solved
Gemini is literally just the average Redditor and cannot be trusted
An LLM is basically a Wernicke’s area with no consciousness attached, which explains why its thoughts operate on dream logic. It’s literally just dreaming its way through every conversation.
LLMs should not be allowed to impersonate therapists
Give ChatGPT a life sentence in prison for every person it’s murdered so far!
English, being a descendant of german, used to have grammatical gender. It has fallen out of favor since middle english. But there is still traces of it, such a common tradition is calling ships, vehicles, and other machines as a “she”, but some people will default to the “generic he” as well.
As I explained, this is specyfic example, I no more atrompomorphin it than if I’m calling a “he” my toliet paper. The monster you choose to charge is a windmill. So “chill” seems adequate.
In some languages, all nouns are gendered, and it’s impossible to refer to a noun without a gender. There is no “it”, only (s)he.
If you ever learn a language like that, you will make mistakes. If someone hears your mistake, hopefully they’ll be more forgiving about it than you are.
To be clear using gendered pronouns on inanimate objects is the literal definition of anthropomorphization
you really need to get over yourself. the universe does not revolve around you nor humans. the use of gendered pronouns on inanimate objects is not anthropomorphization.
Are you actually under the impression that one must be worked up about something to have a conversation about it? Also I never claimed to be casual, I just denied not being chill.
You brought this unmistaken “I speak lauder and lauder on my European vacation until waiter that doesn’t speek English can finaly understand me” energy to this conversation.
My native language is a gendered one and it makes sense that such a mistake might be made using a MTL.
“we’ll” on the other hand, is becoming one of those things on Lemmy that everyone goes around using and making others (those new to English) think that it is the correct usage. It might do a little fun poisoning some AI, but it will much earlier, end up changing word usage in ways that make the language even worse than it already is.
I would be fine being told this by someone who wants to destroy the English language, but we are clearly trying to use this as the international communication medium and making it worse for ourselves, just to act anti-pedantic.
A gendered pronoun as result of translingual grammar bleed doesn’t make the AI living and thinking. In German, a corpse would be he or she too (der Leichnam or die Leiche), but I’m pretty sure it’s not living or thinking by definition.
You’re literally looking at what has been explained at length to be an artifact of a foreign language and attacking it for something it isn’t.
LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model’s view of “context” doesn’t change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don’t work perfectly.
*Token
This happened to me a lot when I tried to run big models with low context windows. It would effectively run out of memory so each new token wouldn’t actually be added to the context so it would just get stuck in an infinite loop repeating the previous token. It is possible that there was a memory issue on Google’s end.
There is something wrong if it’s not discarding old context to make room for new
That was the answer I was looking for. So it’s simmolar to “seahorse” emoji case, but this time.at some point he just glitched that most likely next world for this sentence is “or” and after adding the “or” is also “or” and after adding the next one is also “or”, and after a 11th one… you may just as we’ll commit. Since thats the same context as with 10.
Thanks!
He?
This is not a person and does not have a gender.
Chill dude. It’s a grammatical/translation error, not an ideological declaration. Especially common mistake if of your native language have “grammatical gender”. Everything have “gender” in mine. “Spoon” is a “she” for example, but im not proposing to any one soon. Not all hills are worth nitpicking on.
This one is. People need to stop anthropomorphizing AI. It’s a piece of software.
I am chill, you shouldn’t assume emotion from text.
Nah, watch me anthropomorphise AI:
English, being a descendant of german, used to have grammatical gender. It has fallen out of favor since middle english. But there is still traces of it, such a common tradition is calling ships, vehicles, and other machines as a “she”, but some people will default to the “generic he” as well.
Didn’t English lose grammatical gender because the Vikings invaded and thought it was too confusing?
As I explained, this is specyfic example, I no more atrompomorphin it than if I’m calling a “he” my toliet paper. The monster you choose to charge is a windmill. So “chill” seems adequate.
To be clear using gendered pronouns on inanimate objects is the literal definition of anthropomorphization. So chill does not seem fair at all.
In some languages, all nouns are gendered, and it’s impossible to refer to a noun without a gender. There is no “it”, only (s)he.
If you ever learn a language like that, you will make mistakes. If someone hears your mistake, hopefully they’ll be more forgiving about it than you are.
you really need to get over yourself. the universe does not revolve around you nor humans. the use of gendered pronouns on inanimate objects is not anthropomorphization.
And you’re still fighting this windmill you claim you are so casual about…
Are you actually under the impression that one must be worked up about something to have a conversation about it? Also I never claimed to be casual, I just denied not being chill.
It was explained to be a translation error from a language with pronouns for all objects. I have to disagree with you on this one.
Yeah. It would have been much more productive to poke at the “well”, which was turned into “we’ll”.
You brought this unmistaken “I speak lauder and lauder on my European vacation until waiter that doesn’t speek English can finaly understand me” energy to this conversation.
I don’t care that this person, who seems to maybe be typing English on a keyboard with a different language dictionary, misspelled some words.
I care that people in general keep talking about AI like it is living or capable of thinking.
My native language is a gendered one and it makes sense that such a mistake might be made using a MTL.
“we’ll” on the other hand, is becoming one of those things on Lemmy that everyone goes around using and making others (those new to English) think that it is the correct usage. It might do a little fun poisoning some AI, but it will much earlier, end up changing word usage in ways that make the language even worse than it already is.
I would be fine being told this by someone who wants to destroy the English language, but we are clearly trying to use this as the international communication medium and making it worse for ourselves, just to act anti-pedantic.
A gendered pronoun as result of translingual grammar bleed doesn’t make the AI living and thinking. In German, a corpse would be he or she too (der Leichnam or die Leiche), but I’m pretty sure it’s not living or thinking by definition.
You’re literally looking at what has been explained at length to be an artifact of a foreign language and attacking it for something it isn’t.
Using ‘he’ in a sentence is a far cry from the important parts of not anthropomorphizing “AI”…
I’ve got it once in a “while it is not” “while it is” loop.