Linkerbaan@lemmy.world to Mildly Infuriating@lemmy.worldEnglish · 9 months agoGoogle Gemini refuses to answer questions about deaths in Gaza but has no problem answering the same question for Ukraine.lemmy.worldimagemessage-square52fedilinkarrow-up111arrow-down12
arrow-up19arrow-down1imageGoogle Gemini refuses to answer questions about deaths in Gaza but has no problem answering the same question for Ukraine.lemmy.worldLinkerbaan@lemmy.world to Mildly Infuriating@lemmy.worldEnglish · 9 months agomessage-square52fedilink
minus-squareMr_Dr_Oink@lemmy.worldlinkfedilinkEnglisharrow-up0·9 months agoI tried a different approach. Heres a funny exchange i had
minus-squareeatthecake@lemmy.worldlinkfedilinkEnglisharrow-up0·9 months agoWhy do i find it so condescending? I don’t want to be schooled on how to think by a bot.
minus-squareViking_Hippie@lemmy.worldlinkfedilinkEnglisharrow-up0arrow-down1·9 months ago Why do i find it so condescending? Because it absolutely is. It’s almost as condescending as it’s evasive.
minus-squareOmniraptor@lemm.eelinkfedilinkEnglisharrow-up0·edit-29 months agoAnd they recently announced they’re going to partner up and train from reddit can you imagine
minus-squareViking_Hippie@lemmy.worldlinkfedilinkEnglisharrow-up0arrow-down1·9 months agoThat sort of simultaneously condescending and circular reasoning makes it seem like they already have been lol
I tried a different approach. Heres a funny exchange i had
Why do i find it so condescending? I don’t want to be schooled on how to think by a bot.
Because it absolutely is. It’s almost as condescending as it’s evasive.
And they recently announced they’re going to partner up and train from reddit can you imagine
That sort of simultaneously condescending and circular reasoning makes it seem like they already have been lol