- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
So what actually happened seems to be this.
- a user was exposed to another users conversation.
thats a big ooof and really shouldn’t happen
- the conversations that where exposed contained sensitive userinformation
unresponsible user error, everyone and their mom should know better by now
Yeah you gotta treat chat GPT like it’s a public GitHub repository.
Why is it that whenever a corporation loses or otherwise leaks sensitive user data that was their responsibility to keep private, all of Lemmy comes out to comment about how it’s the users who are idiots?
Except it’s never just about that. Every comment has to make it known that they would never allow that to happen to them because they’re super smart. It’s honestly one of the most self-righteous, tone deaf takes I see on here.
Because people who come to Lemmy tend to be more technical and better on questions of security than the average population. For most people around here, much of this is obvious and we’re all tired of hearing this story over and over while the public learns nothing.
Your frustration is valid. Also calling people stupid is an easy mistake that a lot of prople make, its easy to do.
Well I’d never use the term to describe a person–it’s unnecessarily loaded. Ignorant, naive, etc might be better.
ChatGPT doesn’t leak passwords. Chat history is leaking which one of those happens to contain a plain text password. What’s up with the current trend of saying AI did this and that while the AI really didn’t?
People are far too willing to believe AI can do anything. How would the AI even have the passwords.
They weren’t there when I used ChatGPT just last night (I’m a pretty heavy user). No queries were made—they just appeared in my history, and most certainly aren’t from me (and I don’t think they’re from the same user either).
This sounds more like a huge fuckup with the site, not the AI itself.
Edit: A depressing amount of people commenting here obviously didn’t read the article…
If you paste plaintext passwords into ChatGPT, the problem is not ChatGPT; the problem is you.
Well tbf chatGPT also shouldn’t remember and then leak those passwords lol.
Did you read the article? It didn’t. Someone received someone else’s chat history appended to one of their own chats. No prompting, just appeared overnight.
That’s funny, all I see is ********
Back in the RuneScape days people would do dumb password scams. My buddy was introducing me to the game. We were sitting in his parents garage and he was playing and showing me his high lvl guy. Anyway, he walks around the trading area and someone says something like “omg you can’t type your password backwards *****”. In total disbelief he tries it out. Instantly freaks out, logs out to reset his password, and fails due to to the password already being changed
That’s golden. With all my hatred towards scammers, there’s a little niche for scams that make people feel smart before undressing them that I can’t bring myself to judge.
you can go hunter2 my hunter2-ing hunter2.
haha, does that look funny to you?
I put on my robe and wizard hat.
Use local and open source models if you care about privacy.
I absolutely agree. Use somthing like ollama. do keep in mind that it takes a lot of compiting resources to run these models. About 5GB ram and about 3GB filesize for the smaller sized ollama-unsensored.
It’s not great, but an old GTX GPU can be had cheaply if you look around refurb, as long as there is a warranty, you’re gold. Stick it into a 10 year old Xeon workstation off eBay, you can have a machine with 8 cores, 32GB RAM and a solid GPU cheaply under $200 easily.
Its the RAM requirement that stings rn, I beleave ive got the specs but was told or misremember a 64 GB ram requirement for a model.
IDK what you’ve read, but I have 24GB and can use Dreambooth and fine-tune Mistral no problem. RAM is only required to load the model briefly before it’s passed to VRAM iirc, and that’s the main deal, you need 8GB VRAM as an absolute minimum, even my 24GB VRAM is often not enough for some high end stuff.
Plus RAM is actually really cheap compared to a GPU. Remember it doesn’t have to be super fancy RAM either, DDR3 is fine if you’re not gaming on a like a Ryzen or something modern