mystic-macaroni@lemmy.ml to Privacy@lemmy.ml · 1 year agoCan you trust locally run LLMs?message-squaremessage-square15fedilinkarrow-up165arrow-down16file-text
arrow-up159arrow-down1message-squareCan you trust locally run LLMs?mystic-macaroni@lemmy.ml to Privacy@lemmy.ml · 1 year agomessage-square15fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squareSupraMario@lemmy.worldlinkfedilinkarrow-up16·1 year agoAnd if you don’t want to do that…run it in a VM and unplug your nic/disconnect wifi
And if you don’t want to do that…run it in a VM and unplug your nic/disconnect wifi