• 1 Post
  • 4 Comments
Joined 29 days ago
cake
Cake day: January 5th, 2026

help-circle



  • Fair point on the notes. You’re right, if a user explicitly types “I am John Doe” in the journal, that string does get passed to the LLM. I can strip headers and IPs, but I can’t perfectly scrub context without breaking the analysis.

    To mitigate that, I use the paid API. Unlike the free version, Google is contractually blocked from training on the data. I realize that is a legal promise rather than a technical guarantee, but it is the same binding agreement used by hospitals and banks.

    As for why not local/Ollama? Two reasons:

    1. Intelligence: For psychological pattern recognition, small local models hallucinated way too much in my testing and missed obvious patterns. I need SOTA reasoning to avoid giving bad recovery advice.
    2. Hardware: Local inference kills battery and requires high-end phones. I want this tool accessible to everyone, not just people with $1k devices.

    I’m planning a “Local Only” toggle for the future, but the tech isn’t quite there yet for the average user.