Python stack traces give you all files involved in the error, with their lines. I don’t know what you’re talking about
Python stack traces give you all files involved in the error, with their lines. I don’t know what you’re talking about
how’s that the same thing as in the picture?


The waste of power is often associated to the proof of work consensus, but that’s not a requirement of blockchain. There are other ways to create consensus.
The bandwidth requirements really depend on what’s being stored, but it’s usually very manageable for a server. And clients not running validation don’t need to store or transfer that much data.


git itself is really not far from a blockhain. Blockchain is fine, it only has a bad rep because of ponzi schemes that use it to create crypto, but the technology and trustless consensus mechanisms are interesting.


but seriously, we need project management features that are decentralized: issue tracking, kanban, code reviews w/ comments, and ways to extend functionality without relying on a git forge.


it’s larger than one, also likely larger than nvidia Jetson and some firewalls out there. The record is for the smallest PC capable of running a 100B model LLM locally.
The specs are impressive tbh, but the record is a bit… specific
my bad, I thought this was a wendy’s
I’m not complaining, I’m stating an observation. You seem the one bothered
any language that allows ternary conditionals


“Just pour money and we’ll solve these little problems, trust me bro, just a few trillion more, we’re almost there” vibes
people really overthinking the joke in the comments huh


Other timezones



right… the backup…


It makes sense for it to be there
my point is that, usually, just an ls will put more than one file per line


and that’s not even a ls -l


I’d probably give the extra item to someone, even if a stranger, but I certainly wouldn’t put more money in the machine. Especially considering most machines just give the money back if there’s no purchase made. What a dumb question.
- can I ask you a question?
- you just did. Have a good day.
potentially relevant: paperless recently merged some opt-in LLM features, like chatting with documents and automated title generation based on the OCR context extracted.
https://github.com/paperless-ngx/paperless-ngx/pull/10319