Project N.O.M.A.D, is a self-contained, offline survival computer packed with critical tools, knowledge, and AI to keep you informed and empowered—anytime, anywhere. - Crosstalk-Solutions/project-n...
While on one part i do like the all-ini-one ui and services, i feel as though it could have been done a little better without hosting a mini web-server just to use localhost on it.
Most if not all of the tools here are based on snapshots of online websites running in a browser, along with Docker ontop of it. While the intention is good and there are some neat ideas in here, why not just bundle native, offline FOSS programs that do the job already? for instance, cyberchef can be replaced with respective linux programs (eg base64, hexdump, grep, awk/sed, and gpg, just to name a few. graphical versions of these programs exist as well, so it’s not like you need to use the terminal, it’s just the most versatile environment for this type of stuff). No need for a webserver or anything.
However i will say, the offline wikipedia and maps are cool, unfortunately they’re the only neat things in this project.
Now let’s get to the point, an AI chatbot. What, does the dev think we have money to burn? Much less if SHTF and NVDIA RTX GPUs are scrapped for metal? (which they should be anyways). Now i know it’s local, and that it most likely has data already trained on it so that it has the 100% guarantee of not huffing its own fumes and hallucinating, but compared to the absolute power usage that’ll bring because of the sheer amount of resources it’s hogging out trying to spit out an answer, a search engine could do just as good, and it won’t hog up your GPU while at it. That’s not even getting into the current ssd/gpu/ram situation right now. On its front page, its own recommended spec sheet says 32 gigs of ram. yeah that’s a bit steep. 1TB SSD, i could kinda see why, but if i assume that most of the information is just text, you don’t really need 1TB, but it is better safe than sorry. Still, that’ll be pretty expensive if we’re going by today’s prices. When SHTF do you really think that most people are going to be rocking killer rigs with 8/16core CPUs, 32+ gigs of RAM and an RTX GPU? For the millionares and spoiled gamers who already have those? Sure, but for the masses? They’ll mostly be using laptops with 4-6 cores, 8 gigs of ram, and a mid-range gpu if they’re lucky, or integrated graphics.
Sure, you can say that having AI in it is somehow beneficial and tout how “everyone is using it”, but don’t get all pissy when your power bank runs out of juice at the worst time, let alone word gets out and your place gets raided and your 20-year-old 5090 is turned into scrap. All because you thought AI is good enough.
All in all it’s a good premise, but it could be executed way better than just snapshotting websites, then slapping AI onto it and calling it a day.
While on one part i do like the all-ini-one ui and services, i feel as though it could have been done a little better without hosting a mini web-server just to use localhost on it.
Most if not all of the tools here are based on snapshots of online websites running in a browser, along with Docker ontop of it. While the intention is good and there are some neat ideas in here, why not just bundle native, offline FOSS programs that do the job already? for instance, cyberchef can be replaced with respective linux programs (eg base64, hexdump, grep, awk/sed, and gpg, just to name a few. graphical versions of these programs exist as well, so it’s not like you need to use the terminal, it’s just the most versatile environment for this type of stuff). No need for a webserver or anything.
However i will say, the offline wikipedia and maps are cool, unfortunately they’re the only neat things in this project.
Now let’s get to the point, an AI chatbot. What, does the dev think we have money to burn? Much less if SHTF and NVDIA RTX GPUs are scrapped for metal? (which they should be anyways). Now i know it’s local, and that it most likely has data already trained on it so that it has the 100% guarantee of not huffing its own fumes and hallucinating, but compared to the absolute power usage that’ll bring because of the sheer amount of resources it’s hogging out trying to spit out an answer, a search engine could do just as good, and it won’t hog up your GPU while at it. That’s not even getting into the current ssd/gpu/ram situation right now. On its front page, its own recommended spec sheet says 32 gigs of ram. yeah that’s a bit steep. 1TB SSD, i could kinda see why, but if i assume that most of the information is just text, you don’t really need 1TB, but it is better safe than sorry. Still, that’ll be pretty expensive if we’re going by today’s prices. When SHTF do you really think that most people are going to be rocking killer rigs with 8/16core CPUs, 32+ gigs of RAM and an RTX GPU? For the millionares and spoiled gamers who already have those? Sure, but for the masses? They’ll mostly be using laptops with 4-6 cores, 8 gigs of ram, and a mid-range gpu if they’re lucky, or integrated graphics.
Sure, you can say that having AI in it is somehow beneficial and tout how “everyone is using it”, but don’t get all pissy when your power bank runs out of juice at the worst time, let alone word gets out and your place gets raided and your 20-year-old 5090 is turned into scrap. All because you thought AI is good enough.
All in all it’s a good premise, but it could be executed way better than just snapshotting websites, then slapping AI onto it and calling it a day.