• 6 Posts
  • 57 Comments
Joined 1 year ago
cake
Cake day: November 21st, 2023

help-circle
  • For nonidentical devices you create additional packages prefixed with specific device name. You don’t need to link all packages at once with stow, pass a name of a package to link it alone.uuu

    Sooo… I find some way to share the dotfiles directory across devices (rsync, syncthing, git, nextcloud, DAV) then make specific subdirs like this?:

    ~
      - dotfiles
          - bash-desktop
             dot-bashrc
             dot-bash_profile
          - bash-laptop
             dot-bashrc
             dot-profile
             dot-bash_profile
    

    But what is the software doing for me? I’m manually moving all these files and putting them together in the specific way requested. Setting the whole thing up is most of the work. Anyone who can write a script to create the structure can just as easily write it to make symlinks. I’m sure I’m missing something here.


  • yadm is the one I liked the best and tried it a few times. fact is that I am unlikely to keep a repo like this even part way up to date. New files are created all the time and not added, old ones don’t get updated or removed. There’s not even a good way to notice in any file manager what is included and what’s not as far as I know. yadm doesn’t work with tools like eza which can display the git status of files in repos. (and it probably wouldn’t be feasible.)

    Plus I have some specific config collections already in change tracking and it makes more sense to keep it that way. Having so many unrelated files together in one project is too chaotic and distracting.

    It’s not realistic for me to manage merges, modules, cherry picking, branches all that for so many files that change constantly without direct intervention. Quickly enough git will tie itself into some knot and I won’t be able to pick it apart.






  • thanks I appreciate it. I’ve been around the block enough times to expect maximalist advice in places like this. people who are motivated to be hanging around in a forum just waiting for someone to ask a question about hard drives are coming from a certain perspective. Honestly, it’s not my perspective. But the information is helpful in totality even though I’m unlikely to end up doing what any one person suggests.

    RAID is something I’ve seen mentioned over and over again. Every year or two I go reading about them more intentionally and never get the impression it’s for me. Too elaborate to solve problems I don’t have.



  • Forget NFS, SSHFS and syncthing as those are to complex and overkill at the moment. SMB is dead simple in a lot of ways and is hard to mess up.

    OTOH, SSHFS and syncthing are already humming along and I’m framiliar with them. Is SMB so easy or having other benefits that would make it better even though I have to start from scratch? It looks like it (and/or NFS) can be administered from cockpit web interface which is cool.

    Now that I look around I think I actually have a bit of RAM I could put in the PC. MacMini’s original RAM which is DDR3L; but I read you can put it in a device that wants DDR3. So I will do that next time it’s powered off.

    Thanks for letting me know I could use an expansion card. I was wondering about that but the service manual didn’t mention it at all and I had a hard time finding information online.

    Is this the sort of thing I am looking for: SATA Card 4 Port with 4 SATA Cables, 6 Gbps SATA 3.0 Controller PCI Express Expression Card with Low Profile Bracket Support 4 SATA 3.0 Devices ($23 USD) I don’t find anything cheaper than that. But there are various higher price points. Assuming none of those would be worthwhile on a crummy old computer like I have. Is there any specific RAID support I should look for?

    I have only the most cursory knowledge of RAID but can tell it becomes important at some point.

    But am I correct in my understanding that putting storage device in RAID decreases the total capacity? For example if I have 2x6TB in RAID, I have 6 TB of storage right?

    Honestly, more than half my data is stuff I don’t care too much about keeping. If I lose all the TV shows I don’t cry over it. Only some of it is stuff I would care enough to buy extra hardware to back up. Those tend to be the smaller files (like documents) whereas the items taking up a lot of space (media files) are more disposable. For these ones “good enough” is “good enough”.

    I really appreciate your time already and anything further. But I am still wondering, to what extent is all this helping me solve my original question which is that I want to be able to edit remote files on Desktop as easily as if they were local on Laptop? Assuming i got it all configured correctly, is GIMP going to be just as happy with a giant file lots of layers, undos, etc, on the Desktop as it would be with the same file on Laptop?


  • Do you mean take the board out of this case and put it in another, bigger one?

    I actually do have a larger, older tower that I fished out of the trash. Came with a 56k modem! But I don’t know if they would fit together. I also don’t notice anywhere particularly suitable to holding a bunch of storage; I guess I would have to buy (or make?) some pieces.

    Here is the board configuration for the Small Form Factor:

    I did try using #9 and #10 for storage and I seem to recall it kind of worked but didn’t totally work but not sure of the details. But hey, at least I can use a CD drive and a floppy drive at the same time!


  • Thanks! I have gone to look at TrueNAS or FreeNAS a few times over the years. I am dissuaded because hardware-wise they seem expensive. Then on the other hand, they are limited in what they can do.

    Comprehension check. Is the below accurate?

    1. TrueNAS is an OS, it would replace Debian.
    2. Main purpose of TrueNAS is to maintain the filesystem
    3. There are some packages available for TrueNAS, like someone mentioned Syncthing supports it
    4. But basically if I run TrueNAS, I will likely need a second computer to run services

    Also for comprehension check:

    • The reason many people are recommending NAS (or WebDAV, NFS, VPN etc) is because with better storage and network infrastructure I would no longer be interested in this caching idea.
    • Better would be to have solid enough file sharing within the LAN that accessing files located on Desktop from Laptop would work.
    • The above would be completely plausible

    How’m I doing?



  • Thanks!

    I elaborated on why I’m using USB HDDs in this comment. I have been a bit stuck knowing how to proceed to avoid these problems. I am willing to get a new desktop at some point but not sure what is needed and don’t have unlimited resources. If I buy a new device, I’ll have to live with it for a long time. I have about 6 or 8 external HDDs in total. Will probably eventually consolidate the smaller ones into a larger drive which would bring it down. Several are 2-4TB, could replace with 1x 12TB. But I will probably keep using the existing ones for backup if at all possible.

    Re the VPN, people keep mentioning this. I am not understanding what it would do though? I mostly need to access my files from within the LAN. Certainly not enough to justify the security risk of a dummy like me running a public service. I’d rather just copy files to an encrypted disk for those occasions and feel safe with my ports closed to outsiders.

    Is there some reason to consider a VPN for inside the LAN?


    1. In another comment I ran iperf3 Laptop (wifi) —> Desktop (ethernet) which was about 80-90MBits/s. Whereas Desktop —> OtherDesktop was in the 900-950 MBits/s range. So I think I can say the networking is fine enough when it’s all ethernet. Is there some other kind of benchmarking to do?

    2. Just posted a more detailed description of the desktops in this comment (4th paragraph). It’s not ideal but for now its what I have. I did actually take the time (gnome-disks benchmarking) to test different cables, ports, etc to find the best possible configuration. While there is an upper limit, if you are forced to use USB, this makes a big difference.

    3. Other people suggested ZeroTier or VPNs generally. I don’t really understand the role this component would be playing? I have a LAN and I really only want local access. Why the VPN?

    4. Ya, I have tried using syncthing for this before and it involves deleting stuff all the time then re-syncing it when you need it again. And you need to be careful not to accidentally delete while synced, which could destroy all files.

    5. Resilio I used it a long time ago. Didn’t realize it was still around! IIRC it was somewhat based on bittorrent with the idea of peers providing data to one another.


  • Maybe Syncthing is the way forward. I use it for years and am reasonably comfortable with it. When it works, it works. Problems is that when it doesn’t work, it’s hard to solve or even to know about. For the present use case it would involve making a lot of shares and manually toggling them on and off all the time. And would need to have some kind of error checking system also to avoid deleting unsynced files.

    Others have also suggested NFS but I am having a difficult time finding basic info about what it is and what I can expect. How is it different than using SSHFS mounted? Assuming I continue limping along on my existing hardware, do you think it can do any of the local caching type stuff I was hoping for?

    Re the hardware, thanks for the feedback! I am only recently learning about this side of computing. Am not a gamer and usually have had laptops, so never got too much into the hardware.

    I have actually 2 desktops, both 10+ years old. 1 is a macmini so there is no chance of getting the storage properly installed. I believe the CPU is better and it has more RAM because it was upgraded when it was my main machine. The other is a “small” tower (about 14") picked up cheaply to learn about PCs. Has not been upgraded at all other than SSD for the system drive. Both running debian now.

    In another comment I ran iperf3 Laptop (wifi) —> Desktop (ethernet) which was about 80-90MBits/s. Whereas Desktop —> OtherDesktop was in the 900-950 MBits/s range. So I think I can say the networking is fine enough when it’s all ethernet.

    One thing I wasn’t expecting from the tower is that it only supports 2x internal HDDs. I was hoping to get all the loose USB devices inside the box, like you suggest. It didn’t occur to me that I could only get the system drive + one extra. I don’t know if that’s common? Or if there is some way to expand the capacity? There isn’t too much room inside the box but if there was a way to add trays, most of them could fit inside with a bit of air between them.

    This is the kind of pitfall I wanted to learn about when I bought this machine so I guess it’s doing its job. :)

    Efforts to research what I would like to have instead have led me to be quite overwhelmed. I find a lot of people online who have way more time and resources to devote than I do, who want really high performance. I always just want “good enough”. If I followed the advice I found online I would end up with a PC costing more than everything else I own in the world put together.

    As far as I can tell, the solution for the miniPC type device is to buy an external drive holder rack. Do you agree? They are sooo expensive though, like $200-300 for basically a box. I don’t understand why they cost so much.


  • What would be the role of Zerotier? It seems like some sort of VPN-type application. I don’t understand what it’s needed for though. Someone else also suggested it albeit in a different configuration.

    Just doing some reading on NFS, it certainly seems promising. Naturally ArchWiki has a fairly clear instruction document. But I am having a ahrd time seeing what it is exactly? Why is it faster than SSHFS?

    Using the Cache with NFS > Cache Limitations with NFS:

    Opening a file from a shared file system for direct I/O automatically bypasses the cache. This is because this type of access must be direct to the server.

    Which raises the question what is “direct I/O” and is it something I use? This page calls direct I/O “an alternative caching policy” and the limited amount I can understand elsewhere leads me to infer I don’t need to worry about this. Does anyone know otherwise?

    The issue with syncing, is usually needing to sync everything.

    yes this is why syncthing proved difficult when I last tried it for this purpose.

    Beyond the actual files ti would be really handy if some lower-level stuff could be cache/synced between devices. Like thumbnails and other metadata. To my mind, remotely perusing Desktop filesystem from Laptop should be just as fast as looking through local files. I wouldn’t mind having a reasonable chunk of local storage dedicated to keeping this available.


  • What would be the role of Zerotier? It seems like some sort of VPN-type application. What do I need that for?

    rclone is cool and I used it before. I was never able to get it to work really consistently so always gave up. But that’s probably use error.

    That said, I can mount network drives and access them from within the file system. I think GVFS is doing the lifting for that. There are a couple different ways I’ve tried including with rclone, none seemed superior performance-wise. I should say the Desktop computer is just old and slow; there is only so much improvement possible if the files reside there. I would much prefer to work on my Laptop directly and move them back to Desktop for safe keeping when done.

    “vfs cache” is certainly an intriguing term

    Looks like maybe the main documentation is rclone mount > vfs-file-caching and specifically --vfs-cache-mode-full

    In this mode the files in the cache will be sparse files and rclone will keep track of which bits of the files it has downloaded.

    So if an application only reads the starts of each file, then rclone will only buffer the start of the file. These files will appear to be their full size in the cache, but they will be sparse files with only the data that has been downloaded present in them.

    I’m not totally sure what this would be doing, if it is exactly what I want, or close enough? I am remembering now one reason I didn’t stick with rclone which is I find the documentation difficult to understand. This is a really useful lead though.





  • A few weeks ago I put some serious time/brainpower into the network and got it waaaay smoother and faster than before. Finally implemented some upgraded hardware that has been sitting on a shelf for too long.

    I tried iperf. Actually iperf3 because that’s the first tutorial I found. Do you have any opinion on iPerf vs iperf3? On Desktop I ran:

    iperf3 -s -p 7673
    

    On Laptop I am currently doing some stuff I didn’t want to quit so this may not be a totally fair test. I’ll try re running it later. That said I ran:

     iperf3 -c desktop.lan -p 7673 -bidir
    

    And what looks like a summary at the bottom:

    [ ID] Interval           Transfer     Bitrate         Retr
    [  5]   0.00-10.00  sec   102 MBytes  86.0 Mbits/sec  152             sender
    [  5]   0.00-10.00  sec   102 MBytes  85.6 Mbits/sec                  receiver
    

    I actually have AnotherDesktop on the LAN also connected via ethernet. Going from Laptop —> AnotherDesktop gets similar to the above.

    However going AnotherDesktop —> Desktop gets 10x better results:

    [  5]   0.00-10.00  sec  1.09 GBytes   936 Mbits/sec    0             sender
    [  5]   0.00-10.00  sec  1.09 GBytes   933 Mbits/sec                  receiver
    

    Laptop has Intel Dual Band Wireless-AC 8260 who’s Max Speed = 867 Mbps. It probably isn’t the bottleneck. Although with the distro running at the moment (Fedora) I have a LOT of problems with everything so possibly things aren’t set up ideally here.

    I still didn’t upgrade the actual wireless access point for the network; don’t recall what the max speed is for current WAP but could be around 100Mbps.

    So this is an interesting path to optimize. However I am still interested in solving the original problem because even when I am directly using Desktop, things are slow. I do not really want to upgrade it is I can get away with a software solution. There are many items on my list of projects and purchases that I’d rather concentrate on.