For years I’ve had a dream of building a rack mounted PC capable of splitting its resources to host multiple GPU intensive VMs:

  • a few gaming VMs
  • a VM for work that can run Davinci Resolve and Blender renders
  • an LLM server
  • a Stable Diffusion server
  • media server

Just to name a few possibilities…

Everytime I’ve looked into it, it seemed like the technology just wasn’t there yet. I remember a few years ago Linus TT took a shot at it, but in the end suggested the technology (for non-commercial entities) just wasn’t in a comfortable spot yet.

So how far off are we? Obviously AI focused companies seem to make it work, but what possibilities exist for us self-hosters who might also want to run multiple displays in addition to the web gui LLM servers? And without forking out crazy money for GPU virtualization software licenses?

  • Sethayy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 months ago

    I currently have a setup exactly like this, with a threadripper 2950x, an RX 6600, and a 2070 super.

    Let me know if you have any questions in the specifics, but its 100% possible

    Best part of this setup is being able to connect to both via sunshine on many displays at once

    • brownmustardminion@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      I’m curious in a more in depth breakdown of your setup if you don’t mind. What is latency like and how are you handling switching?

      • Sethayy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        5 months ago

        I have a rack server in the garage with a gaming PC in it, 2 PSU’s and the 2 GPU’s mentioned, all running on Debian (which I soon plan to swap to nixos).

        The AMD GPU’s is passed through to a windows VM with 8 gigs or so of ram, for VR development in the garage usually, but sometimes is streamed as well.

        The second Nvidia GPU goes to my linux machine on Ubuntu just for ease of patched nvidia drivers, a couple virtual monitors with an xconfig like this, and is my daily driver with 16 gigs of RAM.

        Both use Virtio drivers for disk, network, and anything else I’m forgetting, Pcie passthrough via KVM/QEMU on the host.

        I’d say the latency hangs around 5ms when streaming both at once, and never comes close to saturating the gigabit connection, but I’m sure some optimisations could be done somewhere along the line.

        Clients run on anything from an Xbox series X to a random PC, hopefully soon an orange pi (worried about latency though).

        When I have a workload requiring both GPU’s I just keep 2 moonlight windows open and use the keybinds to unfocus the mouse then alt+tab to swap between them.

        I don’t have any complaints, although one time when my thermal setup was worse I left 2 copies Subnautica running for my wife and I to at Nitrox together, and it did start to drop in fps on the Linux machine once we picked it up after an hour or 2 running the games AFK.

        Edit to add I’m mostly using this for gaming right now, but its handled everything (within reason) that I’ve tossed at it, but I’m planning on soon setting up this sometime soon also across a couple other PC’s, but as of right now the VM’s feel as if they’re entirely distinct PC’s from an external perspective