• Skullgrid@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 hour ago

      Supply will catch up with demand. High PC component prices are a temporary thing.

      we said that about housing since 2008.

  • NannerBanner@literature.cafe
    link
    fedilink
    English
    arrow-up
    22
    ·
    6 hours ago

    Lol, I feel bad for anyone new to the pc building community. At least those of us with 10+ year old computers at this point can play most of the indie games coming out. I AM still surprised by how intensive some games can be when they look like minecraft downgrades.

    • lightnsfw@reddthat.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 hours ago

      Kind of funny story, I launched Stardew valley yesterday and my displays absolutely shit themselves even though my graphics card is pretty new. Turned out that nvidias stupid app had changed the display settings to something weird. I had to manually flip it back to borderless and that fixed it but at first I was like “how out of everything I’ve played is this the one having problems?”

      • AmbientChaos@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        My wife started Stardew valley the other day and we also had display issues trying to output 4K. Still had to max out the zoom and even then the dialog boxes are cut off until you zoom out. Unlucky

  • rose56@lemmy.zip
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 hours ago

    I guess we won’t do something about that, especially when we have the power in our hands.

  • bigchungus@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    25
    ·
    10 hours ago

    You see, the big mistake in 2029 was the person installing Windows. Now they can see the horrible data center right outside of their house. As they say, out of sight, out of mind.

  • trashcroissant@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    67
    ·
    13 hours ago

    Jesus, I had to do a double take because I thought the stick person had somehow trapped a little human inside a pod for their entertainment and I was so confused.

  • real_squids@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    27
    ·
    14 hours ago

    You can pry my “double” slot GPU out of my cold dead hands (good luck trying to run away with it, it’s heavy as fuck and needs a supporting post)

    • autriyo@feddit.org
      link
      fedilink
      English
      arrow-up
      11
      ·
      9 hours ago

      Tbh, the whole card format feels very legacy, even for my Vega 56 “dual slot” card, and that thing “only” consumes ~230W.

      If ppl back then could’ve foreseen what obscenely power hungry parts would be shoehorned into the expansion card format, they probably would’ve chosen a different approach for GPUs specifically.

      • real_squids@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 hours ago

        I was joking about it being 2.5 slots, tbf most modern cards should be triple slots. Mine is 300W and it’s pretty chunky to stay below 60C, best option for big cards is a horizontal mobo imo

        edit: unpopular opinion but I’d rather have a chunky card that stays cool as fuck than a slim one, that’s why I picked up the Nitro when I had a 6650XT

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 hours ago

        iGPUs should have been a better option, but they were hamstrung by PCI conventions and graphics APIs favoring discrete VRAM.

        (Just look at how x86 SoC consoles run circles around similar-spec PCs.)

        I’m hoping that ARM is a chance to reset.

        • autriyo@feddit.org
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 hours ago

          I do like the modularity of discrete GPUs though.

          But a cooling setup similar to CPUs would’ve been better for airflow.

          • real_squids@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 hours ago

            Kinda hard to do when so many GPU vendors slap their memory and power circuits all over the place. Even if the die is in the same place cooler manufacturers would need to test fit a bajillion models, and on top of that they’d need insane R&D budgets to keep up with new additions, sometimes coming years after the original gpu comes out

          • kibiz0r@midwest.social
            link
            fedilink
            English
            arrow-up
            3
            ·
            5 hours ago

            Modularity is nice — both for personal preference reasons and incentivizing-market-competition reasons — but it does come at a cost.

            The thing is: even in our modular world right now, you don’t really have many choices. Two CPU companies, three GPU companies (two of them being the same as the CPU companies)…

            We could someday have a world where PC hardware is technically less modular than it is today but consumers have more choices in the marketplace than they do today.