I haven’t thought about it in a while but the premise of the article rings true. Desktops are overall disposable. Gpu generations are only really significant with new cpu generations. CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.

Is there a platform that challenges that trend?

Edit Good points were made. There is a lot to disagree with in the article, especially when focused on gaming.

Storage For the love of your data : storage is a WEAR component. Especially with HDD. Up until recently storage was so cheap it was crazy not to get new drives every few years.

Power Supplies Just because the computer still boots doesn’t mean the power supply is still good. A PSU will continue to shove power into your system long past the ability to provide clean power. Scope and test an older PSU before you put it on a new build.

  • anubis2814@lemmy.today
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 hour ago

    This might be true for Intel, I don’t know, I use amd. I know the limits of my cpu/gpu pairing. I bought the affordable low end GPU for the cpu and in 5 years I’ll upgrade to the upper end gpu when it’s really cheap. 5 years later, I’ll get a new computer

  • testaccount372920@piefed.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    9 hours ago

    The title of this article just doesn’t match reality. It really only (maybe) applies to very high end systems that are already pushing the limits of all components. Most people don’t have the money to waste on that and have plenty of room to upgrade their hardware for a looong time.

    If you don’t need much (e.g. no gaming, 3D rendering, etc.), especially if you don’t need a dedicated gpu, then you can upgrade for at least a decade before running into issues. To be fair, a laptop should last a decade as well in that case, but at a higher prices and while being less repairable.

  • verdi@tarte.nuage-libre.fr
    link
    fedilink
    Français
    arrow-up
    7
    arrow-down
    1
    ·
    12 hours ago

    The manufacturing of consent to move your machine to the cloud has begun. We had a good run lads.

    • worhui@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      5 hours ago

      You are literally the only person saying that out this this whole exchange.

      • verdi@tarte.nuage-libre.fr
        link
        fedilink
        Français
        arrow-up
        4
        ·
        edit-2
        3 hours ago

        "This persistent narrative in the media trying to talk consumers out of desktops as being viable options kind of sneakily ties into the greater “you will own nothing and you will be happy” narrative being pushed by big tech.

        It’s really obvious and it needs to be consistently called out for what it is."

        Literally the most upvoted comment in the linked article.

        I guess some frogs are just to stupid to figure out they’re being slow boiled and it’s up to us to carry the dead weight out of the pan…

  • fonix232@fedia.io
    link
    fedilink
    arrow-up
    18
    ·
    17 hours ago

    This is categorically untrue with the latest generations of chipsets, CPUs and GPUs. Just look at AMD instead of Intel: AM4/5 cross-compatibility, DDR4/DDR5 combined support and so on.

    If anything, today is the day when you can upgrade beyond your current gen hardware component by component.

  • themachinestops@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    15
    ·
    20 hours ago

    Honestly most people just upgrade the GPU and ssd, after 10-15 years they buy a new desktop. Also one of the biggest reasons to get a desktop is that it is cheaper than laptops, last longer, and you can change any part that breaks. I had many laptops with one component basically making the entire device useless, if it was a desktop it could easily be fixed, for example soldered RAM.

    • worhui@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      17 hours ago

      This isn’t against desktops. It’s against idea that a desktop is significantly more future proof than another form factor.

      • testaccount372920@piefed.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 hours ago

        The previous comment gives a pretty clear argument for why desktops are more future proof, I think. Being more repairable is a pretty big deal for the longevity of the whole system.

      • A_norny_mousse@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        14 hours ago

        Not sure what “future proof” means, but my PC still has its original case from Windows Vista times, has seen 2 mobo replacements, 1 PSU replacement, and I don’t even know how many hard drive / SSD additions / swaps. RAM extensions too. Used to have a GPU but after the 2nd mobo/CPU replacement I dropped it.

        Different screens, keyboards, and mice.

        None of this would have easily been possible on a laptop.

        In a world where hardware is getting more expensive again you are really sending the wrong message here.

        Not to speak of environmental impact & consumerism.

        • worhui@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          4 hours ago

          Your history sounds exactly like the spiral of component replacement that is being discussed. it sounds like your replaced everything multiple times, but just kept the case.

          • WhyJiffie@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            36 minutes ago

            separately, part by part. if they had a laptop they would have needed to buy at least 6 complete laptops by that time, or more realistically, give up on upgrades.

  • lightnsfw@reddthat.com
    link
    fedilink
    English
    arrow-up
    38
    ·
    1 day ago

    I have been ship of theseusing my desktop and server for 15 years. This article is fucking stupid.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    ·
    edit-2
    1 day ago

    That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.

    I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.

    I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.

    …That being said, there’s a lot of trends going against people, especially for gaming:

    • There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.

    • We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.

    • Time gaps between generations are growing as silicon gets more expensive to design.

    • …Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.

    • Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.

    • You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.

    IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.

    • A_norny_mousse@feddit.org
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      13 hours ago

      You nailed it, except “huge generalization” is actually being generous. The article is simply wrong. The author is speaking esoteric technobabble:

      The upgrade death spiral (…) happens because upgrading one component of your computer can unbalance the system.

      It’s the sort of argument a husband might give his not tech savvy wife when she asks why he repeatedly needs to spend so much $$$ on something only he uses.

      I think FOMO says it pretty well, or simply consumerism.

      Now that hardware is getting more expensive again, this is really sending the wrong message.

      And OP keeps doubling & tripling down despite basically every comment disagreeing. I think they wrote that article.

    • claymore@pawb.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      12 hours ago

      Don’t forget about PCIe expansion. Just yesterday I got a FireWire PCIe card for 20€ to transfer old DV tapes to digital with no quality loss. Plug the card in and you’re done. To get the same result on a laptop I’d need a Thunderbolt port and two adapters, one of which isn’t manufactured anymore and goes for 150€+ on secondhand stores.

      PS. I would remove “CPU heating” from your system if I were you :)

    • kreskin@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      23 hours ago

      If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.

      While throwing out working things is terrible, the cost of servicing a motherboard outpaces the cost of replacing it. They can possibly still charge you 200 dollars and tell you the board cant be fixed, right? I think the right balance is that you observe the warranty period, try to troubleshoot it yourself --and then call it a day, unless you have a 400+ dollar motherboard.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        19 hours ago

        Yeah, probably. I actually have no idea what they charge, so I’d have to ask.

        It’s be worth it for a 3090 though, no question.

    • worhui@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      22 hours ago

      Typically I’ve seen a motherboard supports about 2 generations of gpu before some underlying technology makes it no longer can keep up.

      If you are going from a 30 series to a 50 series gpu there is going to be a need for increased pci bandwidth in terms of lanes and pcie- spec for it to be fully utilized.

      I just saw this play out with a coworker where he replaced 2x3090 with a 5090. The single card is faster but now the he can’t fully task his storage and gpu at the same time due to pci-lane limits. So it’s a new motherboard, which needs a new cpu which needs new ram.

      Basically a 2 generation gpu upgrade needs a whole new system.

      Each generation of pcie doubles bandwith so a future 2x pcie-6 gpu will need an 8x pcie 4 worth of bandwidth.

      Even then gpu’s and cpu have been getting more power hungry. Unless you over spec your psu there is a reasonable chance once you get past 2 gpu generations you need a bigger Psu. Power supplies are wear items. They continue to function, but may not provide power as cleanly when you get to 5+ years of continuous use.

      Sure you can keep the case and psu but literally everything else will run thunderbolt or usb-c without penalties.

      At this point why not run storage outside the box for anything sizeable? Anything fast runs on nvme internal.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        19 hours ago

        This doesn’t make any sense, especially the 2x 3090 example. I’ve run my 3090 at PCIe 3.0 over a riser, and there’s only one niche app where it ever made any difference. I’ve seen plenty of benches show PCIe 4.0 is just fine for a 5090:

        https://gamersnexus.net/gpus/nvidia-rtx-5090-pcie-50-vs-40-vs-30-x16-scaling-benchmarks

        1x 5090 uses the same net bandwidth, and half the PCIe lanes, as 2x 3090.

        Storage is, to my knowledge, always on a separate bus than graphics, so that also doesn’t make any sense.

        My literally ancient TX750 still worked fine with my 3090, though it was moved. I’m just going to throttle any GPU that uses more than 420W anyway, as that’s ridiculous and past the point of diminishing returns.

        And if you are buying a 5090… a newer CPU platform is like a drop in the bucket.


        I hate to be critical, and there are potential issues, like severe CPU bottlenecking or even instruction support. But… I don’t really follow where you’re going with the other stuff.

        • worhui@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          17 hours ago

          And if you are buying a 5090… a newer CPU platform is like a drop in the bucket.

          That is the point of the article.

          The problem my friends has is that he is rendering video so he has a high performance Sas host adapter on the same PCI bus as the GPU. He upgraded both hoping the 5090 would play nicer with the sas adapter but he can’t pull full disk bandwith and render images at the same time. Maybe it’s ok for gaming, not for compute and writing to disk.

          The thing with power supplies, they continue to provide enough power long after they lose the ability to provide clean power under load. Only when they are really on their last legs will they actually stop providing the rated power. I have seem a persistent networking issue resolved by swapping a power supply. Most of the time you don’t test a power supply under load to understand if each rail is staying where it needs to be.

  • sorghum@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 day ago

    Disposable my ass. I just did the final upgrades to my AM4 platform to be my main rig for the next 5 years. After that it will get a storage upgrade and become a NAS and do other server stuff. This computer 7 years in has another 15 left in it.

    • Lfrith@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      19 hours ago

      Yeah, it’s crazy that someone could have gotten like a Ryzen 5 1600 then upgraded to a 5800x3D around 5 years later without needing to buy a new motherboard, which usually can mean having to buy a new set of ram too.

      For a long time just doing a new build if upgrading to a newer CPU used to be the thing when Intel was dominant.

      • sorghum@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        17 hours ago

        Yeah, I usually over spec when I build my main rig because I want to have it last and repurpose it later down the road. I finally retired a power supply that I bought back in the mid 2000s. I can’t power modern cards anymore unfortunately. 🫡 pc power and cooling single rail take a break. You’ve earned it.

  • Kristell@herbicide.fallcounty.omg.lol
    link
    fedilink
    English
    arrow-up
    9
    ·
    22 hours ago

    Yes, desktop PCs challenge that trend. If you’re not chasing the newest of the new, you can keep using your old stuff till it dies. I’ve done one CPU upgrade, and a GPU upgrade, to my desktop in the eight years I’ve owned it, and it handles all of my games fine.

    If you’re changing the motherboard, you’ll usually need a new CPU, and sometimes RAM. As long as your MOBO has a PCI/PCIE slot you can shove your old graphics card in there. Unless there’s a new RAM version, you don’t need to replace the RAM, and SATA’s been the standard storage connector for how long now?

    Unless you’re going above your current PSU’s rating that thing’s good until it’s dead.

    I just don’t see how this argument holds up. If your motherboard is old enough that they no longer make your CPU/RAM socket, and you’re looking to upgrade, chances are very good that thing’s lived far longer than most laptops would be expected to. But like. When I built my current desktop 8 years ago, it had 8gb of RAM and a… I don’t remember the graphics card, I know the processor was a pentium G something, and like 1tb of storage. It has an i7 (don’t remember the generation off hand), and an R9 290, and 32gb of RAM, and 7tb of storage now. Same motherboard. If I replace it I will need a new processor, and new RAM (the RAM is actively dying, so I haven’t been using it much), but these parts are all nearly a decade old, with the exception of the RAM. Well. One RAM stick is 8 years old, but that’s beside the point.

    This just doesn’t line up with my own personal experience?

    • worhui@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      16 hours ago

      Unless you’re going above your current PSU’s rating that thing’s good until it’s dead.

      Power supplies will work well past the point of providing clean in spec power on each rail. Lots of parts in a power supply can stop working properly before it physically no longer passes power.

      Unless the PSU is relatively new it’s not a great idea to put it into a new build with testing that it is still in spec on each voltage rail under a load.

  • m-p{3}@lemmy.ca
    link
    fedilink
    English
    arrow-up
    27
    ·
    1 day ago

    Personally I still prefer the desktop because I can choose exactly where I prefer performance, and where I can make some tradeoffs. Also, parts are easier to replace when they fail, making them more sustainable. You don’t have that choice with a laptop since it’s all prebuilt.

    • socphoenix@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 day ago

      Desktops also offer better heat dissipation and peripheral replacements extending the life of the unit. It can be difficult for most folks to replace a laptop display or even battery nowadays frankly.

  • [deleted]@piefed.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    23 hours ago

    AMD challenges that trend, but the article writer dismisses them because of Intel’s market share.

    Terrible article.

  • A_norny_mousse@feddit.org
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 day ago

    CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.

    I find the quoted statement untrue. You still have all peripherals, including the screen, the PSU, and the case.

    You can replace components as and when it becomes necessary.

    You can add up hard drives, instead of replacing a smaller one with a larger one.

    Desktop mobos are usually more upgradeable with RAM than laptops.

    There’s probably more arguments that speak against the gist of this article.

    • worhui@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      22 hours ago

      All of the peripherals will carry on to any new system. With usb-c basically all you need to run in your case is a gpu and nvme.

      Throw in thunderbolt and networking as well as hdd based das won’t be bottlenecked.

      Yeah desktops can have more ram than laptops and that is the one case where a desktop can really shine. Even then there is usually a pretty big ram limit you need to pass.

  • Rimu@piefed.social
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 day ago

    Laptop CPUs are crippled garbage compared to desktop CPUs of the same generation. So there’s that.

  • Tywèle@piefed.social
    link
    fedilink
    English
    arrow-up
    8
    ·
    23 hours ago

    I don’t agree with this article. Everyone I know usually upgrades their GPU until the CPU is bottlenecking it heavily and that is only the case after a few GPU upgrades.

    • Lfrith@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      20 hours ago

      Yeah, and when CPU is the bottleneck upgrading the CPU, mobo, and ram but not the GPU.

      This time though I only upgraded the CPU, since AM4 had supported multiple generations of CPUs. One of the best things to happen for PC.