It seems like it would be trivial for them to reduce quality control and have customers just “deal with” chips that aren’t as stable. How come they aren’t doing this?

  • AskewLord@piefed.social
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    edit-2
    1 day ago

    because it would tank their reputation and nobody would buy their chips.

    you have to understand, chips are commodity items. they aren’t ‘sexy’ or marketed heavily. people don’t choose one chip brand over another based on how ‘sexy’ it is’. They mostly just don’t care as long as it works. technology is at this point is pretty much a commodity/utility for most people and most use cases apart from high end, low volume applications like PC gaming or etc.

    the reason like a car manufacturer can do this is because the ‘image’ of the car is so appealing to people they will overlook how shit it is and how poorly it’s built. computer chips don’t have sex or image appeal in this way, so they really can’t afford to produce a shitty unreliable product.

    consumer grade chips are already ‘low quality’ compared to the server end stuff anyway. they have cheaper tolerances and lack error correction and other features that are necessities for service-level computing. not to mention the really high end stuff you find in super computing and military/industrial hardware, etc.