Samsung is reportedly preparing to wind down its SATA SSD business, and a notable hardware leaker warns the move could have broader implications for consumer storage pricing than Micron’s decision to end its Crucial RAM lineup. The report suggests reduced supply and short-term price pressure may follow as the market adjusts.
Again, I don’t buy this. The training data isn’t actually that big, nor is training done on such a huge scale so frequently.
As we approach the theoretical error rate limit for LLMs, as proven in the 2020 research paper by OpenAI and corrected by the 2022 paper by Deepmind, the required training and power costs rise to infinity.
In addition to that, the companies might have many different nearly identical datasets to try to achieve different outcomes.
Things like books and wikipedia pages aren’t that bad, wikipedia itself compressed is only 25GB, maybe a few hundred petabytes could store most of these items, but images and videos are also valid training data and that’s much larger, and then there is readable code. On top of that, all user inputs have to be stored to reference them again later if the chatbot offers that service.