This isn’t sustainable. Almost all of our infrastructure runs on computers and eventually it will reach a point where you have a computer in charge of vital infrastructure that won’t be able to buy replacement part and it’ll just fail.
There‘s existing infrastructure, that runs on hardware from the 1980s. Especially in industrial applications there are still plenty of gigantic machines controlled by a 386 or a C-64.
The used vintage market can keep these running for a long time. Eventually you replace them with an emulator or an FPGA that runs the same software.
Big banking, insurance, airlines, shipping, governments, militaries bought huge IBM mainframes from the 1960s onwards. They ran for decades. Many of these were transformed into virtual machines, still running their ancient FORTRAN code.
There’s also the story of (IIRC Minutemen) nuclear missiles needing 5.25 floppies to program their guidance systems. These were still operational in the early 2000s. Lots of military weapons systems run on ancient hardware.
Banks, insurance, and aviation all run on very well-tested established code, and are very, very resistant to change.
But people who know cobol and fortran are getting fewer and further between, so they are slowly changing. Fortunately with modern software development practices, you can much more easily write verified software.
Stuff is just getting more expensive, because of demand competition with AI. There is no reason to think that production for non-AI computing will ever hit literally zero.
I bit of ram, but then I’d imagine you only need some purpose built chip for the connection, input and display logic. Effectively you’d need little more than a chrome cast-like device.
Chips of any kind are the issue. All the silicon fabs are being diverted to cover these insane datacenter orders. Like they’re backordered out over a year at this point. All to boost a tech bubble for a product that doesn’t work
We can’t even get them to upgrade our infrastructure to the 21st century in some cases so good luck with that. We still got shit running on Windows 7 or even Windows XP.
This isn’t sustainable. Almost all of our infrastructure runs on computers and eventually it will reach a point where you have a computer in charge of vital infrastructure that won’t be able to buy replacement part and it’ll just fail.
We used to get by with much less. If only we could start writing more efficient software again…
There‘s existing infrastructure, that runs on hardware from the 1980s. Especially in industrial applications there are still plenty of gigantic machines controlled by a 386 or a C-64.
The used vintage market can keep these running for a long time. Eventually you replace them with an emulator or an FPGA that runs the same software.
Big banking, insurance, airlines, shipping, governments, militaries bought huge IBM mainframes from the 1960s onwards. They ran for decades. Many of these were transformed into virtual machines, still running their ancient FORTRAN code.
There’s also the story of (IIRC Minutemen) nuclear missiles needing 5.25 floppies to program their guidance systems. These were still operational in the early 2000s. Lots of military weapons systems run on ancient hardware.
Banks, insurance, and aviation all run on very well-tested established code, and are very, very resistant to change.
But people who know cobol and fortran are getting fewer and further between, so they are slowly changing. Fortunately with modern software development practices, you can much more easily write verified software.
Don’t forget the IRS! They’ve been on 1980s equipment since…well… the eighties!
Stuff is just getting more expensive, because of demand competition with AI. There is no reason to think that production for non-AI computing will ever hit literally zero.
Overall production != local availability or accessibility
nah all of the datacenters they build for AI, will come to use then.
they will say"Need computing? Don’t worry, just rent from us, for an ever increasing and enshittifying subscription"
You can’t interact with a computer in the cloud though without some kind of computer in front of you.
We’ll just return to terminals. Just a screen, and input devices connected to a server :(
Right but surely you still need a CPU and RAM at the very least to process the Remote Desktop connection.
I bit of ram, but then I’d imagine you only need some purpose built chip for the connection, input and display logic. Effectively you’d need little more than a chrome cast-like device.
I’m going back to my old Mac III. Screw this nonsense
Chips of any kind are the issue. All the silicon fabs are being diverted to cover these insane datacenter orders. Like they’re backordered out over a year at this point. All to boost a tech bubble for a product that doesn’t work
We can’t even get them to upgrade our infrastructure to the 21st century in some cases so good luck with that. We still got shit running on Windows 7 or even Windows XP.
Windows 7. Don’t moan, it was the last good windows. Plus all the themes and hacks you could get for XP. Times were good
Have you considered that…that is the intent?
Aaron lot of out infrastructure still uses floppy discs.
Yup
BART looking smug because there’s no vacuum tube shortage.