We can’t even get them to upgrade our infrastructure to the 21st century in some cases so good luck with that. We still got shit running on Windows 7 or even Windows XP.
I bit of ram, but then I’d imagine you only need some purpose built chip for the connection, input and display logic. Effectively you’d need little more than a chrome cast-like device.
Chips of any kind are the issue. All the silicon fabs are being diverted to cover these insane datacenter orders. Like they’re backordered out over a year at this point. All to boost a tech bubble for a product that doesn’t work
nah all of the datacenters they build for AI, will come to use then.
they will say"Need computing? Don’t worry, just rent from us, for an ever increasing and enshittifying subscription"
We can’t even get them to upgrade our infrastructure to the 21st century in some cases so good luck with that. We still got shit running on Windows 7 or even Windows XP.
Windows 7. Don’t moan, it was the last good windows. Plus all the themes and hacks you could get for XP. Times were good
You can’t interact with a computer in the cloud though without some kind of computer in front of you.
We’ll just return to terminals. Just a screen, and input devices connected to a server :(
Right but surely you still need a CPU and RAM at the very least to process the Remote Desktop connection.
I bit of ram, but then I’d imagine you only need some purpose built chip for the connection, input and display logic. Effectively you’d need little more than a chrome cast-like device.
Chips of any kind are the issue. All the silicon fabs are being diverted to cover these insane datacenter orders. Like they’re backordered out over a year at this point. All to boost a tech bubble for a product that doesn’t work
I’m going back to my old Mac III. Screw this nonsense