

You’re talking about the worst of AI, which I agree should be dismantled. There are many smaller projects that do not do the things you mentioned, and it’s possible to support those while shunning corporate AI.


You’re talking about the worst of AI, which I agree should be dismantled. There are many smaller projects that do not do the things you mentioned, and it’s possible to support those while shunning corporate AI.


AI is not a monolith; there are a lot of tools out there that you don’t hear about because all the focus is on the large, corporate models that are meant to dehumanize. LLMs like Gemini, Grok, and ChatGPT are awful inventions that should be dismantled, but smaller ML projects found on GitHub shouldn’t be lumped in, as the few that survive the bubble will stick around because they prove to be effective.


Hey I’m against corporate AI too, but when anyone can create a very basic ML program that runs locally with public domain data, eventually something both useful and ethical will emerge. It’s good to be skeptical, but you don’t have to be an AI bro to see that some specific tools might meet or exceed your standards.
I don’t like image or video generators, but the core tech is really useful for frame interpolation, a usecase that is not inherently controversial and badly needs improvement.
Sorry to not-x-it’s-y, but it’s not about forcing the big tool into your workflow, it’s about finding the 1001 little tools that work every time and collecting them. Or, wait for these tools to be consolidated.
If I seem naive, It’s cause I believe in reclaiming as much from tainted technology as possible.


“Alexander pulls out his bootable USB”


Most people are cool with some AI when you show the small, non-plagarative stuff. It sucks that “AI” is such a big umbrella term, but the truth is that the majority of AI (measured in model size, usage, and output volume) is bad and should stop.
Neural Network technology should not progress at the cost of our environment, short term or long term, and shouldn’t be used to dilute our collective culture and intelligence. Let’s not pretend that the dangers aren’t obvious and push for regulation.


What’s Myst 6? Uru??
It gave hapiness, that’s the point.


Waterfox has been pretty good lately


Chatbots often read as neurodivergent because they usually model one of our common constructed personalities: the faithful and professional helper that charms adults with their giftedness. Anything adults experienced was fascinating because it was an alien world that’s more logical than the world of kids that were our age, so we would enthusiastically chat with adults about subjects we’ve memorized but don’t yet understand, like science and technology.


Since recursion in humans has no commonly understood definition, Geoff and ChatGPT are each working off of diverging understandings. If users don’t validate definitions, getting abstract with a chatbot would lead to conceptual breakdown… that does not sound fun to experience.


You might be reading a lot into vague, highly conceptual, highly abstract language, but your conclusion is worth brainstorming about.
Personally, I think Geoff Lewis just discovered that people are starting to distrust him and others, and he used ChatGPT to construct an academic thesis that technically describes this new concept called “distrust,” void of accountability on his end.
“Why are people acting this way towords me? I know they can’t possibly distrust me without being manipulated!”
No wonder AI can replace middle-management…
And he had bespoke animations that were kinda charming
Beard looks good on him


I worked with an AS400 while in vehicle logistics, those things are optimized for simple functions but high data throughput


If that dude got proper support, he could have done wonders; he made animated icons for his 16-color assembly-coded OS, and a simple 3d racer! All by himself!
There’s a Fairly Odd Parents episode where Timmy wishes for this, and his asshole neighbors still found themselves to be the best of the grey blobs


I’m guessing OP means the build quality, as defined by the mechanical and material standards that are needed to recreate the keyboard.
That’s only if the HR knew what they were talking about when crafting the listing. Not saying GOG will use AI for good, but we don’t know if the job will require something like ChatGPT or something in-house that isn’t like GPT.