Altman’s remarks in his tweet drew an overwhelmingly negative reaction.
“You’re welcome,” one user responded. “Nice to know that our reward is our jobs being taken away.”
Others called him a “f***ing psychopath” and “scum.”
“Nothing says ‘you’re being replaced’ quite like a heartfelt thank you from the guy doing the replacing,” one user wrote.
Hahahaha… go fuck yourself, you miserable thief
This dude is an empty husk of a human being. Sim Notman
He and Mr. Beast are the perfect couple
my (non technical boss) proudly declared they had “vibe coded a new landing page for our contact us page”
when you hit the submit button it downloaded the .ico of the website to the temp files and nothing else.
Perfect for a malicious code injection!
lets hope if robots take over they dispose of those useless ceos and billionaires
All these AI ceos seem to be getting pretty desperate these days, do they know something we don’t
Yeah, they know that all they have is advanced predictive text and that if people see through the mysticism they’re trying to project (about this predictive text on steroids being the emergence of a technogod), they’re fucked.
Unless I’m mistaken, it took programmers to code AI in the first place.
That’s what he’s saying. He’s thanking them for the effort and is saying the AI they created made them obsolete.
Lol the “AI” that can write a functioning optimized software especially with niche stuff, i still have to see
Nah, Sam. I think it’s your time that’s soon over…
Incestuous sex pest Sam Altman?
wait what
His sister is suing him and says he sexually assaulted her for the better part of a decade.
I’ll take all the AI they throw at me once the governments start taxing the rich appropriately, and these taxes cover for at least the basic needs of everyone. I’m not holding my breath, they always want everything.
I wish more people brought this up instead of just flinging shit at users/companies using AI.
They’re fighting an unwinable battle, AI is here to stay and it will keep improving, we have to adapt and ensure people are able to have their basic needs met.
But that’s scary socialism or w/e.
And yet, if you walk into any discussion about LLM use in coding, devs come out in droves to defend and even champion its use…

These companies have more money in their hands than they can carry, is it too crazy to think they may be somehow buying on public perception? I’m not trying to make a point on this, I don’t have the time, just saying, this kind of thing happens all the time; investing on public opinion is quite cheap for them.
I think most engineers don’t really like it because it’s making things harder and way less fun. The people who usually seem all about it are engineering managers and c suite idiots. They are all pushing it so hard. The message is clear, use the ai or be fired. It’s been sold to them and they bought it with everything they have. It’s not going to happen the way they think though.
AI tools are nice when I use them the way I want to. I like to do the stuff I am good at and have it help me with what I’m not so good at. At work they want us to just use it for everything though until it can just replace us. That’s bullshit and it ruined any benefits we would have gained
So, we’re just trafficking in misinformation now?
Sam Altman Thanks Programmers for Their Effort
True
, Says Their Time Is Over
Complete fiction. Clickbait misinformation.
The tweet:
I have so much gratitude to people who wrote extremely complex software character-by-character. It already feels difficult to remember how much effort it really took.
Thank you for getting us to this point.
I agree, the “Says Their Time Is Over” part is clickbaity, but the sentiment is the same. He’s thanking “manual programmers” for taking us to this point, clearly implying that from now on they will be no longer required
And he’s out of his mind or lying just the same. AI needs constant supervision, and if a human has to understand the code well enough to debug it, they may as well learn about it the code by – writing it themselves…
He didn’t say them at the same time. He is absolutely all about firing them. It’s his favorite pastime.
If you swap “says” with “hints”, it works though.
If you swap “says” with “doesn’t say” it also works.
Yes, the age of all programming is over, because no new libraries or languages will ever be invented and LLMs will this always know everything there is to know about coding based on what’s already been written which will never go obsolete.
Honestly, mocking these things is SO EASY.
What makes you think AI won’t be writing libraries or languages in the future?
Assuming that’s true, and that’s a BIG assumption… What makes you think that would matter? AI has no interiority; it isn’t a thinking blob, it’s a text generator. Think of it as a fancy Markov chain.
Even if it were true, where in the chain do new principles, new techniques, new concepts enter into it? All these forms of generative AI can do is regurgitate what’s been fed into it. The worst thing you can train an AI from is AI-generated output.
They used the word future for a reason. The technology is still being developed so basing future predictions on the current state is silly.
Ever heard of skills? You can essentially “teach” it new things that are not directly available in its model, right now it’s still pretty early but it (to me) feels like quite a leap compared to model-only usage.
Its by no means perfect, but I do not think we’re even close to scratching the surface of what all can be done with the tech.
I would bet people back at the advent of computers would scoff at many of the things computers can do now as fantasy.
Edit: Right now, context size is a limiting factor, but you can do things like assign sub-agents to specific tasks/skills and have the overall agent call the subagent to complete the task thereby reducing the context size needed for the skill on the original agent call, it sorta acts as a mediator. Of course you still need to ensure you’re documenting what does/doesn’t work and have that available for future tasks in the same vein so it doesn’t repeat mistakes.
On your point about the underlying model used to train it, I imagine at some point there will be a breakthrough where it becomes more dynamic, I think skills are kind of a stepping stone to that. Maybe instead of models being gigantic, data is broken down into individual skills that are called to inform specific actions, and those skills can easily be dynamic already.
The arrogance is astounding
Executives, your time is over.
So we have to make a new programming language that uses keywords that are slurs and curses so not allowed to be used by LLM. PI+. Politically Incorrect programming language.







