Altman’s remarks in his tweet drew an overwhelmingly negative reaction.

“You’re welcome,” one user responded. “Nice to know that our reward is our jobs being taken away.”

Others called him a “f***ing psychopath” and “scum.”

“Nothing says ‘you’re being replaced’ quite like a heartfelt thank you from the guy doing the replacing,” one user wrote.

  • funkless_eck@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 hour ago

    my (non technical boss) proudly declared they had “vibe coded a new landing page for our contact us page”

    when you hit the submit button it downloaded the .ico of the website to the temp files and nothing else.

    Perfect for a malicious code injection!

  • aliser@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 hour ago

    lets hope if robots take over they dispose of those useless ceos and billionaires

    • Earthman_Jim@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 hour ago

      Yeah, they know that all they have is advanced predictive text and that if people see through the mysticism they’re trying to project (about this predictive text on steroids being the emergence of a technogod), they’re fucked.

  • bpinyon@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    59 minutes ago

    Unless I’m mistaken, it took programmers to code AI in the first place.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      1
      ·
      28 minutes ago

      That’s what he’s saying. He’s thanking them for the effort and is saying the AI they created made them obsolete.

  • selokichtli@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    ·
    3 hours ago

    I’ll take all the AI they throw at me once the governments start taxing the rich appropriately, and these taxes cover for at least the basic needs of everyone. I’m not holding my breath, they always want everything.

    • Jakeroxs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 hours ago

      I wish more people brought this up instead of just flinging shit at users/companies using AI.

      They’re fighting an unwinable battle, AI is here to stay and it will keep improving, we have to adapt and ensure people are able to have their basic needs met.

      But that’s scary socialism or w/e.

  • Crozekiel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    4 hours ago

    And yet, if you walk into any discussion about LLM use in coding, devs come out in droves to defend and even champion its use…

    • selokichtli@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      3 hours ago

      These companies have more money in their hands than they can carry, is it too crazy to think they may be somehow buying on public perception? I’m not trying to make a point on this, I don’t have the time, just saying, this kind of thing happens all the time; investing on public opinion is quite cheap for them.

    • RagingRobot@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      3 hours ago

      I think most engineers don’t really like it because it’s making things harder and way less fun. The people who usually seem all about it are engineering managers and c suite idiots. They are all pushing it so hard. The message is clear, use the ai or be fired. It’s been sold to them and they bought it with everything they have. It’s not going to happen the way they think though.

      AI tools are nice when I use them the way I want to. I like to do the stuff I am good at and have it help me with what I’m not so good at. At work they want us to just use it for everything though until it can just replace us. That’s bullshit and it ruined any benefits we would have gained

  • FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    4
    ·
    5 hours ago

    So, we’re just trafficking in misinformation now?

    Sam Altman Thanks Programmers for Their Effort

    True

    , Says Their Time Is Over

    Complete fiction. Clickbait misinformation.

    The tweet:

    I have so much gratitude to people who wrote extremely complex software character-by-character. It already feels difficult to remember how much effort it really took.

    Thank you for getting us to this point.

    • judgyweevil@feddit.it
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 hours ago

      I agree, the “Says Their Time Is Over” part is clickbaity, but the sentiment is the same. He’s thanking “manual programmers” for taking us to this point, clearly implying that from now on they will be no longer required

      • Earthman_Jim@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        And he’s out of his mind or lying just the same. AI needs constant supervision, and if a human has to understand the code well enough to debug it, they may as well learn about it the code by – writing it themselves…

    • JcbAzPx@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 hours ago

      He didn’t say them at the same time. He is absolutely all about firing them. It’s his favorite pastime.

  • rodneylives@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    Yes, the age of all programming is over, because no new libraries or languages will ever be invented and LLMs will this always know everything there is to know about coding based on what’s already been written which will never go obsolete.

    Honestly, mocking these things is SO EASY.

      • rodneylives@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 hours ago

        Assuming that’s true, and that’s a BIG assumption… What makes you think that would matter? AI has no interiority; it isn’t a thinking blob, it’s a text generator. Think of it as a fancy Markov chain.

        Even if it were true, where in the chain do new principles, new techniques, new concepts enter into it? All these forms of generative AI can do is regurgitate what’s been fed into it. The worst thing you can train an AI from is AI-generated output.

        • village604@adultswim.fan
          link
          fedilink
          English
          arrow-up
          1
          ·
          23 minutes ago

          They used the word future for a reason. The technology is still being developed so basing future predictions on the current state is silly.

        • Jakeroxs@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          2 hours ago

          Ever heard of skills? You can essentially “teach” it new things that are not directly available in its model, right now it’s still pretty early but it (to me) feels like quite a leap compared to model-only usage.

          Its by no means perfect, but I do not think we’re even close to scratching the surface of what all can be done with the tech.

          I would bet people back at the advent of computers would scoff at many of the things computers can do now as fantasy.

          Edit: Right now, context size is a limiting factor, but you can do things like assign sub-agents to specific tasks/skills and have the overall agent call the subagent to complete the task thereby reducing the context size needed for the skill on the original agent call, it sorta acts as a mediator. Of course you still need to ensure you’re documenting what does/doesn’t work and have that available for future tasks in the same vein so it doesn’t repeat mistakes.

          On your point about the underlying model used to train it, I imagine at some point there will be a breakthrough where it becomes more dynamic, I think skills are kind of a stepping stone to that. Maybe instead of models being gigantic, data is broken down into individual skills that are called to inform specific actions, and those skills can easily be dynamic already.

  • WanderingThoughts@europe.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    So we have to make a new programming language that uses keywords that are slurs and curses so not allowed to be used by LLM. PI+. Politically Incorrect programming language.