I came across this article in another Lemmy community that dislikes AI. I’m reposting instead of cross posting so that we could have a conversation about how “work” might be changing with advancements in technology.

The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work “you and I do today” (including Altman himself), doesn’t look like work.

The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

As humanity’s core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn’t seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they’re made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

These days we have fewer bookkeepers - most companies don’t need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn’t have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

  • SapphironZA@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    29
    ·
    3 days ago

    Executive positions are probably the easiest to replace with AI.

    1. AI will listen to the employees
    2. They will try to be helpful by providing context and perspective based on information the employee might not have.
    3. They will accept being told they are wrong and update their advice.
    4. They will leave the employee to get the job done, trusting that the employee will get back to them if they need more help.
  • Snowclone@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    I’ve worked for big corporations that employ a lot of people. Every job has a metric showing how much money every single task they do creates. Believe me. They would never pay you if your tasks didn’t generate more money than they need to pay you to do the task.

    • Knock_Knock_Lemmy_In@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      Every job has a metric showing how much money every single task they do creates.

      Management accountants would love to do this. In practise you can only do this for low level, commoditised roles.

      • Snowclone@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        6 hours ago

        Mopping a floor has a determined metric. I’m not kidding. It’s a metric. Clean bathrooms are worth a determined dollar amount. It’s not simply sales or production, every task has a dollar amount. The amount of time it takes to do the task has a dollar value determined and on paper. Corporations know what every task is worth in dollar amounts. Processing Hazmats? Prevents the fine. Removing trash or pallets? Prevents lawsuits and workplace injury. Level of light reflected from the floor? Has a multiplier effect on sales. Determined. Defined. Training sales people on language choices, massive sales effect. They know how much money every single tasks generates, fines or lawsuits prevented, multiplier effects on average ticket sales, training to say ’ highest consumer rated repair services ’ instead of ‘extended warentee’ these are on paper defined dollar amounts. There is NO JOB in which you are paid to do something of no financial value. There are no unprofitable positions or tasks.

        • Knock_Knock_Lemmy_In@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 hours ago

          Your examples are all commoditized and measurable. Many roles are not this quantifiable.

          There is NO JOB in which you are paid to do something of no financial value.

          Compliance, marketing, social outreach, branding.

          Putting a $ amount on these and other similar roles is very difficult.

          But I agree, if the value added is known to be zero or negative then usually no-one is paid to do it.

          There are no unprofitable positions or tasks.

          Not when they are set up, but they can become unprofitable over time, and get overlooked.

  • sobchak@programming.dev
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 days ago

    The problem is the capitalist investor class, by and large, determines what work will be done, what kinds of jobs there will be, and who will work those jobs. They are becoming increasingly out of touch with reality as their wealth and power grows and seem to be trying to mold the world into something, somewhere along the lines of what Curtis Yarven advocates for, that most people would consider very dystopian.

    This discussion is also ignoring the fact that currently, 95% of AI projects fail, and studies show that LLM use hurts the productivity of programmers. But yeah, there will almost surely be breakthroughs in the future that will produce more useful AI tech; nobody knows what the timeline for that is though.

    • lemmeLurk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      3 days ago

      But isn’t the investment still driven by consumption in the end? They invest in what makes money, but in the end things people are willing to spend money on make money.

      • Ogy@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        3 days ago

        You’d think so, but unfortunately not. Venture captial is completely illogical, designed around boom or bust “moonshot” ideas that are supposed to completely change everything. So this money isn’t driven by actual consumption, rather speculation. I can’t really speak to other forms of investment but I suspect it doesn’t get a whole lot better. The economy has become far too financialised with a fiat currency that is completely separate from actual intrinsic value. That’s why a watch can cost more than a family home, which isn’t true consumption - just this weird concept of “wealth”

      • sobchak@programming.dev
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 days ago

        They invest in things they think they will be able to sell later for a higher price. Expected consumption is sometimes part of their calculations. But, they are increasingly not in touch with reality (see blockchain, metaverse, Tesla, etc). Sometimes they knowingly take a loss to gain power over the masses (Twitter, Washington Post). They are also powerful enough to induce consumption (bribe governments for contracts, laws, bailouts, and regulations that ensure their investments will be fruitful). They are powerful enough to heavily influence which politicians will get elected, choosing who they want to bribe. They are powerful enough to force the businesses they are invested in to buy/sell to each other. The largest, most profitable companies, produce nearly nothing, they use their positions of being near-monopolies to extract rent (i.e. enshittification/technofeudalism).

  • mechoman444@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    2 days ago

    It’s funny, years ago, a single developer “killing it” on Steam was almost unheard of. It happened, but it was few and far between.

    Now, with the advent of powerful engines like Unreal 5 and the latest iterations of Unity, practically anyone outside the Arctic Circle can pick one up and make a game.

    Is tech like that taking jobs away from the game industry? Yes. Very much so. But since those programs aren’t technically “AI,” they get a pass. Never mind that they use LLMs to streamline the process, they’re fine because they make games we enjoy playing.

    But that’s missing the point. For every job the deployment of some “schedule 1” or “megabonk” tech replaced, it enabled ten more people to play and benefit from the final product. Those games absolutely used AI in development, work that once would’ve gone to human hands.

    Technology always reduces jobs in some markets and creates new ones in others.

    It’s the natural way of things.

  • LittleBorat3@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    2 days ago

    Productivity will rise again and we will not get compensated even if we all get better cooler jobs and do the same but 10x more efficiently. Which we won’t get to do, some of us will have no jobs.

    Earnings from AI and automation need to be redistributed to the people. If it works and AI does not blow up in their face because it’s a bubble, they will be so filthy rich that they either don’t know what to do with it or lose grip of reality and try to shape politics, countries, the world etc.

    See the walking k-hole that tried to make things “more efficient”.

    • lechekaflan@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      What do we need the mega rich for anyway?

      Supposedly the creation and investment of industries, then managing those businesses which also supposedly provide employment for thousands who make the things for them. Except they’ll find ways to cut costs and maximize profit. Like looking for cheaper labor while at the same time thinking of building the next megayacht for which to flex off at Monte Carlo next summer.

  • m-p{3}@lemmy.ca
    link
    fedilink
    English
    arrow-up
    46
    ·
    edit-2
    3 days ago

    CEO isn’t an actual job either, it’s just the 21st century’s titre de noblesse.

  • Curious Canid@lemmy.ca
    link
    fedilink
    English
    arrow-up
    28
    ·
    3 days ago

    Sam Altman is a huckster, not a technologist. As such, I don’t really care what he says about technology. His purpose has always been to transfer as much money as possible from investors into his own pocket before the bubble bursts. Anything else is incidental.

    I am not entirely writing off LLMs, but very little of the discussion about them has been rational. They do some things fairly well and a lot of things quite poorly. It would be nice if we could just focus on the former.

    • Tollana1234567@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      hes probably afraid, its going to burst too fast and is left holding the bag, thats why GATES, musk, MS, google is trying to stem the bleeding.

  • billwashere@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    3 days ago

    Sam, I say this will all my heart…

    Fuck you very kindly. I’m pretty sure what you do is not “a real job” and should be replaced by AI.

  • 6nk06@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    104
    arrow-down
    3
    ·
    4 days ago

    At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

    No and no. Have you ever coded anything?

    • kescusay@lemmy.world
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      6
      ·
      4 days ago

      Yeah, I have never spent “days” setting anything up. Anyone who can’t do it without spending “days” struggling with it is not reading the documentation.

      • HarkMahlberg@kbin.earth
        link
        fedilink
        arrow-up
        52
        arrow-down
        1
        ·
        4 days ago

        Ever work in an enterprise environment? Sometimes a single talented developer cannot overcome the calcification of hundreds of people over several decades who care more about the optics of work than actual work. Documentation cannot help if its non-existent/20 years old. Documentation cannot make teams that don’t believe in automation, adopt Docker.

        Not that I expect Sam Altman to understand what it’s like working in a dumpster fire company, the only job he’s ever held is to pour gasoline.

        • killeronthecorner@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          edit-2
          3 days ago

          Dumpster fire companies are the ones he’s targeting because they’re the mostly likely to look for quick and cheap ways to fix the symptoms of their problems, and most likely to want to replace their employees with automations.

        • kescusay@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          3 days ago

          Well, if I’m not, then neither is an LLM.

          But for most projects built with modern tooling, the documentation is fine, and they mostly have simple CLIs for scaffolding a new application.

          • galaxy_nova@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 days ago

            I mean if you use the code base you’re working in as context it’ll probably learn the code base faster than you will, although I’m not saying that’s a good strategy, I’d never personally do that

            • kescusay@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              3 days ago

              The thing is, it really won’t. The context window isn’t large enough, especially for a decently-sized application, and that seems to be a fundamental limitation. Make the context window too large, and the LLM gets massively offtrack very easily, because there’s too much in it to distract it.

              And LLMs don’t remember anything. The next time you interact with it and put the whole codebase into its context window again, it won’t know what it did before, even if the last session was ten minutes ago. That’s why they so frequently create bloat.

      • Bo7a@piefed.ca
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        3 days ago

        I know this was aimed at someone else. But my response is “Every day.” What is your follow-up question?

    • nucleative@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      31
      ·
      4 days ago

      If your argument attacks my credibility, that’s fine, you don’t know me. We can find cases where developers use the technology and cases where they refuse.

      Do you have anything substantive to add to the discussion about whether AI LLMs are anything more than just a tool that allows workers to further abstract, advancing all of the professions it can touch towards any of: better / faster / cheaper / easier?

      • HarkMahlberg@kbin.earth
        link
        fedilink
        arrow-up
        18
        ·
        3 days ago

        Yeah, I’ve got something to add. The ruling class will use LLMs as a tool to lay off tens of thousands of workers to consolidate more power and wealth at the top.

        LLMs also advance no profession at all while it can still hallucinate and be manipulated by it’s owners, producing more junk that requires a skilled worker to fix. Even my coworkers have said “if I have to fix everything it gives me, why didn’t I just do it myself?”

        LLMs also have dire consequences outside the context of labor. Because of how easy they are to manipulate, they can be used to manufacture consent and warp public consciousness around their owners’ ideals.

        LLMs are also a massive financial bubble, ready to pop and send us into a recession. Nvidia is shoveling money into companies so they can shovel it back into Nvidia.

        Would you like me to continue on about the climate?

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I’ve got something to add: in every practical application AI have increased liabilities and created vastly inferior product, so they’re not more than just a tool that allows workers to further abstract because they are less than that. This in addition to the fact that AI companies can’t turn a profit, so it’s not better, not faster, not cheaper, but but it is certainly easier (to do a shit job).