• LobsterJim@slrpnk.net
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    13
    ·
    14 hours ago

    There’s nothing to complain about here. Games require tons of placeholders, in art, dialogue, and code. They will iterate dozens of times before the final product, and given Larian’s own production standards, there’s no chance anything but the most inconsequential or forgotten items made by an LLM will stay in.

    • Noja@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      4
      ·
      edit-2
      14 hours ago

      Among the devs responding is a former Larian staffer, environment artist Selena Tobin. “consider my feedback: i loved working at @larianstudios.com until AI,” Tobin writes. “reconsider and change your direction, like, yesterday. show your employees some respect. they are world-class & do not need AI assistance to come up with amazing ideas.”

      https://www.rockpapershotgun.com/larian-boss-responds-to-criticism-of-generative-ai-use-its-something-we-are-constantly-discussing-internally

      there’s no chance anything but the most inconsequential or forgotten items made by an LLM will stay in.

      Concept art is not a placeholder. It’s part of the creative process. By using AI to generate text and images you already influenced the creative process negatively.

      • LobsterJim@slrpnk.net
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        15
        ·
        13 hours ago

        The article doesn’t say Larian is using it for concept art.

        Those were hypothetical statements from people outside the studio.

        • Passerby6497@lemmy.world
          link
          fedilink
          English
          arrow-up
          22
          ·
          12 hours ago

          Literally the first sentence

          A few hours ago, reports surfaced that Larian are making use of generative AI during development of their new RPG Divinity - specifically, to come up with ideas, produce placeholder text, develop concept art, and create materials for PowerPoint presentations.

          • Alaknár@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            6 hours ago

            The actual quote from Swen is that they use it in the “ideation” phase of concept art. Basically: throwing shit on the wall and seeing what sticks. After that, the process is taken over by any of their almost 30 concept artists on payroll.

        • Noja@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          7
          ·
          12 hours ago

          Idk but that seems pretty obvious to me from reading the quote by Larian CEO Swen Vincke that they used to, or are still using it to generate or “enhance” concept art and that’s it’s a highly discussed topic within the company?

          • Alaknár@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            6 hours ago

            What he actually said was that they use it in the “ideation” phase of concept art. Basically: throwing shit on the wall and seeing what sticks. After that, the process is taken over by any of their almost 30 concept artists on payroll.

  • TommySoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    9
    ·
    edit-2
    17 hours ago

    They honestly should have expected this given peoples visceral reaction to anything AI. Personally, I have huge problems with AI and refuse to play most games that have used it. I think it’s poisoning every creative industry and replacing important jobs while using vague the excuse that it makes things “easier” while making the game soulless in the process. I’m willing to give Larian the benefit of the doubt simply because of their previous games being amazing, but imma wait for the reviews on this one. This game is still going to be in development for another 4 years and none of us will no what’ll happen between then and now, but for now I’ll remain hopefully optimistic

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      62
      ·
      17 hours ago

      Most people—even obsessive gamers—don’t give two shits about AI. There’s a very loud minority that gets in everyone’s face saying all AI is evil like we’re John Connor or something. They are so obsessive and extreme about it, it often makes the news (like this article).

      The market has already determined that if a game is fun, people will play it. How much AI was used to make it is irrelevant.

      • CosmoNova@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        7 hours ago

        I don‘t like to admit it but you‘re likely right. And there are very cool use cases for machine learning if done right. And some of these concepts are already in successful games.

        Of course there absolutely is slop that I‘m refusing to buy and companies do face backlash over it. No doubt about it, but that really doesn‘t mean every single use case for AI is bad or makes for a terrible product at all.

        But it is interesting to see how much pushback you‘re facing for this comment while most people seem cool with it when Larian does it for some reason. Consumers are hypocrites sometimes.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        33
        arrow-down
        2
        ·
        edit-2
        17 hours ago

        That’s extreme, and put abrasively.

        …But the sentiment isn’t wong.

        Except it’s not a small minority anymore, which is understandable given how pervasive chatbot enshittification is becoming. Maybe the ‘made with AI’ label isn’t enough to deter everyone, but it’s enough to kill social media momentum, which is largely how games sell these days.

        • Cybersteel@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          10
          ·
          16 hours ago

          I’ve been arrested several times putting a crowbar onto anything AI for a while now. From those waiter bots to my now ex-company’s AI servers. A non-relevant game made by a non-relevant dev is an easy skip/boycott from me.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            13 hours ago

            Yeah. That’s a bit extreme.

            You can sit back and let this stuff collapse under its own weight, you know.

            TBH a violent reaction feels like is just going to help politicize this LLM mania (and therefore present an excuse to cement the enshittification). Let people see how awful and annoying it is all by itself.

            You should break Meta glasses though. That’s totally warranted.

            • Cybersteel@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              9 hours ago

              Yeah did punch a few raybans back in the day, may or may not be meta ones tho oops.

      • NotASharkInAManSuit@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        9
        ·
        12 hours ago

        I’ve yet to meet a single person in real life who isn’t turned off by AI, and there are fewer and fewer of you grifters in the comments these days trying to defend it.

        Fuck LLMs and dispersion models, its not art and it’s not even AI.

  • 13igTyme@piefed.social
    link
    fedilink
    English
    arrow-up
    96
    arrow-down
    36
    ·
    19 hours ago

    Nothing wrong with using AI to organize or supplement workflow. That’s literally the best use for it.

    • iAmTheTot@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      103
      arrow-down
      16
      ·
      19 hours ago

      Except for the ethical question of how the AI was trained, or the environmental aspect of using it.

      • Hackworth@piefed.ca
        link
        fedilink
        English
        arrow-up
        35
        arrow-down
        11
        ·
        18 hours ago

        There are AI’s that are ethically trained. There are AI’s that run on local hardware. We’ll eventually need AI ratings to distinguish use types, I suppose.

        • utopiah@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          ·
          12 hours ago

          There are AI’s that are ethically trained

          Can you please share examples and criteria?

        • Riskable@programming.dev
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          22
          ·
          17 hours ago

          It’s even more complicated than that: “AI” is not even a well-defined term. Back when Quake 3 was still in beta (“the demo”), id Software held a competition to develop “bot AIs” that could be added to a server so players would have something to play against while they waited for more people to join (or you could have players VS bots style matches).

          That was over 25 years ago. What kind of “AI” do you think was used back then? 🤣

          The AI hater extremists seem to be in two camps:

          • Data center haters
          • AI-is-killing-jobs

          The data center haters are the strangest, to me. Because there’s this default assumption that data centers can never be powered by renewable energy and that AI will never improve to the point where it can all be run locally on people’s PCs (and other, personal hardware).

          Yet every day there’s news suggesting that local AI is performing better and better. It seems inevitable—to me—that “big AI” will go the same route as mainframes.

          • acosmichippo@lemmy.world
            link
            fedilink
            English
            arrow-up
            18
            ·
            edit-2
            12 hours ago

            colloquially today most people mean genAI like LLMs when they say “AI” for brevity.

            Because there’s this default assumption that data centers can never be powered by renewable energy

            that’s not the point at all. the point is, even before AI, our energy needs have been outpacing our ability/willingness to switch to green energy. Even then we were using more fossil fuels than at any point in the history of the world. Now AI is just adding a whole other layer of energy demand on top of that.

            sure, maybe, eventually, we will power everything with green energy, but… we aren’t actually doing that, and we don’t have time to put off the transition. every bit longer we wait will add to negative effects on our climate and ecosystems.

      • Bronzebeard@lemmy.zip
        link
        fedilink
        English
        arrow-up
        5
        ·
        15 hours ago

        No one [intelligent] is using an LLm for workflow organization. Despite what the media will try to convince you, Not every AI is an LLM or even and LLM trained on all the copyrighted shit you can find in the Internet.

      • ruuster13@lemmy.zip
        link
        fedilink
        English
        arrow-up
        38
        arrow-down
        36
        ·
        18 hours ago

        The cat’s out of the bag. Focus your energy on stopping fascist oligarchs then regulating AI to be as green and democratic as possible. Or sit back and avoid it out of ethical concerns as the fascists use it to target and eliminate you.

        • iAmTheTot@sh.itjust.worksOP
          link
          fedilink
          English
          arrow-up
          54
          arrow-down
          3
          ·
          17 hours ago

          Holy false dichotomy. I can care about more than one thing at a time. The existence of fascists doesn’t mean I need to use and like AI lmao

        • MoogleMaestro@lemmy.zip
          link
          fedilink
          English
          arrow-up
          43
          arrow-down
          7
          ·
          17 hours ago

          The cat’s out of the bag

          That’s 👏 not 👏 an 👏 excuse 👏 to be 👏 SHITTY!

          The number of people who think that saying that the cat’s out of the bag is somehow redeeming is completely bizarre. Would you say this about slavery too in the 1800s? Just because people are doing it doesn’t mean it’s morally or ethically right to do it, nor that we should put up with it.

          • teawrecks@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            7
            ·
            edit-2
            14 hours ago

            No one 👏👏 is 👏👏 excusing 👏👏 being 👏👏 shitty.

            The “cat” does not refer to unethical training of models. Tell me, if we somehow managed to delete every single unethically trained model in existence AND miraculously prevent another one from being ever made (ignoring the part where the AI bubble pops) what would happen? Do you think everyone would go “welp, no more AI I guess.” NO! People would immediately get to work making an “ethically trained” model (according to some regulatory definition of “ethical”), and by “people” I don’t mean just anyone, I mean the people who can afford to gather or license the most exclusive training data: the wealthy.

            “Cat’s out of the bag” means the knowledge of what’s possible is out there and everyone knows it. The only thing you could gain by trying to put it “back in the bag” is to help the ultra wealthy capitalize on it.

            So, much like with slavery and animal testing and nuclear weapons, what we should do instead is recognize that we live in a reality where the cat is out of the bag, and try to prevent harm caused by it going forward.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      8
      ·
      edit-2
      19 hours ago

      We’ve had tools to manage workflows for decades. You don’t need Copilot injected into every corner of your interface to achieve this. I suspect the bigger challenge for Larian is working in a development suite that can’t be accused of having “AI Assist” hiding somewhere in the internals.

      • Hackworth@piefed.ca
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        19 hours ago

        Yup! Certifying a workflow as AI-free would be a monumental task now. First, you’d have to designate exactly what kinds of AI you mean, which is a harder task than I think people realize. Then, you’d have to identify every instance of that kind of AI in every tool you might use. And just looking at Adobe, there’s a lot. Then you, what, forbid your team from using them, sure, but how do you monitor that? Ya can’t uninstall generative fill from Photoshop. Anyway, that’s why anything with a complicated design process marked “AI-Free” is going to be the equivalent of greenwashing, at least for a while. But they should be able to prevent obvious slop from being in the final product just in regular testing.

          • plateee@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            11 hours ago

            Or just have a hard cut-off for software released after 2022.

            It’s the only way I search for recipes anymore - a date filter from 1/1/1990 - 1/1/2022.

          • Hackworth@piefed.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            18 hours ago

            Coincidentally, this paper published yesterday indicates that LLMs are worse at coding the closer you get to the low level like assembly or binary. Or more precisely, ya stop seeing improvements pretty early on in scaling up the models. If I’m reading it right, which I’m probably not.

        • Bronzebeard@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 hours ago

          Yeah, do you use any Microsoft products at all (like 98% of corporate software development does)? Everything from teams to word to visual studio has copilot sitting there. It would just take one employee asking it a question to render a no-AI pledge a lie.

      • rtxn@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        6
        ·
        edit-2
        18 hours ago

        You know it doesn’t have to be all or nothing, right?

        In the early design phase, for example, quick placeholder objects are invaluable for composing a scene. Say you want a dozen different effigies built from wood and straw – you let the clanker churn them out. If you like them, an environment artist can replace them with bespoke models, as detailed and as optimized as the scene needs it. If you don’t like them, you can just chuck them in the trash and you won’t have wasted the work of an artist, who can work on artwork that will actually appear in the released product.

        Larian haven’t done anything to make me question their credibility in this matter.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          6
          ·
          18 hours ago

          You know it doesn’t have to be all or nothing, right?

          Part of the “magic” of AI is how much of the design process gets hijacked by inference. At some scale you simply don’t have control of your own product anymore. What is normally a process of building up an asset by layers becomes flattened blobs you need to meticulously deconstruct and reconstruct if you want them to not look like total shit.

          That’s a big part of the reason why “AI slop” looks so bad. Inference is fundamentally not how people create complex and delicate art pieces. It’s like constructing a house by starting with the paint job and ending with the framing lumber, then asking an architect to fix where you fucked up.

          If you don’t like them, you can just chuck them in the trash and you won’t have wasted the work of an artist

          If you engineer your art department to start with verbal prompts rather than sketches and rough drawings, you’re handcuffing yourself to the heuristics of your AI dataset. It doesn’t matter that you can throw away what you don’t like. It matters that you’re preemptively limiting yourself to what you’ll eventually approve.

          • Prove_your_argument@piefed.social
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            4
            ·
            18 hours ago

            That’s a big part of the reason why “AI slop” looks so bad. Inference is fundamentally not how people create complex and delicate art pieces. It’s like constructing a house by starting with the paint job and ending with the framing lumber, then asking an architect to fix where you fucked up.

            This is just the whole robot sandwich thing to me.

            A tool is a tool. Fools may not use them well, but someone who understands how to properly use a tool can get great things out of it.

            Doesn’t anybody remember how internet search was in the early days? How you had to craft very specific searches to get something you actually wanted? To me this is like that. I use generative AI as a search engine and just like with altavista or google, it’s up to my own evaluation of the results and my own acumen with the prompt to get me where I want to be. Even then, I still need to pay attention and make sure what I have is relevant and useful.

            I think artists could use gen AI to make more good art than ever, but just like a photographer… a thousand shots only results in a very small number of truly amazing outcomes.

            Gen AI can’t think for itself or for anybody, and if you let it do the thinking and end up with slop well… garbage in, garbage out.

            At the end of the day right now two people can use the same tools and ask for the same things and get wildly different outputs. It doesn’t have to be garbage unless you let it be though.

            I will say, gen AI seems to be the only way to combat the insane BEC attacks we have today. I can’t babysit every single user’s every email, but it sure as hell can bring me a shortlist of things to look at. Something might get through, but before I had a tool a ton of shit got through, and we almost paid tens of thousands of dollars in a single bogus but convincing looking invoice. It went so far as a fucking bank account penny test (they verified two ach deposits) Four different people gave their approvals - head of accounting included… before a junior person asked us if we saw anything fishy. This is just one example for why gen AI can have real practical use cases.

            • UnderpantsWeevil@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              10
              ·
              17 hours ago

              This is just the whole robot sandwich thing to me.

              If home kitchens were being replaced by pre-filled Automats, I’d be equally repulsed.

              A tool is a tool. Fools may not use them well, but someone who understands how to properly use a tool can get great things out of it.

              The most expert craftsman won’t get a round peg to fit into a square hole without doing some damage. At some point, you need to understand what the tool is useful for. And the danger of LLMs boils down to the seeming industrial scale willingness to sacrifice quality for expediency and defend the choice in the name of business profit.

              Doesn’t anybody remember how internet search was in the early days? How you had to craft very specific searches to get something you actually wanted?

              Internet search was as much constrained by what was online as what you entered in the prompt. You might ask for a horse and get a hundred different Palominos when you wanted a Clydesdale, not realizing the need to be specific. But you’re never going to find a picture of a Vermont Morgan horse if nobody bothered to snap a photo and host it where a crawler could find it.

              Taken to the next level with LLMs, you’re never going to infer a Vermont Morgan if it isn’t in the training data. You’re never going to even think to look for one, if the LLM hasn’t bothered to index it properly. And because these AI engines are constantly eating their own tails, what you get is a basket of horses that are inferred between a Palomino and a Clydesdale, sucked back into training data, and inferred in between a Palomino and a Palomino-Clydesdale, and sucked back into the training data, and, and, and…

              I think artists could use gen AI to make more good art than ever

              I don’t think using an increasingly elaborate and sophisticated crutch will teach you to sprint faster than Hussein Bolt. Removing steps in the artistic process and relying on glorified Clipart Catalogs will not improve your output. It will speed up your output and meet some minimum viable standard for release. But the goal of that process is to remove human involvement, not improve human involvement.

              I will say, gen AI seems to be the only way to combat the insane BEC attacks we have today.

              Which is great. Love to use algorithmic defenses to combat algorithmic attacks.

              But that’s a completely different problem than using inference to generate art assets.

          • False@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            18 hours ago

            How do you think a human decides what to sketch? They talk about the requirements.

    • Darkcoffee@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      5
      ·
      19 hours ago

      I was saying that as well.

      I get the knee jerk reaction because everything has been so horrible everywhere lately with AI, but they’re actually one of the few companies using it right.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    16
    ·
    edit-2
    19 hours ago

    Yeah, the outrage is overblown.

    This doesn’t mean they’re enforcing a CoPilot quota or vibe coding the game or shipping slop; it could be simple autocompletion, or (say) a component that makes the mocap pipeline easier.

    Don’t let Tech Bros poison dumb tools that could help out devs like Larian.


    …Now, if they ship slop into the final game or announce an “OpenAI partnership,” that’s a different story.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      19 hours ago

      …Now, if they ship slop into the final game

      At a certain level, it is going to be a chore to determine who is or is not slopping up with AI media. Not every asset comes out with six fingers and a half-melted face.

      I can see legitimate frustration with an industry that seems reliant on increasingly generic and interchangeable assets. AI just becomes the next iteration of this problem. You’ve expanded the warehouse of prefab images, but you’re still stuck with end products that are uncannily similar to everything else on the market.

      And that’s before you get to the IP implications of farming all your content out to a third party that doesn’t seem to care where its base library is populated from.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        18 hours ago

        At a certain level, it is going to be a chore to determine who is or is not slopping up with AI media. Not every asset comes out with six fingers and a half-melted face.

        Image/video diffusion is a tiny subset of genAI. I’d bet nothing purely autogenerated makes it into a game.

        I can see legitimate frustration with an industry that seems reliant on increasingly generic and interchangeable assets. AI just becomes the next iteration of this problem. You’ve expanded the warehouse of prefab images, but you’re still stuck with end products that are uncannily similar to everything else on the market.

        See above. And in many spaces, there are a sea of models to choose from, and an easy ability to tune them to whatever style you want.

        And that’s before you get to the IP implications of farming all your content out to a third party that doesn’t seem to care where its base library is populated from.

        Thier tools can be totally in house, disconnected from the outside web, if they wish. They might just be a part of the pipeline on their graphics workstations.


        Keep a distinction between “some machine learning in tedious parts of our workflows” and “a partnership with Big Tech APIs.” Those are totally different things.

        It sounds like Larian is talking about the former, and I’m not worried about any loss of creativity from that.

      • DigitalDilemma@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        17 hours ago

        At a certain level, it is going to be a chore to determine who is or is not slopping up with AI media. Not every asset comes out with six fingers and a half-melted face.

        That’s a good point. I think a lot of people are dismissing AI content because there’s this fallacy and desire to believe it’s all “slop”. It’s willfully ignorant to wave it all away like that. Sure, we’ve all seen the stupid stuff, and it’s really annoying, but we absorb the good stuff without even knowing it. Anyone claiming they can reliably spot AI generated images is fooling themselves even at this early stage.

        I’d like to know when something is real, especially real art and even pictures of nature, but I don’t think I can.

  • Ashtear@piefed.social
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    2
    ·
    19 hours ago

    Considering we’ve already got the one former Larian employee speaking out against this, it’ll be interesting to see how many more show up off the record (or maybe on the record anonymously). I’m sure there was an internal battle over it.

    There aren’t many (possibly none) with more goodwill banked among enthusiast gamers than Vincke, so I feel like we’re about to see just how far a popular figure can step into this particular puddle without coming out soaked.

  • ToiletFlushShowerScream@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    5
    ·
    16 hours ago

    Welp, there goes all of my enthusiasm for the next game. Will have to check out other actual game developers and artists instead of whatever Larian and their genAI prompt monkeys have become.

  • Katana314@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    16 hours ago

    I’ve had an idea of making a visual novel with gen AI, but I’d want to attach “Placeholder: AI Artwork” in a visible location for each sprite. And I only even consider that because I’m not exactly a known game dev and don’t have ready access to artists.

    Larian should likely expect if they’re taking shortcuts in their position, they’d get backlash. I can at least recognize that they’re trying to be moderate about it.

    • Jankatarch@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      11 hours ago

      Also you don’t have the infamous AAA deadlines so you are not guaranteed to include them in final outcome anyways due to “an oopse.”

  • MoogleMaestro@lemmy.zip
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    10
    ·
    17 hours ago

    As someone who owns D:OS2, this is really disappointing and below their standards. I’ll be giving this new Divinity a pass.

    They don’t need to be using AI to create concepts, and if they do, I don’t think the “concepts” will be all that great in the first place. Not to mention the ethical perils of using models trained off other artists who are not licensed or compensated.

    This is some classic CEO “step on a rake and then get mad at everyone else” nonsense. They openly talked about how they liked AI, and get mad at us for saying “cool, that’s a game I’m gonna skip then!”

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      5
      ·
      edit-2
      17 hours ago

      That’s an awfully early point to judge a game, with basically zero knowledge of what they’re actually doing/using.

      What if they’re referencing a small, home grown model to assist with mocap? Or a sketch->3D drafting tool? Would that be enough to write it off?

      • MoogleMaestro@lemmy.zip
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        5
        ·
        edit-2
        17 hours ago

        What if it’s a home grown model to assist with mocap?

        Well, that’s not what it is (a), at least according to the CEO. They used it for concepts, not animations. And also, (b) I’m not really in the place to give people the benefit of the doubt when using AI that is trained off stolen materials. I sincerely doubt they’re using a “home grown model” because anyone who knows even a scrap of how LLM/GANs work knows that the data needs to train a model would be far beyond the reach of a company of Larian’s scale. They’ve likely just licensed it from one of the many grifting oligarch AI peddlers.

        We don’t need defenders coming in here trying to pretend that the CEO hasn’t just clarified that they are using AI for preproduction, we know this and it’s not up for debate now.

        Would that be enough to write it off?

        As someone who really appreciates and likes animation, in that particular example, then yes it would probably be enough to write it off. And frankly, why do I need to play their game when I could just AI generate my own slop and save the 70 bucks? In reality, it’s actually fine for me, I have plenty of games and can replay the old Divinity games before these guys lost their way. They used to be a company that followed a passion for CRPGs with good-will behind them, but now that BG3 has been a runaway hit, it seems like they’ve forgotten about the community that got them to where they are today in favor of some AAA gaming nonsense.

        Edit:

        That’s an awfully early point to judge a game, with basically zero knowledge of what they’re actually doing/using.

        Frankly, there are plenty of games that people judge from the outset. There’s a reason why we have the saying “First impressions matter”. They’ve left a bad taste in anyone who dares question the ethics of AI use, but thankfully there might be an audience of people out there who like slop more than I dislike it so they could be ok. No skin off my nose.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          4
          ·
          edit-2
          14 hours ago

          because anyone who knows even a scrap of how LLM/GANs work knows that the data needs to train a model would be far beyond the reach of a company of Larian’s scale

          If it’s like an image/video model, they could start with existing open weights, and fine tune it. There are tons to pick from, and libraries to easily plug them into.

          If it’s not, and something really niche, and doesn’t already exist to their satisfaction, it probably doesn’t need to be that big a model. A lot of weird stuff like sketch -> 3D models are trained on university student project time + money budgets (though plenty of those already exist).

          We don’t need defenders coming in here trying to pretend that the CEO hasn’t just clarified that they are using AI for preproduction, we know this and it’s not up for debate now.

          No. We don’t know.

          And frankly, why do I need to play their game when I could just AI generate my own slop and save the 70 bucks

          I dunno what you’re on about, that has nothing to do with tools used in preproduction. How do you know they’ll even use text models? Much less that a single would ever be shipped in the final game? And how are you equating LLM slop to a Larian RPG?

          hit, it seems like they’ve forgotten about the community that got them to where they are today in favor of some AAA gaming nonsense.

          Except literally every word that comes out of interviews is care for their developers, and their community, which they continue to support.

          Frankly, there are plenty of games that people judge from the outset. There’s a reason why we have the saying “First impressions matter”. They’ve left a bad taste in anyone who dares question the ethics of AI use, but thankfully there might be an audience of people out there who like slop more than I dislike it so they could be ok. No skin off my nose.

          Read that again; pretend it’s not about AI.

          It sounds like language gamergate followers use as excuses to hate something they’ve never even played, when they’ve read some headline they don’t like.


          …Look, if Divinity comes out and it has any slop in it, it can burn in hell. If it comes out that they partnered with OpenAI or whomever extensively, it deserves to get shunned and raked over coals.

          But I do not like this zealous, uncompromising hate for something that hasn’t even come out, that we know little about, from a studio we have every reason to give the benefit of the doubt. It reminds me of the most toxic “gamer” parts of Reddit and other cesspools of the internet, and I don’t want it to spread here.

          • MoogleMaestro@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 hours ago

            I won’t bother engaging with the “gamergate” false equivalency. I think it’s disingenuous to try to tie any of what I said so far to a some fearmonger induced culture war, biggotted nonsense when we’re talking about a much broader wealth extraction mechanism and misanthropic tech movement. I think you’re saying this from a well-meaning place, but I actually don’t think what I’ve said is overzealous at all. The CEO is saying he’s using AI and, if you’re opposed to the social and financial repercussions of this, it’s fair game to boycott a product over this.

            To pick a true real world example, some people won’t eat meat that isn’t free-range. This isn’t about the quality of the meat really, it’s about the inhumane treatment of animals. Not everyone subscribes to this, sometimes I don’t buy free-range meat either, but it’s not “wrong” for people to choose to not buy meat that isn’t free range. The same can and should be true about the media we consume, whether it’s games or films.

            If it’s like an image/video model, they could start with existing open weights, and fine tune it. There are tons to pick from, and libraries to easily plug them into.

            If it’s not, and something really niche, and doesn’t already exist to their satisfaction, it probably doesn’t need to be that big a model. A lot of weird stuff like sketch -> 3D models are trained on university student project time + money budgets (though plenty of those already exist).

            …Look, if Divinity comes out and it has any slop in it, it can burn in hell. If it comes out that they partnered with OpenAI or whomever extensively, it deserves to get shunned and raked over coals.

            I won’t get into this too much, but “open weights” is not “open source”, and even “open source” is not real “open source” when it comes to AI. Really, what you should be talking about is an open dataset based model, which there are very few of in reality. The issue isn’t the weights, the issue is the data that was used to generate the weights in the first place.

            It’s not impossible that they’re using some bespoke model derived from an open dataset model, but considering the full transcript is now out and he name dropped ChatGPT in particular, I don’t really have much confidence that there’s some kind of ethical silver lining. Since he was the one who mentioned using AI in previs development, it’s actually up to him to clarify what models they’re using and whether they’re ethically sourced. I don’t really have to prove anything beyond them using AI and not thinking AI is to my personal pallet. That’s fine, everyone has their own tastes. To me, I was excited about the new Divinity until this news dropped, and the hype is simply deflated because it is against my morals. That’s on them, not on me.

            If he wants to push for open datasets as an AI industry counter play, then fine – fair play and good riddance to closed source (closed data) AI industry players. But until that happens it’s actually just a fantasy and not based on reality. I’ll stick to what has been said and not extrapolate what could-be.

  • Deceptichum@quokk.au
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    14
    ·
    11 hours ago

    I only care if the game is good. What tools are used to make it is irrelevant to my enjoyment.

  • Jankatarch@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    5
    ·
    edit-2
    11 hours ago

    Yeah I am sure higher ups of this one studio alone will use it responsibly because they are “not like the others” and definetely will resist making a bad decision despite deadlines.

  • Kaput@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    5
    ·
    15 hours ago

    I would play a game with ai NPCs. Not the artwork but the mechanic. You could run hundreds of interqctable characters with simple prompts that states ztheir goals and personalities. Like a modern radiant ai.

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      Why is one thing okay for you but not the other? A generated artwork is an artists not paid and theft. Generated dialogue is a writer not paid and also theft. All the popular models are fed on stolen content.

    • Noja@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      14
      ·
      14 hours ago

      These games already exist and they’re just boring, no thought behind any dialogue, just a waterfall of LLM slop.

        • Zahille7@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          13 hours ago

          I’m not sure about games personally, but there’s the Mantella mod for Skyrim which is pretty much exactly what you’re looking for.

          Check out this video if you want to see how it works in practice.

          • Deceptichum@quokk.au
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            11 hours ago

            Shit that is awesome. Sounds as shit as Skyrim dialogue, but the responsiveness to the world and players actions is such a nice feature to bring life to such a boring game.

  • CallMeAnAI@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    11
    ·
    18 hours ago

    Gamers will and do bitch about anything and everything. I couldn’t care less if they use AI if the product is good.

  • Klanky@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    53
    ·
    19 hours ago

    I really hope this marks the end of the internet’s weird love fest with this company and CEO. They don’t care about you in the slightest.

    • realitaetsverlust@piefed.zip
      link
      fedilink
      English
      arrow-up
      55
      arrow-down
      5
      ·
      edit-2
      19 hours ago

      Yeah because why like a company that has released banger after banger after banger without any MTX or other bullshit attached to it.

      • Klanky@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        17
        ·
        19 hours ago

        That may be true. Still a company, still doesn’t care about anyone besides as a consumer. I don’t get the weird parasocial relationships people develop with any company/corporation.

      • the_q@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        18
        ·
        19 hours ago

        Because one day they won’t. The more money a company makes the closer they get to being every successful company.

          • galaxy_nova@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            4
            ·
            19 hours ago

            Because then you don’t get cool points for going puts sunglasses on “I always knew those guys were evil! I’ve been hating them longer than you!”

          • Amnesigenic@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            13 hours ago

            Not really “in advance” if they’re openly discussing doing the thing that people are mad about right now, more like right on time

        • _cryptagion [he/him]@anarchist.nexus
          link
          fedilink
          English
          arrow-up
          19
          arrow-down
          4
          ·
          19 hours ago

          well until that day, we’ll love them. one day the fucking sun will explode, but you won’t catch me saying we should stop using solar generators.

          • the_q@lemmy.zip
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            18 hours ago

            That’s not a very good comparison, but do whatever you want, bud.