• BurgerBaron@piefed.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    41 minutes ago

    The media has taught me to hate the words could, should, and would when used in headlines but this is probably true for once.

    • tal@lemmy.todayOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 hour ago

      You mean because indie games tend not to have high minimum memory requirements, like?

      • thingsiplay@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 hour ago

        For the most part they don’t. And it is only one reason why Indie games are important. The other is that the games often tend to be cheaper, sometimes more innovative, doesn’t try to put micrtransactions in everything and so on.

  • BiomedOtaku@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    3 hours ago

    Good. Fuck off with high end graphics that need a new graphics card every fucking two months. Give me a good story and gameplay.

    • thingsiplay@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 hour ago

      That is not really true. I use same graphics card and upgrade it with new console cycle. Or maybe once in between.

      • BiomedOtaku@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        37 minutes ago

        I made that comment in regards to a lot of triple a publishers wanting to push out high graphics to a point where eventually you’ll need to upgrade your card to handle the latest and greatest. I much rather decent graphics but a killer story line with good gameplay mechanics.

    • ampersandrew@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      3 hours ago

      I’ve been rocking the same graphics card since 2021, and it still plays every new game on high settings. There are very few games that can even afford the production budget that would push a card like that or even a PS5 to its limits anyway. My most-played game is a 2D game from 2012 that can run on a cheap laptop, and the market at large is most focused on games that are so low spec that they can run on phones too.

        • ampersandrew@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 minutes ago

          RX6800 XT. It did cost me an arm and a leg when I bought it, due to the shortage at the time, but it’s lasted a long, long time.

      • Hubi@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 hours ago

        And even if you’re playing a lot of modern AAA games, the settings beyond medium are often just offering diminishing returns. I can barely tell the difference in most cases and it will still tank the FPS down by half. It’s just not worth it.

        • Alaknár@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          44 minutes ago

          People, especially those who run games on monitors with high resolution, sleep on the fact that you can easily turn AA off to get a massive boost in performance for virtually no degradation of graphics.

    • Alaknár@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      45 minutes ago

      What in the world are you talking about…? I was using my GTX 970 for five years before I got a used RTX 2060, which I sold to my brother after two years. I’m nor running an RX 9070 XT and it looks like it’ll do its job for at least 5 years.

  • Ryoae@piefed.social
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 hours ago

    Might just be a good thing. Because optimizing is what people have been demanding a lot of out of developers for a long time now.

  • jaaake@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    3 hours ago

    This article is sensationalizing a non-issue. It reads like the author went to the convention with the story already written and then tried (and failed) to find people that supported the premise.

    As someone who attended GDC for the full week this year, I can tell you that not a single conversation I had or panel I attended discussed the RAM shortage. I’m sure this topic arose in some circles, especially anything related to the timing or cost of next gen hardware. As a professional AAA game designer of 25 years and an occasional game director, this does not affect the way that the games themselves are made. Games on consoles already have their limitations, games on PC should always be (but not always are) optimized to work across a broad spectrum of hardware configurations, with the minimum spec being the lowest system possible without sacrificing playability.

    Even people interviewed in the article are saying the same thing:

    “Does this affect us? No,” Subotnick said. “We’re making games on as many platforms as we can to delight consumers. Could it impact us? Sure. If there’s less devices for people to get their hands on, then we potentially have less consumers to sell to. But right now, I’d argue that there are plenty of consumers with plenty of devices for us to sell these games to. Where it could impact us is, sure, we will have to make decisions around next-gen platforms when they tell us that it’s time to bring content to them. And if they are threatened to have a total addressable market that is viable from a business standpoint, sure that’s a business challenge. But right now all I’d be doing is speculating on a bunch of hypotheticals.”

    • voxthefox@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 hours ago

      So journalism per usual nowadays? Can’t tell you the last mainstream article I’ve read that didn’t read like they had a very preconceived notion they wanted to convey

  • tal@lemmy.todayOP
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 hours ago

    The title is a little more dramatic than the body — in fact, the article is mostly more of a “yeah, this is something that we’ve seen before, more-or-less” take — but it is interesting to get some actual perspectives on impact from the developer side on what the likely changes are. It does also confirm that some studios are working on reducing the memory requirements for their upcoming games.

    • tal@lemmy.todayOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 hour ago

      I think that you have two factors here. GDC isn’t specific to PC gaming, and additionally, a lot of titles will see both PC and console releases.

      For a game that is intended to see only a PC release, my guess is that that that might affect system requirements of the game.

      For games that see console releases, things like “will fewer people have consoles” — because current-gen consoles are very unlikely to change spec, just price, is how this manifests itself. “Is the Playstation 6 going to be postponed” is a big deal if you were going to release a game for that hardware.