I asked a question on a forum about why a command wasn’t working. They said I didn’t have an interpreter installed on my computer and were making fun of me. I showed them that I had one installed and that wasn’t the problem, but they continued to talk sarcastically to me without explaining anything. Only one of them suggested the cause of the problem, and he was right, so I thanked him. Then another guy said that if I couldn’t figure it out myself, I should do something else and that he was tired of people like me. After that, I deleted my question, and now I’m not sure. And I don’t think I want to ask for help ever again

  • early_riser@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    edit-2
    2 days ago

    It’s true that searching on Google usually solves the problem, but the biggest issue is that it’s hard to know the exact word you need to use.

    I tell people 90% of IT (and development I assume) is knowing what questions to ask, where to ask those questions, and how to interpret the answers. It’s like the search for the ultimate question in The Hitchhiker’s Guide to the Galaxy.

    As for Google, I think it’s getting less useful, so the days of saying “just google it” are gone.

    • okwhateverdude@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      2 days ago

      “Have you asked the clanker yet?”

      Most of my search process prior to LLM was querying google with keywords and specific phrases, clicking through several of the first page links that are from reputable sources, reading all of that, synthesizing an answer. Now that google has completed its enshittification of search, it’s index is gamed both from the SEO grifters and from inside with them making search worse because it ultimately leads to more ad impressions. The quality of those first page links is directly related to how much the stink of money the topic has.

      I now pay for search with kagi (not an ad, swear) so that incentives are properly aligned. I get very useful first page links again because I can lens the search to exclude a lot of the SEO shit sites or only return academic sources. And I also use their robot assistant now for many searches because it can synthesize good summaries and good gists of topics from good source links. Prompted to always cite shit so I can always verify the clanker’s bullshit, but trust has been getting higher lately as the models get better.

      At some point we will end up with a lmgtfy.com but lmpalfy.com (let me prompt an LLM for you)