Just your normal everyday casual software dev. Nothing to see here.

People can share differing opinions without immediately being on the reverse side. Avoid looking at things as black and white. You can like both waffles and pancakes, just like you can hate both waffles and pancakes.

been trying to lower my social presence on services as of late, may go inactive randomly as a result.

  • 0 Posts
  • 1.08K Comments
Joined 3 years ago
cake
Cake day: August 15th, 2023

help-circle

  • Pika@sh.itjust.workstoLinux@lemmy.mlRTFM
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    edit-2
    9 hours ago

    this is an absolutely toxic take of the issue. I took OP’s statement as less of a “I won’t read the manual” and more of a “I struggle to be able to read manuals”

    There are many times I had read the manual, and then had to look up the issue further anyway because I either missed the poorly written section, or misunderstood what it was saying.

    If you want a prime example of that, go look at ffmpeg and try to figure out how to select a specific language for subtitles on a video without looking it up online. its via -map as an advanced option, which is described as a parameter to extract specific streams (which also means they would need to map the video and the audio streams since including a -map removes every auto stream). but map doesn’t tell you subtitle tracks are index:s. it does tell you that you can look at stream specifiers for valid search options, which does include s as a type, and lets you know that you can use m for metadata tagging, but you would need to make the connection that the type is s, and the meta data search flag would be m:language:langcode, and you need to make the connection the entire string has to be concated so its index:s:m:language:langcode For someone who is learning ffmpeg and video transcoding, that is not a very good setup. The stream specifiers give a few examples of what the potentials are but, the location where it specifies the types are in a different area than the one where it specifies the metadata keys. At that point just asking online or searching is way easier.

    Note: this is just an issue I have see people come across because ffmpeg is one of the more complicated programs (the man page is over 2300 lines)

    is it in the manual? yes. is someone who doesn’t know how to use ffmpeg and is trying to learn it going to find it? that’s debatable.

    If I was in that situation, my next step would be googling it, and if I couldn’t find it via searching, I would be reaching out to communities. At that point “RTFM” is useless to me.




  • No, I disagree.

    It is not one person’s doing. That is the deflection.

    I will not downplay the effect of this by saying they are the only one involved. Every maintainer so far that has locked or approved any changes that they did are equally at fault here. In fact, one of those linked articles even stated that the primary reason they locked it is because they didn’t like the amount of coverage it got. This is a failure on the community as a whole, not the individual.

    edit for clarification: By failure, I’m talking more on projects that are humoring it and actually going through with it without considering the potential side effects of just blanket applying that.

    Currently considering that a handful of these are locked or posted as we don’t know if we’re going to be doing this yet, I haven’t quite put them in that same sector yet, but it’s rapidly approaching it.






  • I like that you used the term significantly here, because usually the question is use will disappear, which it never will. Being said? open usage of image generation isn’t going to go away, and the same is likely to be said about casual ai chat bots. I do think that eventually when the bubble pops and investors realize that they are blindly tossing money into what is essentially a paper shredder most commercial usage of it will nosedive.

    This effect is generally rather rapid, once one major company decides to drop it, usually it starts to snowball. Being said, with less commercial avenues of it, non-commercial projects that use cloud based services for it will likely have their prices increased to make up for the difference, so you may see /some/ non-commercial projects go down if the models aren’t being self hosted, but I don’t think it’s going anywhere

    You mentioned it already seems to be decreasing as well? I might agree with that. It’s reached the point where the everyday consumer is saying “well this is cool, and makes it easier, but I don’t know how well I can trust this” and we also have some locations (such as the US) starting to put restrictions making it harder to copyright the outputs, which lowers a lot of its value in a commercial sector, but at the same time, we have big companies still going all in on it. So this concerns me.


  • Pika@sh.itjust.workstomemes@lemmy.worldI hate the beep!!
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 days ago

    mine is this way as well. it will count inputs as setting the clock and if you put an invalid time in has a hissy fit and isn’t clear what it is asking of you. The amount of times my grandfather has tried to use it after losing power and got frusterated because he was trying to cook something and the clock wanted to set the time as something stupid like 30:22 or something like that is annoying.



  • I’m not PC but, one benefit of using a central server for syncthing is an always on backup that doesn’t require another client device to be on, it also allows for easier creation of new shares.

    For example, with syncthing you can set the “servers” client device to auto approve/accept any shares that are to trusted devices, then when you get a new device, instead of needing to add that device to every device you share on the syncthing network, you only need to add that device to the server and then you can have your other clients connect to the servers share instead of device to device. It’s easier. You can also configure the shares on the server to use encryption by default too, since you don’t really ever need to actually see the files on the server since it’s basically a install and forget style client.

    As an example of what I mean:

    I have 10 different devices that run syncthing, 9 clients and a “server” client. these clients are not always on at the same time, and as such when I change a file, the files can become desynced and cause issues with conflicts. By having a centralized server, as long as the server is on(it always is) and client itself is online, it’s going to always sync. I don’t need to worry about file conflicts between my clients as the server should always have the newest file.

    Then for example say my phone died. Instead of needing to readd every seperate client that the phone needs to share with to the new device, I only need to add the phone as a trusted source on the “server” client via the webui -> click share to that device on every share the phone needs, and then remap the shares to the proper directories on the mobile device. this is vs having to add every device to the phone, and the phone to every device it needs access to ontop of reconfiguring all the shares. It’s simpler, but fair warning does cause a single point of failure if the server goes offline.





  • Pika@sh.itjust.workstomemes@lemmy.worldI hate the beep!!
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    6 days ago

    THIS NEEDS TO BE MAINSTREAM.

    There is very little reason that with the digital behemoths that microwaves are now, that a simple “sound off” setting can’t be done.

    I would also love a “sound off for this cycle” option, but that might be being too needy



  • Pika@sh.itjust.workstolinuxmemes@lemmy.worldWhat would you change?
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 days ago

    I had forgotten about the set UID flag. That might actually fix the issue altogether without having to do a hard-coded sudo path.

    And would mean I wouldn’t have to double check the commans to make sure that there’s no destructive subcommands that could be done as well.

    I might try that later, thanks!


  • Pika@sh.itjust.workstolinuxmemes@lemmy.worldWhat would you change?
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 days ago

    I have been really trying to avoid implementing it into the user session, it requires superuser to run the commands and I don’t like the concept of hardcoding sudo paths using nopasswd

    But I probably will end up having to do something similar in the user environment.

    edit: Now that I think about it, I could probably just make the command path to the network command be authorized as no password on any user as I don’t really see a situation where the user logged in shouldn’t be able to manipulate the network it’s connected to.