• 0 Posts
  • 67 Comments
Joined 1 year ago
cake
Cake day: July 11th, 2023

help-circle

  • To clarify; I have a 100W Ugreen Nexode 4 Port USB Charger that I use to charge my laptop (~60W), Steam Deck (~40W), iPhone (~20W) and AirPods (~5?W).

    The problem is if my original product cable has gone walkabout temporarily and I need to use a random one to stand in - there is no clear way of telling if I’m accidentally using a 5W-max cheap cable to try and keep my laptop charged while working.

    Obviously there are some context clues depending on cable thickness etc., but with how common cosmetic braiding is becoming a thing - even that’s getting harder to rely on.


  • Yes, you can. The charger and the device communicate between one another what they can support, and pick the highest one they both agree on.

    E.G. my laptop charger can charge at full speed (100W) for my MacBook, but only at 20W for my iPhone.

    That bit is pretty straightforward and transparent to end users (there are a few rare conditions where devices might not agree on the fastest, and have to fall back to a slower one); the issue is more with cables not having sufficient gauge wire, or missing connections that prevent the charger and device from communicating their full functionality.


  • It’s been more of a pain in the arse than initially expected.

    Most motherboards (for example) only have 2-4 USB-C ports, meaning that I still need to employ A-C and C-C cables for peripherals etc.

    My main gripe is that the standard just tries to do too many things without clear delineation/markings:

    1. Is it a USB 2.0 (480Mbit), 5Gbit, 10Gbit or 20Gbit cable? Can’t really tell from the plug alone.

    2. More importantly, for charging devices: How the heck do I determine maximum wattage I can run?

    For all its faults, at least the blue colour of a USB-3.0 plug (or additional connectors for B/Micro) made it easy to differentiate !

    Now I’m eyeing up a USB Cable tester just to validate and catalogue my growing collection! 🤦🏻‍♂️






  • I doing think it was an one thing, but more-so a build-up over time - a death of a thousand cuts, if you will:

    It was a cultural moment generally, just think back to all of those celebrity commercials (“I’m Mr. T and I’m a Night Elf Mohawk”). All cultural moments pass eventually.

    The third expansion (Cataclysm) was quite weak to begin with; coupled with a lack of content in the tail-end of the second (Wrath of the Lich King), which itself was incredible - narratively wrapped up the story that began all the way back in Warcraft 3.

    So a lot of people chose that time to bow out of the game, as it required a fair bit of time dedication and seemed like an appropriate time to do so - given the narrative pay-off.

    Lastly, the introduction of a number of game tools to automate the group composition process meant that the impact of player reputation on servers was severely diminished. Before then, there players who were toxic (stealing items, intentionally killing the group, failing quests) were infamous on a server.

    Once this tool was further opened up to allow for groups to form across multiple servers - the sense of community was shattered as you would have no way to know if the person from another server was good/bad etc. it stopped being about bringing in the individual player, and just getting a body in to fill a role.








  • thatKamGuy@sh.itjust.workstomemes@lemmy.worldWe're sorry...
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    2 months ago

    Going by that argument though, then EVERYTHING is an indirect act of God.

    Bullet wound? Clearly it was God’s will, for ordering the universe in such a way that an individual was armed at that point in time to cause you harm.

    Cancer? God willed the carcinoma onto your skin.

    Maybe it’s just Argumentum ad absurdum, but insurance companies are basically arguing against their own existence.



  • In Australia, we could call that Carer’s Leave; Mental health days are a valid use case (at least at my work).

    Edit: Mind you, we are also a country that needed to implement a Public Holiday ahead of the AFL Grand Final (similar to SuperBowl) because a significant portion of the population were taking ‘sick days’! 🤣


  • I specified one generation of hardware backwards compatibility; beyond that software emulation would be more than sufficient.

    The PS5 is backwards compatible with all but ~6 PS4 titles. Sure that’s entirely because of the shared x86-64 architecture, but it makes the PS4 stand out like a sore thumb for its lack of direct generational backwards compatibility.

    By the end of the PS3’s lifecycle the Cell processor has been die-shrunk multiple times, reducing power consumption, heat output and PCB space required. It could then share the rest of the PS4s existing IO chips and circuitry.

    There was literally no reason for backwards compatibility to be removed beyond corporate greed. Blindly accepting it, and actually trying to justify that as a good thing is one of the key reasons this hobby has gone down the toilet.