• magic_smoke@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    1 day ago

    Key signing maybe?

    You get a cert which is cryptographically signed by your government. They can prove its signed with the governments root cert, showing that its someone over 18, but not who.

    That being said the key identifiers will probably still be attached to you in some government db, just not on the porn site.

    Though the government could force the pornsite to hand over any logged ids. Some people would say that’s private, as they trust the government not to do stuff without a judges warrant.

    As a trans woman relying on HIPPA to not be put on a list of those on HRT, lmao yeah fucking right. The christian taliban will connect the dots the first chance they get.

    • ArchRecord@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      They can prove its signed with the governments root cert, showing that its someone over 18, but not who.

      This is generally a pretty decent system in concept, but it has some unique flaws.

      A similar system is even being developed by Cloudflare (“Privacy Pass”) to make CAPTCHAs more private by allowing you to anonymously redeem “tokens” proving you’ve solved a CAPTCHA recently, without the CAPTCHA provider having to track any data about you across sites.

      They know someone who had solved a captcha recently is redeeming a token, but they don’t know who.

      This type of system will always have one core problem that really can’t be fixed though, which is the sale and transfer of authenticated tokens/keys/whatever they get called in a given implementation.

      Someone could simply take their signed cert, and allow anybody else to use it. If you allow the government to view whoever is using their keys, but not the porn sites, then you give the government a database of every porn user with easily timestamped logs. If you don’t give the government that ability, even one cert being shared defeats the whole system. If you add a rate limit to try and solve the previous problem, you can end up blocking access if a site, browser, or extension, is just slightly misconfigured in how it handles requesting the cert, or could break someone’s ability to use their cert the moment it gets leaked.

      And even if someone isn’t voluntarily offering up their cert, it will simply get sold. I’ve investigated sites selling IDs and SSNs for less than a dollar a piece before, and I doubt something even less consequential like an ID just for accessing online adult content would even sell for that much.

      I’ve seen other methods before, such as “anonymous” scans of your face where processing is done locally to prove you’re an adult, then the result of the cryptographic challenge is sent back proving you’re over 18, but that would fail anyone who looks younger but is still an adult, can be bypassed by the aforementioned sale of personal data to people wanting to verify, and is often easily fooled by videos and photos of people on YouTube, for example.

      • magic_smoke@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        22 hours ago

        Who under the age of 18 will have money to buy these, and who would be willing to sell them for the pittance teenagers would be willing to spend?

        Especially if these get rotated out regularly via a system wide program.

        • ArchRecord@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          20 hours ago

          Who under the age of 18 will have money to buy these

          Anyone with at least $0.25-$1, and access to any method of digital payments. (Gift Cards for most retailers, PayPal, Cash App, Zelle, prepaid or non-prepaid debit cards, any cryptocurrency, etc)

          and who would be willing to sell them for the pittance teenagers would be willing to spend?

          Primarily bad actors that obtain the credentials any number of ways, then either directly sell them, or sell them indirectly through third-party storefronts that buy from the bad actors in bulk. Believe me, I’ve watched hundreds of kids in Discord servers publicly sharing and using sites on the clearweb where they cashapp in a dollar then buy a stolen set of bank credentials and try withdrawing money back to their Cash App account.

          I’ve monitored so many of these sites, and seen how easy it is for anybody, even teens with limited financial payment options, to buy stolen credentials with infinitely more importance and personal security measures taken to keep them safe than something specifically for accessing an NSFW site.

          Some of these site owners operate for months before eventually shutting down and re-opening separate storefronts for anonymity, and I know of one who was selling stolen SSNs, IDs, Gift Cards, and assorted accounts, and made, by my estimates, at least a million dollars in revenue every month off items that were almost all within the price range of any child or teenager.

          Especially if these get rotated out regularly via a system wide program.

          Rotation can help, but doesn’t cut off these services from operating. They just sell stuff in smaller, more quickly refilled batches instead of buying large batches and reselling them over longer time periods. It can make prices slightly higher, but in the end it doesn’t prevent kids from accessing this content.

          But what it does end up doing is creating perverse incentives.

          It drives people to even less regulated, more harmful porn sites. It leads to the further stealing of credentials and personal information. It creates databases and online footprints that can be used to blackmail people, and it normalizes giving sensitive personal information to random websites online.

          The last thing you want when you’re trying to prevent people from getting scammed is to monetarily encourage scamming people out of their credentials and biometric data, while simultaneously making it easier for people to unknowingly hand over credentials and biometrics by normalizing the process.

          This is something practically every digital rights organization argues against, and for good reason. It’s a generally unsafe system that creates bad incentives and drives people to even more unsafe options.

          The best mechanisms by far to prevent kids from being exposed to harmful material, or at the very least prevent them from experiencing much harm from such material is often proper parental controls and general internet monitoring by those parents, good sex education, and parents actually talking with their kids instead of fostering the us vs them mentality that drives many kids to rebel against these restrictions, even when they are to benefit the kid.

          That’s why news like this is always so upsetting to me. It’s a mom who is understandably upset, but instead of taking accountability for leaving a unsecured laptop with access to the internet easily accessible to her kid while not monitoring it at all, she simply puts the blame on the platforms her child decided to access, even though we know she could have done many things herself to prevent this from happening without risking anybody’s privacy or safety, unlike what age-gating regulations do in practice.

          • magic_smoke@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            20 hours ago

            Oh no I wholeheartedly agree age verification isn’t the wrong answer, I was just playing devils advocate on the technological side.

            Parents should be parents, it governments shouldn’t be getting I. the way of you being just another ghost in the machine.