• 3 Posts
  • 1.11K Comments
Joined 4 years ago
cake
Cake day: January 17th, 2022

help-circle



  • Historical context : it’s a 1yo post.

    TPM itself isn’t the problem. TPM itself technically might be a good solution, what the FSF precisely put forward is “out of the user’s control”. They even mention how it’s not about theoretical ideas but how it’s actually used. If Microsoft gets to decide HOW your computers works DESPITE you wanting NOT to behave that way AND it makes Microsoft itself, or its partners, even more entrenched then it’s a serious problem, it means “your” computer is their computer.

    What we have all witnessed is that bit by bit OSes like Windows, but also MacOS and Android, are not simply providing stores or tightly controllers channel (with fees for themselves) but ALSO removing entirely, or making it radically harder, to install software the user actually wants to install (not malware).

    It’s not about TPM, it’s as usual about who control your computer.


  • Apple still has the most reliable out of the box experience for hardware.

    Out of curiosity, did you try an equivalent, e.g. Framework or Tuxedo or a SteamDeck, or only generic hardware, like a PC, then slapped on it a random distribution?

    I don’t want to presume of your experiences and only to highlight that Apple out of the box experience better be flawless precisely because they have very limited hardware to support. In fact I would argue any distribution, even an obscure one, could fare very very well if it only had well known hardware (even if hundreds of them) supported, as opposed to an open and thus endless ecosystem.





  • It sure is possible to embed invisible information into videos and images, it’s called metadata. Now you might think of other techniques, e.g. https://en.wikipedia.org/wiki/Steganography but most if not all are, AFAIK (and I won’t pretend I know the state of the art in the domain) if they are within the data itself (thus become data, not meta-data), e.g. a visible stamp in an image, are made to remain visible. Compression codecs are specifically targeting the visible or audible spectrum. One of the most basic way to “compress” lossy information (as opposed to lossless) is precisely to remove the ends of the spectrum that is not perceived by the average human audience.

    So… AFAICT it’s either visible and thus can be spotted (and thus can be removed, even if by adding a black mark over) or not visible but then most likely will be removed by basic compression codecs even without trying to do so.

    TL;DR: no and I wouldn’t be until I see this in the wild (not a research paper claimed it’s technically possible).


  • When you generalize your position about your available time and technical knowledge as the limiting factor for everybody you are not saying it’s impossible for you, you’re saying it’s impossible for anybody and everybody. That’s the problem. It’s like saying “I don’t like this food” versus “It tastes bad!”. For you they are equivalent, for others they are totally different. I’m not saying you, or anybody else, should learn about self-hosting (federated) social platforms then set some up, what I’m rejecting instead is giving up pre-emptively on the behalf of others because it’s giving power back to BigTech.












  • Worthwhile yet tricky. Companies like OpenAI, Google, Meta, etc are full of experts in statistics and they have access to a lot of storage space. If use a service from those companies, say 4hrs per day between 7am and 9pm, at a certain frequency, e.g. 10 requests / hour, then suddenly, when you realize you actually do not trust them with your data, you do 10000 req/hr for 1hr then that’s a suspect pattern. Then might be able to rollback until before that “freak” event automatically. They might still present you as a user your data with the changes but not in their internal databases.

    So… I’m not saying it’s not a good idea, nor useful, but I bet doing it properly is hard. It’s probably MUCH harder than do a GDPR (or equivalent) take out request then deletion request AND avoiding all services that might leverage your data from these providers.