- cross-posted to:
- privacy@lemmy.ml
- cross-posted to:
- privacy@lemmy.ml
The creator of Nearby Glasses made the app after reading 404 Media’s coverage of how people are using Meta’s Ray-Bans smartglasses to film people without their knowledge or consent. “I consider it to be a tiny part of resistance against surveillance tech.”
more at: @feed@404media.co



You know what sucks?
In that AR glasses, in theory, are such an interesting technology with lots of potential, and certainly a piece of tech I would love to have and work with and on. Not to secretly record people, but to, well… augment my field of view with whatever digital tools or displays I would like. It would be so useful
It’s honestly kinda saddening to me that it most likely will get completely ruined by our current toxic relationship to technology. A step towards our ever increasing cyberdystopia, and not towards enchanting our limited lives
Obviously either way I don’t trust Meta, but an open-hardware device running a FOSS AR system? It would be nice…
I still hold out hope that this somehow could be resolved, and I would love to contribute to open software for these devices. Maybe one day soon-ish I will. My expertise should be well applicable, after all
It would be incredibly useful in construction. Having a digital overlay telling you exactly where to put up the framing for a separating wall, or an overlay showing the correct distance between screws, or where wires and pipes are inside a wall? There are so incredibly many awesome possible uses for AR in construction.
I always wanted to build an AR app for inside data centers. Imagine looking at a server and being able to open a terminal or desktop that you can immediately interact with on the floor. or have it display resource information like hardware utilization, temps, network throughput and configuration, etc.
it would make a difficult job just bit more manageable.
I really like the special tagged tape that could bring up AR tags and details about it. Organization and directions are so more useful.
Drop the cameras and microphones and replace them with a couple accelerometers and gyros. Paired with your phone’s GPS tracking, the glasses can tell where you’re looking without actually seeing anything. You can get handy features like a floating ‘turn here’ sign over your exit while driving with GPS navigation without recording anyone or anything at any time. Better battery life, too.
Tbh I don’t even mind cameras that much if they were entirely controlled by the individuals themselves. I have a much bigger issue with it when you’re streaming my facial recognition data to Evil Megacorp 2™ servers that also feed directly to the “Not Spying… Again” agency, though.
I don’t think that would work particularly well with AR: People get sick if movement isn’t synced up properly, not having any sort of cameras or sensors at all would exacerbate that problem.
If you are talking about a simple HUD, then that might be a lot more viable, but for AR and the tech we currently have, some sort of camera or sensor array is kind of a requirement practically speaking.
Except that one cool thing with AR is being able to have it tell what you’re looking at is. Not just positioning things in space. A lot of cool shit that could be done with AR, like real time text translation, object identification, etc needs some kind of camera, even if it just sees IR light. Lotta cool shit needs a microphone, too.
I agree, it would be nice.