> Google Glass originally failed due in part to public backlash at being recorded without consent in public spaces. However, it’s also true that in the decade since, people have become more accustomed to being filmed due to the rise of smartphones, vloggers, and TikTok.
This is a massive "citation needed." People around me are still almost always not filming, and I absolutely have not become accustomed to being filmed. I expect that's generally true for most people?
Just search for "glasshole" online and you get the articles from 2014.
Heh, I should have been clearer. I meant citation needed on the other sentence.
Are there new license plate readers that also do object and face detection? Are the AI corporations starting to train their algorithms on all of reality instead of just what is online? Tesla at least has long been collecting video data that could be used for this purpose, besides being used for training self driving car algorithms. Vehicles can be tracked using subsonic vibrations, where every vehicle in the world moves could be tracked with an array of geophones and nobody would ever know it is happening. Perhaps SQUID magnetometers could do the same with many fewer sensors. SQUID magnetometers are extremely sensitive, so much so that the output looks like noise without shielding, but maybe AI could make sense of all that signal that looks like noise, but which is actually derived from every electromagnetic event on Earth? I talked to a scientist about this and he thought the confluence of so many signals would be impossible to distinguish at the time, but maybe this is a problem that AI would excel at by being able to see minute correlations that people or heuristics can not? Not sure I'd want to live in such a world though. Instead of an LLM it might require a more time-domain oriented algorithm.
Really glad there are researchers out there coming up with bad actor scenarios, and I hope corporations, privacy advocates, and the government take these into consideration.
These are solvable problems. Not always 100% but the harms can be mitigated. But the truth is that some people don't want these problems solved.
Another example in the wild was Apple Air Tags.
[dupe] https://news.ycombinator.com/item?id=41726385
https://news.ycombinator.com/item?id=41724957
among other submissions.
Just focus on the students' source instead of repeating these news posts over and over and over
Source Doc: https://news.ycombinator.com/item?id=41724310
I'm curious about how they do the reverse image search. Do they just use Google?
The article mentions that they used publicly available databases. Their paper specifically mentioned Pimeyes and Facecheck ID.
This use seems like it would violate the terms of service for pimeyes, which say it should only be used with the consent of the people in the photos. (In practice it's clear they dont actually care)
Stuff like this makes me glad that my family and myself practice digital hygiene such that there is not a single photo of myself online that could lead back to me.
For everyone else's sake I hope employing automated (facial) recognition on private citizens becomes so highly illegal, no sane person would ever dare to do crap like this.
All I've ever wanted from AR glasses is the ability to look up people's names so I don't have to remember them.
While I like what they stitched together, it has nothing to do with meta ... at the end of the day, any camera can do the same
Meta just released an internet connected camera disguised as eye wear. That makes this situation quite different. If a person wants to reject being photographed, they move away from the person holding the camera, or avoid public places that have surveillance systems. The expectation has now been increased: to reject being identified, a person must avoid people wearing glasses.
Far more people wear glasses than hold a camera.
In basically any public space, there are tons of people on their phones, and the sense that these people could easily be turning a camera towards me is, for me at least, constant. I don’t think this is very new.
You don’t think the camera being on a person’s face is different from them holding it in their hands?
I understand the practical difference of angle, but in terms of how it affects public places for me, I mean that I already feel constantly surveilled. I really don’t feel like it makes much of a difference for me.
How many people would notice a pen sticking out of a pocket?