Meta has a serious problem. New reports show that recordings from smart glasses were made available to people who were watching users in very private situations. The issue has already triggered lawsuits and investigations, and the scale of the problem is larger than anticipated.
Recordings from home were sent for analysis
Data revealed by the media indicates that Meta used an outside company to analyze recordings. Employees were tasked with tagging objects, but in practice, they saw much more. The materials included intimate situations, daily life, and private moments that users likely did not want to share. The issue is serious, as many people may not have even realized what exactly their devices were recording.
Not just image - audio was analyzed too
This is not the end. Meta also analyzed audio recordings to develop its AI. This means that user conversations may have been processed, even if they concerned sensitive topics. Worse, there is no simple way to completely opt out of this type of analysis if one wants to use AI features.
The matter quickly reached the courts. A lawsuit was filed in the USA, and an investigation was launched in the UK. Experts emphasize that in this model, data is the most important, not hardware. In practice, this means that the user and their surroundings become a source of information for the company. Meta claims it employs similar practices as other companies and filters data, but this does not reassure critics.
The scandal surrounding smart glasses shows how great the risk is associated with the development of AI and wearable devices. The line between convenience and privacy is becoming increasingly thin, and users may not have full control over what reaches tech companies.
Source: Svenska Dagbladet, TechCrunch, flatpanelshd
Redakcja Choose TV












