Security
Headlines
HeadlinesLatestCVEs

Headline

A private moment, caught by a Roomba, ended up on Facebook. Eileen Guo explains how: Lock and Code S04E03

Categories: Podcast This week on Lock and Code, we speak with MIT Technology Review reporter Eileen Guo about how an image of a woman on a toilet—captured by a smart vacuum—ended up on Facebook.

(Read more…)

The post A private moment, caught by a Roomba, ended up on Facebook. Eileen Guo explains how: Lock and Code S04E03 appeared first on Malwarebytes Labs.

Malwarebytes
#intel

In 2020, a photo of a woman sitting on a toilet—her shorts pulled half-way down her thighs—was shared on Facebook, and it was shared by someone whose job it was to look at that photo and, by labeling the objects in it, help train an artificial intelligence system for a vacuum.

Bizarre? Yes. Unique? No.

In December, MIT Technology Review investigated the data collection and sharing practices of the company iRobot, the developer of the popular self-automated Roomba vacuums. In their reporting, MIT Technology Review discovered a series of 15 images that were all captured by development versions of Roomba vacuums. Those images were eventually shared with third-party contractors in Venezuela who were tasked with the responsibility of "annotation"—the act of labeling photos with identifying information. This work of, say, tagging a cabinet as a cabinet, or a TV as a TV, or a shelf as a shelf, would help the robot vacuums “learn” about their surroundings when inside people’s homes.

In response to MIT Technology Review’s reporting, iRobot stressed that none of the images found by the outlet came from customers. Instead, the images were “from iRobot development robots used by paid data collectors and employees in 2020.” That meant that the images were from people who agreed to be part of a testing or “beta” program for non-public versions of the Roomba vacuums, and that everyone who participated had signed an agreement as to how iRobot would use their data.

According to the company’s CEO in a post on LinkedIn: “Participants are informed and acknowledge how the data will be collected.”

But after MIT Technology Review published its investigation, people who’d previously participated in iRobot’s testing environments reached out. According to several of them, they felt misled.

Today, on the Lock and Code podcast with host David Ruiz, we speak with the investigative reporter of the piece, Eileen Guo, about how all of this happened, and about how, she said, this story illuminates a broader problem in data privacy today.

“What this story is ultimately about is that conversations about privacy, protection, and what that actually means, are so lopsided because we just don’t know what it is that we’re consenting to.”

Tune in today.

Malwarebytes: Latest News

Explained: the Microsoft connected experiences controversy