Robot vacuum cleaners have always seemed like helpful, friendly little guys, but maybe that’s because I haven’t used them much. I like the idea of a little guy running around bumping into things and making my house cleaner. The idea seems less charming now that I’ve found out that they can take pictures of you on the toilet and then upload them to social media without you knowing. That’s a level of weird bullying I don’t want from my mechanic valet.
Eileen Guo up MIT Technology Review (opens in new tab) delved into how such photos of Roomba vacuum cleaners from iRobot, with users’ faces clearly visible, came into their own way to social media (opens in new tab). The photos contain very candid shots of people doing personal things in their homes, including images of women and children in the toilet. Guo also made an excellent Twitter thread (opens in new tab) with further explanation and useful links.
The sensitive images were clearly taken from the position of the robot vacuum cleaner, which would have done so in the interest of data collection. Speaking to MIT Technology Review, an affected user explained that he was a product tester for the iRobot Roomba J Series, which meant letting the robot roam the house collecting information in hopes of improving the product.
Expecting your information to be sent back to a secure company that wants to train its cleaning AI is one thing, and finding out that those images will also be uploaded to social media is another. MIT Technology Review found that when iRobot collects all that data, it forwards it to companies for data annotation. One such company was Scale AI, which hires outside contractors to help review the uncensored data.
This led to workers sharing images among themselves on social media, which of course found their way to the rest of the world. Many users felt this was a breach of their trust if not their contract as testers. There are at least 15 images that have made their way out, but many more are likely to be shared. Appropriately, iRobot no longer works with Scale AI.
Unfortunately, iRobot doesn’t do much more than that to help those affected or regain trust. CEO Colin Angle responded to the MIT Technology Review report in a LinkedIn message (opens in new tab) that didn’t acknowledge any problem or danger around providing these uncensored images to gig workers. With little responsibility or recourse, it seems like a dangerous idea.
Angle spends the first part of the Linked In post talking about how great the company’s Roombas are, attributing it to data collected from these types of testers, then throws them under the bus, saying they don’t have any consumers and have consented to have their data collected.
It’s good to know this doesn’t happen with regular consumer iRobot products, but the lack of accountability to testers doesn’t make me want to grab a Roomba any time soon.
The LinkedIn post also chastises MIT Technology Review for sharing censored versions of the images in its article, which doesn’t make a lot of sense considering iRobot was already voluntarily sharing uncensored versions with strangers who ended up online.