OPINION: Amazon buying the maker of Roomba robot vacuum cleaners gives it control over a dangerous volume of in-home data. Is it time for us to take stock and protect our most sacred spaces?
If you own a few select products, Amazon knows an insane amount about what goes on in your home. Own an Amazon Echo speaker? It can tune into your every utterance. Although it’s “not listening” unless you say the wake word, it can hear when you break wind. It can hear when you’re being intimate with a partner. It knows what you like to listen to while getting dressed.
If you connect a few smart products to said speaker, it can work out when you come and go, when you go to bed and when you wake up for that morning coffee. Your Alexa-connected smart light bulbs and your door locks give the game away. If you have a Ring security camera or video doorbell, that’s now owned by Amazon, and it can see who comes and goes.
Have a Fire TV product? Amazon knows what you watch. If you happen to shop at Whole Foods, Amazon knows whether you like to cook extra firm tofu or rib-eye steaks to go with an episode of The Boys. You get the idea, right? That’s a lot of trust you’re placing in a company that has a less-than-exemplary record in that regard (see here, here and here to name but a few incidences we’ve reported on in recent years).
This has happened quite quickly and easily. A full on incursion into our private spaces, one smart device at a time, making our at make our lives easier and more expedient.
Sometimes you have to say it out loud to get your head around it. Amazon (and the rest of the big tech firms) have bugged our homes – our most sacred spaces – and we’re ok with it because voice control is convenient and still somewhat of a novelty.
Sweeping it under the rug
The invasion just got a little more intense. Amazon has purchased iRobot, the firm behind the Roomba vacuum cleaners. So what? You may think. Well, Roombas work successfully by creating an impression of the layout of your house, so it can avoid banging into things.
So Congratulations, Roomba users, Amazon now has a blueprint of the inside of your house. It knows if you could use a dining table, or a proper entertainment centre to replace that crappy TV stand. It knows whether you sleep in a single or king-sized bed. Adding to the fact it potentially knows when you sleep (Alexa lights on/off, doors lock/unlock commands). If you bought one of those dumbass Amazon Halo fitness bands, it knows how you slept (although not very effectively).
In isolation, Amazon buying iRobot isn’t the biggest deal in the world. However, it’s the cumulative effect of all of this data that’s so valuable to Amazon, as it continues a grand quest to become the only shop on Earth (and beyond).Roomba users’ data could unquestionably help Amazon sell even more smart home products. An Echo in every room? An Eero bridge Wi-Fi router for that hard to reach nook? Another Ring camera for the back yard? You get the idea. This push isn’t going away until every word and ever inch of the home is accounted for and documented on an AWS server.
Sleeping with the enemy
The Roomba purchase is also another lesson for us as consumers. If you get yourself invested in a new and exciting start-up ecosystem, there’s always a chance your data will end up in the hands of big tech following an acquisition.
Ring users have already been through what Roomba users are about to. For a more extreme example, talk to some Fitbit users about how they feel about years and years of intimate body data (cycle tracking, sleep, heart rate, exercise, you name it) ending up in Google’s hands.
Is it time for governments to get involved here? From my perspective, Big Tech shouldn’t have access to any of these users’ data from the time before the purchase takes place, unless given explicit consent.
People should be given the opportunity to opt out. That way they can have a fresh start elsewhere, safe in the knowledge the most powerful corporations in the world don’t have access to intimate data they did not agree to hand over to a company that already knows too much.