Marketing teams around the world are coming up with various ways of selling smart devices, objects that connect to the Internet of Things (IoT), all of which revolve around the concepts of control, security, and efficiency. Doorbells and locks that are digitally controlled via smartphones; smoke detectors that sense fire before fire; cat flaps that inform you whether your cat is in or out of the house; and voice assistants that instantly give you any piece of information you could possibly need to know.
IoT’s Unfortunate Truth: Sex Toys and Fish Tanks
However, the reality of IoT is vastly different to the claims of security and control that manufacturers are making and shines a slightly concerning light on the industry. The reality is that the software inside smart devices is always owned by the manufacturer, not the owner, much like the operating systems in smartphones. This means that one can buy and own the hardware yet have no jurisdiction over the software, the part that does all the clever things that IoT devices do and, more importantly, gathers the vast amounts of data the user creates.
Therefore, the manufacturers have complete control and ownership over the data that is produced by you and your smart device. There was, for example, a scandal not so long ago when the smart sex toy, WeVibe, was discovered to gather data and sell it to third parties; data on how often you use the toy, where you use it, and how long you use it for.
Did WeVibe’s customers know that this was going on? No. Were they upset when they found out? Oh yes. And thankfully, WeVibe paid through the nose for it. Which is no surprise when you consider that Donald Trump’s recent Presidential campaign was built almost entirely on the big data collected by people’s personal items and online activity. The idea that this could include stats on how frequently and vigorously one uses their erotic massager doesn’t sit very comfortably, as it were.
And let’s not forget Roomba, the robot vacuum, the high-end models of which have the ability to create maps of the user’s house in order to increase cleaning efficiency. But then we found out that, for some godforsaken reason, the company behind Roomba were then selling these maps to third parties. As to why, one can only guess, but it’s certainly disconcerting to know it’s happening.
IoT is intruding on our privacy, but it is also becoming a very real and very penetrable security concern. For example, the casino robbers who got in through the fish tank.
Yep, one large North American casino had a large fish tank which connected to the Internet of Things in order to control water temperature, feeding times, etc. However, a canny group of data crooks managed to hack the tank and then use that access to break further into the casino’s cybersecurity; they left with over 10GB of private data.
IoT robbing us of privacy and threatening our security in a laughable example of oxymoronic irony.
It seems that hacking IoT devices is child’s play but, more importantly, in my opinion, it’s an industry which promotes opaque relationships and has a questionable moral code. The fact that uptake of IoT has been so great is not because people don’t care about the privacy and security risks they are bringing into their homes, it’s because there is an active decision being made to bury and withhold much of the pertinent software and data agreements that the user is making with the manufacturer.
If people knew how real the data risks were, would they continue purchasing IoT products?
If more people knew the reality, would they still be so willing to give up ownership of their private lives and data? If they knew how real the risks were, would they continue purchasing? The more IoT devices that you have in your life, the less privacy you have and the more strangers know about you. And nobody can be naive enough to believe that the companies buying and using their data are doing so in order to help them out, offer kindness and support through life, make the world a better place. No, the data is being used to in order to manipulate you into doing and thinking certain things at certain times.
The more data you give away, the more comprehensive a picture of you strangers can have. They then make assumptions, most of which are scarily accurate, about your likes, dislikes, political views, love life, religious beliefs, style, education, and so on.
This knowledge is then used to coerce you into voting for a certain political candidate; buying a certain brand of cereal; donating to a certain charity; spreading a certain message; living a certain life. Without even noticing it, gradually over time, you’ve stop making your own decisions because they are, essentially, being made for you; instead, you simply follow the tide.
What’s the solution?
Is there hope for IoT? Can these issues of privacy and security be overcome? Maybe, but only if the buying public makes is morally unacceptable, and then, hopefully, legally unacceptable, to hoodwink customers and gather unreasonable amounts of data in semi-secrecy.
All of our personal data, yours and mine, is incredibly valuable to a large number of people; this is something that I don’t think many members of the public are aware of. If they were, I think they would have something to say about the fact that they are not benefitting from it. The manufacturer is making money on the sale of the device, and then more money again on the sale of data that their customer goes on to produce, every day; day after day.
So, if there is to be hope for the future of IoT, I think two very important thing needs to happen. Firstly, manufacturers need to prove, time and time again, that their devices are safe and secure, impenetrable to hackers. Secondly, customers need to be rewarded rather than used for their data. Whether that reward is monetary, in some sort of cash-back scheme, or perks orientated, that’s for each individual manufacturer to decide, but if something isn’t offered soon, I can’t imagine it will be long before people clock-on to what’s happening and begin rejecting IoT as a whole.
If the government fails to step in with some form of regulation, some set of rules, then the general public needs to stop accepting the current practice, demand more clarity, insist on a more acceptable moral code and ensure they are rewarded for the data they provide. Until then, we are paying twice for our smart devices; first, we pay the price on the cash register, and then we pay the price of personal privacy and security.