This Information Age article forms part of a 7-part series on Ethics, covering artificial influencers, facial recognition, IoT, security and more. The series will culminate in an online panel on 11 December. Register to take part in the discussion and send your questions to the ACS Ethics Committee.

Imagine this: you come home after a night out only to find that the fancy new electronic door lock that you have installed won’t recognise your pin.

A bit puzzled, but not really concerned because you have been having a few problems with gadgets lately, you go around to the back door which has a normal lock.

The first thing that you notice is how very hot it is inside, the second is that music has started blaring from the speakers connected to your piped-sound system.

And then the burglar alarm goes off…

Twenty minutes later, you’ve reset the alarm, turned down the thermostat, turned off the music and reset the door codes.

You decide that going to bed is the best plan.

You go to take a shower, but strangely there’s no hot water.

At 3am you wake in terror as every smoke alarm in the house goes off simultaneously.

And then you get a text from your ex – “just checking to see if everything is OK”…

A nightmare scenario?

Unfortunately, this picture may become more common as more devices (Gartner estimates 7 billion consumer devices in 2018) are connected to in-home networks and the internet (the Internet of Things or IoT), and the IoT indeed has been reported as being employed in domestic abuse cases.

Amazon’s Alexa, Google’s Home and various other smart home hubs from Nest and others are sold as devices to make your life more convenient.

These hubs can be used to control door locks, switch lights on and off, control the heating thermostat, and are connected to sensors, switches, smoke alarms and even the garden sprinkler system.

This is quite apart from internet connected fridges that can order food when stock gets low, and smart TVs that listen to your conversations waiting for your command, and remember what you like to watch when.

A quick check of the website of only one of the providers of internet connected home automation provides a startling list of other household systems and appliances that can be controlled (smart beds, garage doors, hot water systems, power outlets, ovens, dishwashers, doorbell, automated window blinds, and air filters).

As an IT professional you may have already suspected that the self-proclaimed hacker that you dated for a while was behind these strange goings-on, but what about other people who are less technically savvy than you?

Is it reasonable to expect them to figure out what is going on, let alone fix the problem without spending large amounts of money for IT assistance?

What about someone who is already under severe stress due to abusive or controlling behaviour by a current or previous domestic partner?

And even if compromising their IoT home automation systems doesn’t lead to physical harm to the victim or their home (a real possibility), the psychological injuries caused could be significant.

Home automation products are designed for convenience.

They have not been not designed to preserve or honour human rights.

This makes them especially susceptible to repurpose as tools to invade privacy and manipulate the emotions, sense of safety and security, or even the mental state of the victims of abuse.

IoT company Nest says of its system: “It just works. You don’t have to be a tech geek to get things working. Once you connect a product to Nest, it automatically starts doing things for you, without you having to program it”.

As privacy and security is not the central concern of the manufacturers, the very lack of technical expertise that typical owners require may allow for easier access to others to comprise the systems, and greater impact when they are used to intimidate, frighten, manipulate and even injure victims.

But it’s not only the possibility of domestic abuse that is an issue.

One of the ethical issues with IoT devices is the opt-out rather than opt-in nature of data sharing with the seller of the system.

That’s if you have any option at all to not share your data.

Many of the hub products keep data in the cloud, and by data, of course I mean your commands and system preferences.

There have already been instances of devices recording all conversations and storing that on the cloud, or even sending a recording of your conversations to others, as happened earlier this year with an Amazon Alexa system in Portland.

It’s not hard to imagine a device sending a visual recording of you on your own security camera instead, or mistakenly turning the oven on because it interpreted some dialogue on TV as a command to do so.

So, there is the real risk of IoT systems in the home being used for gaslighting; for them causing moral, psychological and physical injury, and damage to property.

Given these major ethical concerns, why aren’t governments taking notice and requiring improved standards of protection for privacy and human dignity to be designed in?

Perhaps states aren’t as alarmed by business’s erosion of the human right to live a life free from constant monitoring, as we think they ought to be?

What do you think?

The Panopticon anyone?

https://www.zdnet.com/article/iot-devices-will-outnumber-the-worlds-population-this-year-for-the-first-time/

https://www.seattletimes.com/nation-world/smart-home-technologybecomes-the-newest-tool-of-domestic-abusers/?utm_source=twitter&utm_medium=social&utm_campaign=article_left_1.1

https://nest.com/au/works-with-nest/

https://www.nytimes.com/2018/05/25/business/amazon-alexa-conversation-shared-echo.html

https://en.wikipedia.org/wiki/Gaslighting

https://en.wikipedia.org/wiki/Panopticon_gaze

Michael Wildenauer is Professor of Practice in Management at La Trobe Business School, and is Vice Chair, ACS Ethics Committee.

Register to take part in our Ethics online discussion on 11 December.

Read our entire 2018 Ethics series:

Part 1: Artificial influencers
Part 2: Facial recognition unmasked
Part 3: When IoT goes wrong
Part 4: Who’s to blame for phishing breaches?
Part 5: Could encryption legislation increase risk of being hacked?
Part 6: Would you install a keylogger at your workplace?
Part 7: Do you abide by a professional code of ethics?