The Fourth Amendment protects your right to privacy, and the Fifth protects you from self-incrimination. That’s why the government has to get a warrant before it can snoop inside your house and warn you that you have the right to remain silent when questioned by police. But in the Fall of 2018 Google filed a patent application for a system that will monitor and evaluate everything you say and do and take corrective action if you stray from prescribed norms. All you have to do is click to agree and you’ve signed away your privacy and your protection against self-incrimination.
Google’s Smart-Home Automation System will secure your house, as any modern home automation system should, seamlessly coordinating your smoke, burglar, and CO2 alarms, your appliances, and your ambient temperature, humidity, and lighting. But it will also get personal, setting goals for household members based on its analysis of the data that its audiovisual sensors gather about everyone living in the house. That’s just for starters. Then it tailors a series of rewards and punishments to get everyone to meet their goals.
You won’t have to fill in any bothersome forms or create a password which you’ll promptly forget. Google already knows more about you than you know yourself, including your DNA profile and your mother’s maiden name, and what it doesn’t know it will infer from data that its sensors transmit as you go about your daily routine. Despite its nerdy name, Google’s Smart-Home Automation System is a digital Panopticon promising nothing less than universal surveillance and total population control.
Jeremy Bentham proposed the first Panopticon back in 1791. His Panopticon, from the Greek for ‘all-seeing,’ was a building that could be a prison, school, factory, hospital, or any other institution whose residents need monitoring and correction round the clock. The people to be watched—prisoners, students, workers, patients—were secured in cells arrayed around the outside wall. A “governor” sat in the center in a booth, unseen by those being observed, but able to see them. Even though the guard could not watch every inmate at every moment, knowing that they could be seen would be enough to ensure appropriate behavior by prisoners and keep students and workers on task. (Given the state of medicine at the time, patient survival had little to do with whether anybody watched them.)
Bentham’s design worked better on paper than in real life, but he did establish one enduring principle: technology—in this case, the design of a building—can be harnessed to control behavior and destroy any expectation privacy.
The Panopticon revived in the twentieth century with advances in communications technology. George Orwell anticipated today’s widespread use of domestic surveillance in his novel 1984, published in 1949, where always-on “telescreens” observe people in the privacy of their homes. Then, in the 1950s, in a precursor to today’s smart home, China began distributing party propaganda to homes in local villages through wired radio loudspeakers which could not be turned off. But in those days, actual snooping still had to be conducted the old-fashioned way, through human spies and informers.
But now homes are filling up with smart TVs and wifi thermostats, mobile phones, tablets and laptops, programmable toasters and slo-cookers, electronic door locks, automated lightbulbs, and digital assistants like Siri and Alexa and Google Assistant that answer our questions before we ask them, choose our music and videos, and update our social media. But patent no. US 10,114,351 B2 will do more than turn off the lights when no one’s in the room, text when the milk’s past its sell-by date, or locate lost keys under that sofa cushion. The Smart-Home Automation System will scoop up the data generated by all the smart devices in your localnet (what you used to call your house) and “suggest or automatically implement selected household policies based on sensed observations.”
Just go about your daily routine and your Smart-Home Automation System surreptitiously builds a picture of you and any other household residents, records what all of you say and do, and then sets about altering your behavior. Once it compares the activities it monitors inside the home with patterns gleaned from a wider demographic of “users like you,” it will suggest what it regards as improvements, like cutting down on screen time or eating together as a family more often. It’s a good thing that Google isn’t Hobby Lobby, forcing families to stay together and pray together—after all, the Constitution really does say nothing about separation of church and corporation. But Google seems blithely unaware that it recommends reduced internet use—it will even shut off your access once it thinks you’ve surfed too much—by means of surveillance using always-on internet-enabled monitors and sensors.
The Smart-Home Automation System does other “good things” as well. It sends data from the smoke alarm to the fire department. It alerts the police to the presence of an unexpected occupant. It calls the paramedics if you fall and don’t get up. And it does all this not simply by interpreting data from a wearable device or a window sensor, but by “inferring” behaviors observed with passive audio, video, and thermal sensors throughout the home and in the yard, and adjusting its response accordingly. For example, the System will reset your alarm while you’re asleep if unexpected traffic shows that you will be late for work. Being of a moralistic bent, the System won't automatically text your boss if you turn off the alarm and sleep in. But if you try to disable the home automation system it just might activate its “I know what’s best for you loop” (option-control-🤷♀️) and send your boss that automated text about your malingering.
The System is continually calculating whether resident behaviors are appropriate, and like Bentham’s inspector, the System can punish residents who break the rules. Gone off your diet? Refrigerator locked till breakfast tomorrow. Teen stays out too late? Phone bricked for a week. Foul language? No need for a swear jar, Google activate your debit card after the second offence.
It’s the nanny state and Santa with a hefty dose of Puritanism thrown in. Consider this hypothetical from the patent application: “When children are near a liquor cabinet or are in their parents’ bedroom alone, the system may infer that mischief is likely to be occurring.” Indeed.
On a more-positive note, the System will disable your car ignition if it decides you’ve had one too many, but if it detects an unexpected occupant in your bed, it will Photoshop a scarlet A onto your profile picture.
Here are some of the other things that the Smart-Home Automation System infers from the “emotional context data” it is constantly gathering:
- It decides whether your are happy or sad by comparing your facial expression and vocalizations with what it knows about your past behavior and the data that it has gathered about people like you. These findings “may be used in household reporting, near real time notification, or may be used to control smart devices within the home.” Your secret's never safe: Google has both the NSA and the National Enquirer on speed dial.
- It can detect bullying or other undesirable actions, alerting the police or child protection services. For less serious offenses, it will brick your devices and send you to your room to think about what you did.
- It can detect “interaction with undesirable substances”—especially if you didn’t bring enough to share.
- It tracks chore completion, blocking access to “content playback devices” until the chore is completed to the home assistant’s satisfaction (no PlayStation for you).
- It logs your instrument practice (its sensors can detect a squeaky clarinet at 300 meters).
- and the attentiveness of babysitters (even without a camera hidden in a plush toy);
- and the fulfilment of a promise—it will even decide whether a promise “is capable of being completed” (in which case it’s not really a promise; Grice would be so proud).
- And it can monitor the supervisors as well as the supervised, making sure everyone is doing exactly what they’re supposed to be doing (quis custodiet custodes).
- The System can suggest “areas of improvement” for the household. Better handwriting. Daily bed-making. That thing about eating meals together. Don’t forget to floss.
- It monitors progression towards goals, providing rewards for progress (Good job, you get five minutes more online).
- and punishing failure (no soup for you).
Don't panic just yet. Just as Orwell’s telescreen never came to pass, and Mao’s wired propaganda radio only reached a fraction of China’s villages (still, that meant a few million peasants forced to listen), Google’s Smart-Home Automation System isn’t a reality, not yet. Not every patent leads to a device. In 2010, Apple patented a system for iPhones to control swearing, force children to do their homework, teach foreign languages, and report back to parents and teachers on a student’s progress or failure. Still waiting for that.
We regularly trade privacy for convenience, giving up our location in order to find the best place to eat nearby, or storing payment information in the cloud so we don’t have to enter a long string of numbers every time we buy online. We also know that, despite the altruistic scenarios in its patent application, Google’s goal is not to make the world or your community or your household a better place, but to monetize your data before its sell-by date. So before it brings Smart-Home Automation to market, Google should think twice about the System’s creep factor--that could seriously impact sales. After all, it’s fine to let a bot control the lighting and the HVAC, but no one wants an app to infer appropriate behavioral goals and then enforce them. Even Jeremy Bentham might think that was too weird.