Digital Assistants And Your Psychologist
Every morning, I shuffle into the kitchen to complete my morning routine; pour myself some cereal, put on the kettle to brew some tea, and mumble into the darkness, “Alexa, what’s new?” The Amazon Echo Dot on my kitchen counter cheerfully tells me the current weather, the forecast, and plays the news from my selected news sources. That small device, about the size of a hockey puck, has become integral in my daily life; from acting as a timer for when I’m cooking (“Alexa, set a timer for 5 minutes”), to entering information into my calendar (“Alexa, set a calendar event for today at 3pm to pick up the dry cleaning.”), to making me laugh (“Alexa, tell me a joke.”). The rise of Amazon Echo and Google Home (which I also have) has made home automation more interactive and intuitive. These devices can integrate with thermostats (like Nest) to control your home temperature, with light switches to control your lighting, to your TV to allow you to stream video just by asking for it. Additionally, these devices are very affordable ($50... and sometimes as little as $30 if they're on sale).
These devices are freeing our hands and eyes from our devices to complete tasks for us. But, in order to do those things for us, they must listen for us. And they listen all the time. In order for them to hear my call for their attention (whether I say “Alexa…” or “Hey, Google…”) they must be actively listening to all the sounds in the room, and the home, where they’re situated. The reality is that while I’m sitting at the kitchen table talking to my wife about how horrible traffic was, my Echo Dot is listening to that conversation ready to interject if it hears me say "Alexa." Otherwise they are simply silent witnesses to that conversation, and other conversations, around it.
While our home is replete with these devices, we will not have such a device in our offices. The reason is quite simple; the privacy of our patients is paramount and these devices, as they are currently designed, could present a significant challenge in protecting our patient’s privacy.
The Seminal Case
These digital assistants can create a record of a person’s habits, whereabouts, and interactions. Because the assistant is always listening, it may record information that the user may not have intended for it to record. Those communications could potentially be subpoenaed by law enforcement (and others) for use in court proceedings. A recent murder investigation in Arkansas brought to the forefront the concerns raised by these devices. During their investigation, the police discovered that an Amazon Echo was present in the room during the crime. They issued a search warrant to Amazon to get access to all the data gathered by the device during the time of the murder (See State of Arkansas vs Bates, Case No. CR-2016-370-2).
In an excellent article by the American Bar Association (https://www.americanbar.org/publications/blt/2017/07/05_boughman.html) the author asks, "Should one expect privacy in the communications he engages in around a voice-activated digital assistant? The Arkansas homeowner’s lawyer seemed to think so: ‘You have an expectation of privacy in your home, and I have a big problem that law enforcement can use the technology that advances our quality of life against us.’" This case bumps up against the Fourth Amendment of the US Constitution, which reads, “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated…” Because the crime happened in a private home, the assumption is that people believe they are due privacy in their home as accorded in the Fourth Amendment.
According to the Bar Association article, the data collected by digital assistants would bear no special treatment under the Fourth Amendment since very little data is actually stored on the device. All the data captured by the device is stored on "the cloud" on Amazon's servers. "Under existing law, it is likely a court would hold that users of voice-activated technology should expect no greater degree of privacy than search engine users. One who utilizes a search engine and knowingly sends his search inquiries or commands across the Internet to the search company’s servers should expect that the information will be processed, and disclosed as necessary, to provide the requested services."
So, in short, because that information was sent from your home via the device to the service provider's servers you are, in essence, acknowledging that that information is not so private that you're unwilling to send it out to a third party (in this case, Amazon).
Our Protection of Your Privacy
In your psychologists office, you have the expectation of absolute privacy. And that guarantee of privacy is codified by law in the Health Insurance Portability and Accountability Act of 1996 (HIPAA) which is the overarching law that protects your private health information. We comply with HIPAA judiciously, and as such we cannot, and will not, reveal anything about you without your written consent, or without a court order compelling us to reveal that information. Everyone who works in our office who handles your records is trained extensively on what HIPAA means and what they can and cannot do with that Protected Health Information (PHI). They are legally bound by the provisions and we take great pains to ensure that HIPAA guidelines are followed to the letter.
Which brings us back to the Amazon Echo and Google Home. Placing one of those devices in our offices would open a new window into that most-personal of relationships: between the patient and the clinician. As of this writing we feel we would also have no control over the data that is generated from its presence in our offices.
These devices are very handy and, frankly, I love having them in my home. They're fun to interact with and they do make things easier in our very busy lives. But from the standpoint of a health-care professional, these devices are not something we would introduce into the protected environment of our offices.
Some more reading: https://www.forbes.com/sites/forbeslegalcouncil/2017/09/18/is-there-an-echo-in-here-what-you-need-to-consider-about-privacy-protection/#1f59b1df38fd