Siri, Cortana, Alexa and Now are all digital (or virtual) assistants being offered by Apple, Microsoft, Amazon and Google. These assistants provide all the benefits of convenience while raising some privacy concerns since the user's data is recorded and stored when a digital assistant is used. The metaphor that best describes this process is like leaving a trail of breadcrumbs behind each time you ask a question or even order flowers after a breakup. Even though privacy is an issue these digital assistants are on the rise. Concerns regarding someone listening in on our calls have always been an issue, but digital assistants can not only record your voice they can capture data ranging from your contacts to browsing history to your personal preferences such as your preferred florist for break-up flowers. These helpful digital assistants can use this captured data to pull up traffic or weather reports en route to your destination or suggest nearby restaurants. These helpful features are part of the dilemma...how to deliver convenience without sacrificing privacy and security? Saved queries are connected to user accounts, therefore, painting an accurate picture of your habits, travels and preferences, which is helpful information to third parties.
While much of the information from our daily lives appears boring and inconsequential it's actually a goldmine to companies who are listening and storing this data with far reaching consequences. For starters, while a digital assistant is installed in your home and on your devices, the information on you is being collected and transmitted to third parties so it's assumed that reasonable privacy doesn't exists. According to the Fourth Amendment in the US if you have installed a device that's listening and is transmitting to a third party, then you've waived your privacy rights under the Electronic Communications Privacy Act.
So when are assistants recording? Who can access the recordings and can they be deleted? A sticking point exists with privacy rights regarding people who aren't the device's owner themselves, and who never agreed to anything, who find themselves as a guest in a room where their voices are being listened to.
As digital assistants become more popular concerns of how data is collected, transmitted and stored will increase. Privacy advocacy groups will be monitoring and advocating how these metaphorical breadcrumbs are used in the future.
We'd love to know if you are using a digital assistant in your home. If so, have you changed any of the privacy settings or left them factory standard?
Interactive voice response (IVR) technology is constantly adapting to the industry’s needs. An example of such an adaptation is visual IVR (VIVR).
Hit Go to Search or X to close