Skip to Content

Government Surveillance, Privacy & Data

Ok Google, Can You Repeat That?

Last month, Google launched Duplex, what it calls “a new technology for conducting natural conversations to carry out ‘real world’ tasks over the phone”, in an on-stage keynote presentation at its developer conference. CDT was part of a group of journalists and advocates invited over the past week to experience Duplex in a hands-on environment, as Google seeks to relaunch Duplex and address many of the concerns raised earlier. Based on the wide range of public reaction, both expected and unexpected, Google appears to have more fully considered the ethical and privacy critiques it received as it lays out its plan for the development and roll-out of Duplex.

Duplex allows the Google Assistant to make phone calls on behalf of a users’ request to complete specific tasks. The combination of speech recognition, parsing user intent, and voice synthesis by Duplex allows Assistant to engage with a human by phone using conversational AI. It is limited to an extremely narrow set of tasks (for now): confirming business holiday hours, making a reservation at a restaurant, and booking an appointment at a hair salon. But it is natural to infer how this technology could be used in a number of scenarios in which the user may not have the time, desire, or ability to engage in a straightforward transactional conversation like booking an appointment, placing an order, or requesting basic information.

This combination of unencumbered speculation about conversational AI and the magical technical capabilities Google put on display in a choreographed on-stage demo generated significant public interest. Most of the negative reaction centered around the ethics of Google going “too far” by trying to trick the call recipient into thinking that they were talking to a human; concerns were also raised that the user’s private data may be disclosed during those phone calls. Google found itself in the unenviable position of trying to answer questions about ethics and privacy from the deep end of the uncanny valley.

Today Google is having a relaunch of sorts of Duplex. This time it is showing that the company can react and iterate more quickly based on public concerns; and more importantly, that it can focus on communicating the thoughtfulness involved in the Duplex development process. The biggest change since the developer conference was Google’s announcement of its AI Principles that are supposed to guide its “responsible AI innovation.” Transparency and control are now heavily emphasized. Duplex announces its purpose, identifies itself, and states that the call will be recorded at the beginning of every call. The recipient can even decline to participate on a recorded call, and a business can opt-out Duplex interactions entirely. When that happens, or in any other scenario that Duplex cannot process, Duplex degrades rather gracefully by apologizing, ending the call and a human Google Assistant calls back to complete the request.

Significantly limiting tasks that Duplex can complete and being able to fail back to human assistants are only temporary measures, however. Google still aims toward fully automated completion of user requests that are likely to become increasingly diverse and complex. Ethical and privacy concerns will be deeply ingrained in those diverse, complex tasks. From a technical perspective, the speech disfluencies (“ums” and “uhs”) and natural-sounding voices lead to an increased success rate in completing the task. From an ethical perspective, those same features may be taking advantage of and reinforcing cultural biases. For example: by restricting the user’s ability to choose the Duplex voice, Google may decide to use a particular dialect or accent purely on a technical basis of increasing the likelihood of successfully making an appointment. However, that increased success rate may be based on the business employee’s existing bias for or against certain cultural groups based on voice alone. Whether or not Google decides to leverage or challenge existing social biases is a complex internal decision. The gender choice of several voice assistants may offer insight into that decision-making process.

Debate over participation in the call and the ownership of the call data and metadata may now involve four parties: the human user initiating the request, Google placing the call, the business receiving the call, and the human answering the call. That call data is currently subject to Google’s existing data retention and privacy policies for its users as well as state and federal call recording laws. However, there is an interesting question as to whether an employee should be subject to those user policies and whether they can or should be able to consent to having their voice recorded, saved, and analyzed by Google as a condition of performing their job duties. Duplex can be a tremendous enabling technology for people with disabilities or any person facing a physical or language barrier that makes talking on the phone a challenge. Would an employee or business declining to participate in a Duplex call alienate those users or perhaps even run afoul of protections offered by Title III of the Americans with Disabilities Actk?

We recently attended Google’s reintroduction of Duplex that included an ambitious live demo at an actual restaurant with unscripted participants. Duplex was capable of conversing with trained restaurant staff and other untrained call takers (I was certainly in the untrained group) in a natural and effective manner. Duplex even handled my unusual statement of “I’m not sure that I’m allowed to be on a recorded call” by transferring me to a human Google Assistant on an unrecorded line. This was quite a difference compared to the pre-recorded calls presented at the developers conference. It was a strong counter to the speculation about the authenticity of the original demo, the disclosures Duplex makes to call recipients, and the conversational capabilities of Duplex. Google should continue to be just as ambitious in demonstrating its commitment to offering users and third-parties transparency and control over their interactions with Google technologies like Duplex.