Silicon Valley startups sometimes seem like they’ve made it their goal to come up with the most dystopian uses of Artificial Intelligence. Earlier this month we had Meta’s racist chatbot and the AI-generated rapper that uses the n-word, before that, back in June, we learned about the Google AI so good it convinced an engineer it was sentient. This time around, Palo Alto-based startup Sanas has introduced to the world an AI with the goal of making foreign call center employees sound accent-neutral, and the effect of making them sound white.
As reported by SFGATE, Sanas is a startup that offers “accent translation” for call center employees, a job that tends to be outsourced to cheaper foreign markets like India and the Philippines. Sanas, which was founded by three Stanford graduates, offers a real-time accent translation service, supposedly to make it easier for call center employees to be understood. It has already received over $30 million in venture capital funding.
“We don’t want to say that accents are a problem because you have one,” Sanas president Marty Sarim told SFGATE. “They’re only a problem because they cause bias and they cause misunderstandings.”
Based on the demo you can try out on Sanas’ website where you can “hear the magic,” it really does work. Not only does the software remove the accent, but it replaces the voice with something unsettlingly robotic akin to a standard American English accent. According to its website, Sanas believes this will allow call center employees to “take back the power of their own voice.”
A common comparison to Sanas’ AI has been to the 2018 film Sorry to Bother You where the main character, a Black man, adopts a “white voice” in order to garner more sales at his dystopian call center job. While Sanas states that its AI is meant to combat bias, critics assert that “accent translation” is another way to dehumanize an already dehumanizing job.
“On the surface it reflects communication difficulty — people not being able to understand someone else’s speech,” Winifred Poster, a professor of sociology at Washington University in St. Louis told SFGATE. “But, really, it’s coded for a whole bunch of other issues about how accent triggers racism and ethnocentrism.”
The Sanas AI not sounding human doesn’t help much either. According to the University of Toronto’s Kiran Mirchandani, whose research was on the treatment of Indian call center employees, she told SFGATE that people who are already predisposed to dishing out racist abuse to call center employees also won’t take kindly to a robotic voice on the phone either.
“Customer racism is likely to increase if workers are further dehumanized when an ‘app’ is placed between worker and customer, especially since there will no doubt be errors made by the app,” she told SFGATE.
Sanas’ president Sarim stressed in his interview with SFGATE that workers will have a choice about whether or not to use the AI’s accent translation. However, those familiar with the exploitation that happens within the foreign call center industry believe if the tech proves to be successful, the workers won’t have much of a choice.
“There is virtually nothing in the labor process of call centers which involves choice by the workers in terms of technology,” Poster told SFGATE. Already, workers are subject to deeply invasive surveillance, which makes it almost impossible to have an authentic conversation with people on the other end.”