Will technology force everyone to be honest, or will people just develop technology to fool the technology? How invasive will this be? How invasive will technology not even imagined yet become? Are we in a technology bubble now like the Matrix movie? Some people think so. It's called the Creation. How much more advanced is God? Who can imagine the infinitely advanced?

Long-Promised, Voice Commands Are Finally Going Mainstream
By Alexander Gelfand 06.04.08

Speech technology has long languished in the no-man's land between sci-fi fantasy ("Computer, engage warp drive!") and disappointing reality ("For further assistance, please say or press 1 ...").

But that's about to change, as advances in computing power make voice recognition the next big thing in electronic security and user-interface design.

A whole host of highly advanced speech technologies, including emotion and lie detection, are moving from the lab to the marketplace.
The Federal Financial Institutions Examination Council has issued guidance requiring stronger security than simple ID and password combinations, which is expected to drive widespread adoption of voice verification by U.S. financial institutions in coming years. Ameritrade, Volkswagen and European banking giant ABN AMRO all employ voice-authentication systems already.

Speech recognition systems that can tell if a speaker is agitated, anxious or lying are also in the pipeline.

Computer scientists have already developed software that can identify emotional states and even truthfulness by analyzing acoustic features like pitch and intensity, and lexical ones like the use of contractions and particular parts of speech. And they are honing their algorithms using the massive amounts of real-world speech data collected by call centers.

A reliable, speech-based lie detector would be a boon to law enforcement and the military. But broader emotion detection could be useful as well.

For example, a virtual call center agent that could sense a customer's mounting frustration and route her to a live agent would save time, money and customer loyalty.
Companies like Autonomy eTalk claim to have functioning anger and frustration detection systems already, but experts are skeptical. According to Julia Hirschberg, a computer scientist at Columbia University, "The systems in place are typically not ones that have been scientifically tested."
In a study funded by the National Science Foundation and the Department of Homeland Security, Hirschberg and several colleagues used software tools developed by SRI to scan statements that were known to be either true or false. Scanning for 250 different acoustic and lexical cues, "We were getting accuracy maybe around the mid- to upper-60s," she says.

That may not sound so hot, but it's a lot better than the commercial speech-based lie detection systems currently on the market. According to independent researchers, such "voice stress analysis" systems are no more reliable than a coin-toss.

It may be awhile before industrial-strength emotion and lie detection come to a call center near you. But make no mistake: They are coming. And they will be preceded by a mounting tide of gadgets that you can talk to — and argue with.

Tom Usher

About Tom Usher

Employment: 2008 - present, website developer and writer. 2015 - present, insurance broker. Education: Arizona State University, Bachelor of Science in Political Science. City University of Seattle, graduate studies in Public Administration. Volunteerism: 2007 - present, president of the Real Liberal Christian Church and Christian Commons Project.