
Some folks at icrontic e-mailed us this suggestion:
“Some friends and I had a brief discussion about the technology you are developing. This idea came up: ‘I wonder if they could work with linguistic experts and statisticians to develop databases that essentially mapped common facial expressions to common verbal expressions. For example .. you see something cute and say “awwww” .. your eyebrows lift and you make the “isn’t that cute face” If they could automate that and create an API for it along with their lower jaw/mouth/tongue animations then I think they would be in serious business here.’ and a colleague responded: ‘The man you would be looking for is Dr. Paul Ekman and his creation, FACS. He figured out the emotional significance of every facial expression and mapped it to the specific muscles involved.’ So there you have it, perhaps this idea helps you.”
Yes! There is definitely a need to automate non-verbal facial activity as well, especially in the upper face, and in fact this is something we’re actively working on, based on those correlations between speech and non-speech behavior which you rightly point out. And you’re right – Ekman is definitely the man in terms of understanding the muscular composition of these expressions, and their connection to psychological states. From there the trick is to get reasonable and robust predictions of nonverbal events from speech. We are open to using speech-based information from various levels – acoustics, syntax, lexical semantics – in a statistical framework. In addition we need to be able to synthesize those events with natural-looking dynamics.
This is certainly an easier problem than speech (which has insane dynamical properties and also has to be perfectly in sync with an audio channel), but still demands really good fidelity to be convincing to us humans, innately specialized in reading the tiniest movements on faces. It will certainly be a nice complement to our existing capabilities with lip sync.
Some other near-term strategies for the upper face are close on the horizon, so stay tuned!