I just wanted to emphasize that the basic idea here is a quite simple one: Two loosely-coupled systems tend to get in sync, or otherwise into a steady state of interaction. Historically the phenomenon was discovered when two grandfather clocks next to each other started beating in time. Personally I learned about the concept in this article: Iverson, J. M. & Thelen, E. (1999). Hand, mouth and brain. The dynamic emergence of speech and gesture. Journal of Consciousness Studies , 6(11-12), pp. 19–40. ( http://cspeech.ucd.ie/Fred/docs/IversonThelen.pdf ) This article seems quite apropos for the study of non-verbal communication -- indeed, it illustrates an originary link between non-verbal and verbal communication. I'm finding the ideas from that article useful (at least at a metaphorical level) thinking about how to build AI systems that use language. I'd assert that this will really take off when we see that it's not just about text mining -- which would cor