The idea that certain primate vocalizations are information-bearing signals is an old one. In 1892, R. L. Garner used playbacks of monkey vocalizations to deduce that some monkey calls had referential significance — they were what we now know as food or alarm calls. But it was much later, only about fifty years ago, that the concepts of 'information' and 'signal' became clear enough to be formalized mathematically by Norbert Wiener and Claude Shannon, giving birth to modern information theory. By isolating and formally defining a quantity termed information, which has surprising affinities to the physicists' concept of entropy, Shannon and Wiener planted the seeds of today's digital world, where diverse types of information can be transformed, stored or transmitted as a pattern of binary digits (or 'bits', a term they introduced). Shannon and Wiener were acutely aware that information (a measurable quantity of signals) is not to be confused with meaning (which depends on context and interpretation, and exists in the eye of the beholder). They both explicitly set aside 'meaning' as a topic for future work, and it remains formally undefined today.
展开▼