Scientists from the NASA Ames Research Center are taking advantage of the nerve activity that happens near the throat when humans speak in order to gain information about what a person is saying. The researchers have shown that the sub auditory, or silent electrical signals in the throat can be tapped for speech recognition interfaces and communications. The researchers capture signals through two pairs of sensors stuck under the chin and on either side of the throat and use a computer to interpret the signals. These nerve signals — muscle control signals that the brain sends to the tongue and vocal cords — are present whether a person speaks audibly or silently.
展开▼