The technical term is glossokinetic potential (GKP). As you move your tongue around your mouth, it changes electrical potentials all over your head. Making out with someone isn't likely to result in a cerebral lightning storm, but the changes are nonetheless detectable via EEG. This detectability raises the interesting possibility of using tongue movements to control machines, like computers or wheelchairs.
Tongue input technology based on GKPs is something being actively pursued by computer scientist Yunjun Nam and colleagues at the Laboratory for Advanced Brain Signal Processing in Tokyo. In the current issue of IEEE Systems, Man, and Cybernetics Magazine they describe a successful tongue-machine interface immediately capable of directing an motorized wheelchair and that shows promise for being used in silent speech recognition technology, e.g. soundless talking.
The glossokinetic potential is usually considered to be an EEG "artifact." Such artifacts actually arise from many different sources not directly related to brain activity, such as the pulsing of blood vessels, eye movements, and the twitches of scalp muscles themselves. Usually, these are things that represent pitfalls in EEG interpretation for the simple reason that they obscure actual brain activity.
But Nam and co., who are clearly tongue fans, see something else, something useful. The tongue is capable of a wide range of motion and, thus, is capable of representing a wide range of information.
"The human tongue is a special organ with dynamic mobility," Nam writes. "One may test the following tongue movements to realize its variety. The tongue can be bent up/down, moved left/right, stuck out forward/retracted backward, rolled clockwise/ counterclockwise, flattened, or rounded. Some people even can fold the lateral edges of the tongue upward to form a tube, commonly referenced as a 'tongue-rolling' trait in old-fashioned introductory genetics courses."
The reason the tongue has this electrical effect is simply that the tip of the tongue has a negative electric charge with respect to its root. Consequently, when the tongue touches surrounding tissues, the result is a decrease in potential at the point of contact. By tracing patterns of decreased potentials in EEG data, Nam and co. found that they could trace actual tongue positions—a useful ability, as it turns out.
The first step in Nam's research was to prove that tongue movements could be useful at all. This was accomplished first through a series of experiments in which subjects were asked to respond to visual cues by moving their tongues this or that way as their EEG responses were recorded. Crucially, they found that the response lag for the tongue interface is about on par with the normal response time for recognizing a visual cue in college-age individuals. Then, proving that the tongue interface could have a practical application was accomplished by creating a "tongue rudder" which tracks horizontal tongue movements and translates them into directional control for a wheelchair.
Still, perhaps the most interesting potential application of the tongue-machine interface has to do with speech. To capture this sort of information it's neccessary to expand from the horizontal tongue tracking of the wheelchair system to one that can capture both vertical movements and forward-backward movements, both of which play large roles in speech production. In further experiments, Nam and co. were indeed able to identify related EEG activity.
"A fast and reliable technique to trace tongue movements is important for speech scientists who try to understand how speech is controlled by the tongue or how the tongue is disrupted in various speech disorders," the paper explains. "GKP can provide a simple and cost-effective method to trace tongue movements that is able to detect the contact of the tongue with other articulatory organs."
Helping people is cool and all, but just imagine the VR applications of a tongue-machine interface.