In fact, the complete dissociation between Input and Output, or sensory and motor channels is meaningless, because of the tight coupling between input and output in the human information processing system. First, of all, perception of an external world is meaningless in a system which cannot perform actions on that world. Second, it can be argued that none of the sensory modalities is completely independent of motor output. Vision relies to such an extent on the motor output of the oculomotor muscles that a stationary image will disappear within a second after paralysis of these muscles (due to the absence of optic flow). Hearing depends in two ways on motor output: the head orientation helps in sound localization, and the dynamic range of sound perception is regulated by tiny muscles stretching the tympanic membrane. Propriocepsis is controlled by the gamma-efferents, where the central nervous system directly controls the sensitivity of the muscle spindle receptors. Indeed, for each sensory modality or sub-modality it is possible to identify one or more reflexes, which are (more or less automatic) control systems, either open-loop (as the vestibulo-ocular reflex) or closed-loop (as in the stretch reflex). These reflexes (i.e. the appropriate activation/stimulation of them by an artificial system) are important for virtual reality but not necessarily for advanced (multimodal) man-machine interaction.
In any case, the strongest form of integrated perception & action occurs in somesthesia, kinesthesia, sense of touch, haptic perception. Many people (Aristotle as well as his followers, probably including Muller) take for granted that the sense of touch, as the perceptual channel which operates while we touch/manipulate objects, only has to do with the skin, forgetting the crucial role of the muscle/joint receptors. The pure stimulation of the skin without a concurrent stimulation of joint/muscle receptors, which is being attempted in some ``advanced'' data gloves, is highly non-physiological and virtually pointless from the man-machine point of view. The concept of haptic perception captures the essence of this problem, although it still has no nationality in neurobiology; Kandel & Schwartz, for example, speak of the ``Somatic Sensory System'' to include in an integrated way pain, thermal sensation, touch-pressure, position sense and kinesthesia. The main point is that in the tactile exploration of an object it is the ``conjunction'' of information from skin, muscle, joint receptors, not their ``disjunction'', which is essential. Slight misalignments of the different components can destroy the internal coherence (the Gestalt) of the haptic-perceptual process. Kandel & Schwartz, differently from Shepherd, do not define ``sensory modalities'' per se but subdivide the analysis of sensory-perceptual processes into three main ``Systems'': the somatic, visual, and auditory sensory systems, respectively. They consider the ``sense of balance'' as a component of the motor control system and forget smell and taste, mainly due to their limited cortical representations. We suggest to drop the term ``sense of touch'' because it is misleading: It can be interpreted as all-inclusive or in an extremely selective way. It is better to distinguish , along with Shepherd, between ``somesthesia'' and ``kinesthesia'' and to define ``haptic perception'' as the conjunction of the two: not a mere sum but some kind of integration process which builds a common internal representation of space and objects. From this point of view, haptic perception is multimodal by itself.