Skip to main content
Fig. 2 | Journal of NeuroEngineering and Rehabilitation

Fig. 2

From: A multi-dimensional framework for prosthetic embodiment: a perspective for translational research

Fig. 2

Representations of the multi-dimensional prosthetic embodiment spectrum in which a biological hand demarks the embodiment horizon. a 3D representation of prosthetic embodiment depending on the degree of interaction with the environment and the degree of integration of volition and multi-sensory information. A static object (e.g., a rubber hand) requires few sensory modalities to integrate correctly for embodiment to arise (e.g., visuo-tactile congruency). On the other side of the spectrum, an object dynamically interacting with the environment (e.g., a dexterous prosthetic hand) needs both the volition and all multi-sensory inputs to integrate correctly for embodiment to arise. Between the spectrum endpoints, we find chopsticks as a representative tool as an example, which due to their limited sensory feedback does not fulfill the sensory integration criteria for embodiment in a static environment. In a more dynamic environment, however, movements of a tool arising from volition can contribute to a partial, yet bounded embodiment experience. The grey bands indicate personal and circumstantial factors that can modulate the perceived prosthetic embodiment. b 2D projection of the multi-dimensional spectrum of panel a for more compact visualization. c The ideal one-dimensional embodiment scale represents fully dynamic interaction with the environment, which is arguably the ideal condition of operation for prosthetics

Back to article page