Skip to main content
Fig. 1 | Journal of NeuroEngineering and Rehabilitation

Fig. 1

From: MEG-based neurofeedback for hand rehabilitation

Fig. 1

Schematic of the BCI used to translate SMR into proportional control of grasping. Beginning in the upper left, first, the power spectrum of data recorded from 36 sensorimotor MEG sensors (shown on a top-down view of the MEG helmet) are computed using 300 ms sliding windows. A mask is applied to these features to remove any components that did not exhibit desynchronization during calibration. Then a linear decoder applies weights (W) to the neural signal (N) to compute a hand velocity value (V H ). The velocity output from the decoder is scaled (g) to ensure movement speeds are appropriate for the task. The previous hand position (an image from the video sequence) is then updated more closed or more opened within the ROM based on the scaled velocity command. The picture representing the desired aperture is chosen from 25 possible images. A progressive change in the images appeared to participants as a grasping movie with a 76 ms refresh rate

Back to article page