Skip to main content
Fig. 2 | Journal of NeuroEngineering and Rehabilitation

Fig. 2

From: Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping

Fig. 2

Shared control system diagram and robot testing set up. a System diagram for the vision-guided shared control. The blue boxes show the BMI system decoding endpoint translational and grasp velocity. The green boxes show the components of the vision-guided robotic system for grasping. If shared control was not in use, only the output of the BMI system was used to send commands to the arm, but with shared control, the control signal of the vision-guided system was blended with that of the BMI system to create the final robot command. b The 7.5 cm cube (yellow) and the target box (clear box) were positioned on the table, as shown, to start the ARAT trials. The subject sat approximately 1 m to the left of the robot. c An example of the central cross-section of the grasp envelope for a stable grasp position on a 7.5 cm cube is outlined by the blue dotted line. The shading shows the gradient of shared control (α value), with white areas being completely controlled by the BMI user and darker areas having more robot control. d A trial progression schematic showing when translation and grasp control are under BMI control (blue) or robot control (green). Wrist orientation was always maintained in a neutral posture under computer control

Back to article page