Skip to main content
Fig. 3 | Journal of NeuroEngineering and Rehabilitation

Fig. 3

From: User activity recognition system to improve the performance of environmental control interfaces: a pilot study with patients

Fig. 3

Multimodal system processing for one ADL in AIDE mode. The user had to perform different actions in order to execute the corresponding ADLs, in this example, the user had to switch on the TV, phases a-g show the behavior of both screens during the task. EEG (a) and EOG (b) signals were acquired to online control the ECI in order to perform ADLs in a virtual house. When the task started (vertical purple line), the scan through the ECI was performed by EOG activity detection [orange line in (b)], i.e. when HOV activity exceeded the threshold [indicated by the orange dashed line in (b)] the grid marker moved forward (phases a-e). Once the subject stopped at one grid, a task confirmation was needed [indicated by the vertical black line] and the ECI ‘switched off’ the rest of the grids indicating this purpose (phase f). The confirmation was performed by the detection of SMR-ERD [indicated by red line in (a)] and the action was done, so the ADL finished (vertical dotted purple line). This ADL was performed in one step, i.e. the user only needed to navigate through the last abstraction level to complete the task. Before the experimentation, the user was trained in motor imagery (c) and EOG movement (d) to the set up the control system with the personalized parameters

Back to article page