Skip to main content
Fig. 1 | Journal of NeuroEngineering and Rehabilitation

Fig. 1

From: Immersive virtual reality gameplay detects visuospatial atypicality, including unilateral spatial neglect, following brain injury: a pilot study

Fig. 1

The Attention Atlas. a Visual search localisation gameplay. Array elements were positioned on a spherical surface with an origin at the headset position, calibrated at the start of each level [31]. This panel depicts the full field level. b Coordinates. Array elements were presented on a spherical grid. Each game level used a subset of possible positions, which could appear within the central field of view (FOV) or towards or beyond its edge, requiring head and eye movements for target localisation. For the axes level, the most eccentric horizontal and vertical positions fell outside the central FOV. c Cue/array trial structure. Cues and arrays were presented until an element was selected. d Game levels are depicted from the first-person perspective. Stimuli are scaled for clarity. Those with eight elements were presented on a single concentric ring presented 15° from central vision. The tutorial, excluded from analysis, was a search of a red “T” among blue “Ls”. Axes was a search for a “T” among “Ls” positioned horizontally or vertically on separate trials. Stimuli were food, cards, and balloons on separate trials. For food, the target and distractors are randomly selected at the beginning of each trial from 121 food icons. The queen of diamonds was the target for cards. For balloons, the target was a balloon without a string located among balloons with strings. Depth presented elements simultaneously at one of two depths: a near-surface (2 m) and a far-surface (4 m). Full field presented elements in four concentric rings. Free viewing depicted a low-resolution polygon forest, which surrounded the player in 360°. We instructed players to “look around” and report what they could see

Back to article page