Skip to main content
Fig. 6 | Journal of NeuroEngineering and Rehabilitation

Fig. 6

From: A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment

Fig. 6

Four representative trials from Experiment One (a-d), showing how the three methods classified gaze events. The gaze angular velocity (computed using Equation 6a) is plotted as a function of time. The top plot in each panel is the raw data. In the three lower plots (all four panels), we show gaze events classified by Manual Classification, Algorithm Classification and SR Classification. The magenta areas indicate fixation and gold indicate saccades. The smooth pursuit of the object on the left is marked in grey and the pursuit of the right object in red. The black vertical line shows the time point at which the two moving objects appeared in the workspace. The blue areas in the lower three panels indicate unclassified gaze events. All four panels (a, b, c and d) clearly show that Algorithm Classified gaze events were very similar to the Manual Classified gaze events. Furthermore, SR Classification did not appropriately classify fixations on the fixation circles (missing magenta areas before black vertical lines in the last plot of panels b, c and d)

Back to article page