Our client needed to assess gaze tracking as part of their cognitive assessment platform. Specifically, they needed a mobile application to assess cognitive functioning in real time for healthy vs. not-healthy behavior in dementia, concussion, and intoxication patient cohorts.
Our team designed a tablet-based application to record a person’s ability to track a visual stimulus (nystagmogram) for healthy, non-concussed individuals. We conducted 2D and 3D feature engineering of the videos to extract facial landmarks, including corrections for head position, orientation, and angle. We trained a convolutional neural net to learn the focus of the gaze, and score a new individual’s ability to track a visual stimulus. Performance-tuned models delivered responses in real time within six degrees of accuracy on a mobile platform.