The Web Visual Paired Comparison (VPCW) was my first project at Emory University. This project proposed a web-based test to measures visual recognition memory by observing how a patient views familiar or novel images which has been shown to be effective in the clinical setting. This project was completed in collaboration with Dr. Jongho Shin, who was a postdoc at Emory university.
Deficits in memory are the clinical hallmark of Alzheimer's disease. Visual Paired Comparison (VPC) task is a recognition memory task that assesses the proportion of time an individual spends viewing a novel picture compared to a recently seen (familiar) picture. The figure above shows a patient tested with VPC using the ASL eye-tracking system (this particular system uses a chinrest to reduce patient’s head movement).
Since both infants and adults, monkeys and humans have innate preference for novelty, normal individuals spend more time viewing the novel picture if they memorize them. In this project, we evaluate this Novelty Preference to measure recognition memory. With a delay of 2 minutes between familiarization and test, age-matched controls spend 70% of the time looking at the novel stimulus, relative to the familiar stimulus. In contrast, MCI patients do not exhibit preferential looking.
The VPC-W task is similar to the eye tracking-based VPC, but the image viewing interface is specially modified to elicit and capture the subject’s image examination behavior (Right figure).
Specifically, two main techniques are used: (1) blurring of the stimulus images and (2) an oval shaped ViewPort or oculus that that tracks the cursor position. As in the VPC task, a subject performing the VPC-W task is shown two images, side by side, but the images are blurred so that one may only recognize the general image outline and not the details. A subject can direct the location of the Viewport (area of focus) by using a computer mouse or touchpad. The Viewport reveals the image details by showing the corresponding part of the original, un-blurred image inside the oculus. As illustrated in Figure 2, the viewport is currently positioned over the right image of the familiarization and test phase.
cursor trajectory for subject #11 using mouse.
cursor trajectory for subject #21 using touchpad.
However, adapting the VPC task to the Web environment introduces significant challenges for the Elderly (the target population for this test), who may have limited prior exposure to computers, and specifically may not be able to comfortably use a computer mouse or touchpad.
To target this challenge, we also designed a set of calibration tools and tutorials to enhance the Usability and Accessibility
Average times (in seconds) spent on completion of 9 calibration tasks (T1-T9)
A pilot test with VPC-W was performed at the Emory Wesley Woods clinic, with 23 elderly patients, aged 55-78 (12 MCI and 11 age-matched normal controls or NC). Of these subjects, 21 (91%) were able to successfully complete the task with minimal supervision by the nurse practitioner. For these subjects, VPC-W with the associated machine learning algorithms achieved over 85% accuracy in detecting impaired subjects, which compares favorably to existing human-administered tests.
Besides, we also compared the novelty preference between VPC & VPCW. At the 10 seconds delay, 13 NC subjects (mean age=63) viewed the novel picture 65% of the time, which is significantly greater than chance (p<0.01) and is similar to novelty preference as assessed by eye tracking at a comparable delay (67%; Crutcher et al., 2009). At the 1-minute delay, 21 NC subjects (mean age=61) viewed the novel picture 69% of the time, which is significantly greater than chance (p<0.001), and similar to novelty preference as assessed by eye tracking at a comparable delay (68%; Crutcher et al., 2009).
|Delay = 10s (mean age = 63)||67%||65%|
|Delay = 60s (mean age = 61)||68%||69%|