We have developed the New Jersey Institute of Technology—Home Virtual Rehabilitation System (NJIT—HoVRS) to facilitate intensive, hand-focused rehabilitation in the home. We developed testing simulations with the goal of providing richer information for clinicians performing remote assessments. This paper presents the results of reliability testing examining differences between in-person and remote testing as well as discriminatory and convergent validity testing of a battery of six kinematic measures collected with NJIT—HoVRS. Two different groups of persons with upper extremity impairments due to chronic stroke participated in two separate experiments. Data Collection: All data collection sessions included six kinematic tests collected with the Leap Motion Controller. Measurements collected include hand opening range, wrist extension range, pronation-supination range, hand opening accuracy, wrist extension accuracy, and pronation-supination accuracy. The system usability was evaluated by therapists performing the reliability study using the System Usability Scale. When comparing the in-laboratory collection and the first remote collection, the intra-class correlation coefficients (ICC) for three of the six measurements were above 0.900 and the other three were between 0.500 and 0.900. Two of the first remote collection/second remote collection ICCs were above 0.900, and the other four were between 0.600 and 0.900. The 95% confidence intervals for these ICC were broad, suggesting that these preliminary analyses need to be confirmed by studies with larger samples. The therapist’s SUS scores ranged from 70 to 90. The mean was 83.1 (SD = 6.4), which is consistent with industry adoption. There were statistically significant differences in the kinematic scores when comparing unimpaired and impaired UE for all six measures. Five of six impaired hand kinematic scores and five of six impaired/unimpaired hand difference scores demonstrated correlations between 0.400 and 0.700 with UEFMA scores. Reliability for all measures was acceptable for clinical practice. Discriminant and convergent validity testing suggest that scores on these tests may be meaningful and valid. Further testing in a remote setting is necessary to validate this process.
All Science Journal Classification (ASJC) codes
- Analytical Chemistry
- Information Systems
- Atomic and Molecular Physics, and Optics
- Electrical and Electronic Engineering