Rheumatoid arthritis (RA) is a condition for which mobile applications have the potential to improve patient outcomes. However, the results of a recent review showed that there is a lack of high-quality applications that can be used to facilitate the management of patients with the disease (Grainger R, et al. JMIR Mhealth Uhealth. 2017;5:e7).
“The findings of this study suggest that currently available RA apps [applications] for RA disease activity monitoring are of variable quality and generally do not comply with RA management guidelines,” reported Rebecca Grainger, BMedSc, MBChB, FRACP, PhD, Senior Lecturer, Rehabilitation Teaching and Research Unit, Department of Medicine, University of Otago Wellington, New Zealand, and colleagues.
The functionality and quality of applications for monitoring RA disease activity were assessed by identifying applications for monitoring of RA (available through iTunes and Google Play), summarizing and comparing their features with the RA disease activity monitoring guidelines established by the American College of Rheumatology (ACR) and the European League Against Rheumatism (EULAR) RA treatment guidelines, and rating application quality according to the recently developed Mobile Application Rating Scale (MARS).
When rating the applications, Dr Grainger and colleagues considered whether they included a validated composite disease activity (CDA) measure, and whether they offered features that could record results for future reference. Two independent reviewers were enlisted to rate the applications using MARS, which includes engagement, functionality, aesthetics, and information quality. MARS categories were graded on a scale of 1 to 5, with 1 being considered inadequate and 5 being excellent.
Of the 19 applications included in the review, 14 included ≥1 validated instruments for measuring RA disease activity, 11 allowed users to enter a joint count (7 of these used the standard 28 swollen and tender joint count), 8 included ≥1 ACR- and EULAR-recommended RA composite disease activity measures, and 10 included the capacity for data storage and retrieval. Only 1 application (ie, Arthritis Power) included an RA CDA measure and also tracked data; however, this application did not use the standard 28 tender and swollen joint count.
The median overall MARS score for applications reviewed in the study was 3.41 of 5. Of the 6 applications that received a score of ≥4 for overall MARS rating, only 1 included a CDA score endorsed by ACR and EULAR; however, this application did not possess the ability to track data. Furthermore, Dr Grainger and colleagues could not identify any applications that scored ≥4 overall for MARS and that included all ACR- and EULAR-endorsed disease activity instruments.
“This could be because apps [applications] are designed with either people with RA or rheumatologists as target users where patients do not usually perform joint counts and doctors would not usually need to store patient data in a mobile phone,” Dr Grainger and colleagues posited.
Based on the results of their study, Dr Grainger and colleagues asserted that current mobile applications available to assist in the management of patients with RA are configured as simple calculators for rheumatologists or as data tracking tools for patients, and the latter do not uniformly collect data using validated instruments. They also noted that the development of more appropriate, high-quality applications will require a multidisciplinary effort.
“Developing apps [applications] that are attractive, engaging, simple to use, and having functionalities relevant to the clinical management of the health condition will require collaboration between rheumatologists, people with RA, app developers, and health systems, and due consideration of local regulatory environment, health service delivery, and user experience,” Dr Grainger and colleagues concluded.