Crowdsourcing has become an increasingly popular way to develop machine learning algorithms to solve many clinical problems in a variety of diseases. Today at the American College of Rheumatology (ACR) annual meeting, a multi-center team led by a researcher from the Hospital for Special Surgery (HSS) presented the results of the RA2-DREAM Challenge, a collective effort focused on developing better methods to quantify joint damage in people with rheumatoid arthritis (RA).
Joint damage in people with RA is currently measured by visual inspection and detailed scoring on x-ray images of small joints in the hands, wrists and feet. This includes both narrowing of the joint space (which indicates loss of cartilage) and bone erosions (which indicate damage due to invasion of the inflamed joint wall). The scoring system requires specially trained experts and is time consuming and expensive. According to the study’s lead author, S. Louis Bridges, Jr., MD, PhD, chief medical officer and chair of the Department of Medicine at HSS.
If a machine learning approach could provide a quick and accurate quantitative score estimating the degree of joint damage to the hands and feet, it would greatly aid clinical research. For example, researchers could analyze data from electronic health records and genetic tests and other research tests to find biomarkers associated with progressive damage. Having to visually inspect all of the images ourselves would be tedious, and outsourcing it is prohibitively expensive. “
S. Louis Bridges, Jr., MD, PhD, Chief Medical Officer and Chairman, Department of Medicine, HSS
“This approach could also help rheumatologists by quickly assessing whether there is progression of damage over time, which would lead to a change in treatment to avoid further damage,” he added. “This is really important in geographic areas where expert musculoskeletal radiologists are not available.”
To meet the challenge, Dr Bridges and colleagues partnered with Sage Bionetworks, a nonprofit organization that helps researchers create Dialogue on Reverse Engineering Assessment and Methods (DREAM) challenges. These competitions focus on the development of innovative tools based on artificial intelligence in the life sciences. Investigators called for nominations, with grants offering prizes to winning teams. Competitors came from a variety of fields, including computer scientists, computational biologists, and medical scientists; none was a radiologist with expertise or training in reading X-ray images.
For the first part of the challenge, a set of images were provided to the teams, along with known scores that had been generated visually. These were used to train the algorithms. Additional image sets were then provided so that competitors could test and refine the tools they had developed. On the last lap, a third set of images were given without scores, and the contestants estimated the amount of joint space narrowing and erosion. Submissions were judged based on those that most faithfully reproduced the visually generated benchmark scores. There were 26 teams that submitted algorithms and 16 final submissions. In total, the contestants received 674 sets of images of 562 different RA patients, all of whom had participated in previous research studies funded by the National Institutes of Health and led by Dr. Bridges. In the end, four teams were named the best performing.
For the organizers of the DREAM Challenge, it was important that any scoring system developed under the project was freely available rather than proprietary, so that it could be used free of charge by researchers and clinicians. âPart of the appeal of this collaboration was that it’s all in the public domain,â Dr. Bridges said.
Dr Bridges explained that more research and the development of computational methods is needed before the tools can be widely used, but current research shows that this type of approach is feasible. “We still have to refine the algorithms, but we are much closer to our goal than before the Challenge,” he concluded.