
Health Editor’s Note: Amazing things happen when everyone works together. AI is being put to use in medicine…Carol
Can Crowdsourcing Lead to Artificial Intelligence Tools for Radiation Oncologists?
By Christina Bennett MS/Cancernetwork
A worldwide crowd innovation contest incentivized international programmers to create automated artificial intelligence (AI) algorithms for an area of radiation oncology, a study published in JAMA Oncology showed. The study also highlighted the difficulties of creating AI solutions by crowdsourcing and how far AI still has to go in medicine.
Given the lack of radiation oncology expertise globally and the need to address that demand, the study researchers created a contest in which programmers from around the world would develop AI algorithms that could replicate a very time-consuming and manual task performed by radiation oncologists: segmenting lung tumors for radiation therapy targeting. The contest was designed to last 10 weeks, take place online, be carried out across three phases, and include financial prizes, which totaled to $55,000.
To train and test their algorithms, contestants were supplied with a lung tumor data set that had been segmented by a single radiation oncologist. Contestants also received real-time feedback during each phase about the performance of their algorithm, allowing them to modify to improve accuracy. Contestants submitted their algorithms at the end of each phase and prizes were awarded to top performers.
Overall, 34 of the 564 contestants (6%) who registered for the challenge submitted 45 algorithms across the three phases, with the pool shrinking as the phases progressed. Phase 1 had 29 algorithms, phase 2 had 11, and phase 3 had 5.
The task of lung segmentation for radiotherapy is known to have considerable interobserver variability. Using the interobserver variability seen among six experts as a benchmark, the top performing algorithm from phase 3 successfully performed within the interobserver variability range.
However, although algorithms were indeed generated using this crowd-based approach, ultimately none can be used in their current form to expand radiation oncology expertise to areas of the world that lack radiation oncologists, which was the overarching goal of the study. In part, this is because the algorithms were trained using a data set from a single radiation oncologist and may contain biases.
“They spent approximately $50,000 to do this. It’s not a huge amount of money, but it’s not a sneeze either,” said Thompson. By comparison, the 2017 Kaggle Bowl, a project that crowdsourced AI algorithms for early lung cancer detection, awarded $1 million in prizes to contestants. For the current crowdsourcing effort, the relatively low-cost design made the approach more sustainable and scalable than other efforts, but ultimately the end result was a “suboptimal product” that can’t be used, he said.

Carol graduated from Riverside White Cross School of Nursing in Columbus, Ohio and received her diploma as a registered nurse. She attended Bowling Green State University where she received a Bachelor of Arts Degree in History and Literature. She attended the University of Toledo, College of Nursing, and received a Master’s of Nursing Science Degree as an Educator.
She has traveled extensively, is a photographer, and writes on medical issues. Carol has three children RJ, Katherine, and Stephen – one daughter-in-law; Katie – two granddaughters; Isabella Marianna and Zoe Olivia – and one grandson, Alexander Paul. She also shares her life with her husband Gordon Duff, many cats, and two rescues.
ATTENTION READERS
We See The World From All Sides and Want YOU To Be Fully InformedIn fact, intentional disinformation is a disgraceful scourge in media today. So to assuage any possible errant incorrect information posted herein, we strongly encourage you to seek corroboration from other non-VT sources before forming an educated opinion.
About VT - Policies & Disclosures - Comment Policy