TY - JOUR
T1 - Point-of-care, smartphone-based, dual-modality, dual-view, oral cancer screening device with neural network classification for low-resource communities
AU - Uthoff, Ross D.
AU - Song, Bofan
AU - Sunny, Sumsum
AU - Patrick, Sanjana
AU - Suresh, Amritha
AU - Kolur, Trupti
AU - Keerthi, G.
AU - Spires, Oliver
AU - Anbarani, Afarin
AU - Wilder-Smith, Petra
AU - Kuriakose, Moni Abraham
AU - Birur, Praveen
AU - Liang, Rongguang
N1 - Funding Information:
RL received research funding from the National Institutes Health, National Institute of Biomedical Imaging and Bioengineering award UH2EB022623 (https://www.nibib.nih.gov/). RDU received research funding from the National Institutes of Health, National Institute of Biomedical Imaging and Bioengineering, Graduate Training in Biomedical Imaging and Spectroscopy grant T32EB000809 (https://www.nibib.nih.gov/training-careers/postdoctoral/ruth-l-kirschstein-national-research-service-award-nrsa-institutional-research/ supported-programs). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Thank you to Pier Morgan and the Center for Gamma-ray Imaging (CGRI) for use and operation of the rapid prototype printer, and to Ken Almonte for guidance on the design of the LED driver. Thank you to TIDI Products for manufacturing custom shields for our device.
Publisher Copyright:
© 2018 Uthoff et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
PY - 2018/12
Y1 - 2018/12
N2 - Oral cancer is a growing health issue in a number of low- and middle-income countries (LMIC), particularly in South and Southeast Asia. The described dual-modality, dual-view, point-of-care oral cancer screening device, developed for high-risk populations in remote regions with limited infrastructure, implements autofluorescence imaging (AFI) and white light imaging (WLI) on a smartphone platform, enabling early detection of pre-cancerous and cancerous lesions in the oral cavity with the potential to reduce morbidity, mortality, and overall healthcare costs. Using a custom Android application, this device synchronizes external light-emitting diode (LED) illumination and image capture for AFI and WLI. Data is uploaded to a cloud server for diagnosis by a remote specialist through a web app, with the ability to transmit triage instructions back to the device and patient. Finally, with the on-site specialist’s diagnosis as the gold-standard, the remote specialist and a convolutional neural network (CNN) were able to classify 170 image pairs into ‘suspicious’ and ‘not suspicious’ with sensitivities, specificities, positive predictive values, and negative predictive values ranging from 81.25% to 94.94%.
AB - Oral cancer is a growing health issue in a number of low- and middle-income countries (LMIC), particularly in South and Southeast Asia. The described dual-modality, dual-view, point-of-care oral cancer screening device, developed for high-risk populations in remote regions with limited infrastructure, implements autofluorescence imaging (AFI) and white light imaging (WLI) on a smartphone platform, enabling early detection of pre-cancerous and cancerous lesions in the oral cavity with the potential to reduce morbidity, mortality, and overall healthcare costs. Using a custom Android application, this device synchronizes external light-emitting diode (LED) illumination and image capture for AFI and WLI. Data is uploaded to a cloud server for diagnosis by a remote specialist through a web app, with the ability to transmit triage instructions back to the device and patient. Finally, with the on-site specialist’s diagnosis as the gold-standard, the remote specialist and a convolutional neural network (CNN) were able to classify 170 image pairs into ‘suspicious’ and ‘not suspicious’ with sensitivities, specificities, positive predictive values, and negative predictive values ranging from 81.25% to 94.94%.
UR - http://www.scopus.com/inward/record.url?scp=85058054065&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85058054065&partnerID=8YFLogxK
U2 - 10.1371/journal.pone.0207493
DO - 10.1371/journal.pone.0207493
M3 - Article
C2 - 30517120
AN - SCOPUS:85058054065
SN - 1932-6203
VL - 13
JO - PLoS One
JF - PLoS One
IS - 12
M1 - e0207493
ER -