Uma nova estratégia de transferência de aprendizado profundo visando o diagnóstico de condições e doenças relacionadas aos olhos
Abstract
Data from the World Health Organization indicate that billions of cases of visual impairment could be avoided, mainly with regular examinations. However, the absence of specialists in basic health units has resulted in a lack of accurate diagnosis of systemic or asymptomatic eye diseases, increasing cases of blindness. In this context, the present thesis proposes a set of convolutional neural networks, which were submitted to a transfer learning process originating from 38,727 high-quality ocular fundus images, to improve the inference of 7,850 low-quality images acquired by low-cost equipment. Public databases such as Eyepacs, Messidor-2, REFUGE, and ODIR were also used for the validation of the proposed architecture in the ocular conditions of referable diabetic retinopathy, glaucoma, and cataract, achieving AUC results of 0.951 and 0.953 and accuracies of 99.6% and 99.3%, respectively. From low-quality images, the proposed approach achieved accuracies of 87.4%, 90.8%, 87.5%, and 79.1% for classifying cataract, referable diabetic retinopathy, abnormal excavation, and abnormal blood vessels, respectively. Thus, the proposed approach contributes to advancing the state-of-the-art in terms of: (i) validating the proposed transfer learning strategy, recognizing related ocular conditions and diseases in low-quality images; (ii) using high-quality images obtained by high-cost equipment only for training the predictive models; (iii) achieving results comparable to the state of the art, even using low-quality images. In addition to the fact that the deep transfer learning strategy is more suitable and feasible for application by public health systems in emerging and developing countries, the present thesis also proposes a system for providing Grad-CAM images of pathological predictions with values above 75% to support the decision-making process.
Collections
The following license files are associated with this item: