Understanding inherent image features in CNN-based assessment of diabetic retinopathy

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

Understanding inherent image features in CNN-based assessment of diabetic retinopathy. / Reguant, Roc; Brunak, Søren; Saha, Sajib.

In: Scientific Reports, Vol. 11, 9704, 2021.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Reguant, R, Brunak, S & Saha, S 2021, 'Understanding inherent image features in CNN-based assessment of diabetic retinopathy', Scientific Reports, vol. 11, 9704. https://doi.org/10.1038/s41598-021-89225-0

APA

Reguant, R., Brunak, S., & Saha, S. (2021). Understanding inherent image features in CNN-based assessment of diabetic retinopathy. Scientific Reports, 11, [9704]. https://doi.org/10.1038/s41598-021-89225-0

Vancouver

Reguant R, Brunak S, Saha S. Understanding inherent image features in CNN-based assessment of diabetic retinopathy. Scientific Reports. 2021;11. 9704. https://doi.org/10.1038/s41598-021-89225-0

Author

Reguant, Roc ; Brunak, Søren ; Saha, Sajib. / Understanding inherent image features in CNN-based assessment of diabetic retinopathy. In: Scientific Reports. 2021 ; Vol. 11.

Bibtex

@article{cdd91657b50f4f7eace2e6239d069269,
title = "Understanding inherent image features in CNN-based assessment of diabetic retinopathy",
abstract = "Diabetic retinopathy (DR) is a leading cause of blindness and affects millions of people throughout the world. Early detection and timely checkups are key to reduce the risk of blindness. Automated grading of DR is a cost-effective way to ensure early detection and timely checkups. Deep learning or more specifically convolutional neural network (CNN)-based methods produce state-of-the-art performance in DR detection. Whilst CNN based methods have been proposed, no comparisons have been done between the extracted image features and their clinical relevance. Here we first adopt a CNN visualization strategy to discover the inherent image features involved in the CNN's decision-making process. Then, we critically analyze those features with respect to commonly known pathologies namely microaneurysms, hemorrhages and exudates, and other ocular components. We also critically analyze different CNNs by considering what image features they pick up during learning to predict and justify their clinical relevance. The experiments are executed on publicly available fundus datasets (EyePACS and DIARETDB1) achieving an accuracy of 89 ~ 95% with AUC, sensitivity and specificity of respectively 95 ~ 98%, 74 ~ 86%, and 93 ~ 97%, for disease level grading of DR. Whilst different CNNs produce consistent classification results, the rate of picked-up image features disagreement between models could be as high as 70%.",
author = "Roc Reguant and S{\o}ren Brunak and Sajib Saha",
year = "2021",
doi = "10.1038/s41598-021-89225-0",
language = "English",
volume = "11",
journal = "Scientific Reports",
issn = "2045-2322",
publisher = "nature publishing group",

}

RIS

TY - JOUR

T1 - Understanding inherent image features in CNN-based assessment of diabetic retinopathy

AU - Reguant, Roc

AU - Brunak, Søren

AU - Saha, Sajib

PY - 2021

Y1 - 2021

N2 - Diabetic retinopathy (DR) is a leading cause of blindness and affects millions of people throughout the world. Early detection and timely checkups are key to reduce the risk of blindness. Automated grading of DR is a cost-effective way to ensure early detection and timely checkups. Deep learning or more specifically convolutional neural network (CNN)-based methods produce state-of-the-art performance in DR detection. Whilst CNN based methods have been proposed, no comparisons have been done between the extracted image features and their clinical relevance. Here we first adopt a CNN visualization strategy to discover the inherent image features involved in the CNN's decision-making process. Then, we critically analyze those features with respect to commonly known pathologies namely microaneurysms, hemorrhages and exudates, and other ocular components. We also critically analyze different CNNs by considering what image features they pick up during learning to predict and justify their clinical relevance. The experiments are executed on publicly available fundus datasets (EyePACS and DIARETDB1) achieving an accuracy of 89 ~ 95% with AUC, sensitivity and specificity of respectively 95 ~ 98%, 74 ~ 86%, and 93 ~ 97%, for disease level grading of DR. Whilst different CNNs produce consistent classification results, the rate of picked-up image features disagreement between models could be as high as 70%.

AB - Diabetic retinopathy (DR) is a leading cause of blindness and affects millions of people throughout the world. Early detection and timely checkups are key to reduce the risk of blindness. Automated grading of DR is a cost-effective way to ensure early detection and timely checkups. Deep learning or more specifically convolutional neural network (CNN)-based methods produce state-of-the-art performance in DR detection. Whilst CNN based methods have been proposed, no comparisons have been done between the extracted image features and their clinical relevance. Here we first adopt a CNN visualization strategy to discover the inherent image features involved in the CNN's decision-making process. Then, we critically analyze those features with respect to commonly known pathologies namely microaneurysms, hemorrhages and exudates, and other ocular components. We also critically analyze different CNNs by considering what image features they pick up during learning to predict and justify their clinical relevance. The experiments are executed on publicly available fundus datasets (EyePACS and DIARETDB1) achieving an accuracy of 89 ~ 95% with AUC, sensitivity and specificity of respectively 95 ~ 98%, 74 ~ 86%, and 93 ~ 97%, for disease level grading of DR. Whilst different CNNs produce consistent classification results, the rate of picked-up image features disagreement between models could be as high as 70%.

U2 - 10.1038/s41598-021-89225-0

DO - 10.1038/s41598-021-89225-0

M3 - Journal article

C2 - 33958686

VL - 11

JO - Scientific Reports

JF - Scientific Reports

SN - 2045-2322

M1 - 9704

ER -

ID: 262745284