PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
With the growth of the aging population, the incidence of eye diseases is getting higher and higher. Traditional manual diagnosis has strong subjectivity and limitations. Computer-aided diagnosis can improve the accuracy of diagnosis while accelerating the diagnosis. The traditional convolutional neural network cannot fully obtain the effective features of the image, which makes the classification accuracy of the image low. The computer-aided diagnosis algorithm proposed in this paper integrates DenseNet and Squeeze-and-Excitation Networks (SENet) in deep learning based on image de-watermarking and data enhancement, while fully extracting and utilizing fundus images features while improving the network's global features information utilization. The experimental results show that the classification accuracy of the model in the fundus image is 0.9528. Compared with other convolutional networks, SEDenseNet achieves the highest accuracy.
Wenwen Yuan,Xiaowei Xu,Xiaowei Li,Ye Tao, andXiaodong Wang
"Classification and recognition method of fundus images based on SE-DenseNet", Proc. SPIE 11720, Twelfth International Conference on Graphics and Image Processing (ICGIP 2020), 117201F (27 January 2021); https://doi.org/10.1117/12.2589339
ACCESS THE FULL ARTICLE
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
The alert did not successfully save. Please try again later.
Wenwen Yuan, Xiaowei Xu, Xiaowei Li, Ye Tao, Xiaodong Wang, "Classification and recognition method of fundus images based on SE-DenseNet," Proc. SPIE 11720, Twelfth International Conference on Graphics and Image Processing (ICGIP 2020), 117201F (27 January 2021); https://doi.org/10.1117/12.2589339