F. MohiEldeen Alabbasy’s research while affiliated with Mansoura University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (1)


Fig. 1. The teacher-student architecture in the KD technique (Gou et al., 2021).
Fig. 2. Knowledge Distillation process.
Fig. 3. The illustrations of the three types of the knowledge (Gou et al., 2021).
Fig. 4. Response-based knowledge distillation.
Fig. 5. Feature-based knowledge distillation.

+19

Compressing Medical Deep Neural Network Models for Edge Devices using Knowledge Distillation
  • Article
  • Full-text available

June 2023

·

483 Reads

·

10 Citations

Journal of King Saud University - Computer and Information Sciences

F. MohiEldeen Alabbasy

·

·

Recently, deep neural networks (DNNs) have been used successfully in many fields, particularly, in medical diagnosis. However, deep learning (DL) models are expensive in terms of memory and computing resources, which hinders their implementation in limited-resources devices or for delay-sensitive systems. Therefore, these deep models need to be accelerated and compressed to smaller sizes to be deployed on edge devices without noticeably affecting their performance. In this paper, recent accelerating and compression approaches of DNN are analyzed and compared regarding their performance, applications, benefits, and limitations with a more focus on the knowledge distillation approach as a successful emergent approach in this field. In addition, a framework is proposed to develop knowledge distilled DNN models that can be deployed on fog/edge devices for automatic disease diagnosis. To evaluate the proposed framework, two compressed medical diagnosis systems are proposed based on knowledge distillation deep neural models for both COVID-19 and Malaria. The experimental results show that these knowledge distilled models have been compressed by 18.4% and 15% of the original model and their responses accelerated by 6.14x and 5.86%, respectively, while there were no significant drop in their performance (dropped by 0.9% and 1.2%, respectively). Furthermore, the distilled models are compared with other pruned and quantized models. The obtained results revealed the superiority of the distilled models in terms of compression rates and response time.

Download

Citations (1)


... This process of KD not only facilitates model compression but also enhances the generalization capabilities of the student model [10]. The success of KD is inherently tied to the quality and diversity of the datasets used during the training step, as well as the large applications of the KD learning-based processes [1,12,[14][15][16][17][18][19]. ...

Reference:

Knowledge Distillation in Image Classification: The Impact of Datasets
Compressing Medical Deep Neural Network Models for Edge Devices using Knowledge Distillation

Journal of King Saud University - Computer and Information Sciences