원문정보
Development of a Lightweight DNN Model using channel Pruning with Transfer Learning
초록
영어
Deep neural networks (DNNs) have been widely used in various applications, however, the computational complexity and memory requirements of DNNs are becoming increasingly challenging, especially in resource-constrained devices such as mobile phones and embedded systems. In this paper, we propose a lightweight DNN model using channel pruning to address the computational complexity and memory requirements of DNNs in resource-constrained devices. Our approach combines channel pruning with transfer learning to maintain accuracy. Evaluation on the CIFAR-10 dataset shows improved performance with 78% test accuracy, 89% train accuracy, and 73% validation accuracy compared to the unpruned model. The pruned model is suitable for applications with limited computational resources.
목차
1. Introduction
2. Related Works
3. Proposed Methodology
3.1. Proposed Framework
4. Experiments
4.1. Experimental setup
4.2. Dataset
4.3. Experimental result
5. Conclusions
Acknowledgment
References