As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
This paper introduces a novel pruning method designed for transfer-learning models in computer vision, leveraging XGBoost to enhance model efficiency through simultaneous depth and width pruning. Contemporary computer vision tasks often rely on transfer learning due to its efficiency after short training periods. However, these pre-trained architectures, while robust, are typically overparameterized, leading to inefficiencies when fine-tuning for simpler problems. Our approach utilizes Parallel Tree Boosting as a surrogate neural classifier to evaluate and prune unnecessary convolutional filters in these architectures effectively. This method addresses the challenge of interference that is often present in transfer-learning models and enables a significant reduction in the size of the model. We demonstrate that it is feasible to achieve both depth and width pruning concurrently, allowing for substantial reductions in model complexity without compromising performance. Experimental results show that our proposed XGB-based Pruner not only reduces the model size by nearly 50% but also improves the accuracy by up to 9% in test data compared to conventional models. Moreover, it can achieve a dramatic reduction in model size by over a hundred times with minimal accuracy loss, proving its effectiveness across various architectures and datasets. This approach marks a significant advancement in optimizing transfer-learning models, promising enhanced performance in real-world applications.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.