

Breast cancer is a dangerous disease, contributing to a high mortality rate in women. Early detection plays a pivotal role in enhancing survival rates. Breast ultrasound is considered an effective method to help diagnose breast diseases early. Breast ultrasound is inexpensive, easy to perform, non-invasive and painless, so it is often prescribed by doctors in cases where it is necessary to examine the nature of clinically palpable lesions or related symptoms in the breast. In this paper, we introduce a method based on transfer learning and deep feature fusion to classify breast cancer using ultrasound images. The results from our experiments involving 780 breast ultrasound images across three categories (benign, malignant, and normal) indicated that the model using max fusion of deep features outperformed an original CNN in terms of performance, the combination of the maximum value between deep features has a higher performance level with an accuracy of about 1% to 4% compared to the original model. The concatenation fusion of VGG19 and ViT features delivers 1% - 4% times more accuracy than the original model alone