Deep Learning models based on the Transformer architecture have revolutionized the state of the art of NLP tasks. As English is the language in which most significant advances are made, languages like Spanish require specific training, but this training has a computational cost so high that only big corporations with servers and GPUs are capable of generating them. This work has explored how to create a model for the Spanish language from a big multilingual model. Specifically, a model aimed at creating text summarization, a very common task in NLP. The results, concerning the quality of the summarization (ROUGE score), point out that these small models, for a specific language, achieve similar results than much bigger models, with a reasonable training in terms of time required and computational power, and are significantly faster at inference.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 firstname.lastname@example.org
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 email@example.com